Skip to content.

SPSC

Sections
Personal tools
You are here: Home » Courses » Advanced Signal Processing 1 and 2 » Bayesian Data Analysis » Bayesian Data Analysis
Views

Bayesian Data Analysis

Reflecting the wide applicability of these methods, the seminar is jointly organized by the working groups of Prof. von der Linden, Prof. Stadlober, Prof. Maass and Prof. Kubin. The student should work on a selected topic and give an oral presentation in class during a 45 minute discussion session. Work in small groups of 2 or 3 students is strongly encouraged.

Suggested topics for seminar presentations

  1. Basic concepts of probability (Von der Linden)
    • Random variable, random vector
    • Probability density function
    • Expectation, moments, cummulants
  2. Statistical signal processing - bayesian data analysis (Von der Linden)
    • Conditional probability, Bayes rule
    • Likelihood, prior, a posteriori densities
    • Motivating applications
    • Bayesian hierarchical models
  3. Hierarchical models - parameter estimation (Koeppl)
    • Priors: entropic, noninformative, conjugate, improper
    • Evidence procedure
    • Marginalization of hyperparameters
    • Evidence procedure versus marginalization
  4. Bayesian inference in Machine Learning (Shutin, Rank, Nessler)
    • Relevance vector machines
    • Gaussian processes
    • Bayesian backpropagation for neural networks
  5. Inverse problems (Frauendorfer, Schwaighofer)
    • Maximum Entropy (Neuber)
    • Outlier-tolerant parameter estimation
    • Form-free reconstruction
    • Adaptive kernel methods
    • Local smoothness
  6. Model comparison
    • Ockhams faktor
    • Bretthorst Prior
    • Comparison to Akaike-criterion
  7. Monte Carlo techniques in Baysian data analysis (Mayer Dieter, Sommer Christian)
    • Introduction, applications, advantages/disadvantages
    • Markov-chain-Monte-Carlo methods
    • Tempering (for bi- or multimodal distribution)
    • Prior predictive value (for model comparison)
  8. Pattern recognition, signal processing
    • Bayesian networks (Pernkopf)
    • Hidden Markov Models (Kubin)
    • Dynamic Bayesian networks
    • Kalman Filter

List of participants and assigned topics

Date 
Participant's name 
Presentation 
24.03.03 
Von der Linden 
 
31.03.03 
Koeppl 
(PDF
07.04.03 
Nessler 
(PPT), (PDF
05.05.2003 
Shutin 
(PDF
12.05.2003 
Rank 
(PDF
19.05.2003 
Schwaighofer 
(PDF
26.05.2003 
Frauendorfer 
(PDF
02.06.2003 
Neuber 
(PDF
16.06.2003 
Mayer, Sommer 
 
23.06.2003 
Pernkopf, Kubin 
 

References/course material

J. Berger. A catalog of noninformative priors. ISDS Discussion Paper, 1997. [ bib | .pdf ]
J. O. Berger. Statistical decision theory and Bayesian analysis. Springer, New York, 1985. [ bib ]
S. Brandl. Maschinelles lernen mit bayes'schen netzwerken. Seminar Computational Intelligence C, IGI, TUgraz, 2001. [ bib | http ]
G. L. Bretthorst. Bayesian Spectrum Analysis and Parameter estimation. Lecture notes in Statistics. Springer-Verlag, 1988. [ bib | .pdf ]
B. Buck and V. A. Macaulay. Maximum Entropy in Action. Oxford Science Publications, 1991. [ bib ]
A. Faul and M. E. Tipping. A variational approach to robust regression. In G. Dorffner, H. Bischof, and K. Hornik, editors, Proceedings of ICANN'01, pages 95-102, 2001. [ bib | .pdf ]
M. Gibbs. Bayesian Gaussian Processes for Regression and Classification. PhD thesis, University of Cambridge, 1997. [ bib | .ps ]
S. F. Gull. Bayesian inductive inference and maximum entropy. In G. J. Erickson and C. R. Smith, editors, Maximum Entropy and Bayesian Methods, Dordrecht the Netherlands, 1988. Kluwer Academic Publishers. [ bib | .pdf ]
S. F. Gull. Bayesian data analysis: Straight-line fitting. In J. Skilling, editor, Maximum Entropy and Bayesian Methods, Dordrecht the Netherlands, 1989. Kluwer Academic Publishers. [ bib | .pdf ]
D. Heckerman. A tutorial on learning with bayesian networks. Technical report, Microsoft Research, 1996. [ bib | .pdf ]
E. T. Jaynes. Probability theory: The logic of science. bayes.wustl.edu, 2003. [ bib | .pdf ]
J. N. Kapur and H. K. Kesavan. Entropy Optimization Principles with Applications. Academic Press, 1992. [ bib ]
S. M. Kay. Fundamentals of Statistical Signal Processing: Estimation Theory. Prentice Hall, New Jersey, 1993. [ bib ]
D. J. C. Mackay. Bayesian interpolation. Neural Computation, 4(3):415-447, 1992. [ bib | .pdf ]
D. J. C. MacKay. Bayesian Methods for Adaptive Models. PhD thesis, California Institute of Technology, 1992. [ bib | .ps.gz ]
D. J. C. Mackay. Introduction to Monte Carlo methods. In M. I. Jordan, editor, Learning in Graphical Models, NATO Science Series, pages 175-204. Kluwer Academic Press, 1998. [ bib | .pdf ]
D. J. C. MacKay. Comparison of approximate methods for handling hyperparameters. Neural Computation, 11(5):1035-1068, 1999. [ bib | .pdf ]
B. Nessler. Maschinelles lernen mit bayes'schen netzwerken. Seminar Computational Intelligence C, IGI, TUgraz, 2001. [ bib | .pdf ]
C. Rasmussen. Evaluation of Gaussian Processes and other Methods for Non-linear Regression. PhD thesis, University of Toronto, Department of Computer Science, 1996. [ bib | .ps.gz ]
C. P. Robert. The Bayesian Choice: From Decision-Theoretic Foundations to Computational Implementation. Springer Texts in Statistics. Springer-Verlag, 2001. [ bib ]
D. S. Sivia. Data Analysis: A Bayesian Tutorial. Oxford Science Publications, 1998.[ bib ]
H. H. Thodberg. A review of bayesian neural networks with an application to near infrared spectroscopy. IEEE Transactions on Neural Networks, 7(1):56-72, 1996. [ bib | .pdf ]
M. E. Tipping. Sparse bayesian learning and the relevance vector machine. Journal of Machine Learning Research, 1:211-244, June 2001. [ bib | .pdf ]
M. Tribus. Rational Descriptions, Decisions, and Designs. Pergamon Press, 1969. [ bib ]
W. von der Linden. Wahrscheinlichkeitstheorie, statistik und datenanalyse. Skriptum, 2003. [ bib | .pdf ]
D. Wickmann. Bayes-Statistik. B.I. Wissenschaftsverlag, 1990.[ bib ]
D. H. Wolpert. On the use of evidence in neural networks. In Stephen José Hanson, Jack D. Cowan, and C. Lee Giles, editors, Advances in Neural Information Processing Systems, volume 5, pages 539-546. Morgan Kaufmann, San Mateo, CA, 1993. [ bib | .pdf ]
D. H. Wolpert and C. E. M. Strauss. What bayes has to say about the evidence procedure. In Maximum Entropy and Bayesian Methods, 1993. [ bib | .pdf ]
 
Created by marian
Last modified 2005-10-25 17:05