Course detail
Bayesian Models for Machine Learning (in English)
FIT-BAYaAcad. year: 2023/2024
Probability theory and probability distributions, Bayesian Inference, Inference in Bayesian models with conjugate priors, Inference in Bayesian Networks, Expectation-Maximization algorithm, Approximate inference in Bayesian models using Gibbs sampling, Variational Bayes inference, Stochastic VB, Infinite mixture models, Dirichlet Process, Chinese Restaurant Process, Pitman-Yor Process for Language modeling, Practical applications of Bayesian inference
Language of instruction
Number of ECTS credits
Mode of study
Guarantor
Offered to foreign students
Entry knowledge
Rules for evaluation and completion of the course
-
Mid-term exam (24 points)
-
Submission and presentation of project (25 points)
-
Final exam (51points)
To get points from the exam, you need to get min. 20 points, otherwise the exam is rated 0 points.
Aims
To demonstrate the limitations of Deep Neural Nets (DNN) that have become a very popular machine learning tool successful in many areas, but that excel only when sufficient amount of well annotated training data is available. To present Bayesian models (BMs) allowing to make robust decisions even in cases of scarce training data as they take into account the uncertainty in the model parameter estimates. To introduce the concept of latent variables making BMs modular (i.e. more complex models can be built out of simpler ones) and well suitable for cases with missing data (e.g. unsupervised learning when annotations are missing). To introduce basic skills and intuitions about the BMs and to develop more advanced topics such as: approximate inference methods necessary for more complex models, infinite mixture models based on non-parametric BMs. The course is taught in English.
Study aids
Prerequisites and corequisites
Basic literature
Recommended reading
C. Bishop: Pattern Recognition and Machine Learning, Springer, 2006 (EN)
P Orbanz: Tutorials on Bayesian Nonparametrics: http://stat.columbia.edu/~porbanz/npb-tutorial.html (EN)
S. J. Gershman and D.M. Blei: A tutorial on Bayesian nonparametric models, Journal of Mathematical Psychology, 2012. (EN)
Classification of course in study plans
- Programme IT-MSC-2 Master's
branch MGMe , 0 year of study, winter semester, compulsory-optional
- Programme MIT-EN Master's 0 year of study, winter semester, compulsory-optional
- Programme MITAI Master's
specialization NSPE , 0 year of study, winter semester, elective
specialization NBIO , 0 year of study, winter semester, elective
specialization NSEN , 0 year of study, winter semester, elective
specialization NVIZ , 0 year of study, winter semester, elective
specialization NGRI , 0 year of study, winter semester, elective
specialization NADE , 0 year of study, winter semester, elective
specialization NISD , 0 year of study, winter semester, elective
specialization NMAT , 0 year of study, winter semester, elective
specialization NSEC , 0 year of study, winter semester, elective
specialization NISY up to 2020/21 , 0 year of study, winter semester, elective
specialization NCPS , 0 year of study, winter semester, elective
specialization NHPC , 0 year of study, winter semester, elective
specialization NNET , 0 year of study, winter semester, elective
specialization NMAL , 0 year of study, winter semester, compulsory
specialization NVER , 0 year of study, winter semester, elective
specialization NIDE , 0 year of study, winter semester, elective
specialization NEMB , 0 year of study, winter semester, elective
specialization NISY , 0 year of study, winter semester, elective
specialization NEMB up to 2021/22 , 0 year of study, winter semester, elective - Programme IT-MGR-1H Master's
specialization MGH , 0 year of study, winter semester, recommended course
Type of course unit
Lecture
Teacher / Lecturer
Syllabus
- Probability theory and probability distributions
- Bayesian Inference (priors, uncertainty of the parameter estimates, posterior predictive probability)
- Inference in Bayesian models with conjugate priors
- Inference in Bayesian Networks (loopy belief propagation)
- Expectation-Maximization algorithm (with application to Gaussian Mixture Model)
- Approximate inference in Bayesian models using Gibbs sampling
- Variational Bayes inference
- Infinite mixture models, Dirichlet Process, Chinese Restaurant Process
- Pitman-Yor Process for Language modeling
- Practical applications of Bayesian inference
Fundamentals seminar
Teacher / Lecturer
Syllabus
Project
Teacher / Lecturer
Syllabus