Course detail
Classification and recognition
FIT-KRDAcad. year: 2022/2023
Estimation of parameters Maximum Likelihood and Expectation-Maximization, formulation of the objective function of discriminative training, Maximum Mutual information (MMI) criterion, adaptation of GMM models, transforms of features for recognition, modelling of feature space using discriminative sub-spaces, factor analysis, kernel techniques, calibration and fusion of classifiers, applications in recognition of speech, video and text.
State doctoral exam - topics:
- Maximum Likelihood estimation of parameters of a model
- Probability distribution from the exponential family and sufficient statistics
- Linear regression model and its probabilistic interpretation
- Bayesian models considering the probability distribution (uncertainty) of model parameters
- Conjugate priors and their significance in Bayesian models
- Fishers linear discriminant analysis
- Difference between generative and discriminative classifiers; their pros and cons
- Perceptron and its learning algorithm as an example of linear classifiers
- Generative linear classifier - Gaussian classifier with shared covariance matrix
- Discriminative classifier based on linear logistic regression
Language of instruction
Mode of study
Guarantor
Learning outcomes of the course unit
The students will learn to solve general problems of classification and recognition.
Prerequisites
Co-requisites
Planned learning activities and teaching methods
Assesment methods and criteria linked to learning outcomes
Course curriculum
Work placements
Aims
Specification of controlled education, way of implementation and compensation for absences
Recommended optional programme components
Prerequisites and corequisites
Basic literature
Recommended reading
Simon Haykin: Neural Networks And Learning Machines, Pearson Education; Third edition, 2016.
Classification of course in study plans
- Programme DIT Doctoral 0 year of study, summer semester, compulsory-optional
- Programme DIT Doctoral 0 year of study, summer semester, compulsory-optional
- Programme DIT-EN Doctoral 0 year of study, summer semester, compulsory-optional
- Programme DIT-EN Doctoral 0 year of study, summer semester, compulsory-optional
- Programme CSE-PHD-4 Doctoral
branch DVI4 , 0 year of study, summer semester, elective
- Programme CSE-PHD-4 Doctoral
branch DVI4 , 0 year of study, summer semester, elective
- Programme CSE-PHD-4 Doctoral
branch DVI4 , 0 year of study, summer semester, elective
- Programme CSE-PHD-4 Doctoral
branch DVI4 , 0 year of study, summer semester, elective
Type of course unit
Lecture
Teacher / Lecturer
Syllabus
- Estimation of parameters of Gaussian probability distribution by Maximum Likelihood (ML)
- Estimation of parameters of Gaussian Gaussian Mixture Model (GMM) by Expectation-Maximization (EM)
- Discriminative training, introduction, formulation of the objective function
- Discriminative training with the Maximum Mutual information (MMI) criterion
- Adaptation of GMM models- Maximum A-Posteriori (MAP), Maximum Likelihood Linear Regression (MLLR)
- Transforms of features for recognition - basis, Principal component analysis (PCA)
- Discriminative transforms of features - Linear Discriminant Analysis (LDA) and Heteroscedastic Linear Discriminant Analysis (HLDA)
- Modeling of feature space using discriminative sub-spaces - factor analysis
- Kernel techniques, SVM
- Calibration and fusion of classifiers
- Applications in recognition of speech, video and text
- Student presentations I
- Student presentations II
Guided consultation in combined form of studies
Teacher / Lecturer