FSI-SSPAcad. year: 2017/2018
The course provides the introduction to the theory of stochastic processes. The following topics are dealt with: types and basic characteristics, covariation function, spectral density, stationarity, examples of typical processes, time series and evaluating, parametric and nonparametric methods, identification of periodical components, ARMA processes. Applications of methods for elaboration of project time series evaluation and prediction supported by the computational system MATLAB.
Learning outcomes of the course unit
Recommended optional programme components
Brockwell, P.J. - Davis, R.A. Introduction to time series and forecasting. 3rd ed. New York: Springer, 2016. 425 s. ISBN 978-3-319-29852-8. (EN)
Cipra, Tomáš. Analýza časových řad s aplikacemi v ekonomii. 1. vyd. Praha : SNTL - Nakladatelství technické literatury, 1986. 246 s. (CS)
Hamilton, J.D. Time series analysis. Princeton University Press, 1994. xiv, 799 s. ISBN 0-691-04289-6. (EN)
Planned learning activities and teaching methods
Assesment methods and criteria linked to learning outcomes
Language of instruction
Specification of controlled education, way of implementation and compensation for absences
Classification of course in study plans
- Programme IT-MGR-2 Master's
branch MBI , any year of study, summer semester, 4 credits, elective
branch MPV , any year of study, summer semester, 4 credits, elective
branch MSK , any year of study, summer semester, 4 credits, elective
branch MBS , any year of study, summer semester, 4 credits, elective
branch MMI , any year of study, summer semester, 4 credits, elective
branch MMM , any year of study, summer semester, 4 credits, compulsory-optional
- Programme M2A-P Master's
branch M-MAI , 1. year of study, summer semester, 4 credits, compulsory
Type of course unit
Teacher / Lecturer
2. Consistent system of distribution functions, strict and weak stacionarity.
3. Moment characteristics: mean and autocorrelation function.
4. Spectral density function (properties).
5. Decomposition model (additive, multiplicative), variance stabilization.
6. Identification of periodic components: periodogram, periodicity tests.
7. Methods of periodic components separation.
8. Methods of trend estimation: polynomial regression, linear filters, splines.
9. Tests of randomness.
10.Best linear prediction, Yule-Walker system of equations, prediction error.
11.Partial autocorrelation function, Durbin-Levinson and Innovations algorithm.
12.Linear systems and convolution, causality, stability, response.
13.ARMA processes and their special cases (AR and MA process).
Teacher / Lecturer
2. Simulating time series with some typical autocorrelation functions: white noise, coloured noise with correlations at lag one, exhibiting linear trend and/or periodicities.
3. Detecting heteroscedasticity. Transformations stabilizing variance (power and Box-Cox transform).
4. Identification of periodic components, periodogram, and testing.
5. Use of linear regression model on time series decomposition.
6. Estimation of polynomial degree for trend and separation of periodic components.
7. Denoising by means of linear filtration (moving average): design of optimal weights preserving polynomials up to a given degree, Spencer's 15-point moving average.
8. Filtering by means of stepwise polynomial regression.
9. Filtering by means of exponential smoothing.
11.Simulation, identification, parameters estimate and verification for ARMA model.
12.Testing significance of (partial) correlations.
13.Tutorials on student projects.