Babes-Bolyai University of Cluj-Napoca
Faculty of Mathematics and Computer Science
Study Cycle: Master

SUBJECT

Code
Subject
MIG1001 Stochastic Modeling of Data
Section
Semester
Hours: C+S+L
Category
Type
Computational Mathematics - in Hungarian
1
2+1+0
speciality
compulsory
Interdisciplinary Mathematics - in Hungarian
3
2+1+0
speciality
compulsory
Optimization of computational models- in Hungarian
3
2+1+0
speciality
compulsory
Teaching Staff in Charge
Assoc.Prof. CSATO Lehel, Ph.D.,  csatolcs.ubbcluj.ro
Aims
The lecture addresses the problems met when processing large data-sets. We present methods to analyse collections of empirical (real) data: to classify them, to discover hidden relations within the attributes or to build inference systems based on the data.
Content
Stochastic data modelling is the application to various types of data of algorithms that exploit the rules of probability and allow direct representation of the stochastic nature of the real world. In order to extract information, usually one has to include prior knowledge into the algorithm. The inclusion of the prior knowledge is possible via the rules of probability. Prior knowledge is the model we want to use. If there are huge data-sets available, then one must use statistical methods, for a few data a lot more emphasis is to be put on the model. The algorithms depend also on the type of the problem we want to solve. The course is aimed as a practical introduction to various models from machine learning and to familiarise the students with the use of randomness and stochastic models.

Thematic overview of the lectures:

• Component analysis (weeks 1-4) ref. [2,4,8,9]:
• Basics of statistical modelling – matrices and eigenvalues [9],
• Principal components used in signal de-noising [2,4].
• Independent components used in blind source separation of signals [8].

• Bayesian modelling (weeks 5-7) ref. [1,2,3,4,5]:
• Introduction to various estimation methods [1, 2, 3],
• Hierarchical model specification and parameter estimation [1,2,4],
• Bayesian modelling: obtaining the posterior distribution [4, 5].

• Hidden Markov models (HMMs) weeks (8-11) ref. [7,8]:
• HMM definitions [7],
• Estimating the latent states [7],
• Applications of HMMs (1) speech recognition [7,8],
• Applications of HMMs (2) gene segmentation [7,8].

• Gaussian Process models (weeks 12-14) ref. [5,6]:
• Joint Gaussians, kernel functions, Functional Bayesian models [4,5,6],
• Approximations: using the KL-projection, sparse approximations,
• Applications of Gaussian process inference.
References
[1]. Russell S, Norvig P. (2003) Artificial Intelligence: A Modern Approach (Second Edition), Prentice Hall.
[2]. Mitchell T (1997) Machine Learning, McGraw Hill.
[3]. Bernardo J.M, Smith A.F.M (2000) Bayesian Theory, John Wiley & Sons.
[4]. MacKay D.J.C (2003) Information Theory, Inference and Learning Algorithms, Cambridge University Press, HTTP: http://wol.ra.phy.cam.ac.uk/mackay/itila/book.html.
[5]. Rasmussen C.E, Williams C.K.I (2006) Gaussian Processes for Machine Learning, The MIT Press.
[6]. Rabiner L.R, Juang, B.H (1986) An introduction to Hidden Markov models, IEEE ASSP Magazine, pp: 4-15.
[7]. Durbin R, Eddy S.R, Krogh A, Mitchison G (1999) Biological Sequence Analysis: Probabilistic Models of Proteins and Nucleic Acids. Cambridge University Press.
[8]. Hyvärinen A, Karhunen J, Oja E (2001) Independent Component Analysis, Wiley-Interscience.
[9]. Barto A. (2002): Statistical Pattern Recognition, John Wiley & Sons.
Assessment
The final grade consist of
- (40%) a presentation of a topic chosen in the first 6 weeks of the semester,
- (20%) the solution of the laboratory examples,
- (40%) oral examination based on the topics of the lectures and the seminars presented by the students.
Links: Syllabus for all subjects
Romanian version for this subject
Rtf format for this subject