Universitatea Babes-Bolyai Lehel Csató Some picture
Not f'd. you won't find me on Facebook

Faculty of Mathematics and Informatics
Universitatea BABES-BOLYAI
Str. Mihail Kogalniceanu, Nr. 1
RO-400084 Cluj-Napoca, Romania

E-mail addresses:
lehel . csato a_t cs . ubbcluj . ro
lehel . csato a_t tuebingen . mpg . de

Schedule of classes

In the first semester 2015/16: WEBLINK. Appointments possible by e-mail (see above).


Past lectures (with most material online):


My interest area focuses on different topics in Machine Learning - see e.g. course by M. Jordan and A. Simma et.al). I am mainly interested in using nonparametric Bayesian methods for analysing data - namely Gaussian processes (Rasmussen, Williams).
A second - related - field of research is that of applying probabilistic methods in robotics. We - together with Razvan Florian - try to link the low-level processing with high-level concepts like formation of ontologies and internal representation. The aims include development of algorithms capable of of robust internal representations of the "learning data", i.e. the environment.

Software packages

  1. NETLAB toolbox for Pattern Recognition
  2. Sparse Online Gaussian Process Toolbox

Links to materials

This section presents links to material that is useful for learning:

Brief Biography

After finishing my undergraduate studies in Computer Science and Mathematics (June, 1995) at the Faculty of Mathematics and Computer Science,   Babes-Bolyai University Cluj-Napoca, Romania, I obtained an MSc degree in Computer Science, specialisation Artificial Intelligence (June, 1996). For two years after graduation I've been a research assistant at the Institute of Isotopes of the Hungarian Academy of Sciences, where I studied mathematical models of psychophysical phenomena (categorical perception and hippocampal modelling) under the supervision of András Lörincz.

From October 1998 I joined the Neural Computing Research Group at Aston University as a PhD student and worked under the supervision of Manfred Opper, studying online learning and its possible extensions to non-parametric inference. My study had focused on the computational issues raised when using non-parametric methods: namely that the number of required parameters (or sufficient statistics) scales with the number of training data. The scaling is at least linear and precludes these methods to be used for datasets of realistic sizes. The area I investigated was Bayesian inference using Gaussian Processes (GPs).

A result of this work is a general method that provides a representation of the process using far less parameters that would be used using standard techniques based on the Kilemdorf-Wahba representer theorem. The sparse representation of Gaussian Process models, additionally to the representation of the mean function (as in Kimeldorf-Wahba), provides a representation to the variance of the posterior process (ie. the posterior kernel). The advantage of estimating the posterior process is that efficient inference algorithms (second order methods in parametric case) can be used in inference.

Details to be found in the PhD thesis (click to get the PDF file, or the compressed postscript), finished in March, 2002.

From May, 2002 I was working on applying and extending the sparse learning to various models and also on a webpage that explains sparse Gaussian Process inference and its applications, which can be found at the following location Sparse Online Gaussian Processes.

From May 2003 till April 2005 I worked as a post-doc at the Max Planck Institute for Biological Cybernetics, Tubingen where I have been working on Development and Assessment of probabilistic models together with Carl Rasmussen.