MPI-INF Logo
Campus Event Calendar

Event Entry

New for: D1, D2, D3, D4, D5

What and Who

Complex Prohabilistic Models in Machine Learning

Herr Dr. Carl Rasmussen
Max-Planck-Institut für Biologische Kybernetik, Tuebingen
MPI-Kolloquium
AG 1, AG 2, AG 3, AG 4, AG 5  
AG Audience

Date, Time and Location

Tuesday, 25 January 2005
13:00
-- Not specified --
45
HS 3
Saarbrücken

Abstract

In machine learning inference is based on a database of examples, rather than explicit rules. Making predictions requires inference in the face of uncertainty. Probabilistic inference is ideally suited as a sound theoretical foundation for machine learning. Classical parametric models have several drawbacks, the most important being the difficulty of setting model complexity in practical problems (bias/variance trade-off). However, in a probabilistic (Bayesian) setting over-complex solutions are automatically avoided (Occam's Razor). Therefore the difficult adjustment of model complexity is effectively side-stepped and the complex limit can be used. For some models further simplifications can be made by considering the limit of infinite complexity (a limit which does not make sense in the classical setting). For example, the limit of a large neural network is a Gaussian Process (the probabilistic counterpart of the Support Vector Machine), a model in which inference can be handled analytically (which is not possible for the finite neural network). One can also understand this limit as a transition between a parametric and non-parametric models. A similar limit can be obtained for unsupervised learning with mixture models using the Dirichlet Process. In the talk I will outline theoretical and practical consequences of these developments. I'll also touch on the limitations and likely future developments.

Contact

--email hidden
passcode not visible
logged in users only

Bahareh Kadkhodazadeh, 01/25/2005 11:07 -- Created document.