About
The theoretical analysis of systems that learn from data has been an important topic of study in statistics, machine learning, and information theory. In all these paradigms, distinct methods have been developed to deal with inference when the models under consideration can be arbitrarily large. Recently, there has been a fruitful cross-fertilization of ideas and proof techniques. To give but one example, very recently, minimax optimal convergence rates of the information-theoretic MDL method were proved using ideas from the - computational - PAC-Bayesian paradigm and - statistical - empirical process techniques. The goal of this workshop is to bring together leading theoreticians to allow them to debate, compare and cross-fertilise ideas from these distinct inductive principles. At the workshop, we will establish a PASCAL special interest group for `merging computational and information-theoretic learning with statistics'.
Related categories
Uploaded videos:
Lectures
Empirical Bayesian test for the smoothness
Feb 25, 2007
·
4267 Views
Lectures
The Complexity of Learning Verification
Feb 25, 2007
·
4000 Views
Support vector machines loss with l1 penalty
Feb 25, 2007
·
5934 Views
Convergence of MDL and Bayesian Methods
Feb 25, 2007
·
4623 Views
Fast Learning Rates for Support Vector Machines
Feb 25, 2007
·
5109 Views
Universal Coding/Prediction and Statistical (In)consistency of Bayesian inferenc...
Feb 25, 2007
·
4609 Views