Workshop on Sparsity in Machine Learning and Statistics, Cumberland Lodge 2009

Workshop on Sparsity in Machine Learning and Statistics, Cumberland Lodge 2009

16 Lectures · Apr 1, 2009

About

Sparse estimation (or sparse recovery) is playing an increasingly important role in the statistics and machine learning communities. Several methods have recently been developed in both fields, which rely upon the notion of sparsity (e.g. penalty methods like the Lasso, Dantzig selector, etc.). Many of the key theoretical ideas and statistical analysis of the methods have been developed independently, but there is increasing awareness of the potential for cross-fertilization of ideas between statistics and machine learning.

Furthermore, there are interesting links between lasso-type methods and boosting (particularly, LP-boosting); there has been a renewed interest in sparse Bayesian methods. Sparse estimation is also important in unsupervised method (sparse PCA, etc.). Recent machine learning techniques for multi-task learning and collaborative filtering have been proposed which implement sparsity constraints on matrices (rank, structured sparsity, etc.). At the same time, sparsity is playing an important role in various application fields, ranging from image and video reconstruction and compression, to speech classification, text and sound analysis, etc.

The overall goal of the workshop is to bring together machine learning researchers with statisticians working on this timely topic of research, to encourage exchange of ideas between both communities and discuss further developments and theoretical underpinning of the methods.

For detailed information visit the Workshops website.

Related categories

Uploaded videos:

video-img
53:30

Sparse Exponential Weighting and Langevin Monte-Carlo

Alexandre Tsybakov

May 06, 2009

 · 

3574 Views

Lecture
video-img
56:50

Phase transitions phenomenon in Compressed Sensing

Jared Tanner

May 06, 2009

 · 

5374 Views

Lecture
video-img
31:57

Large Precision Matrix Estimation for Time Series Data with Latent Factor Model

Clifford Lam

May 06, 2009

 · 

4493 Views

Lecture
video-img
59:49

Fast methods for sparse recovery: alternatives to L1

Mike Davies

May 06, 2009

 · 

7307 Views

Lecture
video-img
35:13

Poster Spotlights 1

May 06, 2009

 · 

3677 Views

Lecture
video-img
49:46

Multi-Task Learning via Matrix Regularization

Andreas Argyriou

May 06, 2009

 · 

3625 Views

Lecture
video-img
58:23

Algorithmic Strategies for Non-convex Optimization in Sparse Learning

Tong Zhang

May 06, 2009

 · 

7814 Views

Lecture
video-img
01:01:49

High-Dimensional Non-Linear Variable Selection through Hierarchical Kernel Learn...

Francis R. Bach

May 06, 2009

 · 

4385 Views

Lecture
video-img
15:56

Matching Pursuit Kernel Fisher Discriminant Analysis

Tom Diethe

May 06, 2009

 · 

3950 Views

Lecture
video-img
37:59

Some results for the adaptive Lasso

Sara van de Geer

May 06, 2009

 · 

6965 Views

Lecture
video-img
01:04:48

Latent Variable Sparse Bayesian Models

David P Wipf

May 06, 2009

 · 

5696 Views

Lecture
video-img
36:48

Poster Spotlights 2

May 06, 2009

 · 

3402 Views

Lecture
video-img
54:07

Sparsity in online multitask/multiview learning

Nicolò Cesa-Bianchi

May 06, 2009

 · 

3203 Views

Lecture
video-img
47:37

Learning with Many Reproducing Kernel Hilbert Spaces

Ming Yuan

May 06, 2009

 · 

4419 Views

Lecture
video-img
24:38

Distilled Sensing: Active sensing for sparse recovery

Rui Castro

May 06, 2009

 · 

4792 Views

Lecture
video-img
42:37

Testing and estimation in a sparse normal means model, with connections to shape...

Jon Wellner

May 06, 2009

 · 

2875 Views

Lecture