About
Kernels for Multiple Outputs and Multi-task Learning: Frequentist and Bayesian Points of View
Accounting for dependencies between outputs has important applications in several areas. In sensor networks, for example, missing signals from temporal failing sensors may be predicted due to correlations with signals acquired from other sensors. In geo-statistics, prediction of the concentration of heavy pollutant metals (for example, Copper concentration), that require expensive procedures to be measured, can be done using inexpensive and oversampled variables (for example, pH data). Multi-task learning is a general learning framework in which it is assumed that learning multiple tasks simultaneously leads to better modeling results and performance that learning the same tasks individually. Exploiting correlations and dependencies among tasks, it becomes possible to handle common practical situations such as missing data or to increase the amount of potential data when only few amount of data per task is available. In this workshop we will consider the use of kernel methods for multiple outputs and multi-task learning. The aim of the workshop is to bring together Bayesian and frequentist researchers to establish common ground and shared goals.
The Workshop homepage can be found at http://intranet.cs.man.ac.uk/mlo/mock09/.
Related categories
Uploaded videos:
48:17
Geostatistics for Gaussian Processes
Jan 19, 2010
·
7865 Views
32:33
Borrowing Strength, Learning Vector Valued Functions and Supervised Dimension Re...
Jan 19, 2010
·
3953 Views
01:01:21
Gaussian Processes and Process Convolutions from a Bayesian Perspective
Jan 19, 2010
·
5216 Views
41:30
Prior Knowledge and Sparse Methods for Convolved Multiple Outputs Gaussian Proce...
Jan 19, 2010
·
4894 Views
37:51
Multi-Task Learning and Matrix Regularization
Jan 19, 2010
·
5709 Views
46:34
Learning Vector Fields with Spectral Filtering
Jan 19, 2010
·
4187 Views