Multitask Learning Using Nonparametrically Learned Predictor Subspaces

author: Piyush Rai, University of Utah
published: Jan. 19, 2010,   recorded: December 2009,   views: 4414


Related Open Educational Resources

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.


Given several related learning tasks, we propose a nonparametric Bayesian learning model that captures task relatedness by assuming that the task parameters (i.e., weight vectors) share a latent subspace. More specifically, the intrinsic dimensionality of this subspace is not assumed to be known a priori. We use an infinite latent feature model - the Indian Buffet Process - to automatically infer this number. We also propose extensions of this model where the subspace learning can incorporate (labeled, and additionally unlabeled if available) examples, or the task parameters share a mixture of subspaces, instead of sharing a single subspace. The latter property can allow learning nonlinear manifold structure underlying the task parameters, and can also help in preventing negative transfer from outlier tasks.

See Also:

Download slides icon Download slides: nipsworkshops09_rai_mlun_01.pdf (343.7 KB)

Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: