Deep Learning via Semi-Supervised Embedding

author: Jason Weston, NEC Laboratories America, Inc.
published: Aug. 26, 2009,   recorded: June 2009,   views: 601
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

We show how nonlinear embedding algorithms popular for use with shallow semi-supervised learning techniques such as kernel methods can be applied to deep multi-layer architectures, either as a regularizer at the output layer, or on each layer of the architecture. This provides a simple alternative to existing approaches to deep learning whilst yielding competitive error rates compared to those methods, and existing shallow semi-supervised techniques.

We then go on to generalize this approach to take advantage of sequential data: for images, and text.

For images, we take advantage of the temporal coherence that naturally exists in unlabeled video recordings. That is, two successive frames are likely to contain the same object or objects. We demonstrate the effectiveness of this method in a semi-supervised setting on some pose invariant object and face recognition tasks.

For text, we describe a unified approach to tagging: a single convolutional neural network architecture that, given a sentence, outputs a host of language processing predictions: part-of-speech tags, chunks, named entity tags, and semantic roles. State-of-the-art performance is attained by learning word embeddings using a text specific semi-supervised task called a language model.

Joint work with: Ronan Collobert, Frederic Ratle, Hossein Mobahi, Pavel Kuksa and Koray Kavukcuoglu.

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: