Language Learning

Language Learning

7 Lectures · Dec 11, 2009

About

Grammar Induction, Representation of Language and Language Learning

Now is the time to revisit some of the fundamental grammar/language learning tasks such as grammar acquisition, language acquisition, language change, and the general problem of automatically inferring generic representations of language structure in a data driven manner. Though the underlying problems have been known to be computationally intractable for the standard representations of the Chomsky hierarchy, such as regular grammars and context free grammars, progress has been made by modifying or restricting these classes to make them more observable. Generalisations of distributional learning have shown promise in unsupervised learning of linguistic structure using tree based representations, or using non-parametric approaches to inference. More radically, significant advances in this domain have been made by switching to different representations such as the work in Clark, Eyrand & Habrard (2008) that addresses the issue of language acquisition, but has the potential to cross-fertilise a wide range of problems that require data driven representations of language. Such approaches are starting to make inroads into one of the fundamental problems of cognitive science: that of learning complex representations that encode meaning. This adds a further motivation for returning to this topic at this point. Grammar induction was the subject of an intense study in the early days of Computational Learning Theory, with the theory of query learning largely developing out of this research. More recently the study of new methods of representing language and grammars through complex kernels and probabilistic modelling together with algorithms such as structured output learning has enabled machine learning methods to be applied successfully to a range of language related tasks from simple topic classification through parts of speech tagging to statistical machine translation. These methods typically rely on more fluid structures than those derived from formal grammars and yet are able to compete favourably with classical grammatical approaches that require significant input from domain experts, often in the form of annotated data.

The Workshop homepage can be found at http://www.cs.ucl.ac.uk/staff/rmartin/grll09/

Related categories

Uploaded videos:

video-img
46:13

Inference for PCFGs and Adaptor Grammars

Mark Johnson

Jan 19, 2010

 · 

4940 Views

Invited Talk
video-img
20:59

Learning to Disambiguate Natural Language Using World Knowledge

Antoine Bordes

Jan 19, 2010

 · 

4300 Views

Lecture
video-img
16:28

Language Modeling with Tree Substitution Grammars

Matt Post

Jan 19, 2010

 · 

6186 Views

Lecture
video-img
20:19

A Preliminary Evaluation of Word Representations for Named-Entity Recognition

Joseph Turian

Jan 19, 2010

 · 

7856 Views

Lecture
video-img
54:42

Learnable Representations for Natural Language

Alexander Clark

Jan 19, 2010

 · 

6531 Views

Lecture
video-img
57:28

Learning Languages and Rational Kernels

Mehryar Mohri

Jan 19, 2010

 · 

4950 Views

Invited Talk
video-img
25:07

Sparsity in Grammar Induction

Jennifer A. Gillenwater

Jan 19, 2010

 · 

6344 Views

Lecture