The Computational Nature of Language Learning

author: Partha Niyogi, Department of Computer Science, University of Chicago
published: Feb. 10, 2012,   recorded: October 2007,   views: 452
Categories

Related Open Educational Resources

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Bibliography

Description

Language learning research becomes more robust when it incorporates insights from evolutionary theory, Partha Niyogi demonstrates. The principles of natural selection and variation in a population come into play not only when exploring how children learn language but how languages alter over time.

All languages are learnable and more or less uniformly learnable, says Nyogi: “It doesn’t take 25 years to learn Chinese and two years to learn Bengali.” But in the last 30-odd years, Nyogi says, there’s been a great deal of debate about the most useful models of learning theory, with efforts to explain how the human language faculty makes use of linguistic input at different developmental stages. One model of language acquisition depicts the child, armed with a basic grammar “map,” extrapolating from data (interactions with adults), and assembling the components of language by some algorithm. Analysis has been conducted as if there were “a target grammar, which produces data, and an algorithm which is trying to acquire this target grammar.” But, says Niyogi, “that’s not true in the world.” A child is exposed to lots of variation from within its population; parents and others all produce different grammars, different data sets.

Niyogi believes that an “evolutionary trajectory” links how acquisition happens at an individual level, and how variation in language springs up from one generation to the next. But rather than inheriting the grammar of your parents, you have to learn it. Examining language variation over time as if it were genetic variation, “you get a different mathematical structure…and probabilities start playing an important role.” Small differences “can have very subtle consequences giving rise to bifurcation in nonlinear dynamics of evolution.” For instance, 1000 years ago, the English were speaking a language that’s unrecognizable to us today. How has it come to be that “we have moved so far from that point through learning which is mimicking the previous generation?”

Niyogi explains that within a single population two varying languages may be in competition (say, a German and an English-type grammar). While a majority may speak the dominant variant, some children will likely be exposed to a mixture of the two. There’s a “drift” in language use, “and suddenly, what was stable becomes unstable.” In the next generation, even more learners pick up the minority variant. It’s possible to determine the probability of learners in successive generations using new expressions, and tracking the evolutionary transformation of language. The “ubiquitous fact of languages is that they change with time,” concludes Nyogi, and “even a slight effect of frequency can wipe out something that looks stable.”

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: