en-de
en-es
en-fr
en-sl
en
en-zh
0.25
0.5
0.75
1.25
1.5
1.75
2
Early language bootstrapping
Published on Oct 12, 20115707 Views
Human infants learn spontaneously and effortlessly the language(s) spoken in their environments, despite the extraordinary complexity of the task. In the past 30 years, tremendous progress has been ma
Related categories
Chapter list
Modeling early language acquistion00:00
What we say to babies ... 00:21
Underlying forms - 100:55
Underlying forms - 201:37
Standard scenario the sequential bootstraping scenario01:51
Step 1: When?03:34
Problem: not tested on raw unsegmented speech06:34
Hidden Markov Models07:30
Optimized State Splitting08:17
Problem #1: how bad?10:07
Problem #1: why?11:11
Problem #2: why?13:06
Problem #2: how bad? - 115:05
Problem #2: how bad? - 217:12
Three ideas to solve problem #217:15
How to reduce the number of allophones?18:23
Problem: Effect of phonotactics19:32
Problem: effect of nb of allophone20:56
Limits of KL21:55
Idea #2. The linguistic/articulatory filters22:11
Implementation of the filters23:58
Tests on French24:49
French - 125:29
French - 225:50
Limits of the linguistic/articulatory filters26:08
Work in progress27:33
Idea #3: use top-down information28:11
Number of segments in corpus - 129:14
Number of segments in corpus - 230:08
But isn’t is cheating? - 130:28
But isn’t is cheating? - 230:58
Solution31:24
Number of segments in corpus - 332:10
The REVISED Sequential Bootstrapping Scenario - 132:47
Is this psychologically plausible?35:21
n-gram pseudo-lexicon37:25
The REVISED Sequential Bootstrapping Scenario - 239:10
The parallel-integrative Bootstrapping scenario41:00
Articulation network41:59
Thank You43:17