The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations thumbnail
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations

Published on May 27, 20162573 Views

We introduce a new test of how well language models capture meaning in children's books. Unlike standard language modelling benchmarks, it distinguishes the task of predicting syntactic function words

Related categories

Chapter list

The Goldilocks Principle00:00
Context is needed to understand language - 100:09
Goldilocks - 100:40
Context is needed to understand language - 200:48
Goldilocks - 201:14
Children’s Book Test - 101:27
Children’s Book Test - 201:36
Children’s Book Test - 301:49
Children’s Book Test - 402:01
Children’s Book Test - 502:11
Children’s Book Test - 602:13
Children’s Book Test - 702:16
Children’s Book Test - 802:20
Children’s Book Test - 902:38
CBT: Importance-weighted evaluation03:32
What does the CBT add?04:20
Can humans do the CBT?05:40
Untitled05:52
Query + Context06:06
What about machines?06:23
Performance comparison06:25
Memory Networks for machine reading - 107:21
Memory Networks for machine reading - 207:30
Memory Networks for machine reading - 307:35
Memory Networks for machine reading - 407:44
Memory Networks for machine reading - 507:48
Memory Networks for machine reading - 608:01
Memory Networks for machine reading - 708:08
Memory Networks for machine reading - 808:21
Three ways to represent text in memory08:40
Lexical Memory08:54
Window Memory09:17
Sentence Memory09:37
Self-supervision for memory retrieval09:53
Choose the memory* with the correct answer in it11:15
Results - 111:16
Results - 211:23
DeepMind reading comprehension benchmark - 113:07
DeepMind reading comprehension benchmark - 213:28
Conclusions14:58
Thanks15:57