Domain-Independent Quality Measures for Crowd Truth Disagreement thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Domain-Independent Quality Measures for Crowd Truth Disagreement

Published on Nov 28, 20132214 Views

Using crowdsourcing platforms such as CrowdFlower and Amazon Mechanical Turk for gathering human annotation data has become now a mainstream process. Such crowd involvement can reduce the time need

Related categories

Chapter list

Domain-Independent Quality Measures for Crowd Truth00:00
Crowd Truth01:45
Background03:41
Crowd-Watson Adaptation to Newspapers Event Extraction06:06
Example - 106:58
Example - 206:59
Example - 307:57
Role-Filler Taxonomies08:55
How do we represent & measure disagreement in a way that it can be harnessed?10:16
Events semantics are hard10:33
Events have multiple dimensions - 110:43
Events have multiple dimensions - 211:40
Each dimension has different granularity 12:51
Why do people disagree? - 113:45
Why do people disagree? - 214:01
Disagreement Analytics14:56
Experimental Setting16:05
Annotation Example16:23
Event Type Disagrement16:54
Event Location Disagrement18:06
Event Time Disagrement18:12
Event Participant Disagrement18:15
Comparative Annotation Distribution - 118:19
Comparative Annotation Distribution - 219:23
Sentence Clarity19:34
Spam Detection19:54
What more ...20:09
Conclusions20:54