What Helps Where - And Why? Semantic Relatedness for Knowledge Transfer
published: July 19, 2010, recorded: June 2010, views: 519
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Remarkable performance has been reported to recognize single object classes. Scalability to large numbers of classes however remains an important challenge for today’s recognition methods. Several authors have promoted knowledge transfer between classes as a key ingredient to address this challenge. However, in previous work the decision which knowledge to transfer has required either manual supervision or at least a few training examples limiting the scalability of these approaches. In this work we explicitly address the question of how to automatically decide which information to transfer between classes without the need of any human intervention. For this we tap into linguistic knowledge bases to provide the semantic link between sources (what) and targets (where) of knowledge transfer. We provide a rigorous experimental evaluation of different knowledge bases and state-of-the-art techniques from Natural Language Processing which goes far beyond the limited use of language in related work. We also give insights into the applicability (why) of different knowledge sources and similarity measures for knowledge transfer.
Download slides: cvpr2010_rohrbach_whw_01.v1.pdf (2.4 MB)
Download slides: cvpr2010_rohrbach_whw_01.ppt (10.6 MB)
Download article: cvpr2010_rohrbach_whw_01.pdf (1.0 MB)
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !