KADE: Aligning Knowledge Base and Document Embedding Models using Regularized Multi-Task Learning thumbnail
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

KADE: Aligning Knowledge Base and Document Embedding Models using Regularized Multi-Task Learning

Published on Nov 22, 20182772 Views

Knowledge Bases (KBs) and textual documents contain rich and complementary information about real-world objects, as well as relations among them. While text documents describe entities in freeform, KB

Related categories

Chapter list

Aligning Knowledge Base and Document Embedding Models using Regularized Multi-Task Learning - 100:00
Aligning Knowledge Base and Document Embedding Models using Regularized Multi-Task Learning - 200:17
Aligning Knowledge Base and Document Embedding Models using Regularized Multi-Task Learning - 300:30
Associate documents to entities and vice-versa00:47
Documents and entities describe real-world objects, hence should be aligned01:06
What do you need to compare entities to documents?01:30
Descriptions can be represented in a vector space02:41
Embeddings learn a vector representation such that related object descriptions have similar vectors.03:04
Embeddings from different models or different instantiations are not comparable.04:37
KADE: Joining embedding models by regularization05:57
KADE doesn’t break document embeddings - 107:19
KADE doesn’t break document embeddings - 208:26
KADE learns similar embeddings for documententity pairs09:48
In conclusion, KADE learns a representation such that descriptions in different formats are comparable.11:44
Future Work12:06
Aligning Knowledge Base and Document Embedding Models using Regularized Multi-Task Learning13:40