Treat Different Negatives Differently: Enriching Loss Functions with Domain and Range Constraints for Link Prediction thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Treat Different Negatives Differently: Enriching Loss Functions with Domain and Range Constraints for Link Prediction

Published on Jun 18, 202458 Views

Knowledge graph embedding models (KGEMs) are used for various tasks related to knowledge graphs (KGs), including link prediction. They are trained with loss functions that consider batches of true and

Related categories

Chapter list

Treat Different Negatives Differently: Enriching Loss Functions with Domain and Range Constraints for Link Prediction00:00
Introduction00:16
Link prediction with knowledge graph embedding models - 100:18
Link prediction with knowledge graph embedding models - 200:19
Semantic-enhanced approaches - 100:20
Semantic-enhanced approaches - 200:46
Semantic-enhanced approaches - 300:51
Semantic-enhanced approaches - 400:53
Signature-driven loss functions02:34
Loss functions for link prediction02:39
Different kinds of negatives03:00
Semantic-enhanced loss functions - 103:58
Semantic-enhanced loss functions - 204:46
Semantic-enhanced loss functions - 304:47
Semantic-enhanced loss functions - 404:55
Semantic-enhanced loss functions - 505:16
Semantic-enhanced loss functions - 605:23
Experiments05:39
Evaluation metrics - 105:43
Evaluation metrics - 205:48
Evaluation metrics: example - 106:07
Evaluation metrics: example - 206:29
Evaluation metrics: example - 306:53
Evaluation metrics: example - 407:05
Evaluation metrics: example - 507:08
Models07:10
Datasets07:20
Results07:37
Global performance - 107:38
Global performance - 208:00
Ablation study09:13
Ablation study (B1) - 110:06
Ablation study (B1) - 210:29
Discussion10:47
Treating different negatives differently (RQ1)10:58
Impact of signature-driven losses on performance (RQ2)11:58
Conclusion12:43
Take-home messages and future work - 112:45
Take-home messages and future work - 212:54
Thank you for your attention!13:34