Top-down Neural Attention by Excitation Backprop thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Top-down Neural Attention by Excitation Backprop

Published on Oct 24, 20161634 Views

We aim to model the top-down attention of a Convolutional Neural Network (CNN) classifier for generating task-specific attention maps. Inspired by a top-down human visual attention model, we propose a

Related categories

Chapter list

Top-down Neural Attention by Excitation Backprop00:00
Motivation/100:14
Motivation/200:26
Goal: Generate Top-Down Attention Maps/101:07
Goal: Generate Top-Down Attention Maps/201:21
Goal: Generate Top-Down Attention Maps/301:49
Related Work02:17
Contributions03:07
The Selective Tuning Model [Tsotsos et al. 1995]/104:04
The Selective Tuning Model [Tsotsos et al. 1995]/204:34
Our Approach: Probabilistic Winner-Take-All04:54
Excitation Backprop/105:54
Excitation Backprop/207:17
Challenge: Responsive to Top-down Signals?/107:32
Challenge: Responsive to Top-down Signals?/208:05
Negating the Output Layer for Contrastive Signals/108:23
Negating the Output Layer for Contrastive Signals/208:28
Negating the Output Layer for Contrastive Signals/308:31
Contrastive Maps08:41
Evaluation: The Pointing Game08:58
Results on VOC07 (GoogleNet)10:12
Results on MS COCO (GoogleNet)10:59
Qualitative Comparison/111:32
Qualitative Comparison/212:11
Top-down Attention from an 18K-Tag Classifier12:22
An Interesting Case13:30
Phrase Localization14:08
Conclusion14:45