4.6 Review

Neural attention for image captioning: review of outstanding methods

Journal

ARTIFICIAL INTELLIGENCE REVIEW
Volume 55, Issue 5, Pages 3833-3862

Publisher

SPRINGER
DOI: 10.1007/s10462-021-10092-2

Keywords

Image captioning; Attention mechanism; LSTM; Transformer

Ask authors/readers for more resources

This paper reviews literature on attentive deep learning models for image captioning, emphasizing different types of attention mechanisms. The most successful image captioning models follow the encoder-decoder architecture, with the best results currently achieved from variants of multi-head attention with bottom-up attention.
Image captioning is the task of automatically generating sentences that describe an input image in the best way possible. The most successful techniques for automatically generating image captions have recently used attentive deep learning models. There are variations in the way deep learning models with attention are designed. In this survey, we provide a review of literature related to attentive deep learning models for image captioning. Instead of offering a comprehensive review of all prior work on deep image captioning models, we explain various types of attention mechanisms used for the task of image captioning in deep learning models. The most successful deep learning models used for image captioning follow the encoder-decoder architecture, although there are differences in the way these models employ attention mechanisms. Via analysis on performance results from different attentive deep models for image captioning, we aim at finding the most successful types of attention mechanisms in deep models for image captioning. Soft attention, bottom-up attention, and multi-head attention are the types of attention mechanism widely used in state-of-the-art attentive deep learning models for image captioning. At the current time, the best results are achieved from variants of multi-head attention with bottom-up attention.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available