LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Image-text dual neural network with decision strategy for small-sample image classification

Photo from wikipedia

Abstract Small-sample classification is a challenging problem in computer vision. In this work, we show how to efficiently and effectively utilize semantic information of the annotations to improve the performance… Click to show full abstract

Abstract Small-sample classification is a challenging problem in computer vision. In this work, we show how to efficiently and effectively utilize semantic information of the annotations to improve the performance of small-sample classification. First, we propose an image-text dual neural network to improve the classification performance on small-sample datasets. The proposed model consists of two sub-models, an image classification model and a text classification model. After training the sub-models separately, we design a novel method to fuse the two sub-models rather than simply combine their results. Our image-text dual neural network aims to utilize the text information to overcome the training problem of deep models on small-sample datasets. Then, we propose to incorporate a decision strategy into the image-text dual neural network to further improve the performance of our original model on few-shot datasets. To demonstrate the effectiveness of the proposed models, we conduct experiments on the LabelMe and UIUC-Sports datasets. Experimental results show that our method is superior to other models.

Keywords: dual neural; image; text dual; classification; image text; small sample

Journal Title: Neurocomputing
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.