Saliency-Aware Nonparametric Foreground Annotation Based on Weakly Labeled Data

Page view(s)
36
Checked on Mar 03, 2025
Saliency-Aware Nonparametric Foreground Annotation Based on Weakly Labeled Data
Title:
Saliency-Aware Nonparametric Foreground Annotation Based on Weakly Labeled Data
Journal Title:
IEEE transactions on neural networks and learning systems
Keywords:
Publication Date:
01 June 2016
Citation:
X. Cao, C. Zhang, H. Fu, X. Guo and Q. Tian, "Saliency-Aware Nonparametric Foreground Annotation Based on Weakly Labeled Data," in IEEE Transactions on Neural Networks and Learning Systems, vol. 27, no. 6, pp. 1253-1265, June 2016. doi: 10.1109/TNNLS.2015.2488637
Abstract:
In this paper, we focus on annotating the foreground of an image. More precisely, we predict both image-level labels (category labels) and object-level labels (locations) for objects within a target image in a unified framework. Traditional learning-based image annotation approaches are cumbersome, because they need to establish complex mathematical models and be frequently updated as the scale of training data varies considerably. Thus, we advocate the nonparametric method, which has shown potential in numerous applications and turned out to be attractive thanks to its advantages, i.e., lightweight training load and scalability. In particular, we exploit the salient object windows to describe images, which is beneficial to image retrieval and, thus, the subsequent image-level annotation and localization tasks. Our method, namely, saliency-aware nonparametric foreground annotation, is practical to alleviate the full label requirement of training data, and effectively addresses the problem of foreground annotation. The proposed method only relies on retrieval results from the image database, while pretrained object detectors are no longer necessary. Experimental results on the challenging PASCAL VOC 2007 and PASCAL VOC 2008 demonstrate the advance of our method.
License type:
PublisherCopyrights
Funding Info:
Description:
(c) 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.
ISSN:
2162-237X
Files uploaded:

File Size Format Action
tnnls-2015-p-4669.pdf 4.10 MB PDF Open