Semi-Supervised Few-Shot Segmentation with Noisy Support Images

Page view(s)
4
Checked on Jun 17, 2024
Semi-Supervised Few-Shot Segmentation with Noisy Support Images
Title:
Semi-Supervised Few-Shot Segmentation with Noisy Support Images
Journal Title:
2023 IEEE International Conference on Image Processing (ICIP)
Keywords:
Publication Date:
11 September 2023
Citation:
Zhang, R., Zhu, H., Zhang, H., Gong, C., Zhou, J. T., & Meng, F. (2023, October 8). Semi-Supervised Few-Shot Segmentation with Noisy Support Images. 2023 IEEE International Conference on Image Processing (ICIP). https://doi.org/10.1109/icip49359.2023.10222652
Abstract:
Motivated by the semi-supervised learning that uses the unlabeled data and pseudo annotations to improve the image classification, this paper proposes a new semi-supervised few-shot segmentation (FSS) framework of which the training process uses not only the annotated images, but also the unlabeled images, e.g. images from other available datasets, to enhance the training of the FSS model. Furthermore, in the test phase, more support images and pseudo-annotations can also be generated by the proposed framework to enrich the support set of novel classes and therefore benefit the inference. However, unlabeled images are not a free lunch. The noisy intra-class samples and inter-class samples existed in the unlabeled images as well as the interferences of the bad quality of pseudo annotations make it difficult to utilize the correct images and pseudo annotations for a certain class. To this end, we further propose a ranking algorithm consisting of an inter-class confidence term and an intra-class confidence term to efficiently utilize the pseudo annotations of the class with high quality. Extensive experiments on COCO-20 i dataset demonstrate that the proposed semi-supervised FSS framework is superior to many state-of-the-art methods.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the A*STAR - AME Programmatic Funding
Grant Reference no. : A18A2b0046

This research / project is supported by the A*STAR - Robot HTPO Seed Fund
Grant Reference no. : C211518008

This research / project is supported by the Economic Development Board (EDB) - Space Technology Development Grant
Grant Reference no. : S22-19016-STDP

This work was supported in part by the National Key R&D Program of China under Grant 2021ZD0112000, the National Natural Science Foundation of China under Grant 62271119, the Natural Science Foundation of Sichuan Province under Grant 2023NSFSC1972, and the Natural Science Foundation of Jiangsu Province under Grant BZ2021013.
Description:
© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
ISBN:
978-1-7281-9835-4
Files uploaded:

File Size Format Action
icip2023-camera-ready-4.pdf 3.60 MB PDF Request a copy