Uncertainty-inspired open set learning for retinal anomaly identification

Page view(s)
33
Checked on Oct 23, 2024
Uncertainty-inspired open set learning for retinal anomaly identification
Title:
Uncertainty-inspired open set learning for retinal anomaly identification
Journal Title:
Nature Communications
Keywords:
Publication Date:
24 October 2023
Citation:
Wang, M., Lin, T., Wang, L., Lin, A., Zou, K., Xu, X., Zhou, Y., Peng, Y., Meng, Q., Qian, Y., Deng, G., Wu, Z., Chen, J., Lin, J., Zhang, M., Zhu, W., Zhang, C., Zhang, D., Goh, R. S. M., … Fu, H. (2023). Uncertainty-inspired open set learning for retinal anomaly identification. Nature Communications, 14(1). https://doi.org/10.1038/s41467-023-42444-7
Abstract:
AbstractFailure to recognize samples from the classes unseen during training is a major limitation of artificial intelligence in the real-world implementation for recognition and classification of retinal anomalies. We establish an uncertainty-inspired open set (UIOS) model, which is trained with fundus images of 9 retinal conditions. Besides assessing the probability of each category, UIOS also calculates an uncertainty score to express its confidence. Our UIOS model with thresholding strategy achieves an F1 score of 99.55%, 97.01% and 91.91% for the internal testing set, external target categories (TC)-JSIEC dataset and TC-unseen testing set, respectively, compared to the F1 score of 92.20%, 80.69% and 64.74% by the standard AI model. Furthermore, UIOS correctly predicts high uncertainty scores, which would prompt the need for a manual check in the datasets of non-target categories retinal diseases, low-quality fundus images, and non-fundus images. UIOS provides a robust method for real-world screening of retinal anomalies.
License type:
Attribution 4.0 International (CC BY 4.0)
Funding Info:
This research / project is supported by the A*STAR - Career Development Fund
Grant Reference no. : C222812010

This research / project is supported by the A*STAR - Central Research Fund
Grant Reference no. : NA

This research / project is supported by the A*STAR - AME Programmatic
Grant Reference no. : A20H4b0141
Description:
ISSN:
2041-1723