Guo, E., Fu, H., Zhou, L., & Xu, D. (2023). Bridging Synthetic and Real Images: a Transferable and Multiple Consistency aided Fundus Image Enhancement Framework. IEEE Transactions on Medical Imaging, 1–1. https://doi.org/10.1109/tmi.2023.3247783
Deep learning based image enhancement models have largely improved the readability of fundus images in order to decrease the uncertainty of clinical observations and the risk of misdiagnosis. However, due to the difficulty of acquiring paired real fundus images at different qualities, most existing methods have to adopt synthetic image pairs as training data. The domain shift between the synthetic and the real images inevitably hinders the generalization of such models on clinical data. In this work, we propose an end-to-end optimized teacherstudent framework to simultaneously conduct image enhancement and domain adaptation. The student network uses synthetic pairs for supervised enhancement, and regularizes the enhancement model to reduce domain-shift by enforcing teacher-student prediction consistency on the real fundus images without relying on enhanced groundtruth. Moreover, we also propose a novel multi-stage multiattention guided enhancement network (MAGE-Net) as the backbones of our teacher and student network. Our MAGENet utilizes multi-stage enhancement module and retinal structure preservation module to progressively integrate the multi-scale features and simultaneously preserve the retinal structures for better fundus image quality enhancement. Comprehensive experiments on both real and synthetic datasets demonstrate that our framework outperforms the baseline approaches. Moreover, our method also benefits the downstream clinical tasks.
This research / project is supported by the A*STAR - Central Research Fund
Grant Reference no. : NA
This research / project is supported by the A*STAR - AME Programmatic Fund
Grant Reference no. : A20H4b0141
This work was supported by DP200103223 funded by Australian Research Council (ARC) and The Hong Kong Jockey Club Charities Trust (No.2022-0174), and the startup funding from The University of Hong Kong