Few-Shot Domain Adaptation with Polymorphic Transformers

Page view(s)
Checked on Sep 11, 2023
Few-Shot Domain Adaptation with Polymorphic Transformers
Few-Shot Domain Adaptation with Polymorphic Transformers
Journal Title:
Medical Image Computing and Computer Assisted Intervention – MICCAI 2021
Publication Date:
23 September 2021
Li, S., Sui, X., Fu, J., Fu, H., Luo, X., Feng, Y., Xu, X., Liu, Y., Ting, D. S. W., & Goh, R. S. M. (2021). Few-Shot Domain Adaptation with Polymorphic Transformers. Lecture Notes in Computer Science, 330–340. https://doi.org/10.1007/978-3-030-87196-3_31
Deep neural networks (DNNs) trained on one set of medical images often experience severe performance drop on unseen test images, due to various domain discrepancy between the training images (source domain) and the test images (target domain), which raises a domain adaptation issue. In clinical settings, it is difficult to collect enough annotated target domain data in a short period. Few-shot domain adaptation, i.e., adapting a trained model with a handful of annotations, is highly practical and useful in this case. In this paper, we propose a Polymorphic Transformer (Polyformer), which can be incorporated into any DNN backbones for few-shot domain adaptation. Specifically, after the polyformer layer is inserted into a model trained on the source domain, it extracts a set of prototype embeddings, which can be viewed as a “basis” of the source-domain features. On the target domain, the polyformer layer adapts by only updating a projection layer which controls the interactions between image features and the prototype embeddings. All other model weights (except BatchNorm parameters) are frozen during adaptation. Thus, the chance of overfitting the annotations is greatly reduced, and the model can perform robustly on the target domain after being trained on a few annotated images. We demonstrate the effectiveness of Polyformer on two medical segmentation tasks (i.e., optic disc/cup segmentation, and polyp segmentation). The source code of Polyformer is released at https://github.com/askerlee/segtran.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the A*STAR - Career Development Fund
Grant Reference no. : C210112016

This research / project is supported by the A*STAR - Advanced Manufacturing and Engineering (AME) programme
Grant Reference no. : A18A2b0046
This version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use, but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: https://doi.org/10.1007/978-3-030-87196-3_31
Files uploaded:

File Size Format Action
polyformer-miccai.pdf 614.43 KB PDF Open