A two-step deep learning method for 3DCT-2DUS kidney registration during breathing

Page view(s)
4
Checked on Apr 16, 2024
A two-step deep learning method for 3DCT-2DUS kidney registration during breathing
Title:
A two-step deep learning method for 3DCT-2DUS kidney registration during breathing
Journal Title:
Scientific Reports
Publication Date:
08 August 2023
Citation:
Chi, Y., Xu, Y., Liu, H., Wu, X., Liu, Z., Mao, J., Xu, G., & Huang, W. (2023). A two-step deep learning method for 3DCT-2DUS kidney registration during breathing. Scientific Reports, 13(1). https://doi.org/10.1038/s41598-023-40133-5
Abstract:
AbstractThis work proposed KidneyRegNet, a novel deep registration pipeline for 3D CT and 2D U/S kidney scans of free breathing, which comprises a feature network, and a 3D–2D CNN-based registration network. The feature network has handcrafted texture feature layers to reduce the semantic gap. The registration network is an encoder-decoder structure with loss of feature-image-motion (FIM), which enables hierarchical regression at decoder layers and avoids multiple network concatenation. It was first pretrained with a retrospective dataset cum training data generation strategy and then adapted to specific patient data under unsupervised one-cycle transfer learning in onsite applications. The experiment was performed on 132 U/S sequences, 39 multiple-phase CT and 210 public single-phase CT images, and 25 pairs of CT and U/S sequences. This resulted in a mean contour distance (MCD) of 0.94 mm between kidneys on CT and U/S images and MCD of 1.15 mm on CT and reference CT images. Datasets with small transformations resulted in MCDs of 0.82 and 1.02 mm, respectively. Large transformations resulted in MCDs of 1.10 and 1.28 mm, respectively. This work addressed difficulties in 3DCT-2DUS kidney registration during free breathing via novel network structures and training strategies.
License type:
Attribution 4.0 International (CC BY 4.0)
Funding Info:
This research / project is supported by the ASTAR - GAP Fund
Grant Reference no. : ACCL/19-GAP035-R20H
Description:
ISSN:
2045-2322