End to End Unsupervised Rigid Medical Image Registration by Using Convolutional Neural Networks

Page view(s)
89
Checked on Dec 29, 2024
End to End Unsupervised Rigid Medical Image Registration by Using Convolutional Neural Networks
Title:
End to End Unsupervised Rigid Medical Image Registration by Using Convolutional Neural Networks
Journal Title:
2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC)
Publication Date:
09 December 2021
Citation:
H. Liu et al., "End to End Unsupervised Rigid Medical Image Registration by Using Convolutional Neural Networks," 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), 2021, pp. 4064-4067, doi: 10.1109/EMBC46164.2021.9630351.
Abstract:
In this paper, we focus on the issue of rigid medical image registration using deep learning. Under ultrasound, the moving of some organs, e.g., liver and kidney, can be modeled as rigid motion. Therefore, when the ultrasound probe keeps stationary, the registration between frames can be modeled as rigid registration. We propose an unsupervised method with Convolutional Neural Networks. The network estimates from the input image pair the transform parameters first then the moving image is wrapped using the parameters. The loss is calculated between the registered image and the fixed image. Experiments on ultrasound data of kidney and liver verified that the method is capable of achieve higher accuracy compared with traditional methods and is much faster.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the Agency for Science, Technology and Research - -
Grant Reference no. : ACCL/19-GAP035-R20H
Description:
© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
ISSN:
2694-0604
2375-7477
ISBN:
978-1-7281-1179-7
978-1-7281-1178-0
978-1-7281-1180-3
Files uploaded: