簡易檢索 / 詳目顯示

研究生: 希爾米
Muhamad - Hilmil Muchtar Aditya Pradana
論文名稱: An Ensemble 3D Alignment Method in Application to 3D CT-MRI Fusion and 3D Brain Mapping
An Ensemble 3D Alignment Method in Application to 3D CT-MRI Fusion and 3D Brain Mapping
指導教授: 王靖維
Ching-Wei Wang
口試委員: 郭景明
Jing-Ming Guo
李忠興
Hui-Ching Lin
王靖維
Ching-Wei Wang
學位類別: 碩士
Master
系所名稱: 應用科技學院 - 醫學工程研究所
Graduate Institute of Biomedical Engineering
論文出版年: 2017
畢業學年度: 105
語文別: 英文
論文頁數: 150
中文關鍵詞: Imageregistrationanewensembleapproach3DCTandMRIdatasetsdrosophilabrainmapping
外文關鍵詞: Image registration, a new ensemble approach, 3D CT and MRI datasets, drosophila brain mapping
相關次數: 點閱:194下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • Image registration plays an important role in medical imaging, treatment and surgery. Within the current clinical setting and healthcare technology development, medical imaging is a vital component of medical application and health diagnosis. Computed tomography (CT) image is useful for acquiring the hard tissue information details. It has superior ability to assess bony architecture when compared to magnetic resonance imaging (MRI). MRI is useful for capturing the internal soft tissue details, including organs, blood vessels and nerves. In combination of CT and MRI data, medical experts are able to monitor 3D information of bones and soft tissues. Fusion of CT and MRI assists medical experts in treatment planning and surgery, and it greatly reduces the risks to damage nerves during surgery. Hence, accurate alignment of CT and MRI is beneficial for clinical applications. In some applications in biotechnology field, aligning images of adult drosophila brains in three-dimensions(3D) is useful to identify the potential connectivity of neuronal circuit. For behavioural neural activity in part of the brains, accurate alignment of brains is a prerequisite for detecting anatomical features in brains that correlate with behaviour phenotypes. By studies, image registration of medical images is challenging because of variations on data appearance, imaging artifacts and complex data deformation problems. Hence, the aim of this study is a new ensemble approach in image registration method for medical images. A new ensemble approach in image registration improves the performance of individual method. The proposed method is tested with two types of data, including multi modality alignment of 3D CT and MRI datasets and super high resolution drosophila brain mapping. In evaluation, the proposed method is compared with nine state-of-the-art image registration techniques, and the experimental results show that the proposed method consistently performs well on both applications, achieving 9.444 pixels and 4.1642 pixels averaged error distance for the 3D CT and MRI registration of the human datasets and for the drosophila brain mapping application, respectively. In comparison, the benchmark methods obtain 21.99 ∼ 53.69 pixels and 5.01 ∼ 7.37 pixels averaged error distance for the 3D CT and MRI fusion application of the human datasets and for the drosophila brain mapping application, respectively. Based on the paired-samples T test, the results show that the proposed method is significantly better than the benchmark approaches for the CT-MRI human datasets (p ≤ 0.001) and for the drosophila brain datasets (P ≤ 0.001).


    Image registration plays an important role in medical imaging, treatment and surgery. Within the current clinical setting and healthcare technology development, medical imaging is a vital component of medical application and health diagnosis. Computed tomography (CT) image is useful for acquiring the hard tissue information details. It has superior ability to assess bony architecture when compared to magnetic resonance imaging (MRI). MRI is useful for capturing the internal soft tissue details, including organs, blood vessels and nerves. In combination of CT and MRI data, medical experts are able to monitor 3D information of bones and soft tissues. Fusion of CT and MRI assists medical experts in treatment planning and surgery, and it greatly reduces the risks to damage nerves during surgery. Hence, accurate alignment of CT and MRI is beneficial for clinical applications. In some applications in biotechnology field, aligning images of adult drosophila brains in three-dimensions(3D) is useful to identify the potential connectivity of neuronal circuit. For behavioural neural activity in part of the brains, accurate alignment of brains is a prerequisite for detecting anatomical features in brains that correlate with behaviour phenotypes. By studies, image registration of medical images is challenging because of variations on data appearance, imaging artifacts and complex data deformation problems. Hence, the aim of this study is a new ensemble approach in image registration method for medical images. A new ensemble approach in image registration improves the performance of individual method. The proposed method is tested with two types of data, including multi modality alignment of 3D CT and MRI datasets and super high resolution drosophila brain mapping. In evaluation, the proposed method is compared with nine state-of-the-art image registration techniques, and the experimental results show that the proposed method consistently performs well on both applications, achieving 9.444 pixels and 4.1642 pixels averaged error distance for the 3D CT and MRI registration of the human datasets and for the drosophila brain mapping application, respectively. In comparison, the benchmark methods obtain 21.99 ∼ 53.69 pixels and 5.01 ∼ 7.37 pixels averaged error distance for the 3D CT and MRI fusion application of the human datasets and for the drosophila brain mapping application, respectively. Based on the paired-samples T test, the results show that the proposed method is significantly better than the benchmark approaches for the CT-MRI human datasets (p ≤ 0.001) and for the drosophila brain datasets (P ≤ 0.001).

    Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii Publication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v Acknowledgment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi Table of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Aim and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.3 Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.4 Thesis Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2 Related Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 3 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 3.1 Data Normalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 3.2 Global Registration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.3 Base Local Registration Methods . . . . . . . . . . . . . . . . . . . . . . 14 3.3.1 1 st Base Local Registration Method (P 1 ) . . . . . . . . . . . . . . 16 3.3.2 2 nd Base Local Registration Method (P 2 ) . . . . . . . . . . . . . 20 3.3.3 3 rd Base Local Registration Method (P 3 ) . . . . . . . . . . . . . 21 3.4 Model Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 3.5 Optimization Parameter Tests . . . . . . . . . . . . . . . . . . . . . . . . 22 4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.1 Data Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.1.1 Human Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4.1.2 Brain Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 4.1.3 Evaluation Approaches . . . . . . . . . . . . . . . . . . . . . . . 32 4.2 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 4.2.1 Evaluation Result using Human Datasets . . . . . . . . . . . . . 34 4.2.2 Evaluation Result using Brain Datasets . . . . . . . . . . . . . . 71 4.2.3 Computation Time . . . . . . . . . . . . . . . . . . . . . . . . . 78 5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 5.1 Summary of Contributions . . . . . . . . . . . . . . . . . . . . . . . . . 81 5.2 System Limitation and Future Work . . . . . . . . . . . . . . . . . . . . 82 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 Appendix A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 Appendix B . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 Appendix C . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 Appendix D . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144

    [1] Singh, A. et al. (2013) CT and MRI Brain Images Registration for Clinical Applications, Journal of Cancer Science & Therapy, vol. 06, Issue. 01, 018-026. doi:10.4172/1948-5956.1000243.
    [2] Zitova, B. and Flusser J. (2003) Image registration methods: a survey, Image and
    Vision Computing, vol. 21, Issue. 11, 977-1000. doi:10.1016/s0262-8856(03)001379.
    [3] Smith, B. et al. (2016) A comparison of imaging modalities for the diagnosis of osteomyelitis, Marshall Journal of Medicine, vol. 2, Issue. 3, 83-92. doi:10.18590/mjm.2016.vol2.iss3.10.
    [4] Peng, H. et al. (2011) BrainAligner: 3D registration atlases of Drosophila brains, Nature Methods, vol. 8, Issue. 6, 493-498. doi:10.1038/nmeth.1602.
    [5] Chen, M. et al. (2017) Cross contrast multi-channel image registration using
    image synthesis for MR brain images, Medical Image Analysis, vol. 36, 2-14.
    doi:http://dx.doi.org/10.1016/j.media.2016.10.005.
    [6] Hua, R. et al. (2016) Multiresolution eXtended Free-Form Deformations (XFFD) for
    non-rigid registration with discontinuous transforms, Medical Image Analysis, vol.
    36, 113-122. doi:10.1016/j.media.2016.10.008.
    [7] James, M. et al. (2016) Automatic Image Registration of Multimodal Remotely
    Sensed Data With Global Shearlet Features, IEEE Transactions on Medical Imaging, vol. 54, Issue. 3, 1685-1704. doi:10.1109/TGRS.2015.2487457.
    [8] Schlachter, M. et al. (2016) Visualization of Deformable Image Registration Quality Using Local Image Dissimilarity, IEEE Transactions on Medical Imaging, vol. 35, Issue. 10, 2319-2328. doi:10.1109/TMI.2016.2560942.
    [9] Wang, C.-W. et al. (2015) Fully automatic and robust 3D registration of serial section microscopic images, Scientific Reports, vol. 5. doi:10.1038/srep15051.
    [10] Mani, V., and Arivazhagan S. (2013) Survey of Medical Image Registration, Science and Education Publishing, vol. 1, Issue. 2, 8-25. doi:10.12691/jbet-1-2-1.
    [11] Breiman, L. (1996) Bagging predictors, Machine Learning, vol. 24, Issue. 2, 123-140. doi:10.1007/bf00058655.
    [12] Mikhail, L. et al. (2016) Multimodal Remote Sensing Image Registration With Accuracy Estimation at Local and Global Scales, IEEE Transactions on Geoscience and
    Remote Sensing, vol. 54, Issue. 11, 6587-6605. doi:10.1109/TGRS.2016.2587321.
    [13] Paul, S. et al. (2016) Remote Sensing Optical Image Registration Using Modified
    Uniform Robust SIFT, IEEE Geoscience and Remote Sensing Letters, vol. 13, Issue. 9, 1300-1304. doi:10.1109/LGRS.2016.2582528.
    [14] Shenoy, R. et al. (2016) Deformable Registration of Biomedical Images Using 2D
    Hidden Markov Models, IEEE Transactions on Image Processing, vol. 25, Issue. 10,
    4631-4640. doi:10.1109/TIP.2016.2592702.
    [15] Xing, C. et al. (2011) Intensity-Based Image Registration by Nonparametric Local Smoothing, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 33, Issue. 10, 2081-2092. doi:10.1109/TPAMI.2011.26.
    [16] Lobachev, O. et al. (2017) Feature-based multi-resolution registration of
    immunostained serial sections, Medical Image Analysis, vol. 35, 288-302. doi:10.1016/j.media.2016.07.010.
    [17] Dare, P. and Dowman, I. (2001) An improved model for automatic feature-based
    registration of SAR and SPOT images, ISPRS Journal of Photogrammetry and Remote Sensing, vol. 56, Issue. 1, 13-28. doi:10.1016/s0924-2716(01)00031-4.
    [18] Parmehr, E. et al. (2016) Automatic Parameter Selection for Intensity-Based Registration of Imagery to LiDAR Data, IEEE Transactions on Geoscience and Remote
    Sensing, vol. 54, Issue. 12, 7032-7043. doi:10.1109/TGRS.2016.2594294.
    [19] Unser, M. and Thevenaz, P. (2000) Optimization of mutual information for multiresolution image registration, IEEE Transactions on Image Processing, vol. 9, Issue. 12, 2083-2099. doi:10.1109/83.887976.
    [20] Staring, M. (2009) Registration of Cervical MRI Using Multifeature Mutual Information, IEEE Transactions on Medical Imaging, vol. 28, Issue. 9, 1412-1421.
    doi:10.1109/tmi.2009.2016560.
    [21] Qiao, Y. et al. (2014) Fast automatic estimation of the optimization step size for nonrigid image registration, Medical Imaging 2014: Image Processing, vol. 24, Issue. 2, 391-403. doi:10.1117/12.2042859.
    [22] Specht, E. et al. (2015) Euclidean Geometry Basics (EUC), Euclidean Geometry
    and its Subgeometries, 319-333. doi:10.1007/978-3-319-23775-6 11.
    [23] Ngo, E. et al. (2013) Combinatorial structure of rigid transformations in 2D digital images, Computer Vision and Image Understanding, vol. 117, Issue. 4, 393-408.
    doi:10.1016/j.cviu.2012.08.014.
    [24] Lin, H. et al. (2010) Image registration based on corner detection and affine transformation, 2010 3rd International Congress on Image and Signal Processing, vol. 3, Issue. 3, 2184-2188. doi:10.1109/cisp.2010.5647722.
    [25] Song, Z. et al. (2014) A Novel Image Registration Algorithm for Remote Sensing
    Under Affine Transformation, IEEE Transactions on Geoscience and Remote Sensing, vol. 52, Issue. 8, 4895-4912. doi:10.1109/TGRS.2013.2285814.
    [26] Zakharov, A. et al. (2014) A New Approach to Radial Basis Function Approximation and Its Application to QSAR, Journal of Chemical Information and Modeling, vol. 54, Issue. 3, 713-719. doi:10.1021/ci400704f.
    [27] Hub, M. et al. (2009) A Stochastic Approach to Estimate the UncertaintyInvolved in B-Spline Image Registration, IEEE Transactions on Medical Imaging, vol. 28, Issue. 11, 1708-1716. doi:10.1109/TMI.2009.2021063.
    [28] Lee, S. et al. (1997) Scattered data interpolation with multilevel B-splines, IEEE Transactions on Visualization and Computer Graphics, vol. 3, Issue. 3, 228-244.
    doi:10.1109/2945.620490.
    [29] Lu, G. et al. (2014) A Image Registration Algorithm Based on Wavelet Pyramid for CCD Image, Applied Mechanics and Materials, vol. 644-650, 4273-4277.
    doi:10.4028/www.scientific.net/amm.644-650.4273.
    [30] Lijn, F. (2009) Cerebellum segmentation in MRI using atlas registration and local multi-scale image descriptors, 2009 IEEE International Symposium on Biomedical
    Imaging: From Nano to Macro. doi:10.1109/isbi.2009.5193023.
    [31] Staring, M. (2007) A rigidity penalty term for nonrigid registration, Medical
    Physics, vol. 34, Issue. 11, 4098-4108. doi:10.1118/1.2776236.
    [32] Staring, M. (2011) Facilitating Tumor Functional Assessment by Spatially Relating 3D Tumor Histology and In Vivo MRI: Image Registration Approach, PLoS ONE,
    vol. 6, Issue. 8. doi:10.1371/journal.pone.0022835.
    [33] Klein, S. et al. (2008) Adaptive Stochastic Gradient Descent Optimisation for Image Registration, International Journal of Computer Vision, vol. 81, Issue. 3, 227-239. doi:10.1007/s11263-008-0168-y.
    [34] Klein, S. et al. (2007) Evaluation of Optimization Methods for Nonrigid Medical
    Image Registration Using Mutual Information and B-Splines, IEEE Transactions on
    Image Processing, vol. 16, Issue. 12, 2879-2890. doi:10.1109/tip.2007.909412.
    [35] Klein, S. et al. (2010) elastix: A Toolbox for Intensity-Based Medical Image
    Registration, IEEE Transactions on Medical Imaging, vol. 29, Issue. 1, 196-205.
    doi:10.1109/tmi.2009.2035616.
    [36] Horn, P. et al. (1986) Closed-form Solution of Absolute Orientation using Unit
    Quaternions, Journal Opt. Soc. Amer. A, 629-642.
    [37] Huang, S. et al. (1986) Least-Squares Estimation of Motion Parameters from 3-D
    Point Correspondences, Proc. IEEE Conference Computer Vision and Pattern Recognition, 24-26.
    [38] Faugeras, O. and Hebert M. (1983) A 3-D recognition and positioning algorithm
    using geometrical matching between primitive surfaces, IJCAI’83 Proceedings of the
    Eighth international joint conference on Artificial intelligence, vol. 2, 996-1002.
    [39] Arun, S. et al. (1987) Least-Squares Fitting of Two 3-D Point Sets, IEEE Transactions Pattern Analysis Machine Intelligence, vol. PAMI-9, Issue. 5, 698-700.
    doi:10.1109/tpami.1987.4767965.
    [40] Kabsch, W. (1978) A discussion of the solution for the best rotation to relate
    two sets of vectors, Acta Crystallographica Section A, vol. 34, Issue. 5, 827-828.
    doi:10.1107/s0567739478001680.
    [41] Umeyama, S. (1991) Least-squares estimation of transformation parameters between two point patterns, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 13, Issue. 4, 376-380. doi:10.1109/34.88573.
    [42] Mattes, D. et al. (2003) PET-CT image registration in the chest using free-form
    deformations, IEEE Transactions on Medical Imaging, vol. 22, Issue. 1, 120-128.
    doi:10.1109/tmi.2003.809072.
    [43] Thevenaz, P.andUnser, A.(1997)Splinepyramidsforintermodalimageregistration
    using mutual information, Wavelet Applications in Signal and Image Processing V,
    vol. 3169, 236-247. doi:10.1117/12.292794.
    [44] Chiang, A. et al. (2011) Three-Dimensional Reconstruction of Brain-wide Wiring
    Networks in Drosophila at Single-Cell Resolution, Journal of Cancer Science&Therapy, vol. 21, Issue. 1, 001-011. doi:10.1016/j.cub.2010.11.056.
    [45] Rein, K. (2002) The Drosophila Standard Brain, Current Biology, vol. 12, Issue. 3, 227-231. doi:10.1016/s0960-9822(02)00656-5.

    QR CODE