簡易檢索 / 詳目顯示

研究生: 林承澤
Cheng-Tse Lin
論文名稱: 利用生成對抗網路轉換干涉條紋至波前應用於像差預測
Using Generative Adversarial Network to convert interference fringes to wavefront for aberration prediction
指導教授: 黃忠偉
Jong-Woei Whang
陳怡永
Yi-Yung Chen
口試委員: 黃忠偉
Jong-Woei Whang
陳怡永
Yi-Yung Chen
林瑞珠
Jui-Chu Lin
王孔政
Kung-Jeng Wang
學位類別: 碩士
Master
系所名稱: 電資學院 - 光電工程研究所
Graduate Institute of Electro-Optical Engineering
論文出版年: 2021
畢業學年度: 109
語文別: 中文
論文頁數: 75
中文關鍵詞: 干涉條紋波前感測Zernike 多項式生成對抗網路
外文關鍵詞: Interference fringes, Wavefront sensing, Zernike polynomials, Generative Adversarial Network
相關次數: 點閱:367下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

像差對於一個光學系統的成像好壞有著關鍵性的影響,也因此像差的量測就顯得至關重要,由於像差主要是來自於光學系統中,理想的波前被破壞,所以為了得知像差的型態與大小,我們必須進一步的去量測光學系統的波前並以此波前去定義像差。傳統波前與像差量測牽涉到不同、複雜的數學演算法,使得量測的速度與泛用性有限,因此有如實驗室學長提出利用深度學習的方式,分別從干涉條紋與相位圖去預測像差係數,希望可以加快整個量測像差的過程,但根據結果發現,從干涉條紋預測係數的誤差比從相位圖預測係數來的大上許多。

本研究提出了一個基於生成對抗網路(Generative Adversarial Network, GAN)的相位生成模型 : PhaseGAN,旨在利用生成對抗網路強大的產圖能力,希望將干涉條紋轉換成對應其波前的相位圖,分別利用公式與波動光學軟體來確認網路模型的效果,並搭配先前研究的神經網路,除了用來確認生成模型的效果外,亦可以將整個推算像差的過程都以神經網路取代,並且降低整體從干涉條紋到像差量測的誤差。


In designing and analyzing an optical system quality, aberration is one of the most
critical evaluation indicators since it is directly related to the imaging quality of the entire optical system. The aberration mainly comes from the destruction of the ideal wavefront in the optical system. In order to know the type and magnitude of the
aberration. We must further measure the wavefront of the optical system and define the aberration based on this wavefront. Traditional wavefront and aberration measurements involve different and complex mathematical algorithms, which make the measurement speed and versatility limited. Therefore, it is like the laboratory seniors proposed to use deep learning to predict from interference fringes and phase maps. It is hoped that the aberration coefficient can speed up the entire process of measuring aberrations. However, according to the results, it is found that the error of the coefficients predicted from the interference fringes is much larger than the coefficients predicted from the phase map.

This paper proposed an image-to-image wavefront sensing approach using a deep neural network that directly predicts the phase image from the corresponding interference fringe image instead of reconstructed by the Zernike coefficients. The model is based on Generative Adversarial Network (GAN) and we name it as PhaseGAN. To train the model, we use the interference fringe images as the inputs of the GAN to predict the corresponding phase images as the output conditionally. We numerically investigate the performance by calculating the similarity between the actual phase image and the model output. Besides, optical simulation software is introduced to verify the proposed method. This work can acquire the phase image directly and reduce the error to a lower level to improve the accuracy in measuring the interference fringe.

中文摘要 I Abstract II 致謝 III 目錄 IV 圖目錄 VII 表目錄 IX 第 1 章 緒論 1 1-1 研究背景 1 1-2 研究動機 3 1-3 論文架構 4 第 2 章 文獻探討 5 2-1 像差概論與量測 5 2-2 干涉原理與干涉儀架構 7 2-3 干涉公式 9 2-4 傳統波前像差推算方法 10 2-4-1 干涉條紋轉相位 12 2-4-2 相位展開 13 2-4-3多項式擬合波前 16 2-5 基於深度學習波前像差推算方法 18 2-6 生成對抗網路 (Generative Adversarial Network, GAN) 21 2-7 生成對抗網路之改良 25 2-7-1 Wasserstein GAN 25 2-7-2 Spectral Normalization GAN 28 2-7-3 Deep Convolution GAN 30 第 3 章 研究方法 31 3-1 目標設定 31 3-2 系統架構 33 3-3 訓練資料集 34 3-4 前導實驗 37 3-5 生成器 (Generator) 37 3-5-1激勵函數之影響 37 3-5-2正規化層之影響 39 3-5-3小結 41 3-6 鑑別器 (Discriminator) 42 3-6-1卷積層數之影響 42 3-6-2鑑別器訓練次數之影響 44 3-6-3小結 46 3-7 訓練環境 47 3-8 神經網路架構 48 3-9 訓練參數設定 52 第 4 章 實驗結果與討論 54 4-1 模型的測試 54 4-2 模型的實作 60 4-2-1 實驗設置 60 4-2-2 實驗結果 61 4-3 遷移式學習 64 4-4 遷移式學習之結果 66 第 5 章 結論與未來展望 69 5-1 結論 69 5-2 未來展望 70 參考資料 71 附錄 75

[1]R. Fischer, B. Tadic-Galeb, and P. Yoder, Optical system design. McGraw-Hill Education (2008).
[2]P. Hariharan, Basics of interferometry. Elsevier (2010).
[3]A. Nikitin, J. Sheldakova, A. Kudryashov, D. Denisov, V. Karasik, and A. Sakharov, "Hartmannometer versus Fizeau Interferometer: advantages and drawbacks," in Photonic Instrumentation Engineering II, 2015, vol. 9369: International Society for Optics and Photonics, p. 936905.
[4]E. P. Goodwin and J. C. Wyant, "Field guide to interferometric optical testing," 2006: SPIE Bellingham, WA.
[5]W. W. Macy, "Two-dimensional fringe-pattern analysis," Applied Optics 22(23), 3898-3901 (1983).
[6]M. A. Herráez, D. R. Burton, M. J. Lalor, and M. A. Gdeisat, "Fast two-dimensional phase-unwrapping algorithm based on sorting by reliability following a noncontinuous path," Applied optics 41(35), 7437-7444 (2002).
[7]D. C. Ghiglia and L. A. Romero, "Robust two-dimensional weighted and unweighted phase unwrapping that uses fast transforms and iterative methods," JOSA A 11(1), 107-117 (1994).
[8]G. Barbastathis, A. Ozcan, and G. Situ, "On the use of deep learning for computational imaging," Optica 6(8), 921-943 (2019).
[9]S. D. Campbell, D. Sell, R. P. Jenkins, E. B. Whiting, J. A. Fan, and D. H. Werner, "Review of numerical optimization techniques for meta-device design," Optical Materials Express 9(4), 1842-1863 (2019).
[10]H. Guo, N. Korablinova, Q. Ren, and J. Bille, "Wavefront reconstruction with artificial neural networks," Optics express 14(14), 6456-6462 (2006).
[11]L. Hu, S. Hu, W. Gong, and K. Si, "Learning-based Shack-Hartmann wavefront sensor for high-order aberration detection," Optics express 27(23), 33504-33517 (2019).
[12]D. Saha, U. Schmidt, Q. Zhang, A. Barbotin, Q. Hu, N. Ji, M. J. Booth, M. Weigert, and E. W. Myers, "Practical sensorless aberration estimation for 3D microscopy with deep learning," Optics Express 28(20), 29044-29053 (2020).
[13]Y. Nishizaki, M. Valdivia, R. Horisaki, K. Kitaguchi, M. Saito, J. Tanida, and E. Vera, "Deep learning wavefront sensing," Optics express 27(1), 240-251 (2019).
[14]A. J.-W. Whang, Y.-Y. Chen, C.-M. Chang, Y.-C. Liang, T.-H. Yang, C.-T. Lin, and C.-H. Chou, "Prediction technique of aberration coefficients of interference fringes and phase diagrams based on convolutional neural network," Optics Express 28(25), 37601-37611 (2020).
[15]M. Konnik and J. De Doná, "Waffle mode mitigation in adaptive optics systems: a constrained Receding Horizon Control approach," in 2013 American Control Conference, 2013: IEEE, pp. 3390-3396.
[16]R. Shrestha, J. Park, and W. Kim, "Application of thermal wave imaging and phase shifting method for defect detection in Stainless steel," Infrared Physics & Technology 76(676-683 (2016).
[17]T. Wang, "Interferometric optical testing for high resolution imaging in an optical lattice," NSERC summer report, University of Toronto 83((2008).
[18]X. Su and W. Chen, "Reliability-guided phase unwrapping algorithm: a review," Optics and Lasers in Engineering 42(3), 245-261 (2004).
[19]J. C. Wyant and K. Creath, "Basic wavefront aberration theory for optical metrology," Applied optics and optical engineering 11(part 2), 28-39 (1992).
[20]G.-m. Dai and V. N. Mahajan, "Orthonormal polynomials in wavefront analysis: error analysis," Applied optics 47(19), 3433-3445 (2008).
[21]Y. Jin, Y. Zhang, L. Hu, H. Huang, Q. Xu, X. Zhu, L. Huang, Y. Zheng, H.-L. Shen, and W. Gong, "Machine learning guided rapid focusing with sensor-less aberration corrections," Optics express 26(23), 30162-30171 (2018).
[22]S. W. Paine and J. R. Fienup, "Machine learning for improved image-based wavefront sensing," Optics letters 43(6), 1235-1238 (2018).
[23]Y. Jin, J. Chen, C. Wu, Z. Chen, X. Zhang, H.-l. Shen, W. Gong, and K. Si, "Wavefront reconstruction based on deep transfer learning for microscopy," Optics Express 28(14), 20738-20747 (2020).
[24]C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, "Rethinking the inception architecture for computer vision," in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 2818-2826.
[25]I. J. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, "Generative adversarial networks," arXiv preprint arXiv:1406.2661(2014).
[26]T. Karras, T. Aila, S. Laine, and J. Lehtinen, "Progressive growing of gans for improved quality, stability, and variation," arXiv preprint arXiv:1710.10196(2017).
[27]T. Karras, S. Laine, and T. Aila, "A style-based generator architecture for generative adversarial networks," in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019, pp. 4401-4410.
[28]P. Isola, J.-Y. Zhu, T. Zhou, and A. A. Efros, "Image-to-image translation with conditional adversarial networks," in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 1125-1134.
[29]M. Arjovsky, S. Chintala, and L. Bottou, "Wasserstein generative adversarial networks," in International conference on machine learning, 2017: PMLR, pp. 214-223.
[30]I. Gulrajani, F. Ahmed, M. Arjovsky, V. Dumoulin, and A. Courville, "Improved training of wasserstein gans," arXiv preprint arXiv:1704.00028(2017).
[31]T. Miyato, T. Kataoka, M. Koyama, and Y. Yoshida, "Spectral normalization for generative adversarial networks," arXiv preprint arXiv:1802.05957(2018).
[32]A. Radford, L. Metz, and S. Chintala, "Unsupervised representation learning with deep convolutional generative adversarial networks," arXiv preprint arXiv:1511.06434(2015).
[33]J. Li, Y. Li, J. Li, Q. Zhang, and J. Li, "Single-pixel compressive optical image hiding based on conditional generative adversarial network," Optics Express 28(15), 22992-23002 (2020).
[34]Y. Huang, Z. Lu, Z. Shao, M. Ran, J. Zhou, L. Fang, and Y. Zhang, "Simultaneous denoising and super-resolution of optical coherence tomography images based on generative adversarial network," Optics express 27(9), 12289-12307 (2019).
[35]G. C. Sargent, B. M. Ratliff, and V. K. Asari, "Conditional generative adversarial network demosaicing strategy for division of focal plane polarimeters," Optics Express 28(25), 38419-38443 (2020).
[36]I. Moon, K. Jaferzadeh, Y. Kim, and B. Javidi, "Noise-free quantitative phase imaging in Gabor holography with conditional generative adversarial network," Optics Express 28(18), 26284-26301 (2020).
[37]R. L. Kendrick, D. S. Acton, and A. L. Duncan, "Phase-diversity wave-front sensor for imaging systems," Applied Optics 33(27), 6533-6546 (1994).
[38]B. H. Dean and C. W. Bowers, "Diversity selection for phase-diverse phase retrieval," JOSA A 20(8), 1490-1504 (2003).
[39]Y. Wu and K. He, "Group normalization," in Proceedings of the European conference on computer vision (ECCV), 2018, pp. 3-19.
[40]D. Ulyanov, A. Vedaldi, and V. Lempitsky, "Instance normalization: The missing ingredient for fast stylization," arXiv preprint arXiv:1607.08022(2016).
[41]K. He, X. Zhang, S. Ren, and J. Sun, "Delving deep into rectifiers: Surpassing human-level performance on imagenet classification," in Proceedings of the IEEE international conference on computer vision, 2015, pp. 1026-1034.
[42]J. Hu, L. Shen, and G. Sun, "Squeeze-and-excitation networks," in Proceedings of the IEEE conference on computer vision and pattern recognition, 2018, pp. 7132-7141.
[43]M. Heusel, H. Ramsauer, T. Unterthiner, B. Nessler, and S. Hochreiter, "Gans trained by a two time-scale update rule converge to a local nash equilibrium," arXiv preprint arXiv:1706.08500(2017).
[44]L. Luo, Y. Xiong, Y. Liu, and X. Sun, "Adaptive gradient methods with dynamic bound of learning rate," arXiv preprint arXiv:1902.09843(2019).
[45]Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, "Image quality assessment: from error visibility to structural similarity," IEEE transactions on image processing 13(4), 600-612 (2004).
[46]S. J. Pan and Q. Yang, "A survey on transfer learning," IEEE Transactions on knowledge and data engineering 22(10), 1345-1359 (2009).

無法下載圖示 全文公開日期 2024/08/23 (校內網路)
全文公開日期 2026/08/23 (校外網路)
全文公開日期 2026/08/23 (國家圖書館:臺灣博碩士論文系統)
QR CODE