簡易檢索 / 詳目顯示

研究生: 張清鈞
Ching-Chun - Chang
論文名稱: 基於深度學習與運動補償之視訊去交錯演算法
Video De-Interlacing Algorithm based on Deep Learning and Motion Compensation
指導教授: 花凱龍
Kai-Lung Hua
口試委員: 翁明昉
Ming-Fang Weng
陳永耀
Yung-Yao Chen
楊傳凱
Chuan-Kai Yang
葉梅珍
Mei-Chen Yeh
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2017
畢業學年度: 105
語文別: 中文
論文頁數: 42
中文關鍵詞: 神經網路視訊插值視訊去交錯運動量測運動補償
外文關鍵詞: Video interpolation, Video de-interlacing, Motion compensation
相關次數: 點閱:310下載:2
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 因為交錯影像格式能夠有效減少傳輸影像的頻寬使用量,其技術現今仍然使用
    在許多廣播系統上(如NTSC、PAL、SECOM)。然而現今已廣泛使用的逐行掃描
    顯示裝置,並無法支援這種格式的影像,因此,交錯影像的重建一直是一個重要
    待解決的課題。在本篇論文中,我們提出基於深度學習與運動補償之影像去交錯
    演算法。我們的方法是採用兩階段式填值達到影像去交錯。首先,對於輸入的交
    錯影像使用自我驗證法(Self Validation) 來做初步影像去交錯得到第1 階段的填
    值;接著使用運動適應法區分初步影像去交錯得到的像素是位於靜態區塊或動態
    區塊。對於動態區塊的像素,我們會直接採用初步影像去交錯的結果,對於靜態
    區塊的像素則會經過模組化類神經網路(Modular Neural Networks) 做第2 階段的
    影像去交錯改善填值效果。實驗的部分,我們使用10 部CIF 影像與其他演算法
    做客觀性能評比,實驗結果證明我們提出的演算法優於其他演算法結果。


    Since interlaced video format can effectively reduce the usage of bandwidth in video transmission, this technical is widely used on many Broadcast systems, such as NTSC, PAL, and SECOM. However, progressive scan devices in general do not support this format and need to combine two fields of interlaced video into a single frame, which leads to visual defects. Therefore high quality video deinterlacing method that minimizes these defects is in need.

    In this thesis, we propose a novel video deinterlacing algorithm base on deep learning and motion compensation. Our proposed video deinterlacing method consists of two steps: first, self validation is employed to determine the initial deinterlaced result from various candidates; second, the static region of the initial deinterlaced result is identified and further refined based on the modular neural networks.

    In the experimental section, 10 popular CIF video sequences are utilized for the objective performance evaluation. The experimental results show that our method outperforms the other deinterlacing methods.

    教授推薦書. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 論文口試委員審定書. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 中文摘要. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 英文摘要. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 誌謝. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 目錄. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 表目錄. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 圖目錄. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1 介紹. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 2 卷積類神經網路. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.1 影像重建神經網路. . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.2 影像重建神經網路隱層調整. . . . . . . . . . . . . . . . . . . . . . . 14 3 Proposed Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.1 初步影像去交錯. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.1.1 採用的影像去交錯演算法介紹. . . . . . . . . . . . . . . . . . 19 3.1.2 自我驗證流程. . . . . . . . . . . . . . . . . . . . . . . . . . . 22 3.2 影像去交錯增強. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 3.2.1 基於殘值學習的卷積神經網路. . . . . . . . . . . . . . . . . . 23 3.2.2 模組化類神經網路. . . . . . . . . . . . . . . . . . . . . . . . 26 3.2.3 運動補償. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 3.2.4 運動適應. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 4 實驗與討論. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 4.1 實驗設定. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 4.1.1 模組化類神經網路設定. . . . . . . . . . . . . . . . . . . . . 32 4.1.2 運動適應設定. . . . . . . . . . . . . . . . . . . . . . . . . . . 32 4.1.3 類神經網路網路層設定. . . . . . . . . . . . . . . . . . . . . 33 4.2 結果與討論. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 4.2.1 初步影像去交錯表現. . . . . . . . . . . . . . . . . . . . . . . 33 4.2.2 模組化類神經網路與運動適應性能分析. . . . . . . . . . . . 34 4.2.3 與其他演算法比較. . . . . . . . . . . . . . . . . . . . . . . . 34 5 結論. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 參考文獻. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

    [1] G. de Haan and E. B. Bellers, “Deinterlacing-an overview,” Proc. IEEE, vol. 86, p. 1839–1857, 1998.
    [2] K. N. M. K. Park, M. G. Kang and S. G. Oh, “New edge dependent deinterlacing algorithm based on horizontal edge pattern,” IEEE Transactions
    on Consumer Electron, vol. 49, p. 1508–1512, 2003.
    [3] S. H. L. H. I. Koo and N. I. Cho, “A new edi-based deinterlacing algorithm,” IEEE Transactions on Consumer Electron, vol. 53, p. 1494–1499, 2007.
    [4] J. J. Jin Wang, Gwanggil Jeon, “Moving least-squares method for interlaced to progressive scanning format conversion,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 23, pp. 1865 – 1872, 2013.
    [5] Y. H. L. S. Yang, Y. Y. Jung and R. H. Park, “Motion compensation assisted motion adaptive interlaced-to-progressive conversion,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 14, pp. 1138–1148, 2004.
    [6] K. L. C. Lee, “de-interlacing with motion adaptive vertical temporal filtering,” IEEE Transactions on Consumer Electron, vol. 55, p. 636–643, 2009.
    [7] H. Choi and C. Lee, “Motion adaptive de-interlacing with modular neural networks,” IEEE Transactions on Circuits and Systems for Video Technology,
    vol. 21, p. 844–849, 2011.
    [8] P. L. H. M. Mohammadi and Y. Savaria, “A five-field motion compensated deinterlacing method based on vertical motion,” IEEE Transactions on Consumer Electron, vol. 53, p. 1117–1124, 2007.
    [9] Y.-C. Fan and C.-H. Chung, “De-interlacing algorithm using spatial-temporal correlation-assisted motion estimation,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 19, p. 932–944, 2009.
    [10] S. M. Q. Huang, D. Zhao and H. Sun, “de-interlacing using hierarchical motion analysis,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 20, p. 673–686, 2010.
    [11] S.-Y. C. Ting-Chun Wang, Yi-Nung Liu, “Algorithm adaptive video deinterlacing using self-validation framework,” IEEE International Symposium on Circuits and Systems, p. 2804–2807, 2013.
    [12] K. L. C. Lee, “High quality spatially registered vertical temporal filtering for deinterlacing,” IEEE Transactions on Consumer Electronics, vol. 59, pp. 182–190, 2013.
    [13] R. Dehghannasiri, “Video de-interlacing using asymmetric nonlocal-means filtering,” 48th Asilomar Conference on Signals, Systems and Computers, p. 688–692, 2014.
    [14] S. S. Farhang Vedadi, “A new map-based approach to video de-interlacing using forward-backward algorithm,” 46th Asilomar Conference on Signals, Systems and Computers, p. 1703–1707, 2012.
    [15] R. Dehghannasiri and S. Shirani, “A novel de-interlacing method based on locally-adaptive nonlocal-means,” 46th Asilomar Conference on Signals, Systems and Computers, pp. 1708–1712, 2012.

    無法下載圖示 全文公開日期 2022/01/24 (校內網路)
    全文公開日期 2027/01/24 (校外網路)
    全文公開日期 2027/01/24 (國家圖書館:臺灣博碩士論文系統)
    QR CODE