簡易檢索 / 詳目顯示

研究生: 陳柏安
Po-An Chen
論文名稱: 利用畫作自動擷取貼圖元素藝術成像三維場景
Artistic Rendering for 3D Scenes Using Automatic Texture Element Extraction of Painted Images
指導教授: 姚智原
Chih-Yuan Yao
口試委員: 賴祐吉
Yu-Chi Lai
戴文凱
Wen-Kai Tai
郭重顯
Chung-Hsien Kuo
朱宏國
Hung-Kuo Chu
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2017
畢業學年度: 105
語文別: 中文
論文頁數: 66
中文關鍵詞: 風格化非真實渲染筆觸樣本驅動
外文關鍵詞: stylization, non-photorealistic rendering, stroke, example-based
相關次數: 點閱:241下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 知名藝術家的繪畫作品中通常定義出指標性美感與精神,當中的畫作元素具有特色,而且能夠快速引起大眾的共鳴,市場中,便有將這些元素結合科技的商業應用,透過虛擬實境便能讓原本的平面畫作內容轉移至三維場景做數位互動體驗,讓使用者在三維場景中進行沉浸式賞畫,同時與畫作中的人物或物件進行互動感受其中的故事;然而,畫作風格轉移至場景的過程需要耗費大量美術人力,並花上數月時間,現有的方法亦無法在轉移效率與結果相似度取得平衡,能產生相似度高的方法效率慢;而效率高的方法產生的風格結果則相似度低,因此,本研究提出一個三維場景畫作風格化系統,針對符合轉移效率與維持風格一致的解決方法。在系統中,首先會預處理二維畫作與三維場景,畫作預處理在於自動擷取畫作筆觸,系統首先透過分群機制再擷取出能夠代表畫作風格的顏色和方向區域,透過這些資訊挑選出區域代表,以擴張與圖割(Graph Cut)進一步處理這些代表產生出樣本,根據樣本紋理合成出筆觸並建立資料庫;而在三維場景預處理部分,系統會計算在渲染時所需的物件深度、光影分布、受光面積與輪廓資訊,接著因在渲染架構中以基於點(Point-based) 的方式作為渲染,系統提供重建點雲流程,並計算其資訊。最後,在渲染流程中,筆觸資料庫會根據畫作配色比例與三維物件所佔場景面積比例進行筆觸分配,再將每個三維物件以亮、中、暗三種筆觸作線性插值來對應場景光影分布,同時結合畫作流向資訊讓三維場景在渲染時呈現二維畫作的風格,並以物件輪廓和筆觸凹凸效果,增強三維空間中的立體感。當產生渲染結果後,本研究將其應用於虛擬實境體驗,提供使用者沉浸式賞畫環境,且讓過程中維持至少90幀率的效能,避免暈眩;而另一方面,在驗證部分,則透過讀者研究(User study)進行統計,在結果中,95.5%的受試者能夠從本研究提供的三幅畫作中,正確選擇場景風格轉移的原始畫作以及98%的受試者認為本研究擷取出筆觸鋪排結果較佳。


    The aesthetic elements of painted artworks are defined by the masterpieces of renowned artists. Their signature style elements can easily attract consumer attention. Thus, many studies have proposed digital interactive art systems. Utilizing the latest virtual reality technology, audience can experience famous paintings as if they are in the painting themselves. However, creating a scene resembling paintings requires countless man-hour. To solve this problem, we propose a 3D scene stylization system, which automatically extracts the elements of painting, including color, tone, texture pattern and rhythm then applies to a 3D scene. In the rendering stage, our system designed a scene preprocess to analysis 3D models' information to create point clouds for point-based rendering framework. During the rendering process, strokes are distributed to 3D models according to the color distribution of the original painting and ratio of each model's occupying area in the scene to the total area of the scene then blends the strokes linearly to generate in-between maps. Using in-between maps to synthesize brightness change over a 3D surface, an initial stylization result of 3D models is achieved, as luminance variation on 3D models can be maintained. Our system also considers a painting's flow and normal map to make sure target scene resembles the original painting uses silhouette to maintain the shape of a 3D model. After scene stylization is complete, we applied the result to a VR system and achieved 90fps in our experiment. To evaluate our system's performance, we conducted two user studies to evaluate if our stylization is consistent with the original painting and verify our stroke sets approximate reference painting better than other method.

    中文摘要 i Abstract ii 第一章 介紹 1 第二章 相關研究 5 第三章 系統架構 10 第四章 風格筆觸擷取 12 第五章 三維場景預處理 25 第六章 風格化渲染 32 第七章 實驗結果與討論 41 第八章 結論與未來工作 49 參考文獻 50

    [1] C.-R. Yan, M.-T. Chi, T.-Y. Lee, and W.-C. Lin. Stylized rendering using samples of a painted image. IEEE Transactions on Visualization and Computer Graphics, vol. 14, no. 2, pp. 468-480, 2008.
    [2] A. Hertzmann, C. Jacobs, N. Oliver, B. Curless, and D. Salesin. Image analogies. Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH '01, pp. 327-340, 2001.
    [3] W. Zhang, C. Cao, S. Chen, J. Liu, and X. Tang. Style transfer via image component analysis. IEEE Transactions on multimedia, vol. 15, no. 7,pp. 1594-1601, 2013.
    [4] B. Wang, W. Wang, H. Yang, and J. Sun. Effcient example-based painting and synthesis of 2d directional texture. IEEE Transactions on Visualization and Computer Graphics, 2004.
    [5] I.-C. Chang, Y.-M. Peng, Y.-S. Chen, and S.-C. Wang. Artistic painting style transformation using a patch-based sampling method. Journal of Information Science and Engineering, vol. 26, no. 4, pp. 1443-1458, 2010.
    [6] L. A. Gatys, A. S. Ecker, and M. Bethge. A neural algorithm of artistic style. CoRR, vol. abs/1508.06576, 2015.
    [7] M. Ruder, A. Dosovitskiy, and T. Brox. Artistic style transfer for videos. CoRR, vol. abs/1508.06576, 2016.
    [8] H. Xu, N. Gossett, and B. Chen. Pointworks: Abstraction and rendering of sparsely scanned outdoor environments. In Eurographics Workshop on Rendering, pp. 45-52, 2004.
    [9] H. Xu and B. Chen, Stylized rendering of 3d scanned real world environments. In Proceedings of the 3rd international symposium on Non-photorealistic animation and rendering, pp. 25-34, 2004.
    [10] A. Hertzmann and D. Zorin, Illustrating smooth surfaces. In Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH '00, pp. 517-526, 2000.
    [11] R. Kalnins, L. Markosian, B. Meier, M. Kowalski, and J. Lee. Wysiwyg npr:Drawing strokes directly on 3d models. In Proceedings of the 29th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH '02, pp. 755-762, 2002.
    [12] M.-T. Chi and T.-Y. Lee, Stylized and abstract painterly rendering system using a multiscale segmented sphere hierarchy IEEE Transactions on Visualization and Computer Graphics, vol. 12, no. 1, pp. 61-72, 2006.
    [13] E. Praun, H. Hoppe, M. Webb, and A. Finkelstein. Real-time hatching. In Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH '01, pp. 581-, 2001.
    [14] M. Webb, E. Praun, A. Finkelstein, and H. Hoppe. Fine tone control in hardware hatching. in Proceedings of the 2Nd International Symposium on Non-photorealistic Animation and Rendering, NPAR '02, pp. 53-ff, 2002.
    [15] C. Kulla, J. Tucek, R. Bailey, and C. Grimm, \Using texture synthesis fornon-photorealistic shading from paint samples. In Computer Graphics and Applications, 2003. Proceedings. 11th Pacific Conference on, pp. 477-481, 2003.
    [16] A. Hertzmann, Tutorial: A survey of stroke-based rendering IEEE Comput. Graph. Appl., vol. 23, no. 4, pp. 70-81, 2003.
    [17] R. Achanta, A. Shaji, K. Smith, A. Lucchi, P. Fua, and S. Susstrunk. Slic superpixels compared to state-of-the-art superpixel methods. IEEE transactions on pattern analysis and machine intelligence, vol. 34, no. 11, pp. 2274-2282,2012.
    [18] H. Kang, S. Lee, and C. Chui. Coherent line drawing. In Proceedings of the 5th International Symposium on Non-photorealistic Animation and Rendering, NPAR '07, pp. 43-50, 2007.
    [19] Y. Boykov and M.-P. Jolly, Interactive graph cuts for optimal boundary and region segmentation of objects in n-d images. In Computer Vision, 2001. ICCV 2001. Proceedings. Eighth IEEE International Conference on, pp. 105-112, 2001.
    [20] L.-Y. Wei and M. Levoy. Fast texture synthesis using tree-structured vector quantization. In Proceedings of the 27th annual conference on Computer graphics and interactive techniques, pp. 479-488, 2000.
    [21] D. DeCarlo, A. Finkelstein, S. Rusinkiewicz, and A. Santella. Suggestive contours for conveying shape. ACM Trans. Graph., vol. 22, no. 3, pp. 848-855 2003.
    [22] S. Rusinkiewicz and M. Levoy. Qsplat: A multiresolution point rendering system for large meshes. In Proceedings of the 27th annual conference on Computer graphics and interactive techniques, pp. 343-352, 2000.
    [23] A. Alexei, Efros, T. William, and Freeman. Image quilting for texture synthesis and transfer. In Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH '01, pp. 341-346, 2001.
    [24] H. Kang, S. Lee, and C. Chui. Flow-based image abstraction IEEE Transactions on Visualization and Computer Graphics, vol. 15, no. 1, pp. 62-76, 2009.

    無法下載圖示 全文公開日期 2022/08/20 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE