簡易檢索 / 詳目顯示

研究生: 佘政鄴
ZHENG-YE SHE
論文名稱: 智慧型牙刷解決口腔健康問題基於電腦視覺與遊戲化刷牙
A Dental Health Solution based on Smart Toothbrush, Computer Vision & Brushing Games
指導教授: 鄭正元
Jeng-Ywan Jeng
熊,艾吉
KUMAR, Ajeet
高宜敏
Yi-Ming Kao
口試委員: 鄭正元
Jeng-Ywan Jeng
林上智
Shang-Chih Lin
熊,艾吉
KUMAR, Ajeet
高宜敏
Yi-Ming Kao
學位類別: 碩士
Master
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2022
畢業學年度: 110
語文別: 中文
論文頁數: 122
中文關鍵詞: 智慧型牙刷生理資訊收集電動牙刷電腦視覺互動式體感遊戲遊戲化學習
外文關鍵詞: Smart Toothbrush, Physiological Information Collection, Electric Toothbrush, Computer Vision, Interactive Somatosensory Games, Gamified Learning
相關次數: 點閱:352下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

人工智慧的研究與應用拓展了研究人員的視野,其包括電腦視覺。隨著科技的發展與電腦視覺的進步使得其在各行各業成為不可或缺的角色,例如檢測、癌症檢測、醫學、通信、物流及工業4.0等。
本研究旨在開發智慧型牙刷,智慧型牙刷為電動牙刷另一個發展方向,該技術為將攝像頭、補光燈與手動牙刷結合一體,打破以往將運動機構放入電動牙刷的思維,透過遊玩刷牙遊戲使攝像頭能擷取使用者的口腔生理資訊,並使用電腦視覺對收集到的口腔生理資訊進行口腔病灶分析,藉此解決近年來新冠肺炎導致就診不便與耽誤黃金治療期的問題,更可以讓使用者了解自身的口腔健康,同時可以將數據發送給牙醫尋求建議與治療,藉此達到遠端治療的目的。
智慧型牙刷的軟體針對控制板ESP-EYE的無線連接方式(Wi-Fi)、鏡頭參數與串流影片幀率進行開發,以確保擷取畫面的清晰與穩定。本研究亦開發了一套簡易的智慧型牙刷手機應用程式,對應用程式的使用者介面進行設計與開發基本的應用程式功能。研究電腦視覺辨識口腔病灶的可行性,並且使用兩種圖像分割技術實例分割與語義分割模型分別對口腔病灶進行辨識,探討何種電腦視覺模型有較優異的效果與較適合應用於智慧型牙刷。刷牙遊戲的開發,透過遊戲化刷牙將日常枯澀乏味且一成不變的刷牙動作增添一分色彩,並利用引導式動畫讓使用者在遊玩遊戲的過程中學習正確的刷牙動作。
實驗結果證明本研究成功開發出智慧型牙刷,在軟硬體的配合與口腔生理資訊收集皆有良好的效能與穩定性,而電腦視覺辨識口腔病灶亦能明確地可視化口腔病灶特徵區域,遊戲化刷牙學習利用動畫與口腔病灶怪獸明確的引導使用者正確的刷牙方式,能讓使用者在沉浸於遊戲的當下學習正確的刷牙方式。


Artificial intelligence (AI) including computer vision incorporated with application-based research broadened the horizons for the researchers. With the improvement of data acquisition systems, computer vision techniques are becoming imperative in almost every field such as in biomedical for the detection/treatment of tumors and diseases, in communications, logistics as well as in industry 4.0.
This research aims to devise a smart toothbrush which is a further development of existing electric toothbrushes. Our presented methodology integrates cameras, fill lights with manual toothbrushes, which eliminated/replaced the previous technology i.e., insertion of the motion mechanisms into electric toothbrushes which compromises its performance. The video of teeth brushing through the camera captures the user's oral physiological information which is analyzed by computer vision for the detection and location of oral diseases, thereby solving the problems of delaying the medical treatment and inconvenience caused by COVID-19 in recent years. This can also help the user to understand their own oral health and send the video-recorded data to a medical specialist to seek advice or treatment without physically visiting the medical center/hospital.
The software of the smart toothbrush is developed with Wi-Fi connection, camera parameters and streaming video frame rate of the control board ESP-EYE ensured to capture clear and stable images without any blur effect. A user-friendly application was developed which can live stream and save the brushing video and images. Different segmentation (instance and semantic) techniques were used to recognize oral diseases and explore the feasibility and the computer vision model has better effects and is more suitable for the usage in smart toothbrushes. Also, a brushing game was developed which adds fun to the boring and unchangeable brushing actions through gamification and uses guided animations to allow the users to learn the correct brushing action/sequence during the game.
The experimental results witnessed the successful development of a smart toothbrush, which has good performance and stability in terms of software and hardware, the collection of oral physiological information which allows the clear recognition and visualization of oral diseases. Furthermore, the tooth brushing game not only added fun but also provide learning/training for the correct brushing sequence for healthy teeth.

摘要 I ABSTRACT II 致謝 III 目錄 IV 圖目錄 VII 表目錄 XIII 第一章、 緒論 1 1.1 前言 1 1.2 研究動機 1 1.3 實驗流程 2 1.4 論文架構 3 第二章、 文獻回顧 6 2.1 口腔健康 6 2.1.1 牙菌斑 6 2.1.2 物理與化學移除牙菌斑 8 2.1.3 口腔問題解決方案 11 2.2 電腦視覺 15 2.2.1 實例分割-SOLOv2 15 2.2.2 語義分割-YOLOv3 19 2.3 遊戲化設計 22 2.3.1 遊戲化八大要點 22 2.3.2 互動式體感遊戲 24 第三章、 智慧型牙刷解決口腔健康問題系統開發 26 3.1 智慧型牙刷硬體 26 3.2 智慧型牙刷韌體 32 3.2.1 資料傳輸 33 3.2.2 鏡頭參數調整 35 3.3 行動裝置應用程式開發 38 3.3.1 使用者介面設計 38 3.3.2 應用程式功能開發 41 3.4 電腦視覺資料庫建立 43 3.4.1 口腔生理資訊收集 44 3.4.2 資料庫預處理 44 3.4.3 多邊矩形標記 45 3.4.4 矩形標記 48 3.5 電腦視覺應用於口腔病灶 49 3.5.1 實例分割模型 50 3.5.2 語義分割模型 56 3.6 遊戲化刷牙學習 58 3.6.1 遊戲架構與情境設定 58 3.6.2 刷牙部位檢測 61 第四章、 實驗結果與討論 66 4.1 智慧型牙刷韌體 66 4.1.1 鏡頭參數評估 66 4.1.2 串流影片幀率測試 69 4.2 行動裝置應用程式 71 4.2.1 使用者介面評估 71 4.2.2 應用程式功能測試 74 4.3 資料庫與標記結果評估 77 4.3.1 口腔生理資訊收集探討 77 4.3.2 資料庫建立評估 80 4.4 電腦視覺辨識口腔病灶 85 4.4.1 多邊形標記辨識結果探討 85 4.4.2 評估實例與語義分割模型 88 4.5 遊戲化刷牙學習結果探討 90 4.5.1 刷牙部位測試 90 4.5.2 遊戲測試與評估 92 第五章、 結論與未來展望 99 5.1 結論 99 5.2 未來展望 99 參考文獻 101 附錄1:智慧型潔牙裝置 106

[1] R. G.Watt et al., “Ending the neglect of global oral health: time for radical action,” Lancet, vol. 394, no. 10194, pp. 261–272, 2019, doi: 10.1016/S0140-6736(19)31133-X.
[2] A.Sezer andA.Altan, “Detection of solder paste defects with an optimization‐based deep learning model using image processing techniques,” Solder. Surf. Mt. Technol., vol. 33, no. 5, pp. 291–298, 2021, doi: 10.1108/SSMT-04-2021-0013.
[3] A. W.Senior andR. M.Bolle, “Face Recognition and its Application,” Biometric Solut., pp. 83–97, 2002, doi: 10.1007/978-1-4615-1053-6_4.
[4] K.Indira, K.V.Mohan, andT.Nikhilashwary, “Automatic license plate recognition,” Adv. Intell. Syst. Comput., vol. 727, no. 1, pp. 67–77, 2019, doi: 10.1007/978-981-10-8863-6_8.
[5] 吳冠勳, “智慧口腔生醫與 牙齒資訊收集暨服務系統規畫:國立臺灣科技大學;2021.”
[6] Hamza Azam, “牙齒清潔監測與口腔資訊收集系統 的開發與研究:國立臺灣科技大學;2021.”
[7] K.Almas andT. R.Al-Lafi, “The natural toothbrush.,” World Health Forum, vol. 16, no. 2, pp. 206–210, Jan.1995, Accessed: May18, 2022. [Online]. Available: https://europepmc.org/article/med/7794468.
[8] J. C. M.Souza, R. R. C.Mota, M. B.Sordi, B. B.Passoni, C. A. M.Benfatti, andR. S.Magini, “Biofilm Formation on Different Materials Used in Oral Rehabilitation,” Braz. Dent. J., vol. 27, no. 2, pp. 141–147, Mar.2016, doi: 10.1590/0103-6440201600625.
[9] M.Mukherjee, “Chemical Plaque Control Medical and Research Publications,” 2021, Accessed: May18, 2022. [Online]. Available: www.medicalandresearch.com.
[10] A.Binney, M.Addy, andR. G.Newcombe, “The plaque removal effects of single rinsings and brushings.,” J. Periodontol., vol. 64, no. 3, pp. 181–185, Mar.1993, doi: 10.1902/JOP.1993.64.3.181.
[11] T.Mani andD. N.Herrera, “Chemical plaque controlprevention for the masses Cite this paper,” 2000.
[12] S. M.Sheikh-Al-Eslamian, N.Youssefi, S. E. S.Monir, andM.Kadkhodazadeh, “Comparison of Manual and Electric Toothbrush in Dental Plaque Removal: A Clinical Trial,” J. Dent. Res., vol. 6, no. 1, pp. 5–9, Jun.2014, doi: 10.17795/AJDR-21046.
[13] M.Poyato-Ferrera, J. J.Segura-Egea, andP.Bullón-Fernández, “Comparison of modified Bass technique with normal toothbrushing practices for efficacy in supragingival plaque removal.,” Int. J. Dent. Hyg., vol. 1, no. 2, pp. 110–114, May2003, doi: 10.1034/J.1601-5037.2003.00018.X.
[14] A. R.Rajwani, S. N. D.Hawes, A.To, A.Quaranta, andJ. C.Rincon Aguilar, “Effectiveness of Manual Toothbrushing Techniques on Plaque and Gingivitis: A Systematic Review.,” Oral Health Prev. Dent., vol. 18, no. 1, pp. 843–854, Oct.2020, doi: 10.3290/J.OHPD.A45354.
[15] “貝氏刷牙法 | 衛教資訊 | 親民服務 | 衛生福利部桃園醫院.” https://www.tygh.mohw.gov.tw/?aid=509&pid=0&page_name=detail&iid=507 (accessed May 18, 2022).
[16] X.Wang, T.Kong, C.Shen, Y.Jiang, andL.Li, “SOLO: Seg0menting Objects by Locations,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 12363 LNCS, pp. 649–665, 2020, doi: 10.1007/978-3-030-58523-5_38.
[17] X.Wang, R.Zhang, T.Kong, L.Li, andC.Shen, “SOLOv2: Dynamic and fast instance segmentation,” Adv. Neural Inf. Process. Syst., vol. 2020-Decem, no. NeurIPS, pp. 1–17, 2020.
[18] “YOLO: Real-Time Object Detection.” https://pjreddie.com/darknet/yolov2/ (accessed May 24, 2022).
[19] J.Redmon andA.Farhadi, “YOLOv3: An Incremental Improvement,” 2018, [Online]. Available: http://arxiv.org/abs/1804.02767.
[20] J.Redmon andA.Farhadi, “Yolo V2.0,” Cvpr2017, no. April, pp. 187–213, 2016, [Online]. Available: http://www.worldscientific.com/doi/abs/10.1142/9789812771728_0012.
[21] Y.-K.Chou, “Actionable Gamification-Beyond Points, Badges, and Leaderboards » eBook ^ AEJWCPKKOQ A cti on abl e Gam i f i cati on-Beyon d Poi n ts, Badges, A cti on abl e Gam i f i cati on-Beyon d Poi n ts, Badges, an d Leaderboards an d Leaderboards.”
[22] T.Sulispera andM.Recard, “Octalysis Gamification Framework for Enhancing Students’ Engagement in Language Learning,” Dialekt. J. Pendidik. Bhs. Ingg., vol. 8, no. 2, pp. 103–128, 2021.
[23] R. T.Azuma, “A Survey of Augmented Reality,” Presence Teleoperators Virtual Environ., vol. 6, no. 4, pp. 355–385, Aug.1997, doi: 10.1162/PRES.1997.6.4.355.
[24] T.Althoff, R. W.White, andE.Horvitz, “Influence of Pokémon Go on Physical Activity: Study and Implications,” J Med Internet Res 2016;18(12)e315 https//www.jmir.org/2016/12/e315, vol. 18, no. 12, p. e6759, Dec.2016, doi: 10.2196/JMIR.6759.
[25] J.Carmigniani, B.Furht, M.Anisetti, P.Ceravolo, E.Damiani, andM.Ivkovic, “Augmented reality technologies, systems and applications,” Multimed. Tools Appl., vol. 51, no. 1, pp. 341–377, Jan.2011, doi: 10.1007/S11042-010-0660-6/FIGURES/24.
[26] K. J.Bower, J.Louie, Y.Landesrocha, P.Seedy, A.Gorelik, andJ.Bernhardt, “Clinical feasibility of interactive motion-controlled games for stroke rehabilitation,” J. Neuroeng. Rehabil., vol. 12, no. 1, pp. 1–12, Aug.2015, doi: 10.1186/S12984-015-0057-X/TABLES/2.
[27] M.Hasan, R.Zaman Khan, N. A.Ibraheem, M. M.Hasan, R. Z.Khan, andP. K.Mishra, “ARPN Journal of Science and Technology:: Understanding Color Models: A Review,” ARPN J. Sci. Technol., vol. 2, no. 3, 2012, Accessed: May18, 2022. [Online]. Available: http://www.ejournalofscience.org.
[28] A.Mikołajczyk andM.Grochowski, “Data augmentation for improving deep learning in image classification problem,” 2018 Int. Interdiscip. PhD Work. IIPhDW 2018, pp. 117–122, 2018, doi: 10.1109/IIPHDW.2018.8388338.
[29] M.D Bloice, C.Stocker, andA.Holzinger, “Augmentor: An Image Augmentation Library for Machine Learning,” J. Open Source Softw., vol. 2, no. 19, p. 432, 2017, doi: 10.21105/joss.00432.
[30] C.Shorten andT. M.Khoshgoftaar, “A survey on Image Data Augmentation for Deep Learning,” J. Big Data, vol. 6, no. 1, 2019, doi: 10.1186/s40537-019-0197-0.
[31] J. Y.Chung andS.Lee, “Dropout early warning systems for high school students using machine learning,” Child. Youth Serv. Rev., vol. 96, pp. 346–353, Jan.2019, doi: 10.1016/J.CHILDYOUTH.2018.11.030.
[32] M.Everingham et al., “The PASCAL Visual Object Classes (VOC) Challenge,” Int J Comput Vis, vol. 88, pp. 303–338, 2010, doi: 10.1007/s11263-009-0275-4.
[33] M.Everingham et al., “The PASCAL Visual Object Classes Challenge: A Retrospective,” Int J Comput Vis, vol. 111, pp. 98–136, 2015, doi: 10.1007/s11263-014-0733-5.
[34] “Release Notes :: CUDA Toolkit Documentation.” https://docs.nvidia.com/cuda/cuda-toolkit-release-notes/index.html (accessed Jun. 13, 2022).
[35] E.Arslan, “Learn JavaScript with p5.js Coding for Visual Learners,” 2018, doi: 10.1007/978-1-4842-3426-6.
[36] “Model: FaceMesh | Handsfree.js.” https://handsfree.js.org/ref/model/facemesh.html#with-config (accessed May 18, 2022).
[37] “Model: Hands | Handsfree.js.” https://handsfree.js.org/ref/model/hands.html#usage (accessed May 18, 2022).

無法下載圖示 全文公開日期 2027/07/22 (校內網路)
全文公開日期 2027/07/22 (校外網路)
全文公開日期 2027/07/22 (國家圖書館:臺灣博碩士論文系統)
QR CODE