簡易檢索 / 詳目顯示

研究生: 傅毓芬
Yu-Fen Fu
論文名稱: 肢語障者溝通系統之研究
Intelligent Communication Systems for Handicapped Aphasiacs
指導教授: 李漢銘
Hahn-Ming Lee
何正信
Cheng-Seen Ho
口試委員: 陳錫明
Shyi-Ming Chen
吳榮根
Jung-Gen Wu
李錫智
Shie-Jue Lee
曾憲雄
Shian-Shyong Tseng
許清琦
Ching-Chi Hsu
學位類別: 博士
Doctor
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2010
畢業學年度: 98
語文別: 英文
論文頁數: 45
中文關鍵詞: 肢/語障者數據手套手語指語神經網路
外文關鍵詞: handicapped aphasiacs, data glove, sign language, finger language, neural network
相關次數: 點閱:355下載:4
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

指語(Finger language)與手語(Sign language)不同。手語通常需要用到大幅度的動作,包括雙手的手指、手掌與手臂在三度空間中的位置與移動軌跡,這對僅能作小幅度手指彎曲的肢障者而言是不可行的,指語則只需要病人手指彎曲狀況等小幅度的動作即可表達意思。本論文主要在研發適合僅能作小幅度手指彎曲的肢語障(Handicapped aphasiacs)者之溝通系統,以解決肢語障者與外界溝通和操作電腦的問題。為肢語障者開發適當的溝通系統,必需從硬體的輸入工具以及軟體的信號處理兩方面來考慮:
硬體方面,我們發展了一個低成本的數據手套,讓發光二極體(Light-emitting diode, LED)的光線直接透過空氣到達光檢測器(Photo-detector, PD),這樣手套所抓取的五指信號比傳統的光纖手套的重複性要高、數值較穩定、手指活動範圍的調整性也好。
軟體方面,我們結合了模糊集合論(Fuzzy set theory)和類神經網路(Neural network)兩個理論的優點,建構了兩個具體的混合模型來作為指語辨識系統。第一個混合模型旨在建造出一個靜態指語識別模型,供重病患者做簡單、迅速的溝通工具。第二個混合模型則配合一個允許使用者以最適合的速度來操作的虛擬鍵盤(Virtual keyboard),來構建一個可以讓肢語障者作連續指語輸入的系統。
我們的實驗結果顯示:本數據手套的幾何模型的正確性已得到實驗結果的驗證,因為透過改變LED的電流我們可改變本數據手套的飽和範圍與使用範圍,以適應病患的狀態。在靜態指語識別部分,對特定的使用者,我們的系統的指語辨識正確率可達100%。最後在連續指語辨識部份,我們的系統已能正常執行一般的溝通,可作為一自然、可用的工具。
本論文的貢獻如下:首先,我們針對肢語障者開發出一個低成本的數據手套,該手套重複性高、適應性強、靈敏度高,且機構簡單、經濟堅固、維修容易、足可滿足病患的實際需要。其次,我們已成功為重症患者做出靜態指語的辨識工具,讓患者可以使用預先定義的個別指語,將自己的意念表現出來。本系統對弱勢團體在生活品質的改善方面助益甚大。最後,我們已成功為肢語障者開發出智慧型的通用溝通系統,可提供舒適的環境供肢語障者作連續指語的輸入,除可進行一般的溝通外,在數位環境下更可過著與一般人無差異的生活,因此本系統對於弱勢團體生活與生命價值的發展具重大意義。


Sign language usually requires large-scale movements to form a gesture, such as 3D maneuverings of two hands, palms and sometimes even arms, and thus more often than not presents varying degrees of difficulty among disabled people. In contrast, finger language is represented by small-scale hand gestures accessible by a mere change of the bending manner of a patient’s fingers. This thesis aims to develop an intelligent communication system to help the handicapped aphasiacs, who can only perform tiny finger movements, so that they can properly interact with the outside world. To develop such a communication system, we need to deal with both input hardware and recognition software.
For hardware, we have developed a new data glove which allows the light from light-emitting diodes (LED) to reach photo-detectors (PD) on line of sight. This new data glove possesses merits of better sensitivity, repeatability, and adjustability (or programmability).
For software, we have integrated fuzzy set theory and neural network to create two hybrid models for performing finger language recognition. The first model intended to support the basic communication requirement of seriously disabled patients by recognizing static finger gestures. The second model was used as a general communication system for handicapped aphasiacs to conduct continuous finger language input.
Our experiments show that the correctness of the geometric model of the data glove is verified by the experimental outcomes; thus, by adjusting the LED current, we can control the signal range of the data glove to adapt to a specific user. As for static finger language recognition, the user-dependent easily-adjusted static finger language recognition system has achieved 100% correctness for given specific users under unbiased field experiments. Finally, for continuous finger language recognition, our system can work well to perform basic communication tasks. That is, it can serve as a viable tool for natural and affordable communication for handicapped aphasiacs.
The contributions of the work can be summarized below. First, we have developed a programmable data glove aimed to assist the handicapped aphasiacs whose fingers can barely perform even very limited movements, such as bending. The data glove is sensitive, repeatable, and adjustable. In addition, it is mechanically simple, low-cost, and robust and thus can meet practical needs. Second, we have successfully developed a static finger language recognition tool for seriously disabled patients. The disabled can use the system to perform predefined finger gestures to express their intentions and resume their basic communication ability with other people. This is a great help for them to improve the quality of life. Finally, we have successfully developed an intelligent system for general communication of the handicapped aphasiacs. The system provides a suitable environment for them to perform continuous finger language input. Aside from working as a daily communication tool, it can support more advanced applications in the digital world. This can play a very significant role in further improving the life quality as well as the career development of the handicapped aphasiacs.

Abstract (in Chinese) Abstract (in English) Acknowledgement (in Chinese) Contents List of Tables List of Figures 1 Introduction 1.1 Motivation 1.2 Overview of Dissertation 1.3 Organization 2 Literature Survey 2.1 Data Gloves 2.2 Finger Language 2.3 Virtual Keyboard 3 Development of a Programmable Data Glove 3.1 Structure and Working Principle of the Proposed Data Glove 3.2 Verification of Repeatability of the Proposed Data Glove 3.3 Concluding Remarks 4 A User-Dependent Easily-Adjusted Static Finger Language System for Handicapped Aphasiacs 4.1 Structure and Working Principle of the Static Finger Language Recognition 4.2 System Development 4.3 Concluding Remarks 5 A General Communication System for Handicapped Aphasiacs 5.1 Structure and Working Principle of the General Communication System 5.2 System Development 5.3 Concluding Remarks 6 Conclusions and Future Research 6.1 Conclusions 6.2 Contributions 6.3 Future Research References Bibliography (in Chinese)

[ 1] 行政院主計處,“領有身心障礙手冊人數續增”,國情統計通報民國 98 年 12 月2日, http://www.dgbas.gov.tw/public/Data/91221613471.pdf (最後存取:981204)
The Accounting Office of Executive Yuan, “The Number of People Who Possess the Physical and Psychological Obstacle Manual Increases Continuously”, National Conditions Bulletin, December 2nd, 2009.
[ 2] Sturman, D. J., and Zeltzer, D., “A Survey of Glove-based Input”, IEEE Computer Graphics and Applications, Vol. 14, pp. 30-39 (1994).
[ 3] De Paula, R. P., and Moore, E. L., “Fiber-Optic Sensor Overview”, Proceedings of the SPIE, Vol. 556, PP. 2-15 (1985).
[ 4] Bock, W. J., “Automatic Calibration of a Fiber-Optic Strain Sensor Using a Self-Learning System”, Proceedings of the 9th IEEE Instrumentation And Measurement, Vol. 43, No. 2, PP. 341-346 (1994).
[ 5] Bock, W. J., Porada, E., and Zaremba, M. B., “Neural Processing-Type Fiber-Optic Strain Sensor”, Proceedings of the IEEE Transactions Instrumentation and Measurement Technology Conference, Vol. 41, No. 6, PP. 635-639 (1992).
[ 6] Simone, L. K., Elovic, E., Kalambur, U., and Kamper, D., “A Low Cost Method to Measure Finger Flexion in Individuals with Reduced Hand and Finger Range of Motion”, Proceedings of the 26th Annual International EMBC Conference, Vol. 2, PP. 4791-4794 (2004).
[ 7] Burdea, G., and Zhuang, J., “Dexterous Telerobotics with Force Feedback-an Overview Part 2: Control and Implementation”, Robotica, Vol. 9, pp. 291-298 (1991).
[ 8] Bouzit, M., Burdea, G., Popescu, G., and Boian, R., “The Rutgers Master II-New Design Force-Feedback Glove”, IEEE/ASME Transactions on Mechatronics, Vol. 7, No. 2, pp. 256-263 (2002).
[ 9] Chung, G. S., “Keyboard Input Device Using Data Glove with Tilt Sensors”, Master Thesis, National Chung Hsing University, Taichung, Taiwan (2003).
[10] Fu, Y. F., and Ho, C. S., “Static Finger Language Recognition for Handicapped Aphasiacs”, Proceedings of the 2nd International Conference on Innovative Computing Information and Control, Kumamoto, Japan (2007).
[11] Fu, Y. F., and Ho, C. S., “Development of a Programmable Data glove”, Proceedings of the 6th International Conference on Machine Learning and Cybernetics, PP. 1948-1953 (2007).
[12] Fu, Y. F., and Ho, C. S., “Development of a Programmable Data glove”, Journal of Smart Materials and Structures, Vol. 17, pp. 1-8 (2008).
[13] Gilden, D., and Jaffe, D. L., “Dexter - A Helping Hand for Communicating with the Deaf-Blind”, Proceedings of the 9th Annual RESNA (Rehabilitation Engineering and Assistive Technology Society of North America) Conference, Minneapolis, MN (1986).
[14] Meade, A., “Dexter - A Finger-Spelling Hand for the Deaf-Blind”, Proceedings of IEEE International Conference on Robotics and Automation, Vol. 4, PP. 1192-1195 (1987).
[15] Gilden, D., “A Robotic Hand as a Communication Aid for the Deaf- Blind”, Proceedings of the 20th Hawaii International Conference on System Sciences (1987).
[16] Gilden, D., and Jaffe, D. L., “Dexter - a Robotic Hand Communication Aid for the Deaf-Blind”, International Journal of Rehabilitation Research, Vol. 2, No. 2, pp. 198-199 (1988).
[17] Jaffe, D. L., “Dexter II - The Next Generation Mechanical Fingerspelling Hand for Deaf-Blind Persons”, Proceedings of the 12th Annual RESNA Conference, New Orleans (1989).
[18] Jaffe, D. L., “Dexter - A Fingerspelling Hand”, OnCenter - Technology Transfer News, Vol. 1, No. 1 (1989).
[19] Jaffe, D. L., “Third Generation Fingerspelling Hand”, Proceedings of the Technology and Persons with Disabilities Conference, Los Angeles, CA, March (1993).
[20] Jaffe, D. L., “The Development of a Third Generation Fingerspelling Hand”, Proceedings of the 16th Annual RESNA Conference, Las Vegas, NV (1993).
[21] Kramer, J., and Leifer, L., “An Expressive and Receptive Communication Aid for the Deaf, Deaf-Blind, and Nonvocal”, Proceedings of the Annual Conference, IEEE Engineering in Medicine and Biology Society, Boston (1987).
[22] Gilden, D. B., and Smallridge, B., “Touching Reality: A Robotic Fingerspelling Hand for Deaf-Blind Persons”, Proceedings of Virtual Reality Conference (1993).
[23] Jaffe, D. L., “Speaking in Hands”, SOMA: Engineering for the Human Body, Vol. 2, No. 3, October, pp. 6-13 (1987).
[24] Kramer, J., and Leifer, L., “The Talking Glove: an Expressive and Receptive Verbal Communication Aid for the Deaf, Deaf-blind, and Non-vocal”, Proceedings of the 3rd Annual Conference on Computer Technology / Special Education / Rehabilitation, CA, Northridge, PP. 335-340 (1987).
[25] Fels, S. S., and Hinton, G. E., “Glove-Talk: a Neural Network Interface between a Data-Glove and a Speech Synthesizer”, IEEE Transactions on Neural Networks, Vol. 1, pp. 2-8 (1993).
[26] Kramer, J., and Leifer, L., “The Talking Glove: a Speaking Aid for Non-vocal Deaf and Deaf-blind Individuals”, Proceedings of the RESNA 12th Annual Conference, New Orleans, Louisiana, PP. 471-472 (1993).
[27] Kai, M. C., “The Role of Gestures and Sign Language in Aural Rehabilitation for Hearing Impaired Children”, Chinese Scientific Journal of Hearing and Speech Rehabilitation, Vol. 5, pp. 4-62 (2004).
[28] Wu, J. Q., Wen, G., Bo, P., and Han, J. P., “A Fast Sign Word Recognition Technique for Chinese Sign Language”, Communication of Hi-Tech (in Chinese), Vol. 11, No. 6, pp. 23-27 (2001).
[29] Zhang, Y., Yuan, K., Du, Q. X., and Zou, W., “A Classification Method for Chinese Sign Language Recognition”, Beijing Journal of the University of Science and Technology, Vol. 23, No. 3, pp. 284-286 (2001).
[30] Zou, W., Yuan, K., Zang A. Y., and Zhang, H. B., “A Recognition System of Single-Hand Words in CSL”, Journal of System Simulation, Vol. 15, No. 2, pp. 290-293 (2003).
[31] Kirschenbaum, A., Friedman, Z., and Melnik, A., “Performance of Disabled Persons on a Chordic Keyboard”, Human Factors, Vol. 28, pp. 187-194 (1986).
[32] Demasco, P. W., and McCoy, K. F., “Generating Text from Compressed Input: an Intelligent Interface for People with Severe Motor Impairments”, Communications of the ACM, Vol. 35, No. 5, pp. 68-78 (1992).
[33] Maris, B., Macgyvers, V., and Lagoudakis, M., “A Listening Keyboard for Users with Motor Impairments—A Usability Study”, Speech Technology, Vol. 5, pp. 371-388 (2002).
[34] Tanaka-Ishii, K., Inutsuka, Y., and Takeichi, M., “Japanese Input System with Digits: Can Japanese be Input Only with Consonants”, Proceedings of the Human Language Technology, 2001.
[35] Fu, Y. F., and Ho, C. S., “A Fast Text-based Communication System for Handicapped Aphasiacs”, Proceedings of IEEE International Conference on Intelligent Information Hiding and Multimedia Signal Processing, Kyoto, Japan, PP. 583-594 (2009).
[36] Derksen, S., and Keselman, H. J., “Backward, Forward and Stepwise Automated Subset Selection Algorithms: Frequency of Obtaining Authentic and Noise Variables”, British Journal of Mathematical & Statistical Psychology, Vol. 45, pp. 265-282 (1992).
[37] Fu, Y. F., and Ho, C. S., “A User-Dependent Easily-Adjusted Static Finger Language Recognition System for Handicapped Aphasiacs”, Journal of Applied Artificial Intelligence, Vol. 23, No. 10, pp. 932-944 (2009).
[38] Fu, Y. F., and Ho, C. S., “Building Intelligent Communication Systems for Handicapped Aphasiacs”, Journal of Sensors, Vol. 10, No.1, pp. 374-387 (2009).

QR CODE