簡易檢索 / 詳目顯示

研究生: 葉陳鴻
Chen-Haung Yeh
論文名稱: 三度空間中遠距指向落點之準確度探討及校正系統
Accuracy of goal-directed pointing and adjustment logic system in 3-D space
指導教授: 李永輝
Yung-Hui Lee
口試委員: 相子元
none
謝光進
Kong-King shieh
學位類別: 碩士
Master
系所名稱: 管理學院 - 工業管理系
Department of Industrial Management
論文出版年: 2008
畢業學年度: 96
語文別: 中文
論文頁數: 108
中文關鍵詞: 指向影像辨識手眼協調瞄準策略校正
外文關鍵詞: finger gesture recognition, eye-hand coordination, goal-directed aiming, adjustment system
相關次數: 點閱:163下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本研究期望建立一套校正邏輯作為指向影像辨識系統之調整機制,使得機器視覺對於指向延伸向量之邏輯能夠與使用者瞄準落點相符合。研究方法是透過指向瞄準動作的表達來找出指向向量在目標平面上之落點分布以及探討其背後瞄準策略以及關節位置之差異。受試者共有10人,分別在有視覺回饋及無視覺回饋下之不同狀態下進行遠距指向瞄準動作。實驗設計是藉由VICON攝影機抓取受試者手指、手臂關節上以及雙眼感光點在空間的3D座標位置,將此資料經由程式計算。藉由觀察法找出指向落點分布分類的重要特徵。精準度分析方面期望取得兩種狀態下的目標物接收範圍之參考直徑,以尋得造成不同指向落點分布特徵表現以及不同偏誤類型之原因並探討並校正準確度。
    由實驗數據輸出結果以及原因追溯可發現,指向落點分布共可分為三類:(1)擴散型 (2)左上平移型 (3)接近命中型。準度與精度方面在有視覺回饋下準度受到目標物角度與距離影響,無視覺回饋下受到目標物與螢幕中央距離影響。無視覺回饋下偏誤原因,主要來自指向向量與手眼瞄準向量間相對角度之差異大小,並與之成正比。最後,依照不同之指向落點以及分割校正區域分類降低資料離散程度進行以手眼瞄準落點為目標之複迴歸校正,校正後偏誤比較之結果,將目標紅心與指向落點原先約32.66公分的平均落差降低至平均約14.23公分的誤差,分類校正後誤差各降至(1)13.14公分(2)8.8公分(3)7.54公分。校正結果證明本研究所使用之分布分類以及區域分割,皆對校正系統之誤差降低產生進一步效果。


    A logical system of adjustment was developed in the study for eye-hand coordination in goal-directed aiming and finger gesture recognition logic. This research assume the distortion between eye-hand coordination in goal-directed aiming and finger vector gesture recognition logic can be satisfactorily compensated with a 2-dimensional quadratic function. There are ten participants were recruited for the study. A total of 12 markers were attached onto the joints and fingertips of the index finger, the right arm, close to two eyes and the plane of target. Vicon Motion Analysis Systems with 8 cameras was used to record the motion of goal-directed movement and aiming in two different situation: (1)Visual feedback (2)without Visual feedback In this experiment, and each marker were generated. Then use Matlab to calculate the finger line vector end-points on the plane of target, and investigate the strategy of aiming form participants. Different size and position of target will make accuracy different in visual feedback. Different distance of target will make accuracy different without visual feedback.
    Obviously, there are three types in these finger pointing distributions: (1)extend (2)Up&right-shift (3)close-to-target. According these three classification to adjust the finger line vector end-points close to the position of eye-hand coordination goal-directed aiming. Then, according the result of experiment analysis to divide the region for reduce the inaccuracy. This method can reduce the position error about 90% effectively, and reduce the accuracy error form 32.66cm to 14.23cm.Diminish the target receive region effectively.

    目 錄 摘要 II Abstract Ⅲ 誌謝 Ⅳ 目 錄 Ⅴ 圖目錄 Ⅶ 表目錄 Ⅳ 第一章 緒論 1 1.1 研究背景與動機 1 1.2 研究目的 3 1.3 研究架構 4 1.4 研究限制 7 第二章 文獻探討 9 2.1 影像辨識邏輯應用於指向 9 2.2 身體動作特性與瞄準策略之影響 10 2.3 指向落點特性 13 2.4 Fitts’ law之應用 15 2.5 使用情境探討 17 2.6 指向落點校正邏輯 18 2.7 小結 19 第三章 研究方法 21 3.1 實驗情境探討 23 3.1.1 空間情境模擬原則 23 3.1.2 實驗變項制定 24 3.1.3 實驗硬體與環境 26 3.1.4 受試者募集 27 3.1.5 實驗方法設計 28 3.2 指向資料收集 30 3.2.1 資料分佈 30 3.2.2 3D座標用途 31 3.3 座標資料計算 31 3.3.1 指向延伸直線 31 3.3.2 目標平面 32 3.3.3 資料標準化 34 3.3.4 指向瞄準落點擷取 35 3.3.5 手眼相對位置 40 3.3.6 指向落點分布歸類 42 第四章 實驗結果與分析 47 4.1 精確度分析 47 4.2 準確度分析 49 4.3 影響因子剖析 51 4.4 校正 53 4.4.1 校正區域分割 53 4.2.2 無視覺回饋下校正 54 4.2.3 輸出比較 55 4.2.4 擴散型特殊校正 68 第五章 討論 72 5.1 精準度探討 72 5.2 偏誤原因探討 74 5.3 干擾原因探討 77 5.4 校正結果探討 77 第六章 結論與建議 80 6.1 結論 80 6.2 建議與後續研究 83 參考文獻 84 附錄 A 受試者同意書 89 附錄 B 受試者指向與瞄準落點分布圖(全) 90 圖目錄 圖1-1影像辨識系統 2 圖1-2瞄準方向與指向之差異 3 圖1-3研究架構與步驟 6 圖2-1指向直線向量與視角在2D影像之讀取 10 圖2-2瞄準凝視目標(指尖)產生偏移實驗示意圖 12 圖2-3使用不同瞄準慣用眼產生不同之手眼瞄準向量示意圖 12 圖2-4進行遙指動作之軌跡與落點 14 圖2-5在限制動作下之遙指移動軌跡與停止落點 14 圖3-1慣用手食指感光點放置部位示意圖 22 圖3-2 目標平面示意圖 24 圖3-3 實驗環境 26 圖3-4 實驗硬體 26 圖3-5 實驗環境中受試者與目標平面配置 29 圖3-6 實驗環境中受試者與攝影機配置 29 圖3-7 Vicon產生之座標位置以及Workstation軟體標記數據 30 圖3-8 手指延伸直線與落點 32 圖3-9 平面上目標物及指向延伸直線產生之落點 33 圖3-10所有目標物以及落點與平面90度視角俯視圖 34 圖3-11標準化後各點座標 35 圖3-12角度為0˚之單位實驗產生之移動速度波形示意圖 37 圖3-13角度為45˚時X軸與Y軸皆產生規律且較小幅的速度波動 37 圖3-14旋轉後X軸之速度波動規律較明顯示意圖 38 圖3-15擷取瞄準時指向落點座標值(有視覺回饋) 39 圖3-16擷取瞄準時指向落點座標值(無視覺回饋) 39 圖3-17 3D光點與雷射放置部位示意圖 41 圖3-18手眼瞄準直線與指向延伸直線之落點比較 41 圖3-19命中目標之指向落點分布(有視覺回饋下) 42 圖3-20等距離平移之指向落點分布(有視覺回饋下) 43 圖3-21命中目標之指向落點分布(有視覺回饋下) 44 圖3-22等距離平移之指向落點分布(有視覺回饋下) 45 圖3-23接近命中型指向瞄準落點分布 46 圖4-1 無視覺回饋下--校正前指向瞄準落點 56 圖4-2 無視覺回饋下--手眼連線瞄準落點 56 圖4-3 無視覺回饋校正後指向瞄準落點與手眼連線瞄準落點比較 57 圖4-4 擴散型校正前指向瞄準落點 62 圖4-5 擴散型手眼連線瞄準落點 62 圖4-6 擴散型校正後指向瞄準落點與手眼連線瞄準落點比較圖 63 圖4-7 左上平移型校正前指向瞄準落點 64 圖4-8 左上平移型手眼連線瞄準落點 64 圖4-9 左上平移型校正後指向瞄準落點與手眼連線瞄準落點比較圖 65 圖4-10 接近命中型校正前指向瞄準落點 66 圖4-11 接近命中型手眼連線瞄準落點 66 圖4-12 校正後指向瞄準落點與手眼連線瞄準落點比較圖 67 圖4-13 1號受試者手指擺放之位置圖 68 圖4-14 擴散型受試者1號未用手眼瞄準之偏誤產生示意圖 69 圖4-15 校正前擴散型受試者1號之指向落點分布圖 70 圖4-16 校正後擴散型受試者1號之指向落點分布圖 70 圖5-1 1號受試者手指擺放之位置 75 圖5-2 左上平移型手眼與指向誤差角度俯視與側視圖 76 圖5-3 接近命中型手眼與指向誤差角度俯視與側視圖 76 表目錄 表3-1 受試者瞄準策略相關資料 27 表3-2 目標物名稱及對應角度 33 表4-1 有無視覺回饋下精確度比較 48 表4-2 有無視覺回饋下準確度比較 51 表4-3 有無視覺回饋下精確度顯著因子比較 52 表4-4 有無視覺回饋下準確度顯著因子比較 52 表4-5 根據顯著影響因子進行之校正區域分類 54 表4-6 手眼瞄準與指向落點差距 59 表4-7 校正前後準確度比較 61 表4-8 特殊校正前後準確度比較 71 表5-1 校正前後之手眼瞄準與指向落差比較 78 表5-2 校正前後之準度比較 79

    1.Anne B. Le Seac’h , Joseph McIntyre(2007), Multimodal reference frame for the planning of vertical arms movements. Neuroscience Letters 423 211–215

    2.Kai Nickel , Rainer Stiefelhagen(2007), Visual recognition of pointing gestures for human–robot interaction, Image and Vision Computing 25 1875–1884.

    3.Min C. Shina, Leonid V. Tsapb, Dmitry B. Goldgof (2004), Gesture recognition using Bezier curves for visualization navigation from registered 3-D data. Pattern Recognition 37 1011-1024

    4.Aditya Ramamoorthya, Namrata Vaswania, SantanuChaudhurya,Subhashis Banerjeeb(2003).Recognition of dynamic hand gestures. Pattern Recognition 36 ,2069 – 2081

    5.Thayananthan, B. Stenger, P.H.S. Torr, R. Cipolla(2003), Learning a kinematic prior for tree-based filtering, in: British Machine VisionConference 2, pp. 589–598.

    6.Aarlenne Z. Khan J. Douglas Crawford (2003), Coordinating one hand with two eyes: optimizing for field of view in a pointing task, Vision Research 43 409–417

    7.Kenji Oka and Yoichi Sato, Hideki Koike(2002), Real-Time Fingertip Tracking and Gesture Recognition. IEEE Computer Graphics and Applications 64-71.

    8.Wah Ng, Chan ,Surendra Ranganath(2002),Real-time gesture recognition system and application.Image and Vision Computing 20 993–1007.

    9.D.Y.P. Henriques and J. D. Crawford(2001), Role of Eye, Head, and Shoulder Geometry in the Planning ofAccurate Arm Movements .J Neurophysiol 87: 1677–1685

    10.Ho-Sub Yoo, Jung Soh_, Younglae J. Bae, Hyun Seung Yang(2001), Hand gesture recognition using combined features of location, angle and velocity. Pattern Recognition 34 1491-1501

    11.S. V. Adamovich, M. B. Berkinblit, W. Hening, J. Sagec and h. Poizner(2001), the interaction of visual and proprioceptive inputs in pointing to actual and remembered targets in parkinson's disease. Neuroscience 104(4), pp. 1027-1041,

    12.Phillips, J. G. & Triggs, T. J. (2001). Characteristics of cursor
    trajectory controlled by the computer mouse. Ergonomics, 44(5), 527-536.

    13.J. Segen, S. Kumar(2000), Look ma, no mouse! Communications of the ACM 43 (7) 102–109.

    14.Pascale Pigeon a,1, Anatol G. Feldman a,b, (1998) , Compensatory arm–trunk coordination in pointing movements is preserved inthe absence of visual feedback .Brain Research 802.274–280

    15.Byong k. Ko and Hyun s. Yang(1997), Finger mouse and gesture recognition system as a new human Computer interface. comput. & Crapha. 21(5). 555-561. 1997.

    16.M.Desmurget , M.Jordan, , C.Prablanc, and M .Jeannerod, , (1997)., Constrained and unconstrained movements involve different control strategies. Journal of Neurophysiology, 77, 1644–1650.

    17.R. Harvey, & E.Peper, (1997). Surface electromyography and mouse
    use position. Ergonomics, 40(8), 781-789.

    18.E. L.Dereniak, & G. D.Boreman, (1996). Infra-red Detectors and Systems. New York: John Wiley.

    19.G. W. Thomas, & H. E. Henry, (1996). Effects of angle of approach on cursor movement with a mouse: Consideration of Fitts’ Law. Computer in Human Behavior, 12, 481-495.

    20.B. Shneiderman, (1992). Design the user interface: Strategies for effective human-computer interaction. New York: Addison Wesley.

    21.J. Singh, (1995). Semiconductor Optoelectronics: Physics and Technology. New York: McGraw-Hill.

    22.M. Fogleman, & G.Brogmus, (1995). Computer mouse use and
    cumulative trauma disorders of the upper extremities. Ergonomics,
    38(12), 2465-2475.

    23.Y. Rossetti, , M. Desmurget, and C. Prablanc, , Vectorial (1995), coding of movement: vision proprioception or both?. Journal of Neurophysiology, 74, 457–463.

    24.William P.Huebner, Jacob J.Bloomberg, (1995).A system for the accurate measurement of pointing responses. Journal of Neuroscience Methods ,64, 233-236

    25.J. Sharit, & S. J. Czaja, (1994). Aging, computer-based taskperformance, and stress - issues and challenges. Ergonomics, 37,
    559-577.

    26.M.Rehg James and Kanade Takeo (1994), Visual Tracking of High DOF Articulated Structures: an Application to Human Hand Tracking. Appears in Third European Conf. Computer Vision 35-46.

    27.Masaaki fukumoto, yasuhito suenaga and kenji mase(1994), "FINGER-POINTER": Pointing interface by image processing. Comput. & Graphics, Vol. 18(5). pp. 633-642.

    28.J.Gordon, , M.F.Ghilardi, and C.Ghez, (1994), Accuracy of planar reaching movements—1: Independence of direction and extended variability. Experimental Brain Research, 99, 97–111.

    29.N.Walker, , D. E. Meyer, & J. B. Smelcer, (1993). Spatial and temporal characteristics of rapid cursor positioning movements with electromechanical mice in human-computer interaction, Human Factors, 35, 431-458.

    30.B.Shneiderman, (1992).Design the user interface : Strategies for effective human-computer interaction. New York:Addison Wesley.

    31.U.Trankle, & D.Deutschmann, (1991). Factors influencing speed and precision of cursor positioning using a mouse, Ergonomics, 34, 161-174.

    32.I. S.MacKenzie, , A.Sellen, & W.Buxton, (1991). A comparison of input devices in elemental pointing and dragging tasks. Proceedings of the CHI '91 Conference on Human Factors in Computing Systems, 161-166. New York: ACM.

    33.W. Nunley, & J. S. Bechtel, (1987). Infra-red Optoelectronics: Devices and Applications. New York: Marcel Dekker.

    34.Jean-Luc Nespoulous, Paul Perron, and Andre Roch Lecours (1986). TheBiological Foundations of Gestures: Motor and Semiotic Aspects. Lawrence ErlbaumAssociates, Hillsdale, MJ.

    35.B. H. Kantowitz, & R. D. Sorkin, (1983). Human factors: Understanding People-System Relationships. New York: Wiley.

    36.S. K. Card, , W. K. English, & B. J. Burr, (1978). Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys for text selection on a CRT. Ergonomics, 21, 601-613.

    37.G.D. Langolf, , D.B. Chaffin, ,& J.A Foulke.(1976). An investigation of Fitts’ law using a wide range of movement amplitudes. Journal of MotorBehavior, 8(2),113-128.

    38.P. M. Fitts, & J. R. Peterson, (1964). Information capacity of discrete motor responses. Journal of Experimental Psychology, 67, 103-112.
    39.P. M. Fitts, (1954). The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experiental Psychology, 47, 381-391.

    40.蘇木川(2005):可採多樣工作姿勢的點選輸入設備之設計與驗證,國立台灣科技大學工業管理系博士學位論文。

    41.胡冠宇 (2004) :基於膚色之裸體影像偵測之研究,國立成功大學碩士論文。

    42.黃建中(1998):Fitts' Law在遙指環境之適用性,國立交通大學工業工程與管理學系博士論文。

    43.許尚華(1997) :控制-顯示增量對遙指作業績效之影響, 國立交通大學博士論文。

    44.許勝雄、彭游、吳水丕(1996):人因工程學。台北:揚智文化。

    無法下載圖示 全文公開日期 2013/07/01 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE