簡易檢索 / 詳目顯示

研究生: 吳京諭
Jing-Yu Wu
論文名稱: 智慧型交通監控系統:自適性遮罩夜間車輛偵測與追蹤以及基於眼睛狀態分析之駕駛疲勞與分心偵測
Intelligent Traffic Surveillance System: Night-Time Vehicle Detection and Tracking, and Driver Fatigue and Distraction Detection
指導教授: 郭景明
Jing-Ming Guo
口試委員: 黃志良
Chih-Lyang Hwang
王乃堅
Nai-Jian Wang
李建德
Jiann-Der Lee
夏至賢
Chih-Hsien Hsia
學位類別: 碩士
Master
系所名稱: 電資學院 - 電機工程系
Department of Electrical Engineering
論文出版年: 2014
畢業學年度: 102
語文別: 中文
論文頁數: 125
中文關鍵詞: 夜晚車輛偵測車輛計數駕駛疲勞偵測駕駛分心偵測眨眼率閉眼百分比
外文關鍵詞: Night-time vehicle detection, vehicle counting, driver drowsiness, driver inattention, blink rate, PERCLOS
相關次數: 點閱:230下載:14
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本論文設計的兩大系統對於智慧型交通監控系統中,其最主要貢獻有兩項,分別達成自適性夜晚車輛偵測與追蹤,以及駕駛疲勞與分心狀態偵測之目的,分別簡述於下。
    自適性夜晚車輛偵測與追蹤可分為兩部分做探討,分別為追蹤系統與車輛偵測。對於追蹤系統方面,將狀態分為Appear、Tracked、Leave和Disappear四種,並搭配與前一刻的交集圖的使用,能迅速有效的在追蹤區域內,順利的追蹤與更新每部車輛的狀態;車輛偵測方面,設計車輛驗證機制在車輛進入追蹤區時判定是否為車輛,並於追蹤區域內排除誤判之車輛;由於實際道路監視器對於聚焦遠近與離地面距離皆不一的情況下,為能符合適用性,因此設計自適應的遮罩得在各不同環境能更彈性的應用與實現。實驗結果顯示,本論文提出的方法能達到即時偵測與計數,並且在應用於不同車道環境下也能保持程度以上的精準度。
    駕駛疲勞與分心偵測系統中,本論文針對三樣即時事件做偵測,分別為分心、不正常眨眼和瞌睡,且能同時評估駕駛目前疲勞程度;偵測進行中同步計算眨眼率以及閉眼百分比 (PERCLOS),利用離線訓練之迴歸分析參數,轉換為駕駛疲勞程度;事前訓練之目的,是利用迴歸模型描述眨眼率、閉眼百分比與駕駛疲勞程度之關係。偵測方面利用OpenCV人臉以及眼睛偵測後,利用自帶訓練器偵測是否閉眼,有效率的計算閉眼百分比以及眨眼率,進而判斷不正常眨眼以及瞌睡事件;分心偵測又分為頭部轉向分心以及凝視旁側分心,頭部轉向是記錄正臉移動軌跡,臉轉向後可分析頭部最有可能之轉動方向;凝視方向為偵測瞳孔位置後,兩瞳孔連線之中心點與臉部中心點連線,和水平呈現的夾角做比較。於實驗結果顯示,本系統能有效的判斷駕駛分心、不正常眨眼以及打瞌睡的即時事件,也能利用所訓練之迴歸参數有效描述駕駛當下疲勞的程度。


    This thesis presents two contributions 1) the night-time vehicle detection and tracking with an adaptive mask training, and driver fatigue and distraction detection based on eye-state analysis.
    In night-time vehicle detection and tracking system, the vehicle tracking module is designed to classify four status: appearing, tracking, leaving, and disappearing. Tracking can be efficiently achieved in the tracking area by matching with the intersection map. In vehicle detection module, a vehicle verification mechanism is designed to exclude false detected vehicles when passing through the tracking area. Considering that in practical scenario each surveillance camera on the road may capture vehicles of different distances and sizes, an adaptive mask is trained to tackle with the problem. The result shows that the system can achieve real-time processing detection and tracking.
    In the second half of this thesis, three events (driver distraction, abnormal eye blinking, and drowsiness) and driver fatigue level are evaluated with the proposed driver fatigue and distraction system. The eye blinking rate and percent of closure (PERCLOS) are employed during detection, as important clues for determination of driver abnormal eye blinking, and drowsiness events, the processed data is also converted to driver fatigue level by the off-line regression trained parameters, which is for fitting the relation between blinking rate and PERCLOS and driver fatigue level. Face and opened eyes are detected by cascade classifers trained from AdaBoost. Moreover, a self-trained closed eye classifer is utilized for closed eyes. Driver distraction includes head turning and eye gazing on either left or right side. A driver’s head is regards as turned when frontal face is no longer detected. In this case, the trace of head motion is estimated for the determination of the direction of head turning. Distracted Eyes gazing is detected by computing angles between the center of two irises and the vertical face center. The experimental results show that the system is able to determine the distraction, abnormal eye blink, and drowsiness events of a driver, and is capable of evaluating the driver fatigue level of a driver.

    中文摘要 I Abstract II 誌謝 IV 目錄 V 圖表索引 VII 第一章 緒論 12 1.1 研究背景與動機 12 1.2 研究目的 15 1.3 論文架構 15 第二章 文獻探討 16 2.1 車輛偵測 (Vehicle Detection)相關文獻 16 2.2 駕駛疲勞與分心偵測(Driver Fatigue and Distraction Detection)相關文獻 19 2.2.1 臉部偵測(Face Detection) 23 2.2.2 眼睛偵測(Eyes Detection) 24 2.2.3 特徵擷取與駕駛疲勞狀態評估(Feature Extraction) 26 第三章 夜晚車輛偵測與追蹤 29 3.1 前處理(Pre-processing) 30 3.2 車輛偵測與追蹤系統(Vehicle Detection and Tracking System) 35 3.2.1 車輛偵測階段(Vehicle Detection Stage) 36 3.2.2 車輛追蹤階段(Vehicle Tracking Stage) 43 3.3 自適性遮罩訓練與更新(Adaptive Mask Training and Update) 51 3.3.1 車燈寬高投票學習機制(Vehicle Lamp Pairs Voting-Learning Mechanism) 52 3.3.2 以統計為基礎之初始遮罩訓練(Statistic-based initial vehicle template) 54 3.3.3 遮罩更新(Mask Update) 56 第四章 駕駛疲勞與分心偵測 58 4.1 頭部轉向分心偵測(Head-orientation Detection) 59 4.1.1 分析軌跡暫存器判斷方向(Using Face Trace Analysis for Head-orientation) 61 4.1.2 臉部轉向資料列判斷分心狀態(Using Face Turn Buffer for Distraction Status) 62 4.2 眼睛凝視左右側分心偵測(Eye Gaze Side Detection) 65 4.3 眼睛特徵擷取(Eye Feature Extraction) 67 4.4 迴歸分析模型參數訓練(Regression Analysis Parameter Training) 70 4.4.1 訓練樣本收集(Training Sample Collection) 71 4.4.2 迴歸分析模型訓練參數與結果 (Training Parameters by Using Regression Analysis) 73 第五章 實驗結果 78 5.1 夜晚車輛偵測與追蹤實驗結果 78 5.2 駕駛疲勞與分心偵測實驗結果 90 5.2.1 實驗設計 90 5.2.2 實驗結果 92 第六章 結論與未來展望 113 參考文獻 114

    [1] Association for Safe International Road Travel, ASIRT. (2013). Annual global road crash statistics [Online]. Available at: www.asirt.org/initiatives/informing-road-users/road-safety-facts/road-crash-statistics.aspx. Accessed 25 April 2014.
    [2] National Highway Traffic Safety Administration, U.S. Department of Transportation. (2011). Traffic safety facts crash states: drowsy driving [Online]. Available at: http://www-nrd.nhtsa.dot.gov/pubs/811449.pdf. Accessed 25 April 2014.
    [3] S. V. Masten, J. C. Stutts, and C. A. Martell, “Predicting daytime and nighttime drowsy driving crashes based on crash characteristic models,” in 50th Annual Proceedings of the Association for the Advancement of Automotive Medicine, Chicago, IL, October 2006.
    [4] P. F. Alcantarilla, L. M. Bergasa, P. Jim′enez, and M. A. Sotelo et al., “Night time vehicle detection for driving assistance lightbeam controller,” in IEEE Transaction on Intelligent Vehicles Symposium, Eindhoven University of Technology Eindhoven, The Netherlands, June 4-6 2008.
    [5] Intelligent Street Lighting Solution. (2013). TVILIGHT [Online]. Available at: http://www.tvilight.com. Accessed 25 April 2014.
    [6] S. Sivaraman and M. M. Trivedi, “A review of recent developments in vision-based vehicle detection,” in IEEE Proceedings of Intelligent Vehicle Symposium (IV), Gold Coast, Australia, June 23-26, 2013.
    [7] Z. Sun, G. Bebis, and R. Miller, “On-road vehicle detection: a review,” in IEEE Transaction of Pattern Analysis and Machine Intelligence, vol. 28, no. 5, pp. 694 –711, May 2006.
    [8] M. Herbert, “Active and Passive Range Sensing for Robotics,” in IEEE International Conference on Robotics and Automation (ICRA), San Francisco, CA, USA, April 24-28, 2000, vol.1, pp. 102-110.
    [9] S. Park, T. Kim, S. Kang, and K. Heon, “A Novel Signal Processing Technique for Vehicle Detection Radar,” in IEEE Microwave Theory and Techniques Society's International Microwave Symposium Digest, pp. 607-610, 2003.
    [10] C. Wang, C. Thorpe, and A. Suppe, “Ladar-Based Detection and Tracking of Moving Objects from a Ground Vehicle at High Speeds,” in Proceedings of IEEE Intelligent Vehicles Symposium, Columbus, OH, USA, June 9-11 2003.
    [11] J. Hancock, E. Hoffman, R. Sullivan, and D. Ingimarson et al., “High-Performance Laser Ranger Scanner,” in Proceedings of SPIE International Conference on Intelligent Transportation Systems, 1997.
    [12] R. Chellappa, G. Qian, and Q. Zheng, “Vehicle Detection and Tracking Using Acoustic and Video Sensors,” in Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing, Quebec, Canada, May 17-21, 2004, pp. 793-796.
    [13] M. Bertozzi, A. Broggi, and S. Castelluccio, “A Real-Time Oriented System for Vehicle Detection,” Journal of Systems Architecture, vol. 43, pp. 317-325, 1997.
    [14] N. Matthews, P. An, D. Charnley, and C. Harris, “Vehicle Detection and Recognition in Greyscale Imagery,” Control Engineering Practice, vol. 4, pp. 473-479, 1996.
    [15] C. Goerick, N. Detlev, and M. Werner, “Artificial neural networks in real-time car detection and tracking applications,” Pattern Recognition Letters, vol. 17, pp. 335-343, 1996.
    [16] S. D. Buluswar and B. A. Draper, “Color machine vision for autonomous vehicles,” International Journal of Engineering Applications of Artificial Intelligence, vol. 1, no. 2, pp. 245-256, 1998.
    [17] C. Tzomakas and W. Seelen, “Vehicle detection in traffic scenes using shadows,” Technical Report 98-06, Institut fu‥ r Neuroinformatik, Ruht-Universitat, Bochum, Germany, 1998.
    [18] B. Sun and S. Li, “Moving cast shadow detection of vehicle using combined color models”, in IScIDE'12 Proceedings of 3rd Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering, Nanjing, China, October 15-17 2012.
    [19] M. Cheon, W. Lee, C. Yoon, and M. Park, “Vision-based vehicle detection system with consideration of the detecting location,” IEEE Transactions on Intelligent Transportation Systems, vol. PP, no. 99, pp. 1–10, 2012.
    [20] S. Sivaraman and M. M. Trivedi, “A general active learning framework for on-road vehicle recognition and tracking,” IEEE Transactions on Intelligent Transportation Systems, 2010.
    [21] S. Sivaraman and M. M. Trivedi, “Active learning for on-road vehicle detection: A comparative study,” Machine Vision and Applications, Special Issue on Car Navigation and Vehicle Systems, 2011.
    [22] J. Cui, F. Liu, Z. Li, and Z. Jia, “Vehicle localisation using a single camera,” in IEEE Intelligent Vehicles Symposium (IV), San Diego, CA, June 21-24 2010, pp. 871 –876.
    [23] X. Zhang, N. Zheng, Y. He, and F. Wang, “Vehicle detection using an extended hidden random field model,” in 14th International IEEE Conference on Intelligent Transportation Systems (ITSC), Washington, DC, USA, October 5-7 2011, pp. 1555 –1559.
    [24] J. Kong, Y. Zheng, Y. Lu, and B. Zhang, “A novel background extraction and updating algorithm for vehicle detection and tracking,” in Proceedings of IEEE International Conference on Fuzzy Systems, and Knowledge Discovery, 2007, pp. 464–468.
    [25] B.-F. Wu, S.-P. Lin, and Y.-H. Chen, “A real-time multiple-vehicle detection and tracking system with prior occlusion detection and resolution,” in Proceedings of IEEE International Symposium on Signal Processing Information Technology, Athens, Greece, December 29-21 2005, pp. 311–316.
    [26] J. Zhou, D. Gao, and D. Zhang, “Moving vehicle detection for automatic traffic monitoring,” IEEE Transactions on Vehicular Technology, vol. 56, no. 1, pp. 51–59, January 2007.
    [27] D. Beymer, P. McLauchlan, B. Coifman, and J. Malik, “A real-time computer vision system for measuring traffic parameters,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, San Juan, Puerto Rico, June 17-19 1997, pp. 495–510.
    [28] K. Huang, L.Wang, T. Tan, and S.Maybank, “A real-time object detecting and tracking system for outdoor night surveillance,” Pattern Recognition, vol. 41, no. 1, pp. 432–444, January 2008.
    [29] S. Zhou, J. Li, Z. Shen, and L. Ying, “A Night Time Application for a Real-Time Vehicle Detection Algorithm Based on Computer Vision,” Research Journal of Applied Sciences, Engineering and Technology, vol. 5, no. 10, pp. 3037-3043, March 2013.
    [30] K. Robert, “Night-time traffic surveillance a robust framework for multivehicle detection, classification and tracking,” in Proceedings of IEEE Conference on Advanced Video and Signal Based Surveillance(AVSS), Genova, Italy, September 2-4 2009, pp. 1–6.
    [31] P. F. Alcantarilla, L. M. Bergasa, P. Jimenez, and M. A. Sotelo et al., “Night time vehicle detection for driving assistance lightbeam controller”, in IEEE International Vehicles Symposium, Eindhoven University of Technology Eindhoven, The Netherlands, June 4-6 2008.
    [32] K. Huang, L. Wang, T. Tan, and S. Maybank, “A real-time object detecting and tracking system for outdoor night surveillance,” Pattern Recognition, vol. 41, no. 1, pp. 432–444, Jan. 2008.
    [33] C. C. Wang, S. S. Huang, and L. C. Fu, “Driver assistance system for lane detection and vehicle recognition with night vision,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and System(IROS), Alberta, Canada, August 2-6 2005, pp. 3530–3535.
    [34] T. A. Ranney, E. Mazzae, R. Garrott, and M. J. Goodman, “NHTSA driver distraction research: Past, present, and future,” Transportation Research Center Inc. (TTCI), East Liberty, OH, Technical Report, July 2000.
    [35] J. D. Lee, K. L. Young, and M. A. Regon, “Defining driver distraction,” in Driver Distraction: Theory, Effects and Mitigation. Boca Raton, FL: CRC, 2009.
    [36] L. Angell, J. Auflick, A. Austria, and D. Kochhar, et al., “Driver workload metrics project—Task 2 final report,” U.S. Department Transportation, National Highway Traffic Safety Administration, Washington, DC, Techical Report, 2006.
    [37] Y. Liang and J. D. Lee, “Combining cognitive and visual distraction: Less than the sum of its parts,” Accident Analysis Prevention, vol. 42, no. 3, pp. 881– 890, May 2010.
    [38] K. Kircher, C. Ahlstrom, and A. Kircher, “Comparison of two eye-gaze based real-time driver distraction detection algorithms in a small-scale field operational test,” in Proceedings of 5th International Driving Symposium on Human Factors in Driver Assessment, Training, and Vehicle Design, 2009, pp. 16–23.
    [39] J. C. Stutts, D. W. Reinfurt, L. Staplin, and E. A. Rodgman, “The role of driver distraction in traffic crashes,” AAA Foundation Traffic Safety, Washington, DC, Technical Report, 2001.
    [40] J. C. Brill, P. A. Hancock, and R. D. Gilson, “Driver fatigue: Is something missing?” in Proceedings of 2nd International Driving Symposium on Human Factors Driver Assessment, Training and Vehicle Design, Park City, Utah, July 21-24 2003, pp. 138–142.
    [41] H. D. Croo, M. Bandmann, G. M. Mackay, K. Rumar, and P. Vollenhoven, “The role of driver fatigue in commercial road transport crashes,” European Transportation Safety Council, Brussels, Belgium, Technical Report, 2001.
    [42] B. T. Jap, S. Lal, P. Fischer, and E. Bekiaris, “Using EEG spectral components to assess algorithms for detecting fatigue,” Expert Systems with Applications, vol. 36, no. 2, pp. 2352–2359, March 2009.
    [43] M. V. M. Yeo, X. P. Li, K. Shen, and E. P. W. Smith, “Can SVM be used for automatic EEG detection of drowsiness during car driving?” Safety Science, vol. 47, no. 1, pp. 115–124, January 2009.
    [44] K. Q. Shen, X. P. Li, C. J. Ong, S. Y. Shao, and E. P. V. Wilder-Smith, “EEG-based mental fatigue measurement using multiclass support vector machines with confidence estimate,” Clinical Neurophysiology, vol. 119, no. 7, pp. 1524–1533, July 2008.
    [45] C. T. Lin, R. C. Wu, S. F. Liang, W. H. Chao, Y. J. Chen, and T. P. Jung, “EEG-based drowsiness estimation for safety driving using independent component analysis,” IEEE Transactions on Circuits Systems I, Regular Papers, vol. 52, no. 12, pp. 2726–2738, December 2005.
    [46] C. T. Lin, Y. C. Chen, T. Y. Huang, and T. T. Chiu, et al., “Development of wireless brain computer interface with embedded multitask scheduling and its application on real-time driver’s drowsiness detection and warning,” IEEE Transactions on Biomedical Engineering, vol. 55, no. 5, pp. 1582–1591, May 2008.
    [47] A. Kircher, M. Uddman, and J. Sandin, “Vehicle control and drowsiness,” Swedish National Road and Transport Research Institute, Linkoping, Sweden, Technical Report VTI-922A, 2002.
    [48] K. L. L. Saroj and A. Craig, “A critical review of the psychophysiology of driver fatigue”, Biological Psychology, vol. 55, pp. 173-194, 2001.
    [49] Anon, “Perclos and eyetracking: Challenge and opportunity,” Applied Science Laboratories, Bedford, MA, 1999. [Online]. Available: http://www.a-s-l.com
    [50] C. D. Katsis, N. E. Ntouvas, C. G. Bafas, and D. I. Fotiadis, “Assessment of muscle fatigue during driving using surface EMG,” in Proceedings of IASTED International Conference on Biomedical Engineering, Article, 2004, pp. 259–262.
    [51] P. Bouchner, “A complex analysis of the driver behavior from simulated driving focused on fatigue detection classification,” WSEAS Trans. Syst., vol. 5, no. 1, pp. 84–91, Jan. 2006.
    [52] A. Eskandarian, R. Sayed, P. Delaigue, J. Blum, and A. Mortazavi, “Advanced driver fatigue research,” U.S. Department Transportation, Federal Motor Carrier Safety Administration, Washington, DC, Technical Report, Report FMCSA-RRR-07- 001, 2007.
    [53] J. Soria, Llega el Piloto Tecnologico, pp. 20–21, March/April 2002.
    [54] A. G. Daimer. (2001 June). The Electronic Drawbar [Online]. Available: http://www.daimler.com
    [55] P. Smith, M. Shah, and N. V. Lobo, “Determining driver visual attention with one camera,” IEEE Transactions on Intelligent Transportation Systems, vol. 4, no.4, December 2003.
    [56] M. C. Su, C. Y. Hsiung, and D. Y. Huang, “A simple approach to implementing a system for monitoring driver inattention,” in Proceedings of IEEE International Conference Systems, Man, Cybernetics, Taipei, Taiwan, 8-11 October 2006, vol. 1, pp. 429–433.
    [57] J. Batista, “A drowsiness and point of attention monitoring system for driver vigilance,” in Proceedings of the 10th International IEEE Conference on Intelligent Transportation Systems (ITSC ’07), Seattle, Washington, USA, October 2007, pp. 702–708.
    [58] S. Abtahi, B. Hariri, and S. Shirmohammadi, “Driver drowsiness monitoring based on yawning detection,” in Proceedings of the Instrumentation and Measurement Technology Conference, Hangzhou, China, May 2011.
    [59] P. Viola and M. Jones, “Rapid object detection using a boosted cascade of simple features, in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Cambridge, Mass, USA, December 2001, pp. I511–I518.
    [60] T. Brandt, R. Stemmer, and A. Rakotonirainy, “Affordable visual driver monitoring system for fatigue and monotony,” in Proceedings of IEEE International Conference on Systems, Man, Cybernetics, The Hague, Netherlands, 10-13 October 2004, vol. 7, pp. 6451–6456.
    [61] L. Bergasa, J. Nuevo, M. Sotelo, R. Barea, and E. Lopez, “Real-time system for monitoring driver vigilance,” IEEE Transactions on Intelligent Transportation Systems, vol. 7, no. 1, pp. 63–77, March 2006.
    [62] Q. Ji and X. J. Yang, “Real-time eye, gaze, and face pose tracking for monitoring drive driver vigilance,” Real-Time Imaging, vol. 8, no. 5, pp. 357–377, Oct. 2002.
    [63] C. Cudalbu, B. Anastasiu, R. Radu, R. Cruceanu, E. Schmidt, and E. Barth, “Driver monitoring with a single high-speed camera and IR illumination,” in Proc. Int. Symp. Signals, Circuits Syst., 2005, vol. 1, pp. 219–222.
    [64] L. Harley, T. Horberry, N. Mabbott, and G. Krueger, Review of Fatigue Detection and Prediction Technologies: National Road Transport Commission, 2000.
    [65] X. H. Sun, L. Xu, and J. Y. Yang, “Driver fatigue alarm based on eye detection and gaze estimation,” in Proceedings of MIPPR—Automatic Target Recognition and Image Analysis; and Multispectral Image Acquisition, 2007, pp. 678-612.
    [66] Y. Zheng and Z. Wang, ”Robust and precise eye detection based on locally selective projection,” in Proceedings of the 19th International Conference on Pattern Recognition (ICPR ‘08), Tampa, Fla, USA, December 2008.
    [67] T. D. Orazio, M. Leo, C. Guaragnella, and A. Distante, “A visual approach for driver inattention detection,” Pattern Recognition, vol. 40, no. 8, pp. 2341–2355, August 2007.
    [68] Z.W. Zhu, K. Fujimura, and Q. Ji, “Real-time eye detection and tracking under various light conditions,” Data Science Journal, vol. 6, pp. 636–640, 2007.
    [69] D. F. Dinges and R. Grace, “PERCLOS: A valid psychophysiological measure of alertness as assessed by psychomotor vigilance,” Federal Highway Administration, Office Motor Carriers, Washington, DC, Technical Report, 1998.
    [70] K. Rezaee, S. R. Alavi, and M. Madanian, et al., “Real-time intelligent alarm system of driver fatigue based on video sequences”, in Proceedings of the 2013 RSI/ISM International Conference on Robotics and Mechatronics, Tehran, Iran, 13-15 February 2013.
    [71] X. Fan, B. C. Yin, and Y. F. Sun, “Yawning detection for monitoring driver fatigue,” in Proceedings of Machine Learning and Cybernetics, Hong Kong, China, 19-22 August 2007, vol. 2, pp. 664–668.
    [72] T. Victor and A. Zelinsky, “Automating the measurement of driver visual behavior using passive stereo vision,” International Conference Series Vision in Vehicles VIV9, Brisbane, Australia, 19-22 August 2001.
    [73] M. H. Kutner, C. J. Nachtsheim, and J. Neter, Applied linear regression models. NY: McGraw-Hill, 2005.

    QR CODE