簡易檢索 / 詳目顯示

研究生: 邱憲璋
Shien-chang Chiu
論文名稱: Articulated Tracking using Particle Filtering
Articulated Tracking using Particle Filtering
指導教授: 鮑興國
Hsing-Kuo Pao
口試委員: 鍾國亮
Kuo-Liang Chung
吳怡樂
Yi-Leh Wu
李育杰
Yuh-Jye Lee
劉庭祿
Tyng-Luh Liu
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2007
畢業學年度: 95
語文別: 英文
論文頁數: 48
中文關鍵詞: articulated trackingparticle filteringneighborhood constraint
外文關鍵詞: articulated tracking, particle filtering, neighborhood constraint
相關次數: 點閱:141下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • We propose a method for tracking articulated body. In our framework, the
    articulated tracking is considered as a problem of simultaneously tracking
    multi-objects, where for instance, each object can be an arm, a leg, or a body
    etc. in human motion tracking. However, due to the nature of articulated
    structure, those moving parts usually move with less degree of freedoms
    than regular independent objects. Borrowing the idea of particle filtering,
    we sample candidates with weights in our problem of multi-object tracking.
    E.g., a candidate can be a potential possibility of a waving arm at a moment
    in a location. Different from the regular particle filtering, the weight is also
    affected by how much the samples obey the motion constraints associated
    with the articulated structure. The whole framework is divided into several
    steps. The preprocessing step is aimed at finding those articulated parts. We
    randomly sample moving pixels and use Gaussian mixtures to group them
    into parts. The results from optical flow and locations are considered as
    the input for the Gaussian mixtures. It is close to the abstract concept of
    object tracking where a moving object is roughly defined as the set of points
    with proximity and with the “common fate”. After the articulated parts are
    obtained, we track them simultaneously by a particle filtering approach, with
    weights adjusted according to the motion constraints. The whole process can
    be done close to real time.


    We propose a method for tracking articulated body. In our framework, the
    articulated tracking is considered as a problem of simultaneously tracking
    multi-objects, where for instance, each object can be an arm, a leg, or a body
    etc. in human motion tracking. However, due to the nature of articulated
    structure, those moving parts usually move with less degree of freedoms
    than regular independent objects. Borrowing the idea of particle filtering,
    we sample candidates with weights in our problem of multi-object tracking.
    E.g., a candidate can be a potential possibility of a waving arm at a moment
    in a location. Different from the regular particle filtering, the weight is also
    affected by how much the samples obey the motion constraints associated
    with the articulated structure. The whole framework is divided into several
    steps. The preprocessing step is aimed at finding those articulated parts. We
    randomly sample moving pixels and use Gaussian mixtures to group them
    into parts. The results from optical flow and locations are considered as
    the input for the Gaussian mixtures. It is close to the abstract concept of
    object tracking where a moving object is roughly defined as the set of points
    with proximity and with the “common fate”. After the articulated parts are
    obtained, we track them simultaneously by a particle filtering approach, with
    weights adjusted according to the motion constraints. The whole process can
    be done close to real time.

    1 Introduction 1 1.1 Problem proposed . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Our Framework . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.3 Thesis outline . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2 Previous Work 6 2.1 Optical Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2.2 GaussianMixtureModel . . . . . . . . . . . . . . . . . . . . . 10 2.3 RANdomSAmple Consensus . . . . . . . . . . . . . . . . . . . 13 2.4 Particle Filtering . . . . . . . . . . . . . . . . . . . . . . . . . 16 3 Articulated Tracking 20 3.1 Preprocessing . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 3.2 Tracking through particle filtering . . . . . . . . . . . . . . . . 26 3.2.1 Observation model for single segment . . . . . . . . . . 26 3.2.2 Observation model for two connected segments . . . . . 27 4 Experimental Results 30 4.1 Ex1: Simple background . . . . . . . . . . . . . . . . . . . . . 31 4.1.1 Tracking without neighboring constraint . . . . . . . . 31 4.1.2 Tracking with neighboring constraint . . . . . . . . . . 32 4.2 Ex2: complex background . . . . . . . . . . . . . . . . . . . . 35 4.2.1 Tracking without neighboring constraint . . . . . . . . 35 4.2.2 Tracking with neighboring constraint . . . . . . . . . . 35 5 Conclusion and future work 38 5.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 5.2 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

    [1] J. L. Barron, D. J. Fleet, and S. S. Beauchemin. Performance of optical
    flow techniques. International Journal of Computer Vision, 12(1):43–77,
    1994.
    [2] M. Figueiredo and A.K.Jain. Unsupervised learning of finite mixture
    models. IEEE Transaction on Pattern Analysis and Machine Intelligence,
    24:381–396, 2002.
    [3] M. A. Fischler and R. C. Bolles. Random sample consensus: A paradigm
    for model fitting with applications to image analysis and automated
    cartography. Comm. of the ACM, 24:381–395, 1981.
    [4] I. Haritaoglu, D. Harwood, and L. Davis. W4 : Who? when? where?
    what? a real time system for detecting and tracking people. In Proc.
    IEEE Int’l Conf. on Face and Gesture Recognition, 1998.
    [5] Berthold K. P. Horn and Brian G. Schunck. Determining optical flow.
    Artificial Intelligence, 17(1-3):185–203, 1981.
    [6] Michael Isard and Andrew Blake. Contour tracking by stochastic propagation
    of conditional density. In Proceedings of the Fourth European
    Conference on Computer Vision, 2:343–356, 1996.
    [7] Michael Isard and Andrew Blake. Condensation conditional density
    propagation for visual tracking. International Journal of Computer Vision,
    29(1):5–28, 1998.
    [8] K. Nummiaro, E. Koller-Meier, and L. Van Gool. An adaptive colorbased
    particle filter. Image and Vision, 21(1):99–110, 2003.
    [9] V. Pavlovic, J. M. Rehg, T. J. cham, and K. P. Murthy. A dynamic
    bayesian network approach to figure tracking using learned dynamic models.
    IEEE International Conference on Computer Vision, 1:94–101,
    1999.
    [10] H. Sidenbladh, M.J. Black, and L. Sigal. Implicit probabilistic models
    of human motion for synthesis and tracking. ECCV, 1:784–800, 2002.
    [11] Carlo Tomasi. Estimating gaussian mixture densities with em - a tutorial.
    2003.
    [12] C. Wren, A. Azarbayejani, T. Darrel, and A. Pentland. Pfinder: Realtime
    tracking of the human body. IEEE Trans. on Pattern Analysis and
    Machine Intelligence, 9:780–785, 1997.
    [13] Ying Wu, Gang Hua, and Ting Yu. Tracking articulated body by dynamic
    markov network. Proceddings of the Ninth IEEE International
    Conference on Computer Vision, 2003.

    QR CODE