Loading...

Multiple human 3D pose estimation from multiview images

Ershadi Nasab, S ; Sharif University of Technology

760 Viewed
  1. Type of Document: Article
  2. DOI: 10.1007/s11042-017-5133-8
  3. Abstract:
  4. Multiple human 3D pose estimation is a challenging task. It is mainly because of large variations in the scale and pose of humans, fast motions, multiple persons in the scene, and arbitrary number of visible body parts due to occlusion or truncation. Some of these ambiguities can be resolved by using multiview images. This is due to the fact that more evidences of body parts would be available in multiple views. In this work, a novel method for multiple human 3D pose estimation using evidences in multiview images is proposed. The proposed method utilizes a fully connected pairwise conditional random field that contains two types of pairwise terms. The first pairwise term encodes the spatial dependencies among human body joints based on an articulated human body configuration. The second pairwise term is based on the output of a 2D deep part detector. An approximate inference is then performed using the loopy belief propagation algorithm. The proposed method is evaluated on the Campus, Shelf, Utrecht Multi-Person Motion benchmark, Human3.6M, KTH Football II, and MPII Cooking datasets. Experimental results indicate that the proposed method achieves substantial improvements over the existing state-of-the-art methods in terms of the probability of correct pose and the mean per joint position error performance measures. © 2017 Springer Science+Business Media, LLC
  5. Keywords:
  6. Fully connected model ; Human pose estimation ; Multiview images ; Hardware ; Multimedia systems ; Approximate inference ; Conditional random field ; GraphicaL model ; Human pose estimations ; Loopy belief propagation ; Multi-view image ; Multiple human ; State-of-the-art methods ; Inference engines
  7. Source: Multimedia Tools and Applications ; 2017 , Pages 1-29 ; 13807501 (ISSN)
  8. URL: https://link.springer.com/article/10.1007/s11042-017-5133-8