Abstract
There has been significant progress on pose estimation and increasing interests on pose tracking in recent years. At the same time, the overall algorithm and system complexity increases as well, making the algorithm analysis and comparison more difficult. This work provides simple and effective baseline methods. They are helpful for inspiring and evaluating new ideas for the field. State-of-the-art results are achieved on challenging benchmarks. The code will be available at this https URL.
Keywords
Affiliated Institutions
Related Publications
Articulated Human Detection with Flexible Mixtures of Parts
We describe a method for articulated human detection and human pose estimation in static images based on a new representation of deformable part models. Rather than modeling art...
DeepPose: Human Pose Estimation via Deep Neural Networks
We propose a method for human pose estimation based on Deep Neural Networks (DNNs). The pose estimation is formulated as a DNN-based regression problem towards body joints. We p...
Poselet Conditioned Pictorial Structures
In this paper we consider the challenging problem of articulated human pose estimation in still images. We observe that despite high variability of the body articulations, human...
Clustered Pose and Nonlinear Appearance Models for Human Pose Estimation
We investigate the task of 2D articulated human pose estimation in unconstrained still images. This is extremely challenging because of variation in pose, anatomy, clothing, and...
Articulated people detection and pose estimation: Reshaping the future
State-of-the-art methods for human detection and pose estimation require many training samples for best performance. While large, manually collected datasets exist, the captured...
Publication Info
- Year
- 2018
- Type
- book-chapter
- Pages
- 472-487
- Citations
- 2084
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1007/978-3-030-01231-1_29
- PMID
- 40895350
- PMCID
- PMC12394113
- arXiv
- 1804.06208