en-es
en
0.25
0.5
0.75
1.25
1.5
1.75
2
Metric Regression Forests for Human Pose Estimation
Published on Apr 03, 20142548 Views
We present a new method for inferring dense data to model correspondences, focusing on the application of human pose estimation from depth images. Recent work proposed the use of regression forests
Related categories
Chapter list
Metric Regression Forests for Human Pose Estimation00:00
Human Pose Estimation - 100:10
Human Pose Estimation - 200:22
From Classification to Regression - 100:34
From Classification to Regression - 201:04
From Classification to Regression - 301:30
From Classification to Regression - 401:34
From Classification to Regression - 501:42
From Classification to Regression - 601:54
Inferring Correspondences - 102:08
Inferring Correspondences - 202:20
Inferring Correspondences - 302:22
Inferring Correspondences - 402:34
Inferring Correspondences - 502:37
Parts Classification Objective02:40
Retrofitting for Regression - 103:44
Retrofitting for Regression - 203:59
Retrofitting for Regression - 304:02
Retrofitting for Regression - 404:11
Current Limitations - 104:27
Current Limitations - 204:42
Current Limitations - 304:45
Current Limitations - 404:54
Current Limitations - 505:13
How much better can we do?05:30
Metric Space07:01
Training with a Differential Entropy Objective07:47
Kernel Density Estimation - 108:37
Kernel Density Estimation - 208:49
Kernel Density Estimation - 309:09
How to Compute it? - 110:18
How to Compute it? - 210:23
How to Compute it? - 310:24
How to Compute it? - 410:34
Problems - 110:50
Problems - 211:12
Our approach: Discretize and Cache Kernel11:30
Approximate differential entropy - 111:59
Approximate differential entropy - 212:13
Approximate differential entropy - 312:38
Approximate differential entropy - 412:42
Entropy is computed as - 112:53
Entropy is computed as - 213:10
Results13:40
Optimize the Pose - 114:33
Optimize the Pose - 214:43
Optimize the Pose - 314:47
Pose Accuracy14:58
Improvement after ICP15:15
Where are we?16:05
Conclusions16:36
Thank you!17:24