Classification on Riemannian Manifolds
published: Sept. 13, 2010, recorded: August 2010, views: 1960
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
A large number of natural phenomena can be formulated as inference on differentiable manifolds. More specifically in computer vision, such underlying notions emerge in feature selection, pose estimation, structure from motion, appearance tracking, and shape embedding. Unlike the uniform Euclidean space, differentiable manifolds exhibit local homeomorphism, thus, the differential geometry is applicable only within local tangent spaces. This prevents incorporation of conventional methods that require vector norms into the classification problems on manifolds where distances are defined through the curves of minimal length connecting two points. Recently we introduced a region covariance descriptor that exhibits a Riemannian manifold structure on positive definite matrices. By imposing weak classifiers on tangent spaces and establishing weighted sums via Karcher means, we bootstrap an ensemble of boosted classifiers with logistic loss functions. In this manner, we do not need to flatten the manifold or discover its topology. We demonstrate the new manifold classifiers on human detection and face recognition problems.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !