A metric notion of dimension and its applications to learning
published: July 20, 2010, recorded: June 2010, views: 4292
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Let us define the dimension of a metric space as the minimum k>0 such that every ball in the metric space can be covered by 2^k balls of half the radius. This definition has several attractive features besides being applicable to every metric space. For instance, it coincides with the standard notion of dimension in Euclidean spaces, but captures also nonlinear structures such as manifolds. Metric spaces of low dimension (under the above definition) occur naturally in many contexts. I will discuss recent theoretical results regarding such metric spaces, including questions such as embeddability, dimension reduction, Nearest Neighbor Search, and large-margin classification, the common thread being that low dimension implies algorithmic efficiency.
Download slides: icml2010_krauthgamer_amnda_01-1.pdf (215.3 KB)
Download slides: icml2010_krauthgamer_amnda_01.ppt (554.5 KB)
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !