|search externally:||Google Scholar, Springer, CiteSeer, Microsoft Academic Search, Scirus , DBlife|
I am interested in algorithms that provide theoretical guarantees and are (ideally ;-)) also useful in practice, that is, efficient. Currently I am working on problems that involve submodular minimization and combinatorial optimization (and are useful in Machine Learning). In general, I am interested in combinatorial and discrete optimization, graphs, clustering, approximation algorithms, graphical models and applications in computational biology.
I have also worked on theoretical aspects of clustering and density estimation, from the perspective of approximation as well as learning theory. We proved an approximation factor for (Bregman) Tensor clustering. Another project explored clustering via kernel embeddings (MMD), where one separates distributions by (higher-order) moments and not just means, so clusters can lie within each other. In my diploma thesis, I have studied some aspects of learning theory and clustering. Demo code for our Nearest neighbor clustering (2 clusters) for Normalized cut is available here
For my "Studienarbeit" and beyond, I have been working on large scale methods for kernel ICA. Our optimization methods include stochastic matrix approximations and matrix decompositions, methods for line search and, most recently, FastKICA, a Newton-like method for kernel ICA (together with Hao Shen from NICTA). The code is available here.
In addition, I have worked on large scale SDPs.
Online submodular minimization with combinatorial constraints
as author at Discrete Optimization in Machine Learning,