Learning Deep Hierarchies of Representations
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Whereas theoretical work suggests that deep architectures might be computationally and statistically more efficient at representing highly-varying functions, training deep architectures was unsuccessful until the recent advent of algorithms based on unsupervised pre-training of each level of a hierarchically structured model. Several unsupervised criteria and procedures were proposed for this purpose, starting with the Restricted Boltzmann Machine (RBM), which when stacked gives rise to Deep Belief Networks (DBN). Although the partition function of RBMs is intractable, inference is tractable and we review several successful learning algorithms that have been proposed, in particular those using weights that change quickly during learning instead of converging. In addition to being impressive as generative models, deep architectures based on RBMs and other unsupervised learning methods have made an impact by being used to initialize deep supervised neural networks. Even though these new algorithms have enabled training deep models, many questions remain as to the nature of this difficult learning problem. We attempt to shed some light on these questions by comparing different successful approaches to training deep architectures and through extensive simulations investigating explanatory hypotheses. Finally, we describe our current research program, objectives and challenges, regarding learning representations at multiple levels of abstraction, to compare web objects such as images, documents, and search engine requests, comparisons that are at the core of several information retrieval applications.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !