Compositional Noisy-Logical Learning
published: Aug. 26, 2009, recorded: June 2009, views: 3026
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
We describe a new method for learning the conditional probability distribution of a binary-valued variable from labelled training examples. Our proposed Compositional Noisy-Logical Learning (CNLL) approach learns a noisy-logical distribution in a compositional manner. CNLL is an alternative to the well-known AdaBoost algorithm which performs coordinate descent on an alternative error measure. We describe two CNLL algorithms and test their performance compared to AdaBoost on two types of problem: (i) noisy-logical data (such as noisy exclusive-or), and (ii) four standard datasets from the UCI repository. Our results show that we outperform AdaBoost while using signiﬁcantly fewer weak classiﬁers, thereby giving a more transparent classiﬁer suitable for knowledge extraction.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !