Embedded Methods

author: Isabelle Guyon, Clopinet
published: July 5, 2007,   recorded: July 2007,   views: 12254


Related Open Educational Resources

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.

 Watch videos:   (click on thumbnail to launch)

Watch Part 1
Part 1 17:56
Watch Part 2
Part 2 27:09


This course covers feature selection fundamentals and applications. The students will first be reminded of the basics of machine learning algorithms and the problem of overfitting avoidance. In the wrapper setting, feature selection will be introduced as a special case of the model selection problem. Methods to derive principled feature selection algorithms will be reviewed as well as heuristic method, which work well in practice. One class will be devoted to feature construction techniques. Finally, a lecture will be devoted to the connections between feature section and causal discovery. The class will be accompanied by several lab sessions. The course will be attractive to students who like playing with data and want to learn practical data analysis techniques. The instructor has ten years of experience with consulting for startup companies in the US in pattern recognition and machine learning. Datasets from a variety of application domains will be made available: handwriting recognition, medical diagnosis, drug discovery, text classification, ecology, marketing.

See Also:

Download slides icon Download slides: L3_featselect2.pdf (339.3 KB)

Download slides icon Download slides: L3_featselect2.ppt (1.3 MB)

Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Reviews and comments:

Comment1 Joshua Newton, January 28, 2022 at 10:21 a.m.:

The main advantage of embedded methods is that they can use the same cross-validation procedure that is used to estimate a model's prediction error, thereby avoiding the need for an additional validation step. This means that we can use these methods with any kind of machine learning algorithm without having to worry about how it's performing. Reading https://unhappyhipsters.com/build-a-h... article has convinced me to build a home library. I am thinking to apply some relevant embedded methods to obtain good experience.

Write your own review or comment:

make sure you have javascript enabled or clear this field: