Towards mitigating the class-imbalance problem for partial label learning
published: Nov. 23, 2018, recorded: August 2018, views: 9
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Partial label (PL) learning aims to induce a multi-class classifier from training examples where each of them is associated with a set of candidate labels, among which only one is valid. It is well-known that the problem of class-imbalance stands as a major factor affecting the generalization performance of multi-class classifier, and this problem becomes more pronounced as the ground-truth label of each PL training example is not directly accessible to the learning approach. To mitigate the negative influence of class-imbalance to partial label learning, a novel class-imbalance aware approach named Cimap is proposed by adapting over-sampling techniques for handling PL training examples. Firstly, for each PL training example, Cimap disambiguates its candidate label set by estimating the confidence of each class label being ground-truth one via weighted k-nearest neighbor aggregation. After that, the original PL training set is replenished for model induction by over-sampling existing PL training examples via manipulation of the disambiguation results. Extensive experiments on artificial as well as real-world PL data sets show that Cimap serves as an effective data-level approach to mitigate the class-imbalance problem for partial label learning.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !