國立臺北大學統計學系
專題演講
講題:Is Complementary-Label Learning Realistic?
主講人:林軒田 教授 (國立臺灣大學 資訊工程學系)
時間:115 年 04 月 08 日 (星期三,13:10~15:00)
地點:三峽校區商學院3F13教室
Abstract
Complementary-Label Learning (CLL) is a weakly supervised learning paradigm that is claimed to be useful in situations where collecting true labels is expensive. This talk begins by walking the audience through two advances in CLL. The first work discovered that the CLL loss design from Unbiased Risk Estimator (URE) suffers from high variance in gradient estimation. To address this, a novel surrogate complementary loss (SCL) framework is proposed, which reduces variance and improves gradient alignment, mitigating the overfitting issue. The second work introduces a new approach to CLL by reducing the problem to estimating the probability of complementary classes. This framework sidesteps the limitations of traditional methods and improves robustness in noisy environments, offering a broader perspective for both deep and non-deep models. Empirical results demonstrate the efficacy of both approaches in improving CLL performance. Finally, the speaker will share some of the ongoing attempts in making CLL more realistic, including some firsthand experience in collecting real-world datasets and releasing an open-source library.
~歡迎參加~
國立臺北大學統計學系 敬邀
115.04.07
附件:演講摘要
