๐ฃ๐ฎ๐ฝ๐ฒ๐ฟ ๐๐ฐ๐ฐ๐ฒ๐ฝ๐๐ฎ๐ป๐ฐ๐ฒ ๐๐ป๐ป๐ผ๐๐ป๐ฐ๐ฒ๐บ๐ฒ๐ป๐ : Transitioning Heads Conundrum: The Hidden Bottleneck in Long-Tailed Class-Incremental Learning has been accepted at TMLR 2026
Paper titled โTransitioning Heads Conundrum: The Hidden Bottleneck in Long-Tailed Class-Incremental Learningโ has been accepted at TMLR 2026 (Transactions on Machine Learning Research).
Authors: Rahul Vigneswaran K, Hari Chandana Kuchibhotla, Vineeth N Balasubramanian
๐ Congratulations to all the authors!
๐ Key Highlight: This work introduces DEREK (DEcoupling Representations for Early Knowledge Distillation), a method addressing a previously overlooked challenge in Long-Tailed Class-Incremental Learning (LTCIL): the Transitioning Heads Conundrum.
In LTCIL, head classes that are well-represented in earlier tasks become tail classes in subsequent tasks due to memory constraints, leading to accelerated catastrophic forgetting. DEREK mitigates this by decoupling head and tail learning via specialized expert networks and applying Early Knowledge Distillation before data constraints take effect, preserving rich representations.
Across 2 LTCIL benchmarks, 12 experimental settings, and 24 baselines, DEREK consistently establishes new state-of-the-art performance.