View a PDF of the paper titled AlphaDecay: Module-wise Weight Decay for Heavy-Tailed Balancing in LLMs, by Di He and 6 other authors
View PDF
HTML (experimental)
Abstract:Weight decay is a standard regularization technique for training large language models (LLMs). While it is common to assign a uniform decay rate to every layer, this approach overlooks the structural diversity of LLMs and the varying spectral properties across modules. In this paper, we introduce AlphaDecay, a simple yet effective method that adaptively assigns different weight decay strengths to each module of an LLM. Our approach is guided by Heavy-Tailed Self-Regularization (HT-SR) theory, which analyzes the empirical spectral density (ESD) of weight correlation matrices to quantify “heavy-tailedness.” Modules exhibiting more pronounced heavy-tailed ESDs, reflecting stronger feature learning, are assigned weaker decay, while modules with lighter-tailed spectra receive stronger decay. Our method leverages tailored weight decay assignments to balance the module-wise differences in spectral properties, leading to improved performance. Extensive pre-training tasks with various model sizes from 60M to 1B demonstrate that AlphaDecay achieves better perplexity and generalization than conventional uniform decay and other adaptive decay baselines. Our code is available at this https URL.
Submission history
From: Di He [view email]
[v1]
Tue, 17 Jun 2025 14:21:10 UTC (2,991 KB)
[v2]
Sun, 22 Jun 2025 12:57:32 UTC (2,991 KB)