View a PDF of the paper titled Mixture of Low Rank Adaptation with Partial Parameter Sharing for Time Series Forecasting, by Licheng Pan and 7 other authors
View PDF
HTML (experimental)
Abstract:Multi-task forecasting has become the standard approach for time-series forecasting (TSF). However, we show that it suffers from an Expressiveness Bottleneck, where predictions at different time steps share the same representation, leading to unavoidable errors even with optimal representations. To address this issue, we propose a two-stage framework: first, pre-train a foundation model for one-step-ahead prediction; then, adapt it using step-specific LoRA this http URL design enables the foundation model to handle any number of forecast steps while avoiding the expressiveness bottleneck. We further introduce the Mixture-of-LoRA (MoLA) model, which employs adaptively weighted LoRA experts to achieve partial parameter sharing across steps. This approach enhances both efficiency and forecasting performance by exploiting interdependencies between forecast steps. Experiments show that MoLA significantly improves model expressiveness and outperforms state-of-the-art time-series forecasting methods. Code is available at this https URL.
Submission history
From: Licheng Pan [view email]
[v1]
Fri, 23 May 2025 13:24:39 UTC (974 KB)
[v2]
Tue, 27 May 2025 07:23:28 UTC (966 KB)