As new products are emerging daily, recommendation systems are required to
quickly adapt to possible new domains without needing extensive retraining.
This work presents “X-Cross” — a novel cross-domain
sequential-recommendation model that recommends products in new domains by
integrating several domain-specific language models; each model is fine-tuned
with low-rank adapters (LoRA). Given a recommendation prompt, operating layer
by layer, X-Cross dynamically refines the representation of each source
language model by integrating knowledge from all other models. These refined
representations are propagated from one layer to the next, leveraging the
activations from each domain adapter to ensure domain-specific nuances are
preserved while enabling adaptability across domains. Using Amazon datasets for
sequential recommendation, X-Cross achieves performance comparable to a model
that is fine-tuned with LoRA, while using only 25% of the additional
parameters. In cross-domain tasks, such as adapting from Toys domain to Tools,
Electronics or Sports, X-Cross demonstrates robust performance, while requiring
about 50%-75% less fine-tuning data than LoRA to make fine-tuning effective.
Furthermore, X-Cross achieves significant improvement in accuracy over
alternative cross-domain baselines. Overall, X-Cross enables scalable and
adaptive cross-domain recommendations, reducing computational overhead and
providing an efficient solution for data-constrained environments.