SANER 2025
Tue 4 - Fri 7 March 2025 Montréal, Québec, Canada

Programming languages can benefit from one an- other through utilizing a pre-trained model for software engi- neering tasks such as code summarization and method name prediction. Such benefits were shown in empirical studies on multilingual fine-tuning of Code Language Models (Code-LM). Though these studies used full fine-tuning, Parameter Efficient Fine-Tuning (PEFT) has shown to be a promising alternative approach for many software engineering tasks in other studies. However, there is limited research on studying the knowledge transfer among languages through PEFT architectures that utilize multiple languages. AdapterFusion is one such architecture that enhances the performance of a target task in a specific language, by learning from similar latent information from other languages. However, our empirical studies show that AdapterFu- sion primarily learns the same language of the target task, even though there are multiple languages involved. We therefore, propose AdvFusion, an innovative PEFT-based approach that can effectively learn useful information from other languages. AdvFusion first learns the knowledge of other programming languages, before learning from the language corresponding to the target task and adapting the previously learned knowledge to it. We choose two commonly used down- stream tasks, code summarization and method name prediction, to evaluate our proposed approach. Our experiments show that AdvFusion outperforms AdapterFusion by up to 1.7-point increase on the benchmark datasets, and exceeds LoRA by 1.99, 1.26, and 2.16 for Ruby, JavaScript, and Go, respectively. We open source our scripts for replication purposes1.