SANER 2025
Tue 4 - Fri 7 March 2025 Montréal, Québec, Canada
Fri 7 Mar 2025 11:00 - 11:15 at L-1710 - Change Management & Program Comprehension Chair(s): Masud Rahman

Programming languages can benefit from one an- other through utilizing a pre-trained model for software engi- neering tasks such as code summarization and method name prediction. Such benefits were shown in empirical studies on multilingual fine-tuning of Code Language Models (Code-LM). Though these studies used full fine-tuning, Parameter Efficient Fine-Tuning (PEFT) has shown to be a promising alternative approach for many software engineering tasks in other studies. However, there is limited research on studying the knowledge transfer among languages through PEFT architectures that utilize multiple languages. AdapterFusion is one such architecture that enhances the performance of a target task in a specific language, by learning from similar latent information from other languages. However, our empirical studies show that AdapterFu- sion primarily learns the same language of the target task, even though there are multiple languages involved. We therefore, propose AdvFusion, an innovative PEFT-based approach that can effectively learn useful information from other languages. AdvFusion first learns the knowledge of other programming languages, before learning from the language corresponding to the target task and adapting the previously learned knowledge to it. We choose two commonly used down- stream tasks, code summarization and method name prediction, to evaluate our proposed approach. Our experiments show that AdvFusion outperforms AdapterFusion by up to 1.7-point increase on the benchmark datasets, and exceeds LoRA by 1.99, 1.26, and 2.16 for Ruby, JavaScript, and Go, respectively. We open source our scripts for replication purposes1.

Fri 7 Mar

Displayed time zone: Eastern Time (US & Canada) change

11:00 - 12:30
11:00
15m
Talk
AdvFusion: Adapter-based Knowledge Transfer for Code Summarization on Code Language ModelsBest Paper Award
Research Papers
Iman Saberi University of British Columbia Okanagan, Amirreza Esmaeili University of British Columbia, Fatemeh Hendijani Fard University of British Columbia, Chen Fuxiang University of Leicester
11:15
15m
Talk
EarlyPR: Early Prediction of Potential Pull-Requests from Forks
Research Papers
XiangChen Wu , Liang Wang Nanjing University, Xianping Tao Nanjing University
11:30
15m
Talk
The Hidden Challenges of Merging: A Tool-Based Exploration
Research Papers
Luciana Gomes UFCG, Melina Mongiovi Federal University of Campina Grande, Brazil, Sabrina Souto UEPB, Everton L. G. Alves Federal University of Campina Grande
11:45
7m
Talk
On the Performance of Large Language Models for Code Change Intent Classification
Early Research Achievement (ERA) Track
Issam Oukay Department of Software and IT Engineering, ETS Montreal, University of Quebec, Montreal, Canada, Moataz Chouchen Department of Electrical and Computer Engineering, Concordia University, Montreal, Canada, Ali Ouni ETS Montreal, University of Quebec, Fatemeh Hendijani Fard University of British Columbia
11:52
15m
Talk
Revisiting Method-Level Change Prediction: Comparative Evaluation at Different Granularities
Reproducibility Studies and Negative Results (RENE) Track
Hiroto Sugimori School of Computing, Institute of Science Tokyo, Shinpei Hayashi Institute of Science Tokyo
DOI Pre-print