AdvFusion: Adapter-based Knowledge Transfer for Code Summarization on Code Language Models
Programming languages can benefit from one an- other through utilizing a pre-trained model for software engi- neering tasks such as code summarization and method name prediction. Such benefits were shown in empirical studies on multilingual fine-tuning of Code Language Models (Code-LM). Though these studies used full fine-tuning, Parameter Efficient Fine-Tuning (PEFT) has shown to be a promising alternative approach for many software engineering tasks in other studies. However, there is limited research on studying the knowledge transfer among languages through PEFT architectures that utilize multiple languages. AdapterFusion is one such architecture that enhances the performance of a target task in a specific language, by learning from similar latent information from other languages. However, our empirical studies show that AdapterFu- sion primarily learns the same language of the target task, even though there are multiple languages involved. We therefore, propose AdvFusion, an innovative PEFT-based approach that can effectively learn useful information from other languages. AdvFusion first learns the knowledge of other programming languages, before learning from the language corresponding to the target task and adapting the previously learned knowledge to it. We choose two commonly used down- stream tasks, code summarization and method name prediction, to evaluate our proposed approach. Our experiments show that AdvFusion outperforms AdapterFusion by up to 1.7-point increase on the benchmark datasets, and exceeds LoRA by 1.99, 1.26, and 2.16 for Ruby, JavaScript, and Go, respectively. We open source our scripts for replication purposes1.
Fri 7 MarDisplayed time zone: Eastern Time (US & Canada) change
11:00 - 12:30 | Change Management & Program ComprehensionReproducibility Studies and Negative Results (RENE) Track / Research Papers / Early Research Achievement (ERA) Track at L-1710 Chair(s): Masud Rahman Dalhousie University | ||
11:00 15mTalk | AdvFusion: Adapter-based Knowledge Transfer for Code Summarization on Code Language Models Research Papers Iman Saberi University of British Columbia Okanagan, Amirreza Esmaeili University of British Columbia, Fatemeh Hendijani Fard University of British Columbia, Chen Fuxiang University of Leicester | ||
11:15 15mTalk | EarlyPR: Early Prediction of Potential Pull-Requests from Forks Research Papers | ||
11:30 15mTalk | The Hidden Challenges of Merging: A Tool-Based Exploration Research Papers Luciana Gomes UFCG, Melina Mongiovi Federal University of Campina Grande, Brazil, Sabrina Souto UEPB, Everton L. G. Alves Federal University of Campina Grande | ||
11:45 7mTalk | On the Performance of Large Language Models for Code Change Intent Classification Early Research Achievement (ERA) Track Issam Oukay Department of Software and IT Engineering, ETS Montreal, University of Quebec, Montreal, Canada, Moataz Chouchen Department of Electrical and Computer Engineering, Concordia University, Montreal, Canada, Ali Ouni ETS Montreal, University of Quebec, Fatemeh Hendijani Fard University of British Columbia | ||
11:52 15mTalk | Revisiting Method-Level Change Prediction: Comparative Evaluation at Different Granularities Reproducibility Studies and Negative Results (RENE) Track Hiroto Sugimori School of Computing, Institute of Science Tokyo, Shinpei Hayashi Institute of Science Tokyo DOI Pre-print |