Mixture-of-Experts Low-Rank Adaptation for Multilingual Code Summarization
This program is tentative and subject to change.
As Code Language Models (CLMs) are increasingly used to automate multilingual code intelligence tasks, Full-Parameter Fine-Tuning (FPFT) of CLMs has become a widely adopted approach, which is both time-consuming and resource-intensive. Parameter-Efficient Fine-Tuning (PEFT) provides a more efficient alternative to FPFT. However, it struggles to capture common features shared across languages, leading to performance degradation. Recent studies have explored mixed-language training with PEFT to avoid the loss of common features. However, these methods can result in gradient conflicts due to the diverse language-specific features, causing suboptimal performance particularly for low-resource languages. In this paper, we propose Mixture-of-Experts Multilingual Low-Rank Adaptation (MMLoRA). MMLoRA addresses gradient conflicts while preserving common features shared across languages by combining a universal expert with a set of specialized linguistic experts. Additionally, we introduce an expert loss function that maintains the diversity of specialized linguistic experts while balancing the learning progress. Experimental results indicate that MMLoRA achieves state-of-the-art performance in multilingual code summarization while maintaining efficient fine-tuning. The performance improvement is particularly significant in low-resource languages such as Ruby.
This program is tentative and subject to change.
Mon 17 NovDisplayed time zone: Seoul change
| 14:00 - 15:30 | |||
| 14:0010m Talk | QuanBench: Benchmarking Quantum Code Generation with Large Language Models Research Papers | ||
| 14:1010m Talk | Token Sugar: Making Source Code Sweeter for LLMs through Token-Efficient Shorthand Research Papers Zhensu Sun Singapore Management University, Chengran Yang Singapore Management University, Singapore, Xiaoning Du Monash University, Zhou Yang University of Alberta, Alberta Machine Intelligence Institute , Li Li Beihang University, David Lo Singapore Management University | ||
| 14:2010m Talk | FGIT: Fault-Guided Fine-Tuning for Code Generation Research Papers Lishui Fan Zhejiang University, Zhongxin Liu Zhejiang University, Haoye Wang Hangzhou City University, Lingfeng Bao  Zhejiang University, Xin Xia Zhejiang University, Shanping Li Zhejiang University | ||
| 14:3010m Talk | Mixture-of-Experts Low-Rank Adaptation for Multilingual Code Summarization Research Papers Tianchen Yu School of Software Engineering, South China University of Technology, Li Yuan School of Software Engineering, South China University of Technology, Guangzhou, China, Hailin Huang South China University of Technology, Jiexin Wang South China University of Technology, Yi Cai School of Software Engineering, South China University of Technology, Guangzhou, China | ||
| 14:4010m Talk | EfficientEdit: Accelerating Code Editing via Edit-Oriented Speculative Decoding Research Papers Peiding Wang Beihang university, Li Zhang Beihang University, Fang Liu Beihang University, Yinghao Zhu Beihang University, Wang Xu Tsinghua University, Lin Shi Beihang University, Xiaoli Lian Beihang University, China, Minxiao Li Beihang university, Bo Shen Huawei Cloud Computing Technologies Co., Ltd., Binzhang Fu Huawei Technologies, n.n.Pre-print | ||
| 14:5010m Talk | Bias Testing and Mitigation in LLM-based Code Generation Journal-First Track Dong Huang The University of Hong Kong, Jie M. Zhang King's College London, Qingwen Bu Shanghai Jiao Tong University, Xiaofei Xie Singapore Management University, Junjie Chen Tianjin University, Heming Cui University of Hong Kong | ||
| 15:0010m Talk | FastCoder: Accelerating Repository-level Code Generation via Efficient Retrieval and Verification Research Papers Qianhui Zhao Beihang University, Li Zhang Beihang University, Fang Liu Beihang University, Xiaoli Lian Beihang University, China, Meng Qiaoyuanhe Beihang University, Ziqian Jiao Beihang University, Zetong Zhou Beihang University, Jia Li , Lin Shi Beihang UniversityPre-print | ||
| 15:1010m Talk | AlignCoder: Aligning Retrieval with Target Intent for Repository-Level Code Completion Research Papers Tianyue Jiang Sun Yat-sen University, Yanli Wang Sun Yat-sen University, Yanlin Wang Sun Yat-sen University, Daya Guo , Ensheng Shi Huawei, Yuchi Ma Huawei Cloud Computing Technologies, Jiachi Chen Sun Yat-sen University, Zibin Zheng Sun Yat-sen University | ||
| 15:2010m Talk | Effectiveness of symmetric metamorphic relations on validating the stability of code generation LLM Journal-First Track Chan Pak Yuen Department of Computer Science, City University of Hong Kong, Kowloon, Hong Kong, China, Jacky Keung City University of Hong Kong, Zhen Yang Shandong University | ||