Write a Blog >>
ICSE 2023
Sun 14 - Sat 20 May 2023 Melbourne, Australia
Wed 17 May 2023 11:00 - 11:15 at Level G - Plenary Room 1 - AI models for SE Chair(s): Denys Poshyvanyk

As pre-trained models automate many code intelligence tasks, a widely used paradigm is to fine-tune a model on the task dataset for each programming language. A recent study reported that multilingual fine-tuning benefits a range of tasks and models. However, we find that multilingual fine-tuning leads to performance degradation on recent models UniXcoder and CodeT5.

To alleviate the potentially catastrophic forgetting issue in multilingual models, we fix all pre-trained model parameters, insert the parameter-efficient structure adapter, and fine-tune it. Updating only 0.6% of the overall parameters compared to full-model fine-tuning for each programming language, adapter tuning yields consistent improvements on code search and summarization tasks, achieving state-of-the-art results. In addition, we experimentally show its effectiveness in cross-lingual and low-resource scenarios. Multilingual fine-tuning with 200 samples per programming language approaches the results fine-tuned with the entire dataset on code summarization. Our experiments on three probing tasks show that adapter tuning significantly outperforms full-model fine-tuning and effectively overcomes catastrophic forgetting.

Wed 17 May

Displayed time zone: Hobart change

11:00 - 12:30
11:00
15m
Talk
One Adapter for All Programming Languages? Adapter Tuning for Multilingual Tasks in Software Engineering
Technical Track
Deze Wang National University of Defense Technology, Boxing Chen , Shanshan Li National University of Defense Technology, Wei Luo , Shaoliang Peng Hunan University, Wei Dong School of Computer, National University of Defense Technology, China, Liao Xiangke National University of Defense Technology
11:15
15m
Talk
CCRep: Learning Code Change Representations via Pre-Trained Code Model and Query Back
Technical Track
Zhongxin Liu Zhejiang University, Zhijie Tang Zhejiang University, Xin Xia Huawei, Xiaohu Yang Zhejiang University
Pre-print
11:30
15m
Talk
Keeping Pace with Ever-Increasing Data: Towards Continual Learning of Code Intelligence Models
Technical Track
Shuzheng Gao Harbin institute of technology, Hongyu Zhang The University of Newcastle, Cuiyun Gao Harbin Institute of Technology, Chaozheng Wang Harbin Institute of Technology
11:45
7m
Talk
PCR-Chain: Partial Code Reuse Assisted by Hierarchical Chaining of Prompts on Frozen Copilot
DEMO - Demonstrations
Qing Huang School of Computer Information Engineering, Jiangxi Normal University, Jiahui Zhu School of Computer Information Engineering, Jiangxi Normal University, Zhilong Li School of Computer Information Engineering, Jiangxi Normal University, Zhenchang Xing , Changjing Wang School of Computer Information Engineering, Jiangxi Normal University, Xiwei (Sherry) Xu CSIRO’s Data61
11:52
7m
Talk
Towards Learning Generalizable Code Embeddings using Task-agnostic Graph Convolutional Networks
Journal-First Papers
Zishuo Ding Concordia University, Heng Li Polytechnique Montréal, Weiyi Shang University of Waterloo, Tse-Hsun (Peter) Chen Concordia University
12:00
7m
Talk
deGraphCS: Embedding Variable-based Flow Graph for Neural Code Search
Journal-First Papers
Chen Zeng National University of Defense Technology, Yue Yu College of Computer, National University of Defense Technology, Changsha 410073, China, Shanshan Li National University of Defense Technology, Xin Xia Huawei, Wang Zhiming National University of Defense Technology, Mingyang Geng National University of Defense Technology, Linxiao Bai National University of Defense Technology, Wei Dong School of Computer, National University of Defense Technology, China, Liao Xiangke National University of Defense Technology
12:07
7m
Talk
CodeS: Towards Code Model Generalization Under Distribution Shift
NIER - New Ideas and Emerging Results
Qiang Hu University of Luxembourg, Yuejun GUo University of Luxembourg, Xiaofei Xie Singapore Management University, Maxime Cordy University of Luxembourg, Luxembourg, Lei Ma University of Alberta, Mike Papadakis University of Luxembourg, Luxembourg, Yves Le Traon University of Luxembourg, Luxembourg
12:15
7m
Talk
Towards using Few-Shot Prompt Learning for Automating Model Completion
NIER - New Ideas and Emerging Results
Meriem Ben Chaaben Université de Montréal, DIRO, Lola Burgueño University of Malaga, Houari Sahraoui Université de Montréal