Keeping Pace with Ever-Increasing Data: Towards Continual Learning of Code Intelligence Models
Previous research on code intelligence usually trains a deep learning model on a fixed dataset in an offline manner. However, in real-world scenarios, new code repositories emerge incessantly, and the carried new knowledge is beneficial for providing up-to-date code intelligence service to developers. In this paper, we aim at the following problem: How to enable code intelligence models to continually learn from ever-increasing data? One major challenge here is catastrophic forgetting, meaning that the model can easily forget knowledge learned from previous datasets when learning from the new dataset. To tackle this challenge, we propose REPEAT, a novel method for continual learning of code intelligence models. Specifically, REPEAT addresses the catastrophic forgetting problem with representative exemplars replay and adaptive parameter regularization. The representative exemplars replay component selects informative and diverse exemplars in each dataset and uses them to retrain model periodically. The adaptive parameter regularization component recognizes important parameters in the model and adaptively penalizes their changes to preserve the knowledge learned before. We evaluate the proposed approach on three code intelligence tasks including code summarization, software vulnerability detection, and code clone detection. Extensive experiments demonstrate that REPEAT consistently outperforms baseline methods on all tasks. For example, REPEAT improves conventional fine-tuning method by 5.9%, 20.1%, and 2.0% on code summarization, vulnerability detection and clone detection, respectively.
Wed 17 MayDisplayed time zone: Hobart change
11:00 - 12:30 | AI models for SEJournal-First Papers / Technical Track / DEMO - Demonstrations / NIER - New Ideas and Emerging Results at Level G - Plenary Room 1 Chair(s): Denys Poshyvanyk College of William and Mary | ||
11:00 15mTalk | One Adapter for All Programming Languages? Adapter Tuning for Multilingual Tasks in Software Engineering Technical Track Deze Wang National University of Defense Technology, Boxing Chen , Shanshan Li National University of Defense Technology, Wei Luo , Shaoliang Peng Hunan University, Wei Dong School of Computer, National University of Defense Technology, China, Liao Xiangke National University of Defense Technology | ||
11:15 15mTalk | CCRep: Learning Code Change Representations via Pre-Trained Code Model and Query Back Technical Track Zhongxin Liu Zhejiang University, Zhijie Tang Zhejiang University, Xin Xia Huawei, Xiaohu Yang Zhejiang University Pre-print | ||
11:30 15mTalk | Keeping Pace with Ever-Increasing Data: Towards Continual Learning of Code Intelligence Models Technical Track Shuzheng Gao Harbin institute of technology, Hongyu Zhang The University of Newcastle, Cuiyun Gao Harbin Institute of Technology, Chaozheng Wang Harbin Institute of Technology | ||
11:45 7mTalk | PCR-Chain: Partial Code Reuse Assisted by Hierarchical Chaining of Prompts on Frozen Copilot DEMO - Demonstrations Qing Huang School of Computer Information Engineering, Jiangxi Normal University, Jiahui Zhu School of Computer Information Engineering, Jiangxi Normal University, Zhilong Li School of Computer Information Engineering, Jiangxi Normal University, Zhenchang Xing , Changjing Wang School of Computer Information Engineering, Jiangxi Normal University, Xiwei (Sherry) Xu CSIRO’s Data61 | ||
11:52 7mTalk | Towards Learning Generalizable Code Embeddings using Task-agnostic Graph Convolutional Networks Journal-First Papers Zishuo Ding Concordia University, Heng Li Polytechnique Montréal, Weiyi Shang University of Waterloo, Tse-Hsun (Peter) Chen Concordia University | ||
12:00 7mTalk | deGraphCS: Embedding Variable-based Flow Graph for Neural Code Search Journal-First Papers Chen Zeng National University of Defense Technology, Yue Yu College of Computer, National University of Defense Technology, Changsha 410073, China, Shanshan Li National University of Defense Technology, Xin Xia Huawei, Wang Zhiming National University of Defense Technology, Mingyang Geng National University of Defense Technology, Linxiao Bai National University of Defense Technology, Wei Dong School of Computer, National University of Defense Technology, China, Liao Xiangke National University of Defense Technology | ||
12:07 7mTalk | CodeS: Towards Code Model Generalization Under Distribution Shift NIER - New Ideas and Emerging Results Qiang Hu University of Luxembourg, Yuejun GUo University of Luxembourg, Xiaofei Xie Singapore Management University, Maxime Cordy University of Luxembourg, Luxembourg, Lei Ma University of Alberta, Mike Papadakis University of Luxembourg, Luxembourg, Yves Le Traon University of Luxembourg, Luxembourg | ||
12:15 7mTalk | Towards using Few-Shot Prompt Learning for Automating Model Completion NIER - New Ideas and Emerging Results Meriem Ben Chaaben Université de Montréal, DIRO, Lola Burgueño University of Malaga, Houari Sahraoui Université de Montréal |