Write a Blog >>
ICPC 2022
Mon 16 - Tue 17 May 2022
co-located with ICSE 2022
Sun 15 May 2022 22:37 - 22:44 at ICPC room - Session 2: Program Representation 1 Chair(s): Fatemeh Hendijani Fard

Pre-trained neural Language Models (PTLM), such as CodeBERT, are recently used in software engineering as models pre-trained on large source code corpora. Their knowledge is transferred to downstream tasks (e.g. code clone detection) via fine-tuning. In natural language processing (NLP), other alternatives for transferring the knowledge of PTLMs are explored through using adapters, compact, parameter efficient modules inserted in the layers of the PTLM. Although adapters are known to facilitate adapting to many downstream tasks compared to fine-tuning the model that require retraining all of the models’ parameters– which owes to the adapters’ plug and play nature and being parameter efficient–their usage in software engineering is not explored.

Here, we explore the knowledge transfer using adapters and based on the Naturalness Hypothesis proposed by Hindle et. al [12]. Thus, studying the bimodality of adapters for two tasks of cloze test and code clone detection, compared to their benchmarks from the CodeXGLUE platform. These adapters are trained using programming languages and are inserted in a PTLM that is pre-trained on English corpora (N-PTLM). Three programming languages, C/C++, Python, and Java, are studied along with extensive experiments on the best setup used for adapters. Improving the results of the N-PTLM confirms the success of the adapters in knowledge transfer to software engineering, which sometimes are in par with or exceed the results of a PTLM trained on source code; while being more efficient in terms of the number of parameters, memory usage, and inference time. Our results can open new directions to build smaller models for more software engineering tasks. We open source all the scripts and the trained adapters.

Sun 15 May

Displayed time zone: Eastern Time (US & Canada) change

22:30 - 23:20
Session 2: Program Representation 1Research at ICPC room
Chair(s): Fatemeh Hendijani Fard University of British Columbia
22:30
7m
Talk
Zero-Shot Program Representation Learning
Research
Nan Cui Shanghai Jiao Tong University, Yuze Jiang Shanghai Jiao Tong University, Xiaodong Gu Shanghai Jiao Tong University, China, Beijun Shen School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University
Pre-print Media Attached
22:37
7m
Talk
On The Cross-Modal Transfer from Natural Language to Code through Adapter Modules
Research
Divyam Goel Indian Institute of Technology Roorkee, Ramansh Grover Delhi Technological University, Fatemeh Hendijani Fard University of British Columbia
Pre-print Media Attached
22:44
7m
Talk
Self-Supervised Learning of Smart Contract Representations
Research
Shouliang Yang School of Software, Shanghai Jiao Tong University, Beijun Shen School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University, Xiaodong Gu Shanghai Jiao Tong University, China
Pre-print Media Attached
22:51
7m
Talk
An Exploratory Study on Code Attention in BERT
Research
Rishab Sharma University of British Columbia, Fuxiang Chen University of British Columbia, Fatemeh Hendijani Fard University of British Columbia, David Lo Singapore Management University
Pre-print Media Attached
22:58
7m
Talk
Accurate Generation of Trigger-Action Programs with Domain-Adapted Sequence-to-Sequence Learning
Research
Imam Nur Bani Yusuf Singapore Management University, Lingxiao Jiang Singapore Management University, David Lo Singapore Management University
DOI Pre-print Media Attached
23:05
15m
Live Q&A
Q&A-Paper Session 2
Research


Information for Participants
Sun 15 May 2022 22:30 - 23:20 at ICPC room - Session 2: Program Representation 1 Chair(s): Fatemeh Hendijani Fard
Info for room ICPC room:

Click here to go to the room on Midspace