Write a Blog >>
ICSE 2023
Sun 14 - Sat 20 May 2023 Melbourne, Australia
Fri 19 May 2023 17:00 - 17:15 at Meeting Room 103 - Pre-trained and few shot learning for SE Chair(s): Yiling Lou

Large-scale pre-trained models such as CodeBERT, GraphCodeBERT have earned widespread attention from both academia and industry. Attributed to the superior ability in code representation, they have been further applied in multiple downstream tasks such as clone detection, code search and code translation. However, it is also observed that these state-of-the-art pre-trained models are susceptible to adversarial attacks. The performance of these pre-trained models drops significantly with simple perturbations such as renaming variable names. This weakness may be inherited by their downstream models and thereby amplified at an unprecedented scale. To this end, we propose an approach namely ContraBERT that aims to improve the robustness of pre-trained models via contrastive learning. Specifically, we design nine kinds of semantically equivalent/close mutations on the programming language (PL) and natural language (NL) data to construct different variants. Furthermore, we continue to train the existing pre-trained models by masked language modeling (MLM) and contrastive pre-training task on the original samples with their mutated variants to enhance the robustness of the model. The extensive experiments demonstrate that ContraBERT can effectively improve the robustness of the existing pre-trained models. Further study also confirms that these robustness-enhanced models provide improvements as compared to original models over four popular downstream tasks.

Fri 19 May

Displayed time zone: Hobart change

15:45 - 17:15
Pre-trained and few shot learning for SETechnical Track / Journal-First Papers at Meeting Room 103
Chair(s): Yiling Lou Fudan University
15:45
15m
Talk
On the validity of pre-trained transformers for natural language processing in the software engineering domain
Journal-First Papers
Alexander Trautsch University of Passau, Julian von der Mosel , Steffen Herbold University of Passau
16:00
15m
Talk
Automating Code-Related Tasks Through Transformers: The Impact of Pre-training
Technical Track
Rosalia Tufano Università della Svizzera Italiana, Luca Pascarella ETH Zurich, Gabriele Bavota Software Institute, USI Università della Svizzera italiana
16:15
15m
Talk
Log Parsing with Prompt-based Few-shot Learning
Technical Track
Van-Hoang Le The University of Newcastle, Hongyu Zhang The University of Newcastle
Pre-print
16:30
15m
Talk
Retrieval-Based Prompt Selection for Code-Related Few-Shot Learning
Technical Track
Noor Nashid University of British Columbia, Mifta Sintaha University of British Columbia, Ali Mesbah University of British Columbia (UBC)
Pre-print
16:45
15m
Paper
An Empirical Study of Pre-Trained Model Reuse in the Hugging Face Deep Learning Model Registry
Technical Track
Wenxin Jiang Purdue University, Nicholas Synovic Loyola University Chicago, Matt Hyatt Loyola University Chicago, Taylor R. Schorlemmer Purdue University, Rohan Sethi Loyola University Chicago, Yung-Hsiang Lu Purdue University, George K. Thiruvathukal Loyola University Chicago and Argonne National Laboratory, James C. Davis Purdue University
Pre-print
17:00
15m
Talk
ContraBERT: Enhancing Code Pre-trained Models via Contrastive Learning
Technical Track
Shangqing Liu Nanyang Technological University, bozhi wu Nanyang Technological University, Xiaofei Xie Singapore Management University, Guozhu Meng Institute of Information Engineering, Chinese Academy of Sciences, Yang Liu Nanyang Technological University