Write a Blog >>
ICSE 2023
Sun 14 - Sat 20 May 2023 Melbourne, Australia
Fri 19 May 2023 15:45 - 16:00 at Meeting Room 103 - Pre-trained and few shot learning for SE Chair(s): Yiling Lou

Transformers are the current state-of-the-art of natural language processing in many domains and are using traction within software engineering research as well. Such models are pre-trained on large amounts of data, usually from the general domain. However, we only have a limited understanding regarding the validity of transformers within the software engineering domain, i.e., how good such models are at understanding words and sentences within a software engineering context and how this improves the state-of-the-art. Within this article, we shed light on this complex, but crucial issue. We compare BERT transformer models trained with software engineering data with transformers based on general domain data in multiple dimensions: their vocabulary, their ability to understand which words are missing, and their performance in classification tasks. Our results show that for tasks that require understanding of the software engineering context, pre-training with software engineering data is valuable, while general domain models are sufficient for general language understanding, also within the software engineering domain.

Fri 19 May

Displayed time zone: Hobart change

15:45 - 17:15
Pre-trained and few shot learning for SETechnical Track / Journal-First Papers at Meeting Room 103
Chair(s): Yiling Lou Fudan University
15:45
15m
Talk
On the validity of pre-trained transformers for natural language processing in the software engineering domain
Journal-First Papers
Alexander Trautsch University of Passau, Julian von der Mosel , Steffen Herbold University of Passau
16:00
15m
Talk
Automating Code-Related Tasks Through Transformers: The Impact of Pre-training
Technical Track
Rosalia Tufano Università della Svizzera Italiana, Luca Pascarella ETH Zurich, Gabriele Bavota Software Institute, USI Università della Svizzera italiana
16:15
15m
Talk
Log Parsing with Prompt-based Few-shot Learning
Technical Track
Van-Hoang Le The University of Newcastle, Hongyu Zhang The University of Newcastle
Pre-print
16:30
15m
Talk
Retrieval-Based Prompt Selection for Code-Related Few-Shot Learning
Technical Track
Noor Nashid University of British Columbia, Mifta Sintaha University of British Columbia, Ali Mesbah University of British Columbia (UBC)
Pre-print
16:45
15m
Paper
An Empirical Study of Pre-Trained Model Reuse in the Hugging Face Deep Learning Model Registry
Technical Track
Wenxin Jiang Purdue University, Nicholas Synovic Loyola University Chicago, Matt Hyatt Loyola University Chicago, Taylor R. Schorlemmer Purdue University, Rohan Sethi Loyola University Chicago, Yung-Hsiang Lu Purdue University, George K. Thiruvathukal Loyola University Chicago and Argonne National Laboratory, James C. Davis Purdue University
Pre-print
17:00
15m
Talk
ContraBERT: Enhancing Code Pre-trained Models via Contrastive Learning
Technical Track
Shangqing Liu Nanyang Technological University, bozhi wu Nanyang Technological University, Xiaofei Xie Singapore Management University, Guozhu Meng Institute of Information Engineering, Chinese Academy of Sciences, Yang Liu Nanyang Technological University