Write a Blog >>
ICSE 2023
Sun 14 - Sat 20 May 2023 Melbourne, Australia
Fri 19 May 2023 16:45 - 17:00 at Meeting Room 103 - Pre-trained and few shot learning for SE Chair(s): Yiling Lou

Deep Neural Networks (DNNs) are being adopted as components in software systems. Creating and specializing DNNs from scratch has grown increasingly difficult as state-of-the-art architectures grow more complex. Following the path of traditional software engineering, machine learning engineers have begun to reuse large-scale pre-trained models (PTMs) and fine-tune these models for downstream tasks. Prior works have studied reuse practices for traditional software packages to guide software engineers towards better package maintenance and dependency management. We lack a similar foundation of knowledge to guide behaviors in pre-trained model ecosystems.

In this work, we present the first empirical investigation of PTM reuse. We interviewed 12 practitioners from the most popular PTM ecosystem, Hugging Face, to learn the practices and challenges of PTM reuse. From this data, we model the decision-making process for PTM reuse. Based on the identified practices, we describe useful attributes for model reuse, including provenance, reproducibility, and portability. Three challenges for PTM reuse are missing attributes, discrepancies between claimed and actual performance, and model risks. We substantiate these identified challenges with systematic measurements in the Hugging Face ecosystem. Our work informs future directions on optimizing deep learning ecosystems by automated measuring useful attributes and potential attacks, and envision future research on infrastructure and standardization for model registries.

Fri 19 May

Displayed time zone: Hobart change

15:45 - 17:15
Pre-trained and few shot learning for SETechnical Track / Journal-First Papers at Meeting Room 103
Chair(s): Yiling Lou Fudan University
15:45
15m
Talk
On the validity of pre-trained transformers for natural language processing in the software engineering domain
Journal-First Papers
Alexander Trautsch University of Passau, Julian von der Mosel , Steffen Herbold University of Passau
16:00
15m
Talk
Automating Code-Related Tasks Through Transformers: The Impact of Pre-training
Technical Track
Rosalia Tufano Università della Svizzera Italiana, Luca Pascarella ETH Zurich, Gabriele Bavota Software Institute, USI Università della Svizzera italiana
16:15
15m
Talk
Log Parsing with Prompt-based Few-shot Learning
Technical Track
Van-Hoang Le The University of Newcastle, Hongyu Zhang The University of Newcastle
Pre-print
16:30
15m
Talk
Retrieval-Based Prompt Selection for Code-Related Few-Shot Learning
Technical Track
Noor Nashid University of British Columbia, Mifta Sintaha University of British Columbia, Ali Mesbah University of British Columbia (UBC)
Pre-print
16:45
15m
Paper
An Empirical Study of Pre-Trained Model Reuse in the Hugging Face Deep Learning Model Registry
Technical Track
Wenxin Jiang Purdue University, Nicholas Synovic Loyola University Chicago, Matt Hyatt Loyola University Chicago, Taylor R. Schorlemmer Purdue University, Rohan Sethi Loyola University Chicago, Yung-Hsiang Lu Purdue University, George K. Thiruvathukal Loyola University Chicago and Argonne National Laboratory, James C. Davis Purdue University
Pre-print
17:00
15m
Talk
ContraBERT: Enhancing Code Pre-trained Models via Contrastive Learning
Technical Track
Shangqing Liu Nanyang Technological University, bozhi wu Nanyang Technological University, Xiaofei Xie Singapore Management University, Guozhu Meng Institute of Information Engineering, Chinese Academy of Sciences, Yang Liu Nanyang Technological University