Logs generated by large-scale software systems provide crucial information for engineers to understand the system status and diagnose problems of the systems. Log parsing, which converts raw log messages into structured data, is the first step to enabling automated log analytics. Existing log parsers extract the common part as log templates using statistical features. However, these log parsers often fail to identify the correct templates and parameters because: 1) they often overlook the semantic meaning of log messages, and 2) they require domain-specific knowledge for different log datasets. To address the limitations of existing methods, in this paper, we propose LogPPT to capture the patterns of templates using prompt-based few-shot learning. LogPPT utilises a novel prompt tuning method to recognise keywords and parameters based on a few labelled log data. In addition, an adaptive random sampling algorithm is designed to select a small yet diverse training set. We have conducted extensive experiments on 16 public log datasets. The experimental results show that LogPPT is effective and efficient for log parsing.
Fri 19 MayDisplayed time zone: Hobart change
15:45 - 17:15 | Pre-trained and few shot learning for SETechnical Track / Journal-First Papers at Meeting Room 103 Chair(s): Yiling Lou Fudan University | ||
15:45 15mTalk | On the validity of pre-trained transformers for natural language processing in the software engineering domain Journal-First Papers Alexander Trautsch University of Passau, Julian von der Mosel , Steffen Herbold University of Passau | ||
16:00 15mTalk | Automating Code-Related Tasks Through Transformers: The Impact of Pre-training Technical Track Rosalia Tufano Università della Svizzera Italiana, Luca Pascarella ETH Zurich, Gabriele Bavota Software Institute, USI Università della Svizzera italiana | ||
16:15 15mTalk | Log Parsing with Prompt-based Few-shot Learning Technical Track Pre-print | ||
16:30 15mTalk | Retrieval-Based Prompt Selection for Code-Related Few-Shot Learning Technical Track Noor Nashid University of British Columbia, Mifta Sintaha University of British Columbia, Ali Mesbah University of British Columbia (UBC) Pre-print | ||
16:45 15mPaper | An Empirical Study of Pre-Trained Model Reuse in the Hugging Face Deep Learning Model Registry Technical Track Wenxin Jiang Purdue University, Nicholas Synovic Loyola University Chicago, Matt Hyatt Loyola University Chicago, Taylor R. Schorlemmer Purdue University, Rohan Sethi Loyola University Chicago, Yung-Hsiang Lu Purdue University, George K. Thiruvathukal Loyola University Chicago and Argonne National Laboratory, James C. Davis Purdue University Pre-print | ||
17:00 15mTalk | ContraBERT: Enhancing Code Pre-trained Models via Contrastive Learning Technical Track Shangqing Liu Nanyang Technological University, bozhi wu Nanyang Technological University, Xiaofei Xie Singapore Management University, Guozhu Meng Institute of Information Engineering, Chinese Academy of Sciences, Yang Liu Nanyang Technological University |