ICSE 2024
Fri 12 - Sun 21 April 2024 Lisbon, Portugal

While the majority of existing pre-trained models from code learn source code features such as code tokens and abstract syntax trees, there are some other works that focus on learning from compiler intermediate representations (IRs). Existing IR-based models typically utilize IR features such as instructions, control and data flow graphs (CDFGs), call graphs, etc. However, these methods confuse variable nodes and instruction nodes in a CDFG and fail to distinguish different types of flows, and the neural networks they use fail to capture long-distance dependencies and have over-smoothing and over-squashing problems. To address these weaknesses, we propose FAIR, a Flow type-Aware pre-trained model for IR that involves employing (1) a novel input representation of IR programs; (2) Graph Transformer to address over-smoothing, over-squashing and long-dependencies problems; and (3) five pre-training tasks that we specifically propose to enable FAIR to learn the semantics of IR tokens, flow type information, and the overall representation of IR. Experimental results show that FAIR can achieve state-of-the-art results on four code-related downstream tasks.

Wed 17 Apr

Displayed time zone: Lisbon change

11:00 - 12:30
Language Models and Generated Code 1Research Track / New Ideas and Emerging Results at Maria Helena Vieira da Silva
Chair(s): Yiling Lou Fudan University
11:00
15m
Talk
Modularizing while Training: a New Paradigm for Modularizing DNN ModelsACM SIGSOFT Distinguished Paper Award
Research Track
Binhang Qi Beihang University, Hailong Sun Beihang University, Hongyu Zhang Chongqing University, Ruobing Zhao Beihang University, Xiang Gao Beihang University
Pre-print
11:15
15m
Research paper
KnowLog: Knowledge Enhanced Pre-trained Language Model for Log Understanding
Research Track
Lipeng Ma Fudan University, Weidong Yang Fudan University, Bo Xu Donghua University, Sihang Jiang Fudan University, Ben Fei Fudan University, Jiaqing Liang Fudan University, Mingjie Zhou Fudan University, Yanghua Xiao Fudan University
11:30
15m
Talk
FAIR: Flow Type-Aware Pre-Training of Compiler Intermediate RepresentationsACM SIGSOFT Distinguished Paper Award
Research Track
Changan Niu Software Institute, Nanjing University, Chuanyi Li Nanjing University, Vincent Ng Human Language Technology Research Institute, University of Texas at Dallas, Richardson, TX 75083-0688, David Lo Singapore Management University, Bin Luo Nanjing University
Pre-print
11:45
15m
Talk
Unveiling Memorization in Code Models
Research Track
Zhou Yang Singapore Management University, Zhipeng Zhao Singapore Management University, Chenyu Wang Singapore Management University, Jieke Shi Singapore Management University, Dongsun Kim Kyungpook National University, DongGyun Han Royal Holloway, University of London, David Lo Singapore Management University
12:00
15m
Talk
Code Search is All You Need? Improving Code Suggestions with Code SearchACM SIGSOFT Distinguished Paper Award
Research Track
Junkai Chen Zhejiang University, Xing Hu Zhejiang University, Zhenhao Li Concordia University, Cuiyun Gao Harbin Institute of Technology, Xin Xia Huawei Technologies, David Lo Singapore Management University
12:15
7m
Talk
Expert Monitoring: Human-Centered Concept Drift Detection in Machine Learning Operations
New Ideas and Emerging Results
Joran Leest Vrije Universiteit Amsterdam, Claudia Raibulet Vrije Universiteit Amsterdam, Ilias Gerostathopoulos Vrije Universiteit Amsterdam, Patricia Lago Vrije Universiteit Amsterdam
Pre-print