FSE 2025
Mon 23 - Fri 27 June 2025 Trondheim, Norway
co-located with ISSTA 2025
Tue 24 Jun 2025 17:20 - 17:40 at Aurora A - Fairness and Green Chair(s): Aldeida Aleti

Language models of code have demonstrated remarkable performance across various software engineering and source code analysis tasks. However, their demanding computational resource requirements and consequential environmental footprint remain as significant challenges. This work introduces ALPINE, an adaptive programming language-agnostic pruning technique designed to substantially reduce the computational overhead of these models. The proposed method offers a pluggable layer that can be integrated with all Transformer-based models. With ALPINE, input sequences undergo adaptive compression throughout the pipeline, reaching a size that is up to times 3 less their initial size, resulting in significantly reduced computational load. Our experiments on two software engineering tasks, defect prediction and code clone detection across three language models CodeBERT, GraphCodeBERT, and UniXCoder show that ALPINE achieves up to a 50% reduction in FLOPs, a 58.1% decrease in memory footprint, and a 28.1% improvement in throughput on average. This led to a reduction in CO2 by up to 44.85%. Importantly, it achieves a reduction in computation resources while maintaining up to 98.1% of the original predictive performance. These findings highlight the potential of ALPINE in making language models of code more resource-efficient and accessible while preserving their performance, contributing to the overall sustainability of their adoption in software development. Also, it sheds light on redundant and noisy information in source code analysis corpora, as shown by the substantial sequence compression achieved by ALPINE.

Tue 24 Jun

Displayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

16:00 - 17:40
Fairness and GreenJournal First / Research Papers / Demonstrations at Aurora A
Chair(s): Aldeida Aleti Monash University
16:00
10m
Talk
MANILA: A Low-Code Application to Benchmark Machine Learning Models and Fairness-Enhancing Methods
Demonstrations
Giordano d'Aloisio University of L'Aquila
Pre-print Media Attached
16:10
20m
Talk
Fairness Testing of Machine Translation Systems
Journal First
Zeyu Sun Institute of Software, Chinese Academy of Sciences, Zhenpeng Chen Nanyang Technological University, Jie M. Zhang King's College London, Dan Hao Peking University
16:30
20m
Talk
Bias behind the Wheel: Fairness Testing of Autonomous Driving Systems
Journal First
Xinyue Li Peking University, Zhenpeng Chen Nanyang Technological University, Jie M. Zhang King's College London, Federica Sarro University College London, Ying Zhang Peking University, Xuanzhe Liu Peking University
16:50
10m
Talk
FAMLEM, the FAst ModuLar Energy Meter at Code Level
Demonstrations
Max Weber Leipzig University, Johannes Dorn Leipzig University, Sven Apel Saarland University, Norbert Siegmund Leipzig University
17:00
20m
Talk
NLP Libraries, Energy Consumption and Runtime - An Empirical Study
Research Papers
Rajrupa Chattaraj Indian Institute of Technology Tirupati, India, Sridhar Chimalakonda Indian Institute of Technology Tirupati
DOI
17:20
20m
Talk
An adaptive language-agnostic pruning method for greener language models for code
Research Papers
Mootez Saad Dalhousie University, José Antonio Hernández López Linköping University, Boqi Chen McGill University, Daniel Varro Linköping University / McGill University, Tushar Sharma Dalhousie University
DOI Pre-print

Information for Participants
Tue 24 Jun 2025 16:00 - 17:40 at Aurora A - Fairness and Green Chair(s): Aldeida Aleti
Info for room Aurora A:

Aurora A is the first room in the Aurora wing.

When facing the main Cosmos Hall, access to the Aurora wing is on the right, close to the side entrance of the hotel.

:
:
:
: