CC 2025
Sat 1 - Sun 2 March 2025
Sat 1 Mar 2025 14:00 - 14:30 at Acacia A - Machine Learning and PL I Chair(s): Sara Achour

Data flow analysis is fundamental to modern program optimization and verification, serving as a critical foundation for compiler transformations. As machine learning increasingly drives compiler tasks, the need for models that can implicitly understand and correctly reason about data flow properties becomes crucial for maintaining soundness. State-of-the-art machine learning methods, especially graph neural networks (GNNs), face challenges in generalizing beyond training scenarios due to their limited ability to perform large propagations. We present DFA-Net, a neural network architecture tailored for compilers that systematically generalize. It emulates the reasoning process of compilers, facilitating the generalization of data flow analyses from simple to complex programs. The architecture decomposes data flow analyses into specialized neural networks for initialization, transfer, and meet operations, explicitly incorporating compiler-specific knowledge into the model design. DFA-Net introduces robust ML-enabled compiler tasks, demonstrating that compiler-specific neural architectures can generalize data flow analyses. DFA-Net demonstrates superior performance over traditional GNNs in data flow analysis, achieving F1 scores of 0.761 versus 0.009 for data dependencies and 0.989 versus 0.196 for dominators at high complexity levels, while maintaining perfect scores for liveness and reachability analyses where GNNs struggle significantly.

Sat 1 Mar

Displayed time zone: Pacific Time (US & Canada) change

14:00 - 15:30
Machine Learning and PL IMain Conference at Acacia A
Chair(s): Sara Achour Stanford University
14:00
30m
Talk
DFA-Net: A Compiler-Specific Neural Architecture for Robust Generalization in Data Flow Analyses
Main Conference
Alexander Brauckmann University of Edinburgh, Anderson Faustino da Silva State University of Maringá, Jeronimo Castrillon TU Dresden, Germany, Hugh Leather Meta AI Research
14:30
30m
Talk
Finding Missed Code Size Optimizations in Compilers using Large Language Models
Main Conference
15:00
30m
Talk
LLM Compiler: Foundation Language Models for Compiler Optimization
Main Conference
Chris Cummins Meta, Volker Seeker Meta AI Research, Dejan Grubisic Meta, Baptiste Rozière Meta, Jonas Gehring Meta, Gabriel Synnaeve Meta, Hugh Leather Meta AI Research
:
:
:
: