CC 2023
Sat 25 - Sun 26 February 2023 Montréal, Canada
Sat 25 Feb 2023 11:40 - 12:00 at St. Laurent 3 - Scheduling & Tuning Chair(s): Chen Ding

Loop interchange is an important code optimization that improves data locality and extracts parallelism. While previous research in compilers has tried to automate the selection of which loops to interchange, existing methods have an important limitation. They use less precise machine models. This is mainly because developing a model to predict whether to interchange two loops is challenging since such a prediction depends on many factors.
While state-of-the-art methods try to avoid this problem by using a deep-learning based cost model, they suffer from another limitation. They scale proportionally with the number of loop levels of a given loop nest. This is mainly because they use the model to evaluate all the possible loop interchanges (or a subset of the most promising ones).
In this paper, we propose a novel deep-learning model for loop interchange that addresses the previous limitations. It takes a code representation as input and predicts the best pair of loops to interchange. Compared to state-of-the-art deep-learning based cost models,
it requires constant time to predict the best loop interchange.
This is in contrast to state-of-the-art deep learning models that are used to evaluate all the loop pairs and then pick the best one.
The proposed model is the first deep learning model that requires a constant time to predict the best loops to interchange. The model is implemented and evaluated in the Tiramisu compiler, a state-of-the-art polyhedral compiler.
We evaluate the proposed model on a benchmark of Tiramisu programs and show an accuracy of 78.57% for 1-shot and 85.71% for 2-shots. Experiments show that our model outperforms the cost model currently used by the Tiramisu compiler by 8.57% in terms of 1-shot accuracy, and 5.71% with 2-shots accuracy, while at the same time reducing the total execution time needed for predicting the best pair of loops to interchange.

Sat 25 Feb

Displayed time zone: Eastern Time (US & Canada) change

11:20 - 12:20
Scheduling & TuningResearch Papers at St. Laurent 3
Chair(s): Chen Ding University of Rochester
11:20
20m
Talk
Efficiently Learning Locality Optimizations by Decomposing Transformation Domains
Research Papers
Tharindu Patabandi University of Utah, Mary Hall University of Utah
DOI
11:40
20m
Talk
A Deep Learning Model for Loop Interchange
Research Papers
Lina Mezdour NYU Abu Dhabi; ESI, Khadidja Kadem NYU Abu Dhabi; ESI, Massinissa Merouani NYU Abu Dhabi, Amina Selma Haichour ESI, Saman Amarasinghe Massachusetts Institute of Technology, Riyadh Baghdadi NYU Abu Dhabi
DOI
12:00
20m
Talk
(De/Re)-Compositions Expressed Systematically via MDH-Based Schedules
Research Papers
Ari Rasch University of Muenster, Richard Schulze University of Muenster, Denys Shabalin Google, Anne Elster NTNU, Sergei Gorlatch University of Muenster, Mary Hall University of Utah
DOI