Code Synthesis for Sparse Tensor Format Conversion and Optimization
Many scientific applications compute using sparse data and store that data in a variety of sparse formats because each format has unique space and performance benefits. Optimizing applications that use sparse data involves translating the sparse data into the chosen format and transforming the computation to iterate over that format. This paper presents a formal definition of sparse tensor formats and an automated approach to synthesize the transformation between formats.
This approach is unique in that it supports ordering constraints not supported by other approaches and synthesizes the transformation code in a high-level intermediate representation suitable for applying composable transformations such as loop fusion and temporary storage reduction.
We demonstrate that the synthesized code for COO to CSR with optimizations is 2.85x faster than TACO, Intel MKL, and SPARSKIT while the more complex COO to DIA is 1.4x slower than TACO but faster than SPARSKIT and Intel MKL using the geometric average of execution time.
Mon 27 FebDisplayed time zone: Eastern Time (US & Canada) change
10:00 - 12:00
|Code Generation for In-Place Stencils|
Mohamed Essadki ONERA, Bertrand Michel ONERA, Bruno Maugars ONERA, Oleksandr Zinenko Google, Nicolas Vasilache Google, Albert Cohen GoogleDOI
|To Pack or Not to Pack: A Generalized Packing Analysis and Transformation|
Caio Salvador Rohwedder University of Alberta, Nathan Henderson University of Alberta, João P. L. De Carvalho University of Alberta, Yufei Chen University of Alberta, Jose Nelson Amaral University of AlbertaDOI
|Code Synthesis for Sparse Tensor Format Conversion and Optimization|
Tobi Popoola Boise State University, Tuowen Zhao University of Utah, Aaron St. George Boise State University, Kalyan Bhetwal Boise State University, Michelle Strout University of Arizona, Mary Hall University of Utah, Catherine R. M. Olschanowsky Boise State UniversityDOI
|Looplets: A Language for Structured Coiteration|
Willow Ahrens Massachusetts Institute of Technology, Daniel Donenfeld Massachusetts Institute of Technology, Fredrik Kjolstad Stanford University, Saman Amarasinghe Massachusetts Institute of TechnologyDOI