CAIN 2023
Mon 15 - Sat 20 May 2023 Melbourne, Australia
co-located with ICSE 2023
Mon 15 May 2023 17:45 - 18:05 at Virtual - Zoom for CAIN - Data & Model Optimization Chair(s): Justus Bogner

Modern AI practices all strive towards the same goal: better results. In the context of deep learning, the term “results” often refers to the achieved accuracy on a competitive problem set. In this paper, we adopt an idea from the emerging field of Green AI to consider energy consumption as a metric of equal importance to accuracy and to reduce any irrelevant tasks or energy usage. We examine the training stage of the deep learning pipeline from a sustainability perspective, through the study of hyperparameter tuning strategies and the model complexity, two factors vastly impacting the overall pipeline’s energy consumption. First, we investigate the effectiveness of grid search, random search and Bayesian optimisation during hyperparameter tuning, and we find that Bayesian optimisation significantly dominates the other strategies. Furthermore, we analyse the architecture of convolutional neural networks with the energy consumption of three prominent layer types: convolutional, linear and ReLU layers. The results show that convolutional layers are the most computationally expensive by a strong margin. Additionally, we observe diminishing returns in accuracy for more energy-hungry models. The overall energy consumption of training can be halved by reducing the network complexity. In conclusion, we highlight innovative and promising energy-efficient practices for training deep learning models. To expand the application of Green AI, we advocate for a shift in the design of deep learning models, by considering the trade-off between energy efficiency and accuracy.

Mon 15 May

Displayed time zone: Hobart change

17:15 - 18:45
Data & Model OptimizationPapers / Posters / Industrial Talks at Virtual - Zoom for CAIN
Chair(s): Justus Bogner University of Stuttgart

Click here to Join us over zoom

Click here to watch the session recording on Youtube

17:15
15m
Short-paper
Automatically Resolving Data Source Dependency Hell in Large Scale Data Science Projects
Papers
Laurent Boué Microsoft, Pratap Kunireddy Microsoft, Pavle Subotic Microsoft Azure
Pre-print
17:30
15m
Short-paper
Dataflow graphs as complete causal graphs
Papers
Andrei Paleyes Department of Computer Science and Technology, Univesity of Cambridge, Siyuan Guo Max Planck Institute for Intelligent Systems, Bernhard Schölkopf MPI Tuebingen, Neil D. Lawrence Department of Computer Science and Technology, Univesity of Cambridge
Pre-print
17:45
20m
Long-paper
Uncovering Energy-Efficient Practices in Deep Learning Training: Preliminary Steps Towards Green AIDistinguished paper Award Candidate
Papers
Tim Yarally Delft University of Technology, Luís Cruz Delft University of Technology, Daniel Feitosa University of Groningen, June Sallou Delft University of Technology, Arie van Deursen Delft University of Technology
Pre-print
18:05
15m
Short-paper
Prevalence of Code Smells in Reinforcement Learning Projects
Papers
Nicolás Cardozo Universidad de los Andes, Ivana Dusparic Trinity College Dublin, Ireland, Christian Cabrera Department of Computer Science and Technology, Univesity of Cambridge
Pre-print Media Attached
18:20
20m
Long-paper
Automotive Perception Software Development: An Empirical Investigation into Data, Annotation, and Ecosystem Challenges
Papers
Hans-Martin Heyn University of Gothenburg & Chalmers University of Technology, Khan Mohammad Habibullah University of Gothenburg, Eric Knauss Chalmers | University of Gothenburg, Jennifer Horkoff Chalmers and the University of Gothenburg, Markus Borg CodeScene, Alessia Knauss Zenseact AB, Polly Jing Li Kognic AB
Pre-print