Write a Blog >>
ICSE 2021
Mon 17 May - Sat 5 June 2021

Deep learning models, like traditional software systems, provide a large number of configuration options. A deep learning model can be configured with different hyperparameters and neural architectures. Recently, AutoML (Automated Machine Learning) has been widely adopted to automate the model training by systematically exploring diverse configurations. However, current AutoML approaches do not take into consideration the computational constraints imposed by various resources such as the available memory, computing power of devices, or execution time. The training with non-conforming configurations could lead to many failed AutoML trial jobs or inappropriate models, which causes significant resource waste and severely slows down development productivity.

In this paper, we propose DnnSAT, a resource-guided AutoML approach for deep learning models to help existing AutoML tools efficiently reduce the configuration space ahead of time. DnnSAT can speed up the search process and achieve equal or even better model learning performance because it excludes trial jobs not satisfying the constraints and saves resources for more trials. We formulate the resource-guided configuration space reduction as a constraint satisfaction problem. DnnSAT includes a unified analytic cost model to construct common constraints with respect to the model weight size, number of floating-point operations, model inference time, and GPU memory consumption. It then utilizes an SMT solver to obtain the satisfying configurations of hyperparameters and neural architectures. Our evaluation results demonstrate the effectiveness of DnnSAT in accelerating state-of-the-art AutoML methods (Hyperparameter Optimization and Neural Architecture Search) with an average speedup of 1.19X to 3.95X on public benchmarks. We believe that DnnSAT can make AutoML more practical in a real-world environment with constrained resources.

Fri 28 May

Displayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

15:05 - 16:05
4.3.4. Configuration of Software Systems: OptimizationJournal-First Papers / Technical Track at Blended Sessions Room 4 +12h
Chair(s): Sergio Segura Universidad de Sevilla
15:05
20m
Paper
Resource-Guided Configuration Space Reduction for Deep Learning ModelsTechnical Track
Technical Track
Yanjie Gao Microsoft Research, Yonghao Zhu Microsoft Research, Hongyu Zhang The University of Newcastle, Haoxiang Lin Microsoft Research, Mao Yang Microsoft Research
Link to publication DOI Pre-print Media Attached
15:25
20m
Paper
ConfigMiner: Identifying the Appropriate Configuration Options for Config-related User Questions by Mining Online ForumsJournal-First
Journal-First Papers
Mohammed Sayagh ETS Montreal, University of Quebec, Ahmed E. Hassan School of Computing, Queen's University
Link to publication DOI Pre-print
15:45
20m
Paper
Whence to Learn? Transferring Knowledge in Configurable Systems using BEETLEJournal-First
Journal-First Papers
Rahul Krishna Columbia University, USA, Vivek Nair Facebook, USA, Pooyan Jamshidi University of South Carolina, Tim Menzies North Carolina State University, USA
Link to publication DOI Pre-print Media Attached

Sat 29 May

Displayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

03:05 - 04:05
4.3.4. Configuration of Software Systems: OptimizationJournal-First Papers / Technical Track at Blended Sessions Room 4
03:05
20m
Paper
Resource-Guided Configuration Space Reduction for Deep Learning ModelsTechnical Track
Technical Track
Yanjie Gao Microsoft Research, Yonghao Zhu Microsoft Research, Hongyu Zhang The University of Newcastle, Haoxiang Lin Microsoft Research, Mao Yang Microsoft Research
Link to publication DOI Pre-print Media Attached
03:25
20m
Paper
ConfigMiner: Identifying the Appropriate Configuration Options for Config-related User Questions by Mining Online ForumsJournal-First
Journal-First Papers
Mohammed Sayagh ETS Montreal, University of Quebec, Ahmed E. Hassan School of Computing, Queen's University
Link to publication DOI Pre-print
03:45
20m
Paper
Whence to Learn? Transferring Knowledge in Configurable Systems using BEETLEJournal-First
Journal-First Papers
Rahul Krishna Columbia University, USA, Vivek Nair Facebook, USA, Pooyan Jamshidi University of South Carolina, Tim Menzies North Carolina State University, USA
Link to publication DOI Pre-print Media Attached