Write a Blog >>
ICSE 2021
Mon 17 May - Sat 5 June 2021

This program is tentative and subject to change.

Deep learning models, like traditional software systems, provide a large number of configuration options. A deep learning model can be configured with different hyperparameters and neural architectures. Recently, AutoML (Automated Machine Learning) has been widely adopted to automate the model training by systematically exploring diverse configurations. However, current AutoML approaches do not take into consideration the computational constraints imposed by various resources such as the available memory, computing power of devices, or execution time. The training with non-conforming configurations could lead to many failed AutoML trial jobs or inappropriate models, which causes significant resource waste and severely slows down development productivity.

In this paper, we propose DnnSAT, a resource-guided AutoML approach for deep learning models to help existing AutoML tools efficiently reduce the configuration space ahead of time. DnnSAT can speed up the search process and achieve equal or even better model learning performance because it excludes trial jobs not satisfying the constraints and saves resources for more trials. We formulate the resource-guided configuration space reduction as a constraint satisfaction problem. DnnSAT includes a unified analytic cost model to construct common constraints with respect to the model weight size, number of floating-point operations, model inference time, and GPU memory consumption. It then utilizes an SMT solver to obtain the satisfying configurations of hyperparameters and neural architectures. Our evaluation results demonstrate the effectiveness of DnnSAT in accelerating state-of-the-art AutoML methods (Hyperparameter Optimization and Neural Architecture Search) with an average speedup of 1.19X to 3.95X on public benchmarks. We believe that DnnSAT can make AutoML more practical in a real-world environment with constrained resources.

This program is tentative and subject to change.

Fri 28 May
Times are displayed in time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

15:05 - 16:05
4.3.4. Configuration of Software Systems: OptimizationJournal-First Papers / Technical Track at Blended Sessions Room 4 +12h
Chair(s): Sergio SeguraUniversidad de Sevilla
15:05
20m
Paper
Resource-Guided Configuration Space Reduction for Deep Learning ModelsTechnical Track
Technical Track
Yanjie GaoMicrosoft Research, Yonghao ZhuMicrosoft Research, Hongyu ZhangThe University of Newcastle, Haoxiang LinMicrosoft Research, Mao YangMicrosoft Research
Link to publication DOI Pre-print
15:25
20m
Paper
ConfigMiner: Identifying the Appropriate Configuration Options for Config-related User Questions by Mining Online ForumsJournal-First
Journal-First Papers
Mohammed SayaghETS Montreal, University of Quebec, Ahmed E. HassanSchool of Computing, Queen's University
Link to publication DOI Pre-print
15:45
20m
Paper
Whence to Learn? Transferring Knowledge in Configurable Systems using BEETLEJournal-First
Journal-First Papers
Rahul KrishnaColumbia University, USA, Vivek NairFacebook, USA, Pooyan JamshidiUniversity of South Carolina, Tim MenziesNorth Carolina State University, USA
Link to publication DOI Pre-print

Sat 29 May
Times are displayed in time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

03:05 - 04:05
4.3.4. Configuration of Software Systems: OptimizationJournal-First Papers / Technical Track at Blended Sessions Room 4
03:05
20m
Paper
Resource-Guided Configuration Space Reduction for Deep Learning ModelsTechnical Track
Technical Track
Yanjie GaoMicrosoft Research, Yonghao ZhuMicrosoft Research, Hongyu ZhangThe University of Newcastle, Haoxiang LinMicrosoft Research, Mao YangMicrosoft Research
Link to publication DOI Pre-print
03:25
20m
Paper
ConfigMiner: Identifying the Appropriate Configuration Options for Config-related User Questions by Mining Online ForumsJournal-First
Journal-First Papers
Mohammed SayaghETS Montreal, University of Quebec, Ahmed E. HassanSchool of Computing, Queen's University
Link to publication DOI Pre-print
03:45
20m
Paper
Whence to Learn? Transferring Knowledge in Configurable Systems using BEETLEJournal-First
Journal-First Papers
Rahul KrishnaColumbia University, USA, Vivek NairFacebook, USA, Pooyan JamshidiUniversity of South Carolina, Tim MenziesNorth Carolina State University, USA
Link to publication DOI Pre-print
Hide past events

Information for Participants
Info for Blended Sessions Room 4: