Write a Blog >>
ICSE 2021
Mon 17 May - Sat 5 June 2021

This program is tentative and subject to change.

Deep neural networks (DNNs) are becoming an integral part of most software systems. Previous work has shown that DNNs have bugs. Unfortunately, existing debugging techniques don’t support localizing DNN bugs because of the lack of understanding of model behaviors. The entire DNN model appears as a black box. To address these problems, we propose an approach and a tool that automatically determines whether the model is buggy or not, and identifies the root causes for DNN errors. Our key insight is that historic trends in values propagated between layers can be analyzed to identify faults, and also localize faults. To that end, we first enable dynamic analysis of deep learning applications: by converting it into an imperative representation and alternatively using a callback mechanism. Both mechanisms allows us to insert probes that enable dynamic analysis over the traces produced by the DNN while it is being trained on the training data. We then conduct dynamic analysis over the traces to identify the faulty layer or hyperparameter that causes the error. We propose an algorithm for identifying root causes by capturing any numerical error and monitoring the model during training and finding the relevance of every layer/parameter on the DNN outcome. We have collected a benchmark containing 40 buggy models and patches that contain real errors in deep learning applications from Stack Overflow and GitHub. Our benchmark can be used to evaluate automated debugging tools and repair techniques. We have evaluated our approach using this DNN bug-and-patch benchmark, and the results showed that our approach is much more effective than the existing debugging approach used in the state-of-the-practice Keras library. For 34/40 cases, our approach was able to detect faults whereas the best debugging approach provided by Keras detected 32/40 faults. Our approach was able to localize 21/40 bugs whereas Keras did not localize any faults.

This program is tentative and subject to change.

Tue 25 May
Times are displayed in time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

19:35 - 20:55
1.5.1. Deep Neural Networks: General IssuesTechnical Track / Journal-First Papers / SEIP - Software Engineering in Practice at Blended Sessions Room 1 +12h
Chair(s): Ignacio PanachUniversidad de Valencia
19:35
20m
Paper
Asset Management in Machine Learning: A SurveySEIP
SEIP - Software Engineering in Practice
Samuel IdowuChalmers | University of Gothenburg, Daniel StrüberRadboud University Nijmegen, Thorsten BergerChalmers | University of Gothenburg
Pre-print
19:55
20m
Paper
An Empirical Study of Refactorings and Technical Debt in Machine Learning SystemsTechnical Track
Technical Track
Yiming TangCity University of New York (CUNY) Graduate Center, Raffi KhatchadourianCity University of New York (CUNY) Hunter College, Mehdi BagherzadehOakland University, Rhia SinghCity University of New York (CUNY) Macaulay Honors College, Ajani StewartCity University of New York (CUNY) Hunter College, Anita RajaCity University of New York (CUNY) Hunter College
Pre-print Media Attached
20:15
20m
Paper
Logram: Efficient Log Parsing Using n-Gram DictionariesJournal-First
Journal-First Papers
Hetong DaiConcordia University, Heng LiPolytechnique Montréal, Che-Shao ChenConcordia University, Weiyi ShangConcordia University, Tse-Hsun (Peter) ChenConcordia University
DOI Pre-print
20:35
20m
Paper
DeepLocalize: Fault Localization for Deep Neural NetworksTechnical Track
Technical Track
Mohammad WardatDept. of Computer Science, Iowa State University, Wei LeDept. of Computer Science, Iowa State University, Hridesh RajanDept. of Computer Science, Iowa State University
Pre-print

Wed 26 May
Times are displayed in time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

07:35 - 08:55
07:35
20m
Paper
Asset Management in Machine Learning: A SurveySEIP
SEIP - Software Engineering in Practice
Samuel IdowuChalmers | University of Gothenburg, Daniel StrüberRadboud University Nijmegen, Thorsten BergerChalmers | University of Gothenburg
Pre-print
07:55
20m
Paper
An Empirical Study of Refactorings and Technical Debt in Machine Learning SystemsTechnical Track
Technical Track
Yiming TangCity University of New York (CUNY) Graduate Center, Raffi KhatchadourianCity University of New York (CUNY) Hunter College, Mehdi BagherzadehOakland University, Rhia SinghCity University of New York (CUNY) Macaulay Honors College, Ajani StewartCity University of New York (CUNY) Hunter College, Anita RajaCity University of New York (CUNY) Hunter College
Pre-print Media Attached
08:15
20m
Paper
Logram: Efficient Log Parsing Using n-Gram DictionariesJournal-First
Journal-First Papers
Hetong DaiConcordia University, Heng LiPolytechnique Montréal, Che-Shao ChenConcordia University, Weiyi ShangConcordia University, Tse-Hsun (Peter) ChenConcordia University
DOI Pre-print
08:35
20m
Paper
DeepLocalize: Fault Localization for Deep Neural NetworksTechnical Track
Technical Track
Mohammad WardatDept. of Computer Science, Iowa State University, Wei LeDept. of Computer Science, Iowa State University, Hridesh RajanDept. of Computer Science, Iowa State University
Pre-print

Information for Participants
Info for Blended Sessions Room 1: