Write a Blog >>

Self-adaptive systems increasingly rely on machine learning techniques as black-box models to make decisions even when the target world of interest includes uncertainty and unknowns. Because of the lack of transparency, adaptation decisions, as well as their effect on the world, are hard to explain. This often hinders the ability to trace unsuccessful adaptations back to understandable root causes. In this paper, we introduce our vision of explainable self-adaptation. We demonstrate our vision by instantiating our ideas on a running example in the robotics domain and by showing an automated proof-of-concept process providing human-understandable explanations for successful and unsuccessful adaptations in critical scenarios.

Preprint (ase22-133.pdf)627KiB

Thu 13 Oct

Displayed time zone: Eastern Time (US & Canada) change

13:30 - 15:30
Technical Session 28 - Safety-Critical and Self-Adaptive SystemsIndustry Showcase / Tool Demonstrations / Research Papers / Late Breaking Results / NIER Track at Room 128
Chair(s): Eunsuk Kang Carnegie Mellon University
13:30
10m
Demonstration
SAFA: A Tool for Supporting Safety Analysis in Evolving Software Systems
Tool Demonstrations
Alberto D. Rodriguez University of Notre Dame, Timothy Newman University of Notre Dame, Katherine R. Dearstyne University of Notre Dame, Jane Cleland-Huang University of Notre Dame
13:40
20m
Research paper
Generating Critical Test Scenarios for Autonomous Driving Systems via Influential Behavior PatternsVirtual
Research Papers
Haoxiang Tian Institute of Software, Chinese Academy of Sciences, Guoquan Wu Institute of Software at Chinese Academy of Sciences, China, Jiren Yan Institute of Software, Chinese Academy of Sciences, Yan Jiang Institute of Software, Chinese Academy of Sciences, Jun Wei Institute of Software at Chinese Academy of Sciences; University of Chinese Academy of Sciences, Wei Chen Institute of Software at Chinese Academy of Sciences, China, Shuo Li Institute of Software, Chinese Academy of Sciences, Dan Ye Institute of Software, Chinese Academy of Sciences
14:00
20m
Research paper
Consistent Scene Graph Generation by Constraint OptimizationVirtual
Research Papers
Boqi Chen McGill University, Kristóf Marussy Budapest University of Technology and Economics, Sebastian Pilarski McGill University, Oszkár Semeráth Budapest University of Technology and Economics, Daniel Varro McGill University / Budapest University of Technology and Economics
14:20
20m
Industry talk
A Drift Handling Approach for Self-Adaptive ML Software in Scalable Industrial ProcessesVirtual
Industry Showcase
Firas Bayram Department of Mathematics and Computer Science, Karlstad University, Sweden, Bestoun S. Ahmed Karlstad University Sweden, Erik Hallin Uddeholms AB, Sweden, Anton Engman Uddeholms AB, Sweden
Pre-print
14:40
10m
Paper
SML4ADS: An Open DSML for Autonomous Driving Scenario Representation and GenerationVirtual
Late Breaking Results
Bo Li East China Normal University, Dehui Du East China Normal University, Sicong Chen East China Normal University, Minjun Wei East China Normal University, Chenghang Zheng East China Normal University, Xinyuan Zhang East China Normal University
14:50
10m
Vision and Emerging Results
XSA: eXplainable Self-AdaptationVirtual
NIER Track
Matteo Camilli Free University of Bozen-Bolzano, Raffaela Mirandola Politecnico di Milano, Patrizia Scandurra University of Bergamo, Italy
File Attached
15:00
20m
Industry talk
Design-Space Exploration for Decision-Support Software
Industry Showcase
Ate Penders Thales Research & Technology, Ana Lucia Varbanescu University of Twente, Gregor Pavlin Thales Research & Technology, Henk Sips Delft University of Technology