ETAPS 2019
Sat 6 - Thu 11 April 2019 Prague, Czech Republic

The CREST 2019 workshop is the fourth in a series of workshops addressing formal approaches to reasoning about causation in systems engineering. The topic of formally identifying the cause(s) of specific events - usually some form of failures -, and explaining why they occurred, are increasingly in the focus of several, disjoint communities.

The main objective of CREST is to bring together researchers and practitioners from industry and academia in order to enable discussions how explicit and implicit reasoning about causation is performed. A further objective is to link to the foundations of causal reasoning in the philosophy of sciences and to causal reasoning performed in other areas of computer science, engineering, and beyond.

Previous editions: CREST 2018, 2017 and 2016.

Accepted Papers

Causality & Control flow [Robert Kunnemann, Deepak Garg, Michael Backes]

File Attached
Extending Causal Models from Machines into Humans [Severin Kacianka, Amjad Ibrahim, Alexander Pretschner, Alexander Trende, Andreas Ludtke]

File Attached
Justification Based Reasoning in Dynamic Conflict Resolution [Werner Damm, Martin Franzle, Willem Hagemann, Paul Kroger, Astrid Rakow]

File Attached
Towards A Logical Account of Epistemic Causality [Shakil M. Khan, Mikhail Soutchanski]

File Attached

Call for Papers

Today’s IT systems, and the interactions between them, become increasingly complex. Power grid blackouts, airplane crashes, failures of medical devices and malfunctioning automotive systems are just a few examples of incidents that affect system safety. They are often due to component failures and unexpected interactions of subsystems under conditions that have not been anticipated during system design and testing. The failure of one component may entail a cascade of failures in other components; several components may also fail independently. In the security domain, localizing instructions and tracking agents responsible for information leakage and other system attacks is a central problem. Determining the root cause(s) of a system-level failure and elucidating the exact scenario that led to the failure is today a complex and tedious task that requires significant expertise. Formal approaches for automated causality analysis, fault localization, explanation of events, accountability and blaming have been proposed independently by several communities - in particular, AI, concurrency, model-based diagnosis, software engineering, security engineering and formal methods. Work on these topics has significantly gained speed during the last years.

The goals of this workshop are to bring together and foster exchange between researchers from the different communities, and to present and discuss recent advances and new ideas in the field. Topics of interest include, but are not limited to:

  • foundation of causal reasoning about systems in the philosophy of sciences
  • languages and logics for causal specification and causal analysis
  • definitions of causality and explanation
  • causality analysis on models, programs, and/or traces
  • fault localization
  • causal reasoning in security engineering
  • causality in accident analysis, safety cases and certification
  • fault ascription and blaming
  • accountability, explainability of algorithms and systems
  • applications, implementations, tools and case studies of the above

Submissions should be prepared in EPTCS style ( with a length of up to 15 pages. All contributions must be submitted via the EasyChair submission web site for CREST 2019 ( All contributed papers will be reviewed by at least 3 PC members. Revised versions of selected papers will be published as formal post-workshop proceedings in the Electronic Proceedings in Theoretical Computer Science. At least one of the authors of an accepted paper needs to register for the workshop and present the paper in order for it to be included in the post-workshop proceedings.

You're viewing the program in a time zone which is different from your device's time zone change time zone

Sun 7 Apr

Displayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

14:00 - 15:30
Algebraic approaches to causalityCREST at S6
Coalgebras for causality
Matteo Sammartino University College London
File Attached
Causality and diagrammatic reasoning

Not scheduled yet

Not scheduled yet
(Cancelled) Process theories, string diagrams, and black box causal reasoning
Aleks Kissinger Radboud University

Speaker: Luke Fenton - Glynn, University College London

Title: Probabilistic Actual Causation

Abstract: Actual causation - the sort of causal relation asserted to hold by claims like ‘the Chicxulub impact caused the Cretaceous-Paleogene extinction event’, ‘the H7N9 virus outbreak was caused by poultry farmers becoming simultaneously infected by bird and human ’flu strains’, and ‘the collapse of Bridge 9340 on I-35W was caused by the failure of the gusset plates at position ‘U10’’ - is of significance to scientists, historians, and tort and criminal lawyers. Progress has been made in explicating the actual causal relation in the deterministic case by means of the use of structural equation models and causal graphs. In this work, I seek to make similar progress concerning the probabilistic case by using probabilistic causal models and associated causal graphs.

Speaker: Holger Hermanns, Saarland University

Title: Towards a Science of Perspicuous Computing - Lessons learnt from the Analysis of Automotive Emissions Control Systems

Abstract: From autonomous vehicles to smart homes and cities – increasingly computer programs participate in actions and decisions that affect humans. However, our understanding of how these applications interact and what are the causes of a specific automated decision cascade is lagging far behind. It is nowadays virtually impossible to provide scientifically well-founded answers to questions about the exact reasons that lead to a particular decision, let alone about accountability in case of the malfunctioning of, say, an exhaust aftertreatment system in a modern car. The root of the problem is that contemporary systems do not have any built-in concepts to explicate their behaviour. They calculate and propagate outcomes of computations, but are not designed to provide explanations. They are not perspicuous.

This keynote will discuss the need for establishing a science of perspicuous computing as the key to enable comprehension in a cyber-physical world. Concretely we will discuss lessons learnt from applying model checking, model-based testing and run-time verification approaches to uncover software problems in cars equipped with modern combustion engines. This work is placed in the context of focussed activities that are currently being ramped up as part of the DFG-funded Transregional Collaborative Research Centre 248 -

Speaker: Jean Krivine

Title: Causality and diagrammatic reasoning

(Cancelled: Aleks Kissinger, Radboud University – “Process theories, string diagrams, and black box causal reasoning”)

Speaker: Stefan Leue, University of Konstanz

Title: Analysis, Repair and Causality for Timed Diagnostic Traces

Abstract: I present algorithms and techniques for the repair of timed system models, given as networks of timed automata (NTA). The repair is based on an analysis of timed diagnostic traces (TDTs) that are computed by real-time model checking tools, such as UPPAAL, when they detect the violation of a timed reachability property. We present an encoding of TDTs in quantifier-free linear real arithmetic and use the minimal satisfiability core capabilities of the SMT solver Z3 to compute possible repairs. We then present an admissibility criterion, called functional equivalence, that assesses whether a proposed repair is admissible in the overall context of the NTA. I will then describe the architecture of a proof of concept tool called TarTar, which implements the analysis, repair and admissibility test that I describe. I will finally discuss the relationship of the analysis to causal reasoning in timed systems and present some experimental results.

This is joint work with Martin Koelbl (University of Konstanz) and Thomas Wies (New York University).

Speaker: Sisi Ma, University of Minnesota

Title: Computational Causal Discovery and its Applications

Abstract: With the rapid accumulation of high variety high volume data, there is an increasing demand for computational methods that can perform effective and systematic knowledge discovery. In this talk, I will give an overview of computational causal discovery methods. These methods aim to discover the underlying data generation processes, i.e. causal mechanisms, from observational data, experimental data, and the combination of the two. The operating principles and the general analytical frameworks of computational causal discovery methods will be introduced, along with examples of their applications in biomedical sciences. I will also discuss causal feature selection, one important intersection of computational causal discovery and supervised learning predictive modeling. Causal feature selection is a class of feature selection methods that utilizes the causal relationships among the outcome and the predictors for feature selection. Causal feature selection methods produce predictor sets that result in robust, parsimonious, and interpretable predictive models.

Speaker: Matteo Sammartino, University College London

Title: Coalgebras for causality

Abstract: In this talk I will present a coalgebraic perspective on causality. I will introduce a general notion of causal operational model in the form of coalgebras over structured sets, including causal information in the form of posets. In this richer setting, the canonical final semantics coincides with history preserving bisimulation. These coalgebras encompass known models of causality for processes and Petri nets, such as causal trees (Darondeau-Degano ’90) and and behaviour structures (Tractenbrot-Rabinovich ’88). Finally, I will show how the additional structure of coalgebras enables deriving equivalent, but more succinct, automata models via a categorical equivalence.