16th International Workshop on Graph Computation Models
Graphs are common mathematical structures which are visual and intuitive. They constitute a natural and seamless way for system modeling in science, engineering and beyond, including computer science, life sciences, business processes, etc. Graph computation models constitute a class of very high-level models where graphs are first-class citizens. They generalize classical computation models based on strings or trees, such as Chomsky grammars or term rewrite systems. Their mathematical foundation, in addition to their visual nature, facilitates specification, validation and analysis of complex systems. A variety of computation models have been developed using graphs and rule-based graph transformation. These models include features of programming languages and systems, paradigms for software development, concurrent calculi, local computations and distributed algorithms, and biological and chemical computations. The International Workshop on Graph Computation Models aims at bringing together researchers interested in all aspects of computation models based on graphs and graph transformation. It promotes the cross-fertilizing exchange of ideas and experiences among young and senior researchers from different communities who are interested in the foundations, applications, and implementations of graph computation models and related areas.
Previous editions of the GCM series were held in Natal, Brazil (GCM 2006), in Leicester, UK (GCM 2008), in Enschede, The Netherlands (GCM 2010), in Bremen, Germany (GCM 2012), in York, UK (GCM 2014), in L’Aquila, Italy (GCM 2015), in Wien, Austria (GCM 2016), in Marburg, Germany (GCM 2017), in Toulouse, France (GCM 2018), in Eindhoven, The Netherlands (GCM 2019), online (GCM 2020 and GCM 2021), in Nantes, France (GCM 2022), in Leicester, UK (GCM 2023) and in Enschede, The Netherlands (GCM 2024).
This program is tentative and subject to change.
Tue 10 JunDisplayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change
11:00 - 12:30 | |||
11:00 30mPaper | Introducing The Maximum Common Bigraph Problem GCM Kyle Burns University of Glasgow, Michele Sevegnani University of Glasgow, James Trimble University of Glasgow, Ciaran Mcreesh | ||
11:30 30mPaper | Implementing Binary Search Trees in GP2 (Work in Progress) GCM | ||
12:00 30mPaper | Comparing Synchronization Blocks with Double Pushout Synchronization GCM Georg Hinkel RheinMain University of Applied Sciences, Wiesbaden, Germany |
13:30 - 15:00 | |||
13:30 30mPaper | Termination of Graph Rewriting using Weighted Type Graphs over Non-well-founded Semirings GCM Qi Qiu Université Grenoble Alpes | ||
14:00 30mPaper | Parsing Hypergraphs using Context-Free Positional Grammars GCM | ||
14:30 30mPaper | Systems of Graph Formulas and their Equivalence to Alternating Graph Automata GCM Frank Drewes Umeå universitet, Berthold Hoffmann Universitt Bremen, Mark Minas Universität der Bundeswehr München |
Accepted Papers
Call for Papers
GCM 2025 solicits papers on all aspects of graph computation models. This includes but is not limited to the following topics:
Foundations
- Models of graph transformation
- Machine-learning techniques for graph transformation
- Analysis and verification of graph transformation systems
- Parallel, concurrent, and distributed graph transformation
- Term graph rewriting
- Formal graph languages
Applications
- Graph-based programming models and visual programming
- Model-driven engineering
- Machine-learning
- Evolutionary computation
- Software architectures, validation and evolution
- Databases
- Graph-based security models
- Workflow and business processes
- Social network analysis
- Bioinformatics and computational chemistry
- Quantum computing
- Case-studies
Submissions and Publication
Authors are invited to submit papers in three possible categories:
- Regular papers of at most 16 pages describing innovative contributions.
- Short papers (work in progress, system descriptions, or position papers) of 4 pages.
Papers in PDF format should be submitted electronically via the EasyChair system site. Submissions must use the EPTCS LaTeX style. Simultaneous submission to other conferences with proceedings, as well as submission of material that has already been published elsewhere is not allowed for regular and short papers. The page limits include references. An optional appendix may be added if this is useful for the reviewing process. If a short announcement extensively draws on already published work, a copy of that work is to be attached to the submission.
All submissions will be reviewed by the program committee. Electronic proceedings will be available at the time of the workshop. The authors of selected (regular and short) papers will be invited to submit revised versions for the post-proceedings. The latter will appear in the Electronic Proceedings in Theoretical Computer Science (EPTCS).
Call for Lightning Talks
We are pleased to announce the addition of lightning talks to the GCM workshop program. Lightning talk submissions are invited on the topic of Graph Transformation and AI. They will undergo a lightweight review and mainly be assessed for their potential to stir discussion on future research of the community. We will consider convincing lightning talks that do not recapitulate published work for invitation to submit a full paper to the post-proceedings of GCM. Presentations of lightning talks will be followed by a panel discussion.
Submission Guidelines
- Extended Abstract Length: Lightning talk submissions should be a maximum of 2 pages.
- Content: Contributions may be based on ideas that are related to Graph Transformation and AI, regardless of whether these ideas have already been published elsewhere or are as yet unpublished.
- Presentation Slot: Accepted abstracts will be allotted a 5-minute slot for presentation during the program, followed by an open discussion.
- Submission Process: Submissions must use the EPTCS LaTeX style and be submitted electronically in PDF via the EasyChair system site.
Graph Transformation and AI
Artificial Intelligence (AI), and Machine Learning (ML) in particular, are driving significant progress in computer applications. While ML is data-driven, AI includes rule-based symbolic approaches to modelling and automated reasoning such as in automated theorem proving, Semantic Web and Knowledge Graph technology.
Graph-like data is ubiquitous in applications, for example, in social networks, linked medical datasets, topological and geometric modelling in design, engineering and geography, software models and architectures, molecular modelling, etc.
There is significant interest in ML and AI technology utilising graph-structured data including a range of Graph Neural Network (GNN) and Graph Transformers (GT). Data in ML needs to be cleaned, integrated, mapped and translated before training and inference. This process, referred to as Data Wrangling, also applies to graph data. Hence areas where rule-based graph transformation and graph-based AI can benefit from each other include:
-
AI for Graph Transformation: Graph-based AI to address graph transformation problems, such as Inference of graphs, schemas, constraints, rules, rates and probabilities, grammars or control programs from input/output graphs, (timed) graph transition sequences, temporal graphs, or natural language requirements. Analysis, interrogation and explanation of graph transformation models, providing an alternative (e.g. NLP) interface to existing tools or interpreting models directly.
-
Graph Transformation for AI: Graph-based AI and data wrangling are graph transformations, although not usually defined in a rule-based way. GNNs operate on graphs to infer node attributes, nodes and/or edges. Rule-based graph transformations can provide a common computational model and theory. GTs are trained from examples to transform input to output graphs. A rule-based approach could raise the level of abstraction and help explainability. Graph data wrangling can benefit from concepts, theories and tools developed for model transformations, e.g. using triple graph grammars also support consistency checking, mapping, integration and translation of graph structures.
In both directions we are interested in exploring the link between Data-driven ML and symbolic AI for Graph Data where graph transformation operating on data in a rule-based way provides a link between ML and symbolic reasoning, by
- Extracting rule-based symbolic specifications from data
- Providing higher level or domain-specific input to ML in the form of rules and constraints
We invite short Lightning Talk presentations for a session on Graph Transformation and AI at GCM addressing questions such as:
- Data: How to provide sufficiently large high quality training data for graph-based ML techniques? How to use graph transformation techniques to aid ML techniques (e.g. data wrangling, outlier detection, etc.)?
- Computation: How can AI be used to address common graph transformation problems, or what are the new opportunities? How can rule-based graph transformation provide a platform to support, formalise, specify, analyse or implement graph-based AI?
- Analysis: How can rule-based graph transformation bridge the gap between symbolic AI and ML? How to use and evaluate Large Language Models (LLM) to solve graph-related problems?
Contributions can discuss problems and challenges, potential application scenarios, or solution ideas, including ideas on how to teach this subject.