ICSA 2026
Mon 22 - Fri 26 June 2026

Since the introduction of the Artifacts Evaluation Track in 2021, ICSA has taken important steps toward improving reproducibility and transparency in software architecture research. This track encouraged authors to share their tools, datasets, and replication packages, setting a strong foundation for artifact quality and reusability.

Building on this success, we launch the ICSA 2026 Open Science Challenge.

This challenge is inspired by ongoing discussions, challenges, and meta-research efforts in the software architecture community and beyond.

This includes, among others:

  1. Fostering availability of (reusable) datasets and empirical rigor Shaw2001 Babar2011 GalsterWeyns2016 KonersmannKaplanKühn2022 GalsterWeyns2023 Kazman2023
  2. Declaration and guidelines for threats to validity KonersmannKaplanKühn2022 Verdecchia2023 Lago2024
  3. Semantic description of research knowledge KonersmannKaplanKühn2022 VisuLite2023
  4. Open Science Competitions at REFSQ and RE

Open Science challenges aim to provide long-term advantages for the research communities:

  • More efficient comparison of research contributions through structured artifact annotations.
  • Faster identification and assessment of related work based on open annotated materials.
  • A sustainable community-curated knowledge base supported by shared datasets and replication packages.

This challenge reflects ICSA’s commitment to long-term scientific integrity and aligns with global movements toward Open Science and FAIR principles (Findable, Accessible, Interoperable, Reusable). Together, we can shape a future where software architecture research is not only impactful but also transparent, collaborative, and sustainable.

Who can submit?

We invite all authors of accepted papers in the Research Papers track to engage in this challenge. Fame, honor, award, and prize money await you.

Why participate?

By embracing Open Science, we aim, among others, to:

  • Enable meta-analysis and evidence-based practices through standardized annotations
  • Shape the future of scientific publishing and communication
  • Enable effective and efficient knowledge sharing and accelerate innovation by making research outputs findable, reusable, and sustainable
  • Recognize and reward openness through dedicated awards and incentives.

ICSA 2026 introduces its first Open Science Competition to promote openness in software architecture research. The initiative supports the global Open Science movement, calls on researchers to adopt FAIR principles, and embrace practices that will shape the future of scientific publishing and communication.

The best submission receives the Open Research Knowledge Graph Award (ORKG Open Science Award), including prize money. The Open Science Co-Chairs will involve other reviewers to determine the winners.

Any Questions or Help Needed? Please contact the Open Science Co-Chairs Angelika Kaplan and Oliver Karras.

Challenge: Annotate your paper with SciKGTeX to describe its research contribution.

The accepted paper that is best annotated (see evaluation criteria) with SciKGTeX will be awarded the ORKG Open Science Award, including prize money of 100 €.

SciKGTeX is a LaTeX package that allows authors to semantically annotated the scientific contribution of their research in the LaTeX document at the time of writing. Annotations are stored as structured XMP metadata, making them harvestable by search engines and knowledge graphs. This supports semantic scientific communication and Open Science.

You can find the LaTeX package and more information at the following links:

Task

This challenge of the Open Science Competition asks authors to annotate their papers according to the paper annotation schema (mandatory and optional), if applicable.

Note: Experiments show that SciKGTeX annotations take on average 7.5 minutes (5–10 minutes).

Evaluation Criteria

Submissions will be evaluated by the following criteria:

  • Correctness of the annotated information (in respect to the manuscript).
  • Completeness of the annotations (bonus points for relevant and optional annotations).
  • Conciseness of the information in the annotations.
  • New Contributions via issue templates and issue comments, as deciding criterion when two or more submissions are rated equally according to the other criteria.

Participation

For participation, authors must submit their annotated camera-ready paper.

Contributing

Please note that this is a community-driven effort.

We encourage authors to mature, refine, and extend the annotation schema.

Suggestions (e.g., new categories or classes and their descriptions) can be submitted via issues (see Contributing) via the Open Science Challenge Repository.

Researcher can comment on existing issue.

Please follow the code of conduct when contributing.

For this challenge, we use a holistic classification and data schema. This schema provides a structured way to describe software architecture research contributions according to their findings, validity, and evidence, ensuring consistency and transparency across submissions.

Paper annotation Step-by-Step Guide

  1. Add SciKGTeX Files scikgtex.lua and scikgtex.sty to the LaTeX project
  2. Change Compiler to LuaLaTeX
  3. Add the package \usepackage{scikgtex}
  4. Annotate the Research Field of the Paper (mandatory): > \researchfield*{\uri{https://orkg.org/resource/R659055}{Software Architecture and Design}}
  5. Annotate your paper content (text) according to a given schema (see Paper Annotation Schema below)
  6. Recompile to generate a FAIR-annotated PDF
  7. Submit your annotated camera-ready paper
  8. Contribute to the community (see Contributing below)

Paper Annotation Schema

Note: If a text passage in your paper is not suitable for annotation, add an asterisk (*) to the comment.

Example:

\researchfield{\uri{https://orkg.org/resource/R659055}{Software Architecture and Design}}

\researchfield*{\uri{https://orkg.org/resource/R659055}{Software Architecture and Design}}

Category: Paper Class

\contribution*{paper class}{“The category of the paper class based on Wieringa et al.”}

Category: Research Object

\contribution{research object}{“The category describes the object(s) under investigation”}

Category: Research Method \contribution{research method}{“The type of research method used”}

Category: Result

\result{“Your annotated text describing the research result(s)”}

Category: Threats to Validity annotations

\contribution{external validity}{“Your annotated text describing a threat to external validity”}

\contribution{internal validity}{“Your annotated text describing a threat to internal validity”}

\contribution{construct validity}{“Your annotated text describing a threat to construct validity”}

\contribution{confirmability validity}{“Your annotated text describing a threat to conclusion validity”}

\contribution{repeatability validity}{“Your annotated text describing a threat to repeatability”}

  • Note that you can have multiple annotations of the same type
  • For description and example validity threats see Threats to Validity.

Category: Evaluated Property (optional)

\contribution{evaluated property}{“The property evaluated by a research method”}

  • For examples and description of potential properties see Description of Properties.

Category: Supplementary Materials (mandatory)

\contribution{replication package}{“Your URL to the replication package”}

\contribution{code repository}{“Your URL to the code repository”}

\contribution{dataset}{“Your URL to the dataset”}

Contributing

Please note that this is a community-driven effort.

We encourage authors to mature, refine and extend the annotation schema.

Suggestions (e.g., new categories or classes and their descriptions) can be submitted via issues (see Contributing) via the Open Science Challenge Repository.

Researcher can comment on existing issue.

Please follow the code of conduct when contributing.