Call for Papers
High-quality datasets and a robust evaluation framework are essential for the development and evaluation of foundation models (FM). The Benchmarking track serves as a forum for publishing high-quality research on machine learning datasets and benchmarking results that extend beyond traditional evaluation metrics. Particularly, this track encourages publications that advance the frontiers of data quality and benchmarking standards that facilitate the development and assessment of FM for software engineering (SE).
Scope
This track will accept two types of submissions: (1) data papers, (2) benchmarking papers, in the context of software engineering.
1.Data papers are expected to include:
- New datasets, or carefully and thoughtfully designed (collections of) datasets based on previously available data.
- Data generators and reinforcement learning environments.
- Data-centric AI methods and tools, e.g. to measure and improve data quality or utility, or studies in data-centric AI that bring important new insights.
- Advanced practices in data collection and curation are of general interest even if the data itself cannot be shared.
- Frameworks for responsible dataset development, audits of existing datasets, and identifying significant problems with existing datasets and their use.
2.Benchmarking papers are expected to include:
- Benchmarks on new or existing metrics, as well as benchmarking tools.
- Systematic analyses of existing systems on novel datasets yield important new insights.
Criteria
We are aiming for an evaluation specifically suited to data and benchmarking.
1.For Data papers:
- value, usefulness, and reusability of the datasets or tools;
- quality of the presentation;
- clarity of relation with related work and its relevance to software engineering;
- accessibility of the datasets or tools, i.e., the data can be found and obtained without a personal request, and any required code should be open source.
2.For Benchmarking papers:
- the relevance of the proposed demonstration for the FORGE audience;
- the originality of its underlying ideas;
- the quality of the presentation;
- the usefulness of the results.
- the outreach of the proposed tool, metric or dataset
Submission Instructions
Regardless of paper types mentioned in the Scope section, the length of all the papers submitted to this track is restricted to a maximum of 4 pages, plus 1 additional page of references.
We encourage all authors to disclose (anonymized and curated) data/artifacts to increase reproducibility and replicability. Note that sharing research artifacts is not mandatory for submission or acceptance. However, sharing is expected to be the default, and non-sharing needs to be justified.
All submissions must be in PDF. The page limit is strict, and it will not be possible to purchase additional pages at any point in the process (including after acceptance).
Submissions must conform to the IEEE conference proceedings template, specified in the IEEE Conference Proceedings Formatting Guidelines (title in 24pt font and full text in 10pt type, LaTeX users must use \documentclass[10pt,conference]{IEEEtran} without including the compsoc or compsocconf options).
Note, we use double-anonymous reviewing. Be sure to remove the list of authors from the submitted paper. If citing your own prior work, please do so in the third person to obscure the relationship you have with it. For advice, guidance, and explanation about the double-anonymous review process, see ICSE Research Track’s Q&A page.
All papers must be written in English. The authors are strongly encouraged to use the HotCRP format checker on their submissions. Note that the format checker is not perfect. In particular, it can complain about small fonts in figures, footnotes, or references. As long as the main text follows the requested format, and the figures are readable, the paper will not be rejected for format violations. If you have any concerns, please contact the program chairs.
All papers should be made accessible to people with disabilities. Some guidelines from the SIGACCESS community are available here: https://assets21.sigaccess.org/creating_accessible_pdfs.html.
Please submit your paper on HotCRP: https://forge25-benchmarking.hotcrp.com/
Important dates (AOE)
- Full paper submission deadline: Dec 13, 2024
- Author notification: Jan 14, 2025
- Camera-ready deadline: Feb 21, 2025