MOBILESoft 2023
Dates to be announced Melbourne, Australia
co-located with ICSE 2023

Call for Papers

Introduction

This Tool Demos and Dataset track welcomes submissions on reusable tools and datasets in either practice or research. These tools and datasets should showcase novel, practical contributions that aid software architects, designers, researchers and engineers to jumpstart their own workflows, and also reproduce the earlier works. Authors of each accepted submission will give a short presentation of their solution, followed by a detailed demo session.

Both categories may range from early prototypes to in-house or pre-commercialized products. Authors of regular MOBILESoft papers are also welcome to submit an accompanying MOBILESoft Tool Demos and Dataset paper by adding information regarding the actual demo. Each contribution must be submitted in the form of up to a four (4)-page short paper (including all references and appendices).

Dataset Showcase

Dataset showcase submissions are expected to include:

  • A description of the data source;
  • A description of the methodology used to gather the data (including provenance and the tool used to create/generate/gather the data, if any);
  • A description of the storage mechanism, including a schema if applicable;
  • If the data has been used by the authors or other, a description of how this was done including references to previously published papers;
  • A description of the originality of the dataset (that is, even if the dataset has been used in a published paper, its complete description must be unpublished) and similar existing datasets (if any);
  • A description of the design of the tool, and how to use the tool in practice ideas for further research questions that could be answered using the dataset;
  • Ideas for further improvements that could be made to the dataset, and
  • Any limitations and/or challenges in creating or using the dataset.

Tool Demo Showcase

Tool demo showcase submissions are expected to include

  • A description of the tool, which includes the background, motivation, novelty, overall architecture, detailed design, and preliminary evaluation of the tool, as well as the link to download or access the tool;
  • A description of the design of the tool, how to use the tool in practice;
  • Clear installation instructions and example data set that allow the reviewers to run the tool;
  • If the tool has been used by the authors or others, a description of how the tool was used including references to previously published papers, and ideas for future reusability of the tools;
  • Any limitations of using the tools.

Formatting and Submission Instructions

All submissions must conform to the MOBILESoft 2023 formatting and submission instructions available at https://www.acm.org/publications/proceedings-template for both LaTeX and Word users. LaTeX users must use the provided acmart.cls and ACM-Reference-Format.bst without modification, enable the conference format in the preamble of the document (i.e., \documentclass[sigconf,review,anonymous]{acmart}), and use the ACM reference format for the bibliography (i.e., \bibliographystyle{ACM-Reference-Format}). The review option adds line numbers, thereby allowing referees to refer to specific lines in their comments. The conference information can be set using the following command: \acmConference[MobileSoft’23]{The 10th International Conference on Mobile Software Engineering and Systems}{Melbourne, VIC, Australia}.

All submissions to the Tool Demo and Dataset track must not exceed 4 pages for the main text, inclusive of all figures, tables, appendices, etc. An extra page is allowed for references. All submissions must be in PDF. The page limit is strict, and it will not be possible to purchase additional pages at any point in the process (including after the paper is accepted).

Papers that do not conform to these guidelines will be desk rejected before the review process.

Submissions may be made through HotCrp at: https://tools-datasets-mobilesoft-2023.hotcrp.com/

The Tool Demo and Dataset track at MOBILESoft 2023 will adopt a double-anonymous review process. No submitted paper may reveal its authors’ identities. The authors must make every effort to honor the double-anonymous review process; reviewers will be asked to honor the double-anonymous review process as much as possible. Any author having further questions on double-blind reviewing is encouraged to contact the track’s program co-chairs by email. Any submission that does not comply with the double-blind review process will be desk-rejected. Further advice, guidance and explanation about the double-blind review process can be found in the Q&A page of ICSE: https://conf.researchr.org/track/icse-2023/icse-2023-submitting-to-icse2023--q-a

Review and Evaluation Criteria

Each submission will be reviewed by three members of the program committee. The main evaluation criteria include

  • The value, usefulness, and reusability of the tools and/or datasets.
  • The quality of the presentation.
  • The clarity of relation with related work and its relevance to mining software repositories.
  • The availability of tools and/or datasets.

Tools and Datasets Availability

To promote replicability and to disseminate the advances achieved with the tools and datasetss, we strongly encourage the tools and datasets to be publicly available for download and use, ideally through their distribution with an open-source software license. The tool/dataset should be made available at the time of submission of the paper for review, but will be considered confidential publication of the paper. Whenever the tool and/or dataset are not made publicly available, the paper must include a clear explanation for why this was not possible.

The tool/dataset should include detailed instructions about how to set up the environment (e.g., requirements.txt), how to use the tool/datasets (e.g., how to use the tool with a running example, how to import the data or how to access the data once it has been imported).

To further increase the visibility of the presented tools and datasets, we require all authors of accepted papers to produce a screencast presenting their tool or dataset. Accepted papers will be linked with their accompanying screencasts on the Tool Demonstration and Dataset track website.