CoAST is the 1th International Workshop on Context aware and Adaptive Software systems Testing. It aims to bring together the researchers, practitioners, and tool developers working on topics related to the testing and quality assessment of context-aware software (CAS) and adaptive software (AS) systems.
We also will welcome contributions that present novel methodologies, techniques, and tools showing how context-awareness or adaptiveness is dealt with throughout the software testing processes. Finally, we will favor those contributions that offer to ground on industrial experience/applications and give special consideration to those with empirical nature. Nonetheless, we would welcome industry reports detailing these systems’ successful and failed delivery with a reflective discussion that can help bridge the gap between industry and academia.
The relevant topics include model-based and model-driven testing of CAS and AS systems, empirical studies, and industrial experiences on testing of CAS and AS systems, artificial intelligence and big data analytics applied in CAS and AS testing.
NEWS This year CoAST will be co-located with INTUITESTBEDS (International Workshop on User Interface Test Automation and Testing Techniques for Event Based Software).
Thu 20 AprDisplayed time zone: Dublin change
16:00 - 17:30
|Activity: Focus Group on Challenges for GUI and Context Aware Testing|
|Beyond Combinatorial Interaction Testing: On the need for transition testing in dynamically adaptive context-aware systems|
|Deep Industry Use Cases on Context-Aware Adaptive Mobile Systems Experience Testing|
Call for Papers
Development of Contemporary Software Systems usually integrates a myriad of other systems and devices into a single unit to provide services to their users. In our perspective, contemporary software systems encompass the likes of ubiquitous systems, Industry 4.0, the Internet of Things, smart cities, mobile systems, intelligent environments, and cyber-physical systems. Of particular interest is the capability of some of these systems to adapt their behavior based on the status of the context they are placed in.
This workshop aims to provide researchers and practitioners a forum for exchanging ideas, experiences, understanding of the problems, visions for the future, and promising solutions to the problems in software testing of CAS and AS systems. The workshop will also provide a platform for researchers and developers of testing tools to work together to identify the problems in the theory and practice of test automation, set a plan, and lay the foundation for future development. Moreover, with this workshop, we aim to identify novel, emerging and still preliminary academic or industrial practices that have yet to be published or studied in mainstream academic literature. Nevertheless, we are convinced that the industry and academia are currently approaching the domain with practices and methodologies that can interest computer scientists, software engineers, and practitioners in the case of engineering context-aware software or adaptive software systems.
We solicit novel papers related to the following topics (not strictly limited) in the context of context-aware software (CAS) and adaptive software (AS) systems Testing:
- Novel ideas, methodologies or tools to test context-aware software (CAS) and adaptive software (AS) systems.
- Empirical studies, established or initial results, resulting from the testing of context-aware software (CAS) and adaptive software (AS) systems.
- Case studies from industrial collaborations on testing of context-aware software (CAS) and adaptive software (AS)
- Case studies on the application of modern technologies (Model-Based, Simulations, Artificial Intelligence, Digital Twins, among others) that can be used to support the Testing of context-aware software (CAS) and adaptive software (AS).
- Novel secondary or tertiary studies in context-aware software (CAS) and adaptive software (AS) testing.
- Novel methodologies, techniques, and tools to test autonomous driving systems or Advanced Driver Assistance Systems.
- Novel methodologies, techniques, and tools to test cyber physical software systems.
- Novel methodologies, techniques, and tools to test context aware and adaptive mobile systems and mobile applications.
- Novel methodologies, techniques, and tools to test satellite, aerospace, drones, unmanned drones, and aeronautical software systems.
Accepted papers will be published as part of ICST workshops proceedings, through the IEEE digital library
Papers can be of one of the following types:
- Full research contributions (max. 8 pages): papers presenting research results or industrial practices in CAS and AS testing.
- Position papers (max. 4 pages): papers presenting new ideas, preliminary results or important direction in CAS and AS testing.
- Demo papers (max. 4 pages): papers presenting tools for the automation or semi automation of CAS and AS testing processes.
- Industrial presentations (max. 2 pages): papers describing industrial experiences and case studies in the CAS and AS testing fields
- Negative Results (max 2 pages): papers describing the application of ideas/tools/approaches that did not shield the expected benefits (in industry or in the Lab) or whose lessons learned rendered them not suitable for Testing CAS and AS.
The reviewing process is single blind. Therefore, papers do not need to be anonymized. Papers must conform to the two-column IEEE conference publication format and should be submitted via EasyChair.
Templates for LaTeX and Microsoft Word are available from https://www.ieee.org/conferences/publishing/templates.html