Improving Behavior-Driven Development Scenarios: Empirical Evaluation of a Quality Assement Framework
Behavior-Driven Development (BDD) is an agile practice used to specify expected system behavior for validating a feature. BDD utilizes scenarios written in structured natural language which, when combined with a user story, express a functional requirement more concisely. However, BDD scenarios often suffer from ambiguity, redundancy, and lack of focus, which limits their effectiveness in validating intended requirements. A recently introduced evaluation framework called Quality Attributes-Based Guidelines for Evaluation (QABAGE), defines seven key attributes - Uniqueness, Integrity, Essentiality, Singularity, Completeness, Clarity, and Focus - to improve the quality of BDD scenarios. Although QABAGE has undergone preliminary ex-ante evaluation with software engineering experts, establishing its acceptance and practical utility is still an ongoing process. Effective design and implementation of such frameworks aiming to enhance software engineering practices require both ex-ante and iterative ex-post evaluations to assess their impact across different contexts and conditions. Building on prior research, we empirically evaluate QABAGE by analyzing scenarios—first without and then with the framework—combined with semi-structured interviews to assess its perceived structure and utility. The findings suggest that using QABAGE as a guiding framework enhances the Essentiality and Completeness of BDD scenarios, with participants reporting clearer, more readable scenarios and reduced ambiguity during the scenario-writing process. However, challenges emerged in applying Singularity, particularly in decomposing complex functionalities into distinct, manageable elements within a single scenario. This paper provides insights for improving BDD practices and highlights the need for techniques that bridge communication between technical and non-technical stakeholders.