Gamification is an emerging technique to enhance motivation and performance in traditionally unengaging tasks like software testing. Previous studies have indicated that gamified systems have the potential to improve software testing processes by providing testers with achievements and feedback. However, further evidence of these benefits across different environments, programming languages, and participant groups is required. This paper aims to replicate and validate the effects of IntelliGame, a gamification plugin for IntelliJ IDEA to engage developers in writing and executing tests. The objective is to generalize the benefits observed in earlier studies to new contexts, i.e., the TypeScript programming language and a larger participant pool. The replicability study consists of a controlled experiment with 174 participants, divided into two groups: one using IntelliGame and one with no gamification plugin. The study employed a two-group experimental design to compare testing behavior, coverage, mutation scores, and participant feedback between the groups. Data was collected through test metrics and participant surveys, and statistical analysis was performed to determine the statistical significance. Participants using IntelliGame showed higher engagement and productivity in testing practices than the control group, evidenced by the creation of more tests, increased frequency of executions, and enhanced utilization of testing tools. This ultimately led to better code implementations, highlighting the effectiveness of gamification in improving functional outcomes and motivating users in their testing endeavors. The replication study confirms that gamification, through IntelliGame, positively impacts software testing behavior and developer engagement in coding tasks. These findings suggest that integrating game elements into the testing environment can be an effective strategy to improve software testing practices.
Fri 27 JunDisplayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change
14:00 - 15:30 | Gamification, Specifications, and Code ReviewsResearch Papers / Tool Demonstrations at Cosmos 3C Chair(s): Michael Pradel University of Stuttgart | ||
14:00 25mTalk | NADA: Neural Acceptance-driven Approximate Specification Mining Research Papers Weilin Luo Sun Yat-sen University, Tingchen Han Sun Yat-Sen University, Junming Qiu Sun Yat-sen University, Hai Wan Sun Yat-sen University, Jianfeng Du Guangdong University of Foreign Studies, Bo Peng Sun Yat-Sen University, Guohui Xiao Southeast University, Yanan Liu SUN YAT-SEN UNIVERSITY DOI | ||
14:25 25mTalk | Gamifying Testing in IntelliJ: A Replicability Study Research Papers Philipp Straubinger University of Passau, Tommaso Fulcini Politecnico di Torino, Giacomo Garaccione Politecnico di Torino, Luca Ardito Politecnico di Torino, Gordon Fraser University of Passau DOI | ||
14:50 25mTalk | DeCoMa: Detecting and Purifying Code Dataset Watermarks through Dual Channel Code Abstraction Research Papers Yuan Xiao Nanjing University, Yuchen Chen Nanjing University, Shiqing Ma University of Massachusetts at Amherst, Haocheng Huang Soochow University, Chunrong Fang Nanjing University, Yanwei Chen Nanjing University, Weisong Sun Nanyang Technological University, Yunfeng Zhu Nanjing University, Xiaofang Zhang Soochow University, Zhenyu Chen Nanjing University DOI Pre-print | ||
15:15 15mDemonstration | Teaching Software Testing and Debugging with the Serious Game Sojourner under Sabotage Tool Demonstrations Philipp Straubinger University of Passau, Tim Greller University of Passau, Gordon Fraser University of Passau |
Cosmos 3C is the third room in the Cosmos 3 wing.
When facing the main Cosmos Hall, access to the Cosmos 3 wing is on the left, close to the stairs. The area is accessed through a large door with the number “3”, which will stay open during the event.