XRintTest: An Automated Framework for User Interaction Testing in Extended Reality Applications
Extended Reality (XR) technologies offer immersive user experiences across diverse application domains, presenting unique testing challenges due to their spatial interaction paradigms. While existing works test XR applications through scene navigation and interaction triggering, they fail to synthesise realistic spatial input via specialised XR devices, such as 6 degrees of freedom controller gestures, that are essential for modern XR user experiences. To address this gap, we present XRintTest, an automated testing framework for Unity-based XR applications. XRintTest starts by constructing an XR User Interaction Graph that models interaction targets and required events. Leveraging this graph, it then automatically explores the XR scene under test and generates user interactions. We evaluated XRintTest on XRBench3D, a novel benchmark comprising seven XR scenes containing 367 distinct 3D user interactions. XRintTest shows great effectiveness, achieving 97% coverage of trigger and grab interactions across all scenes, 9x more effective and 5x more efficient than random exploration, while detecting runtime exceptions and functional defects. We open-sourced our tool and dataset at https://github.com/ruizhengu/XRintTest and https://github.com/ruizhengu/XRBench3D, respectively. A video demo is available on YouTube at https://youtu.be/K0Q6waE47Us.