User Elicited Hand Gestures for VR-based Navigation of Architectural Designs
Virtual Reality (VR) systems have become an affordable mass market technology, with ubiquitous linkages from 3D building design tools such as Building Information Modelling (BIM). This provides architects, designers and clients with immersive experiences of evolving building designs. VR systems typically require physical controllers to manage 3D navigation and to invoke other functions within the virtual environment. A range of mid-air, contactless gesture detecting technologies exist which offer the potential of more intuitive interactions between a user and the 3D model they are experiencing. This research elicited a hand gesture set for 3D building navigation from interviews with architects and other design professionals. These interviews identified required navigation actions and corresponding gestures for each action. The gestures proposed by design professionals were evaluated against known HCI usability factors to determine an initial set of gestures which were tested through a VR navigation task. An evaluation study was performed looking at the memorability, intuitiveness and comfort of the gestures. The data from this study suggests that the developed gesture set is effective.