EXP Navigation
Usability experiment for «Almer Connect» Innobooster project.
The experiment is designed as a web application with a centred menu navigation. The navigation can be controlled equally with voice control, motion sensors or physical buttons of the device. The user is free to choose which interaction method to use, depending on personal preferences or contextual constraints. They can be freely mixed and matched and all offer the same level of control. A simple navigation serves as a use case to test the different input methods and determine which mode the user gravitates to. Visual styles are omitted as much as possible to focus on the interaction methods. For dynamic menu items, the corresponding voice commands are displayed below the label. If no additional voice command is listed, the label can be used as a voice command.
The head movement is transferred directly to the current menu selection without a pointer object being displayed. This avoids nervous cursor jitter due to inaccurate sensor data. When using head movement, the user can confirm a selection by focusing on the item for a predefined period of time or by pressing a physical button.