This post follows: Part 5: Applying Scripts to the Scene
Virtual and Mixed Reality systems are typically headgear based and the default behavior is for the system to take control of the scene camera rotation and position — following the user’s gaze. The usual keyboard and mouse input could be more challenging with a headgear so we have to consider alternative options.
Let’s look at the technologies available today:
You can survive a with a keyboard and mouse with the Oculus DK2 since you’re going to be tethered to your computer, so whether you’re sitting or standing for your VR experience, you have the option of peeking below the headgear to use the keyboard or mouse, but that’s not the primary experience you want to target. You’ll be in a PC setup so you have the option of connecting a supported gamepad or console controller. The upcoming Oculus Touch points at a handheld controller approach to supplement the head tracking.
The Oculus Gear VR is a different experience since you’re not tethered and it provides a touch pad on the side of the headgear. The sample apps rely on head tracking and the touch pad which works well enough but after playing several minutes of a shooter game you can almost imagine how X-Men’s Scott Summers (Cyclops) must have felt every time he had to tap his visor to fight. I’m just glad I can pair a bluetooth gamepad so I don’t’ have to keep on tapping the side of my visor to fire weapons.
Microsoft HoloLens makes input a bit more interesting since it supports hand gestures and voice for a self-contained system. The case study videos also present workstation integrated input since you can see through the headgear and still use existing peripherals from keyboard, mouse, pen and other devices on the workstation.
Besides headgear based solutions we have devices such as Project Tango that sense and track the device position in the environment to give a Virtual or Augmented Reality feel without the headgear to block our vision — this however limits us to the device touch screen and inherent device camera view (similar to gaze with the headgear) since you’re holding the Tango unit to aim at items in the running app and tapping the control points.
The common input experience across the devices is the use of the “Gaze” which can be implemented through head tracking, device position tracking or even with scene camera tracking using a game controller. For our sample project, I’ll setup a Gaze based selector coupled with the Xbox One game controller.
Xbox One Controller
The Xbox One game controller is directly supported by the Windows environment especially when plugged in via USB cable and recognized by Unity as a multi-axis and multi-button joystick. My workstation is on Windows 10 so I’m also able use the controller through Xbox Wireless Adapter for Windows
(Android and iOS platforms have game controller options these days and are usually accessible as joysticks but may vary with the axis and button mapping)
Setup for Gaze and Controller Input
VR, AR and Mixed Reality usual handling for gaze based input is to take direct control of the scene camera so it would track the head movement for position and rotation/orientation. For some level of control we need to do the following:
- Create a “CameraRig” parent for the scene camera, this would give the developer and user the ability to move and re-orient the camera since the transforms of the camera itself would be controlled by the device
- Create a 3D Cursor equivalent for the gaze input (a device SDK provided cursor implementation may be available) We are operating in 3D space and in a headgear setup, we usually cannot simply implement a 2D overlay as it would not “feel” right for the viewer — we ideally would need to have a 3D cursor object set at a comfortable distance along the user’s gaze
- Perform a Raycast following the user gaze to the interactable objects in the scene — similar to the process with mouse or touch input
- Use the Joystick axis to effect object or even camera repositioning and buttons for action
I’ll discuss how to implement the input system and use it to interact with the sample Mars Base project on following posts.
For questions, comments or contact – follow me on Twitter @rlozadaFollow @rlozada