Handling Controller Input for Immersive Mixed Reality Headsets

Microsoft recently started shipping the new Immersive Mixed Reality Headsets and while built with the same technology spatial tracking as the HoloLens, we don’t have hand gestures for Input.

The current immersive mixed reality headsets are tethered to a PC so developers have some flexibility on input devices, the typical experience is with an Xbox Game Controller or the announced Motion Controllers.

The motion controllers are accessible in Unity with hardware and programming details on the Mixed Reality Developer Pages and with easy to use components in HoloToolKit.     Unity 2017.2+ is required to develop for the immersive headsets so configuration with pre-release software is needed for now until final versions are released.  There are some changes in the namespace and classes for Virtual Reality and Universal Windows Platform (UWP) app development in Unity;  with Virtual Reality, Augmented Reality, Mixed Reality and even Merged Reality it tends to be confusing so this is consolidated in to the “XR” namespace in UnityEngine.

Motion Controllers are still not generally available but we can already write code using Simulation on the Mixed Reality Portal — Click on the “For Developers” button and enable the Headset Emulation.

Simulator

The Simulation environment emulates the Headset and Left/Right Controllers and controller features such as the Thumb Stick, Touch Pad and Buttons.

To access the controller input, setup event handlers on InteractionManager.

NOTE: Starting with Unity 2017.2.0b8 (Beta8),  the API changes from InteractionManager.OnSourceXXX events to InteractionManager.InteractionSourceXXX.  

using UnityEngine;
using UnityEngine.XR.WSA.Input;

public class InputHandling : MonoBehaviour
{
    public TextMesh StatusText;

    void Start()
    {
        InteractionManager.OnSourceDetected += InteractionManager_OnSourceDetected;
        InteractionManager.OnSourceLost += InteractionManager_OnSourceLost;
        InteractionManager.OnSourcePressed += InteractionManager_OnSourcePressed;
        InteractionManager.OnSourceReleased += InteractionManager_OnSourceReleased;
        InteractionManager.OnSourceUpdated += InteractionManager_OnSourceUpdated;
    }
}

Controllers have properties and functionality that can be determined on detection so specific entries can be checked during update events.   Note that I’m using C# Interpolated Strings instead of the classic String.Format(),  I’m using the new NET 4.6 Scripting Runtime Version and API Compatibility Level.

private void InteractionManager_OnSourceDetected(SourceDetectedEventArgs obj)
{
	// TODO: Store capabilities/properties like TouchPad, Thumbstick, Buttons, Tracking etc
	var st = obj.state;
	var source = obj.state.source;

	DisplayText(
		$"DETECTED: {source.kind} Id: {source.id} HandType:{ st.handType}\n" +
		$"ThumbStick?: {st.supportsThumbstick} TouchPad?: {st.supportsTouchpad}\n" +
		$"Grasp?:{st.supportsGrasp} Menu?:{st.supportsMenu} Pointing:{st.supportsPointing}\n" +
	);
}

private void InteractionManager_OnSourceUpdated(SourceUpdatedEventArgs obj)
{
	// TODO: Check capabilities before checking properties like TouchPad, Thumbstick, Buttons, Tracking etc
	var st = obj.state;
	var source = obj.state.source;

	var pt = obj.state.properties.location.pointer;
	Vector3 pos;
	Quaternion rot;
	pt.TryGetPosition(out pos);
	pt.TryGetRotation(out rot);

	DisplayText(
		$"UPDATED: {source.kind} Id: {source.id} HandType:{ st.handType}\n" +
		$"ThumbStick?: {st.supportsThumbstick} TouchPad?: {st.supportsTouchpad}\n" +
		$"Grasp?:{st.supportsGrasp} Menu?:{st.supportsMenu} Pointing:{st.supportsPointing}\n" +
		$"ThumbStickPos: {st.thumbstickPosition} TStickPressed?:{st.thumbstickPressed}\n" +
		$"TouchPadPos: {st.touchpadPosition} Touched?:{st.touchpadTouched} Pressed?:{st.touchpadPressed}\n" +
		$"Controller=Pos:{pos} Rot:{rot}"
	);
}

Controller Input follows the simulation values in the Mixed Reality Portal.   There are multiple keyboard, mouse and gamepad equivalents to simulate the controllers.   Mouse over the Controller UI to show the “Help” panel that displays the equivalent controls.

Another interesting tidbit is we now have “handiness”, the controllers report if they are Left or Right devices — this is an improvement on instance tracking that needed to be done for HoloLens Hand Tracking since it didn’t report Left or Right hand detection.

ControllerEmulation0

There are multiple controller features to simulate including touch pad, thumb stick, grip, and controller position and rotation tracking — yes, that’s a bit of parity with Oculus and Vive Controllers.

ControllerEmulationTouchpad

This is a quick glimpse of the controller capabilities and the needed code to handle the input.     It follows the concepts we learned from HoloLens input, extended a bit for the new motion controllers and the ability to test in the Simulated environment while waiting for the hardware to arrive.