Getting Started with Immersive Mixed Reality Headset – Part 3: Development

Time to code for Windows Immersive Mixed Reality!   The device setup is done, the Mixed Reality Portal works and inside-out tracking rocks!   We’re not anymore bound by the sensor placement requirements of Oculus Rift or setting up HTC Vive lighthouses on high poles.   I’m also enjoying the HP Mixed Reality Headset build quality and the headband feels secure and cushioned nicely compared to other models I’ve used.

MR-OSFastRing

It took a while to update to the Windows 10 Insider Build (1703-16251.0) — so a bit of patience or doing the dev workstation update with a high-speed Internet link and SSD OS drives; the price paid for being an early adopter.

As a matter of habit,  I normally reboot my workstation with the headset disconnected.  Now that we’re on the latest build, reconnect the headset and the device driver installation initiates then Mixed Reality Portal is started.    This time, I experience the OOBE (Out Of Box Experience), which runs me through movement training using the Xbox Controller.   It’s a quick presentation of movement and selection before you’re back in the virtual cliff house (shell).

After all the pieces are in place including the corresponding Windows SDK Insider Preview (10.0.16232.1000 as of writing), time to build our “Hello, World” equivalent for Windows Mixed Reality.

The usual sample code is to display a Cube in 3D space — writing the text “Hello, World” is comparatively more complicated in 3D apps.   I usually have the cube rotate on an axis just to prove to myself that the code is still running.

Unity 2017 Configuration

The quickest way to setup the test app is through Unity 2017.2,  right now I would need to use the latest Beta version (2017.2.0b4) to work with the Mixed Reality Headset.

  1. Start Unity 2017.2 (Windows Mixed Reality Headset Support)
  2. Create a New Project and Save your current scene.
  3. Change the Build Settings and Select “Universal Windows Platform”
    • Target Device : Any
    • Build Type: D3D  (this is for DirectX 3D)
    • SDK: Latest Installed
    • Build and Run on: Local Machine (this will deploy to the headset through the Mixed Reality Portal)
      BuildSettings
  4. Add the current scene in to the Build.
  5. On “Player Settings”, ensure the Scripting Backend is set to “.NET”, it might sometimes be set “IL2CPP”.    Currently “.NET” is required for the application to deploy and run on the headset.
    PlayerSettingsScriptingBackend
  6. Enable Virtual Reality in the XR Settings Tab
    PlayerSettingsXR
  7. Remember the Project Name, this will be listed in on the application menu on the headset.
    PlayerSettingsPublish

Unity Simple 3D Test Project Setup

Unity2017.2B4

    1. Set the Main Camera to Position (0,0,0)
    2. Create a Cube, position it at (0,0,3)
    3. Create a Material “CubeMat”, change the color to a color you like — so you don’t have to see a bright white cube on the headset
    4. Create a simple rotate script for the cube
using UnityEngine;

public class RotateMe : MonoBehaviour
{
    public float TurnRate = 90f;

    void Update()
    {
        transform.Rotate(
			Vector3.up *
			Time.deltaTime *
			TurnRate);
    }
}

Unity “Play” Mode

Mixed Reality Portal works with Unity “Play” Mode when the project is configured as a Universal Windows Platform Virtual Reality Application.

UnityPlayMode

In the screenshot,  the project is running on the Unity Editor and the output is displayed on the Mixed Reality Portal and the Headset.    Headset tracking is also fed back to Unity to control the Main Camera in the scene.   This is similar functionality to the HoloLens Remoting App but in the same workstation and connected headset.

Unity Play Mode support is a convenient workflow to test the 3D visuals, head tracking and application input without the need to build and deploy the app as long as it can execute on the Unity Editor.

Unity Build and Deploy the Application

There are two routes for deploying the application,  you can follow the historical HoloLens workflow of building the C# Project, then opening in Visual Studio to Build and Deploy to the device.  Personally, I would follow this workflow if I have a lot of scripts in the project that I would be modifying in the course of development — it’s easier to edit the GameObject scripts in the generated project and perform a build/deploy.

The alternative quick route is to “Build And Run” from the Unity Editor, there’s even a keyboard shortcut (Ctrl-B) to do it.   On first build, you will be prompted to select a target folder and Unity, Visual Studio and MSBuild will build and deploy/run the APPX on the Mixed Reality Portal/Headset.

In this Game DVR Capture of the Mixed Reality Portal — I move around the Shell to a target room using the “Teleport” command on the Xbox One Controller (Left-Stick) and then open the App Menu using Xbox Button and then Select the Sample Project App using Gaze and Xbox Button A.

This 3-Part Blog Post is a quick run through from Unboxing to App Development and Deployment to showcase the features and capabilities of the Mixed Reality Headset with Windows 10 Creators Update and Unity.  Check out Windows Mixed-Reality Development on the Microsoft site for additional details.