Creating Virtual and Mixed Reality Apps using Unity, Visual Studio and Azure

This is a multi-part guide to prepare developers for the upcoming consumer release of Virtual Reality and Mixed Reality systems. I won’t be discussing in detail the initial installation and basic setup of the tools since those are covered by other blogs and guides – I’ll be diving into how to put the pieces together.

Part 1: Going beyond the Virtual Block Sandbox

A while back, after my initial introduction to a multi-user operating system called Unix and discovering the games directory with nethack, I thought it would be much nicer with a multi-player option since we already have users on other terminals connected to the same server (beyond being such a useful tool for teaching muscle memory to navigate in vi) Sometime later with the Internet getting more prevalent, we ended up with MUD, MUSE and its variations eventually giving us the MMORPG. We are way beyond text graphics and even computer monitors these days with tablets and head gear supporting network connectivity but the appeal of being immersed in a different world or reality is still around either in a self-contained personal instance or a shared experience with other users.

Software development has significantly advanced and we now have tools that ease 3D, networking, database development and server hosting. We’re not limited to just exploring a dungeon anymore, we can pick any timeline, universe and even mix magic with science. The same technology can also be used for taking real world requirements from showcasing products to providing a detailed immersive 3D view of objects and places. On the succeeding posts, I’ll discuss how to build our own world and realities using readily available tools and systems.

Whether you’re building a game, product showcase or component simulation; the skill and tool requirements extend beyond traditional programming. Nothing is stopping a developer from coding everything needed but development time and effort can be reduced by understanding the available software and systems available today.

Tools of the Trade

  • Unity3D (or Unreal)
  • Visual Studio / Visual Studio Tools for Unity
  • Microsoft Azure
  • 2D/3D Graphics Software


I opted to use Unity3D early on as it was a full featured game engine supporting deployments to Windows, Android and iOS. It was a popular choice in the Indie game developer space and has since been used by mainstream studios. It is also a supported platform by Oculus, HoloLens, Kinect and RealSense. Unity is a free download for the personal edition and the cost-effective solution to get things started.

It took a bit of effort to adjust to Unity since it’s not anymore just a support library to help with drawing 3D graphics but a full platform. For programmers, you suddenly lost having your typical main() entry point or control of the traditional game loop and now have to rely on the internally managed Update() invocation.

These days, there are alternative options – Unreal Engine is also offering a free route for development supporting multiple target platforms. I will most likely do an Unreal Engine equivalent of this guide in the future. The website links are listed below for the game engines.


In the meantime, I’ll be using Unity3D 5.3+ as the game engine for this guide.

Visual Studio

Unity supports JavaScript and C# for its scripting. I will be using C# scripts in Unity and Visual Studio 2015 as the Script Editor. Visual Studio Tools for Unity further improve the integration and support debug/trace with Visual Studio on the Unity Editor.

Visual Studio Community edition is also a free download (based on usage type)
Windows Azure

Our end goal is to create a virtual world so unless we’re taking the single player route, we would need to have a back-end server to coordinate the interaction with the multiple users in the shared world. Azure provides front-end servers and data management solutions.
2D/3D Graphics Software

To create beyond the graphic primitives of cubes or spheres, we need to use 2D/3D graphics editing software. Unity lists several supported systems and formats and you can choose from several free options such as Blender, GIMP or even Paint.NET to professional grade tools from Adobe and Autodesk. The pro tools can be expensive but some have started to offer Community and Indie developer pricing models to make them more accessible.

I use Autodesk Maya for my 3D graphics editing as I did a lot of prior 3D work using the Autodesk Suite. I switch between Adobe Photoshop, Paint.NET and Autodesk Sketchbook for 2D along with Autodesk Graphic for vector and curves.
Defining a Reality

It’s a virtual world so it can be any place, time or universe you can imagine – steampunk to cyberpunk, a clean utopian world, a disaster ravaged city or simply a simulated room to showcase products. You even have a choice of taking a first person immersive view, 3rd person over the shoulder or a top-down viewer.

For this project, I’ll define the following initial core functionality to get started:

  • Top-Down / 3rd person view as the default and 1st person view option on VR headgear
  • Multiple concurrent users interacting in the same world
  • Server hosted world accessible from different client apps from tablets to headgear

We’re going top-down view so we can readily see the world and other users, we can use the same general approach to switch to a 1st person view we’ll setup later for a VR experience. I still prefer the 3rd person/top-down especially when the 3D objects are projected as holograms on a table top.

We’ll also use Unity Networking to manage multiple users in the same game scene connected to a backend host to handle user and execution states.

If I had the significant resources required to build a multiple planet and star system MMO or a fantasy land of warring kingdoms with dragons, I would be blogging about the development progress – for now let’s scope down the reality to what we can cover over the course of this guide.

Project-1: Creating a Mars Base

Why a Mars Base? Using this scenario, we end up with a self-limiting reality, players are pretty much bound to the modules we setup in the base and the transportation we can later decide to include. We get to simplify our graphics requirements to basic structures for now to get something up and running and focus on programming. The alternative would have been to create a product showroom or virtual store front – I’d rather we do an interplanetary trip to keep things a bit more interesting and we get to think about simulation of control systems.

Simple Base Design

Our main goal is to configure a shared reality across multiple users viewable from a 1st and 3rd person perspective so we’re starting with a simple design and graphics to handle the concept and mechanics before we spend effort with the look and feel of the app. We’re not yet creating realistic 3D models to submit to NASA or a Space conglomerate – but we also don’t want to be limited to cubes in our environment. I’ll define the following structures as our core components.

  • Habitation Module
  • AWP (Air and Water Processing) Module
  • Greenhouse Module
  • Power Module with Solar Panels

Modules will be inter-connected via access tubes and air-locks enabling the crew (user/player) to move around the modules and we’ll include the extra activity of going out of the modules to clean the solar panels or fix the turbines.

Let’s define the requirements of the base

We have 4 core resource management items (Air, Water, Food and Power)

  • Crew Members need Air, Water and Food to survive (humans are so needy)
  • The AWP (Air-Water Processing) Module provides Air and Water needed by the Crew and Greenhouse
  • The Greenhouse Produce Food
  • Modules require Power to function
  • The Power Module Produces Power relying on the Solar Panels
  • Solar Panels require regular maintenance

Base Operations

  • Cleaning Solar Panels to ensure Power Level
  • Harvest Greenhouse for Food
  • Regular Maintenance of AWP Module
  • Module Repairs

We begin with this small list of activities to demonstrate how the Base will function and the interaction needed with the user to keep it running. We now have a general concept of what we’re building so we’re off to creating the app.

The thousand miles and the first step

We have an end goal of a Mars Base with a defined set of activities and resource management requirements. How do we start? Let’s go through the initial steps needed to setup the rough Mars Base modules.

  • Creating the Unity Project
  • Create the 3D Models of the Mars Base Modules
  • Export the 3D Models as FBX for Import to Unity


Creating the Unity Project

Instead of screenshots where I describe the steps to take,  I include screen capture videos from my YouTube Channel to show the process.   Simply scrub / scan through the videos for the part you need to follow;  I usually text annotate the key sections and capture at 1080p (1920×1080) 60 fps.

  • Start Unity and Click on the “New” button to create a new project. Make sure to select the “3D” option. Define the project folder in the “Location” and name the project “MarsBase01”. Note that Unity projects are multiple folders and files stored in the “Location” folder.
  • You most likely would want to put your project into a source control system; Unity files however by default are in Binary format which becomes a pain to manage in source control, fortunately we have the option of converting files into text especially the configuration and setting files.
  • This is a relatively small concept project so you’re probably not so concerned with Source Control. Projects get complicated quickly and you’re still better off using a source control solution and there are several options available. You get to at least define a “good state” before making big changes or experiment a bit and revert as needed.I use GIT for my source control needs backed by either Visual Studio Team Services ( or GitHub (

Visual Studio supports GIT on the IDE for convenience or you can use a separate client like SourceTree (

“Hello World”

While we still typically follow tradition and start a new language, tool or system by getting our usual string output to work – in Unity we will be taking the “create a cube” route as writing text is a bit more complicated than getting a cube to display.

We’ll have to use 3D modeling software to get finer control of our 3D objects/ models as Unity’s editor is limited to primitive shapes. Unity can also import directly the save files of certain applications like Blender and Maya but it requires the applications to be installed in the same computer as Unity, the more versatile approach is to export the 3D models as an FBX file which can be imported and processed by Unity.

Video Notes

  • We see the difference in “units” handling between Autodesk Maya and Unity. Interpretation of unit scale in Unity is arbitrary to the programmer/user but the default would seem to be in meters based on other pre-existing defaults like gravity setting being “9.8” (even though real world gravity doesn’t look “right” in games.)
  • Autodesk Maya’s default unit setting is in centimeters and can be changed in preference settings and export options. Setting the unit scale to meters allow us to match Unity’s unit scale.
  • I have Unity and Maya installed on the same workstation so when I’m prototyping, I save my Maya scene (.ma) directly into a folder in my Unity project for direct import without going through the FBX export/import route. The scale only matches if I leave Maya in “centimeter” per unit setup.
  • Using Maya through the FBX export route provides multiple scale setting option to “meters” and would match the Unity setup typically resulting in varying values for the imported model “Scale Factor”, “File Scale” or even the model parent or object local scale values.
  • I find varying values to be a source of potential issues so I prefer to have all the model scale settings value to be 1. The video presents the process for creating and exporting an object to FBX that is imported in Unity with scale setting values configurable to 1. The setup I am using is to leave Maya in “centimeter” scale but export FBX in “meter” scale which results in Scale Factor and File Scale set to 1; some modifications may be needed to change the local scale of imported model object especially for grouped objects and then create the prefab.
  • I prefer to use the “Game Exporter” feature in Maya which allows customized export settings and support for defining frame numbers for animations at export time instead of having to set the values in Unity after import.

For questions, comments or contact – follow me on Twitter @rlozada

In the next entry, I’ll discuss how we create the scripts needed for operating the base and work with code.

Next: Part 2: Writing Scripts for Unity