Interacting with Plugins in Unity and Visual Studio


Unity started out as a gaming platform and with friendly licensing terms for new game developers it gained quick popularity. New devices from VR Systems such as the Oculus, Vive, Cardboard, Daydream and sensors like the Intel RealSense and Google Tango provide a Unity SDK to get an easy to use development environment that handles user input and displays 2D and 3D graphics.

Microsoft launched the Mixed Reality platform starting with the HoloLens and Unity continues as one of the support development systems and works nicely with Visual Studio. An additional consideration is while Unity typically uses Mono to support cross-platform builds; on the Microsoft Windows platform, it can build and execute on .NET and IL2CPP (Intermediate Code to C++ for native recompilation) and target the Universal Windows Platform.

In recent projects,  I needed to use hardware/GPU functionality and native code libraries while working in Unity to handle the User Interface and core graphics features — such as hardware H.264 encoding and decoding for camera views of a graphics scene image frames sent across the network.

Game Runtime

Unity has evolved from its purely gaming roots; however, its runtime is still tied to the traditional concept of a game loop prioritizing graphics display updates and input response for a user experience of at least 30 frames per second — current systems push this requirement to 60-90 frames per second.

Unlike historical project templates that defines an explicit game loop in the application main entry point; each script attached to a Unity GameObject derives from MonoBehavior and has event functions such as Update() invoked from the main execution thread for its processing time slice. Concurrent tasks in the Unity Scripting world is based on Coroutines with yield-waits closer to cooperative multitasking instead of a separate process and executed similar to Update() that eventually yields control back to the main thread instead of operating like a thread instance. Combined total execution time of code in the update event functions and coroutines must be at most 33 milliseconds to achieve the 30 frames per second response time.

Extending Capabilities with Plugins

Plugins provide a path to utilize hardware resources and capabilities not available through Unity scripts. This includes low-level device access from graphics, IO modules, GPU to new language and platform features not restricted by Unity script compatibility requirements.

Focusing the discussion, the sample project UnityPluginHandling targets Universal Windows Platform also referred to as Windows Store in the Unity Build Platform Selector. We’ll be using C# as the scripting language, Visual Studio 2017 as the editor and compiler, NET 4.6+ managed code features, C++ and DirectX for native plugins and low-level graphics. UWP (Universal Apps) is the build route to publish on the Microsoft App Store, target HoloLens and the Windows Mixed Reality Platform.

Managed and Unmanaged Plugins

The Unity script runtime and development is managed* code from the developer viewpoint. (IL2CPP brings the execution over to native space but the source still came from managed code concepts and that’s a different discussion topic)

Managed code plugins are comparatively easier to use than unmanaged/native counterparts, all that is needed is for the Class Library DLL to be included in the Unity Editor Project Folder and configured for the appropriate target platform. The Unity Editor auto-configures the plugin properties as best it can if deployed in the “/Plugins” folder with sub-directories for target build platforms as listed in the documentation. The plugin methods are accessed the same way as other library dependencies; accessible classes are invoked or instantiated with the “using” keyword defining the namespace for the classes.

Unity C# Scripting as of version 5.6.1f1 on the editor targets NET Framework 3.5 compatibility so while the scripts eventually are built into a UWP Visual Studio Solution that can parse new language features and UWP dependencies, it will report a compiler error on the Unity Editor. Compiler directives are used to designate code sections that use platform specific calls. Unity also provides an alternative approach using Placeholders but that’s a topic for another day.

using System;
using System.Collections;
using System.Collections.Generic;
using System.Runtime.InteropServices;
using UnityEngine;

using UniversalCSharp;
using UniversalWRCCSharp;

Unmanaged/Native code plugins require extra steps and consideration with a slight visit to historical documentation on interoperating between native and managed code. In the fantasy world equivalent –this is higher magic requiring spells from ancient tomes.

The general requirements are posted in Unity Native Plugins, which brings InterOp requirements from the attributes for methods, parameter passing considerations and object lifecycle.

Native Plugin methods that are called from Unity should have the following signature.

extern "C" __declspec(dllexport) [return type] [function(params)];


extern "C" __declspec(dllexport) void Add(int x, int y);

Unity provides some macros and headers for convenience located in:

C:\Program Files\Unity\Editor\Data\PluginAPI to enable:


To maintain the method names in Visual Studio, use a DEF file otherwise Visual Studio may prepend the “_” underscore.

To call the plugin methods from Unity, it needs to be loaded using DllIImport and the method signature defined as extern.
The convention is to define UnityLoad and UnityUnload to inform the plugin on startup and shutdown.

extern "C" void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API UnityPluginLoad(IUnityInterfaces* unityInterfaces)


Managed Code Plugins

In the following code example from:


This managed C# UWP DLL executes a background task to perform processing outside the Unity script execution thread. On the running task, the processing can take its time and invoke a callback on the Unity script when completed.

using System;
using System.Threading.Tasks;

namespace UniversalCSharp
    public class PluginTask
        public Action<int, string> OnCallbackEvent;
        private string message;
        private int delay;

        private int counter = 0;
        private Task backgroundTask;

        public PluginTask(string msg, int d)
            message = msg;
            delay = d;

        public void StartThread()
            backgroundTask = Task.Run(() =>
                    for (;;)
                        OnCallbackEvent?.Invoke(counter, $"{message}-{DateTime.Now:T}");

        public string GetInfo()
            return "UniversalCSharp::GetInfo()";

Native / Unmanaged Code Plugins

The native code example is used to perform graphics operations that modifies pixel values on a Direct3D texture in the background. We typically cannot do this in Unity scripts as it will block the Unity thread if the texture pixel update takes longer than the frame update time slice.

The code is based on the Unity low-level plugin example posted at updated to execute on a detached thread and for the graphics to update based on the graphics render trigger from Unity script.

Reference Code from: UnityPluginHandling/blob/master/External/Plugins/UniversalWRCCpp/WrcCppClass.cpp

The plugin thread is started from Unity:

extern "C" void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API SetPluginMode(int mode)
	PluginMode = mode;

	if (PluginMode == 2)
		// Starts the display update thread and detach to run continuously
		backgroundThread = std::thread(ProcessTestFrameDataThread);

The thread modifies the texture image buffer in the background, Unity invokes UpdateSubresource() to apply the image values to the displayed shared texture at the appropriate end-of-frame period of the game update.

void ProcessTestFrameDataThread()
	float scaleval = 127.0f;

	while (true)
		if (!g_TextureHandle)

		//int bufSize = g_TextureWidth * g_TextureHeight * pixelSize;
		const float t = g_Time * 4.0f;
		g_Time += 0.02f;

		isARGBFrameReady = false;

		byte* dst = argbDataBuf;
		for (int y = 0; y < g_TextureHeight; ++y)
			byte* ptr = dst;
			for (int x = 0; x < g_TextureWidth; ++x)
				int vv = int(scaleval +
				(scaleval *
				sinf(x / 7.0f *
				(500.0f / g_TextureWidth)
				+ t))) / 4;

				ptr[0] = vv;
				ptr[1] = vv;
				ptr[2] = vv;
				ptr[3] = 255;

				// To next pixel (our pixels are 4 bpp)
				ptr += 4;
			// To next image row
			dst += pixelSize * g_TextureWidth;

		isARGBFrameReady = true;


Unity provides the core plumbing in developing apps; to do more we need to consider going closer to the hardware and utilize features and functionality we normally would not be able to access from the Unity Runtime from threading, GPU, low-level networking and even modern language features.