OpenXR Hand Tracking in Unity

Pete Nancollis, Senior Software Engineer

In December 2020 Unity announced that support for OpenXR would be added to Unity 2020.2 as a preview feature. This really opens up the space for developers and consumers as hand tracked applications can be made without being tied to one platform.

It didn’t take us long to figure out how to get going with OpenXR hand tracking in Unity.

This blog shows you how to set it up and try for yourself.

Block hands moving in VR

Getting Started

We found that it’s actually pretty easy to use OpenXR hand tracking in Unity.

First of all there are some basic requirements to get started:

Microsoft provide full instructions for installing their Mixed Reality Plugin. It’s a good idea to start there to make sure you’re using the most up–to–date version of their guide.

Be sure to also add the included sample project. There’s no need to use the example scene, but the scripts that come with it are very useful for visualizing hands.

If you don’t have your Ultraleap Hand Tracking Camera setup already, visit our setup guide to get started.

Setting Up Your Scene in Unity

Once everything’s installed getting started is very straightforward.

Let’s walk through what’s involved.

Open Unity. Starting with a default new scene, select the Main Camera object and add a Tracked Pose Driver component. This will link the camera transform to the movement of your HMD.

Next add an Eye Level Scene Origin component to the Main Camera as well. This comes as part of the Mixed Reality package. The OpenXR plugin assumes the application will be referenced from eye level, but by default Unity uses a floor level reference. This component ensures that your hands appear in front of the camera and not down at your feet.

While you’ve got the camera selected, bring the Near Clipping Plane down to 0.01. Now your hands won’t disappear if they get close to your face.

We still need some way of seeing our hands, so now copy the OpenXRExtensionHandJointsManager.cs script from the Microsoft Mixed Reality samples for Unity to your project. Next, create another GameObject and add the Open XR Extension Hand Joints Manager component to it. This component reads the OpenXR hand data and places spheres at each of the joint locations.

Also check out our experimental LeapProvider which allows our Unity Modules to be used with OpenXR hands. Source can be found at this pull request.

That’s all there is to it.

Press play, put on your HMD, and hold up your hands.

We can’t wait to see what you do with it!

Want to stay in the loop on the latest Ultraleap updates? Sign up to our newsletter here.

We welcome feedback on our products and services. Check out our support centre or contact us.

Back to top