Pete Nancollis, Senior Software Engineer
In December 2020 Unity announced that support for OpenXR would be added to Unity 2020.2 as a preview feature. This really opens up the space for developers and consumers as hand tracked applications can be made without being tied to one platform.
It didn’t take us long to figure out how to get going with OpenXR hand tracking in Unity.
This blog shows you how to set it up and try for yourself.
We found that it’s actually pretty easy to use OpenXR hand tracking in Unity.
First of all there are some basic requirements to get started:
An Ultraleap tracking module
Our OpenXR API Layer
An OpenXR runtime – depending on your platform and XR hardware this could include:
Microsoft provide full instructions for installing their Mixed Reality Plugin. It’s a good idea to start there to make sure you’re using the most up–to–date version of their guide.
Be sure to also add the included sample project. There’s no need to use the example scene, but the scripts that come with it are very useful for visualizing hands.
If you don’t have your Ultraleap tracking module setup already, visit our Hand Tracking Overview page to get started.
Check out the video below – once everything’s installed getting started is very straightforward.
Let’s walk through what’s involved.
Open Unity. Starting with a default new scene, select the Main Camera object and add a TrackedPoseDriver component. This will link the camera transform to the movement of your HMD.
While you’ve got the camera selected, bring the Near Clipping Plane down to 0.01. Now your hands won’t disappear if they get close to your face.
Next create a new GameObject and add the OverrideEyeLevel script. This comes as part of the Mixed Reality package. The OpenXR plugin assumes the application will be referenced from eye level, but by default Unity uses a floor level reference. This component ensures that your hands appear in front of the camera and not down at your feet.
We still need some way of seeing our hands, so create another GameObject and add the HandJointManager component. This component reads the OpenXR hand data and places cubes at each of the joint locations.
Also check out our experimental LeapProvider which allows our Unity Modules to be used with OpenXR hands. Source can be found at this pull request.
That’s all there is to it.
Press play, put on your HMD, and hold up your hands.
We can’t wait to see what you do with it!