Link Search Menu Expand Document

Integration Tips and Resources

Table of contents
  1. Testing for Optimal Tracking
  2. Mounting and Embedding For VR/AR
  3. Other Embedded Solutions
  4. Image API
  5. Multiple Devices
  6. Licensing
  7. Useful Link and Resources

Testing for Optimal Tracking

The VR Visualizer is a powerful diagnostic tool (available from the taskbar menu) that will familiarize you with how our hand tracking technology sees the world. You can flip between desktop and head–mounted modes, as well as see the data frame rate from the device. If the data frame rate is significantly below the values indicated in System Requirements, the hand tracking will appear degraded. This happens occasionally due to USB bandwidth limitations and can usually be remedied by switching to a different USB host controller.


Mounting and Embedding For VR/AR

The Leap Motion Controller can be easily mounted onto most VR/AR headsets using the VR Developer Mount. If you’re using the Stereo IR 170 Camera Module we have a 3D printable mount available.

Headset mount for Leap Motion Controller or Ultraleap Stereo IR 170

Depending on your headset and application, you may wish to adjust the virtual offset between the device and the user’s eyes within the game engine. An example of this might be a custom experience/headset integration in which hand alignment needs to be as closely matched as possible. You can adjust these values within the game engine. See our Unity or Unreal documentation for more details.


Other Embedded Solutions

Embedding the Stereo IR 170 Camera Module or Leap Motion Controller into a larger hardware installation can create a sense of mystery and magic by hiding the underlying technology. It also shields the device from dust, moisture, and accidental damage.

Virtual objects

If the user’s hands need to be in an enclosed area, Ultraleap recommend lining the inside surface of the space with a dark material that absorbs infrared light. Duvetyne is used in the film industry to make dark black backdrops and could be used for this purpose.

If you need to cover the device with glass or clear plastic, it should be placed directly on the top window’s surface. Ensure the glass or plastic panel stays very clean (smudges will degrade tracking quality) and use the VR Visualizer to make sure the panel is not blurring or refracting the light.


Image API

You can access stereo infrared images using the Image API. This can be used for augmented reality video passthrough applications. Combined with retroreflective markers and computer vision techniques, the Image API also makes it possible to track the position of physical objects as well as hands. An experimental software build, LeapUVC, is available to provide access to even more camera controls.


Multiple Devices

Whether you’re working on multiuser XR, art installations, location–based entertainment, or anything else using multiple devices, experimental support for multiple interactive spaces is available.

You can run more than one Leap Motion Controller on the same Windows 64-bit computer. To get started, make sure you have sufficient CPU power and enough USB bandwidth to support both devices running at full speed.

The package includes an experimental installer and example code in Unity. The devices are not synchronized but are timestamped, and there’s example code to help you manually calibrate their relative offsets.

While there’s no out-of-the-box support for adjacent spaces (where a tracked hand retains the same ID when moving from one device to another) or overlapping spaces (where the same hand could be tracked from multiple angles), this experimental build puts these possibilities into reach. To get started, download the experimental installer and multidevice Unity Modules, and create your project.


Licensing

Commercial and enterprise customers who are developing with our tracking technology require a license. Learn more at Ultraleap Licensing