Designing for Hand Tracking

When designing interactions for hand tracking careful consideration needs to be given to priority, usability, comfort, and reliability – interactions are not universal across all applications.

Make interactions physical

Manipulating virtual objects with our hands leverages a lifetime of physical experience, minimizing the learning curve for new users.

Space scene in VR with Ultraleap hand tracking

Ultraleap recommends direct physical manipulation of virtual objects wherever possible, rather than abstract gestures. Users can manipulate objects with a broad range of push, pinch, and grab interactions. Whichever fingers and orientations feel most natural to them will work. So, for example, users can grab and throw an object without needing to learn new behaviour.

Ultraleap’s Interaction Engine provides a quick and robust way to implement these flexible, physical interactions.

Leap Motion Cat Explorer - a VR cat

The most intuitive interactions are ones where the virtual objects feel physical, and convey their use clearly through affordances, rather than overlaid prompts and help content.

Experience with directly manipulating physical objects in the real world means interactions may require little or no instructional information for people to use them in immersive 3D. This makes for a smoother, more intuitive virtual experience.

Explore interactions:

Try Blocks Demo

Try Cat Explorer Demo

Curved UI panels

Flat UI panels and components can be very useful in XR applications – users can access a lot of features quickly through a main menu, for example. But the 2D menus that suit touchscreens so well need adapting to a more physical, three–dimensional space.

Adding an overall concave curve to panels – particularly larger ones – helps ensure every control can be comfortably reached.

Green and Blue curved UI panel in VR

We also recommend adding an anchor object that users will quickly learn to grab in order to reposition it.

Use poses and gestures sparingly

Unlike natural physical interactions, abstract poses cannot leverage a lifetime of physical experience. They usually require instruction via tutorials or demonstrations. Teaching users specific poses reliably is a significant challenge as users will tend to perform them with slightly different hand poses and movement.

If you do decide to include abstract poses in your application, follow these basic guidelines

  • Use effective tutorials that can be accessed by the user at any time.

  • Make sure gestures and poses are highly distinctive from one another to avoid unintentional activations.

  • Minimise the number of gestures users must learn, and make their purpose consistent throughout.

  • Design for gradual introduction of new gestures one by one. Even once taught and mastered, most users will likely only remember a small number.

VR hand drawing a purple circle

Gestures and poses work best when used for actions that are valuable to the user. This makes them feel worth learning.

In other applications, abstract gestures may be appropriate for enabling locomotion, or bringing up a main menu UI panel.

Provide as much feedback as possible

Using hands is the most natural way to interact with an immersive 3D environment, but the real–world feedback inherent to pressing a button on a controller must be designed back in. Different types of feedback can guide interactions and increase user confidence.

VR finger pressing numbered buttons

Prioritise features when designing interactions

Defining interactions for an application requires careful consideration of the full feature set and how people will use each feature.

Physical interaction should always be used first where possible. Gestures and poses should be used sparingly for quick access to important or frequently used features.

VR finger selecting colour options in a UI panel

Other features can be accessed via other means. This could be a virtually wearable menu or UI panel, or contextual switching depending on mode or location. If your application contains lots of features, you will need elements like these in order to provide access to everything.

Make use of context

A powerful way to streamline user experience is to utilize context. You can refine user choices by considering where the user is looking, what elements are available to interact with, and what potential and likely actions might follow. This will focus users toward relevant interactions and remove unnecessary and unwanted clutter.

An example of this is using context to change the content in a virtually wearable menu. In this example, the default menu offers a range of UI panels that can be pulled out into the world. Once users open a shopping interface, the menu changes to offer only options related to shopping, such as product colour customization.

VR finger selecting shopping options in a UI panel

Consider user comfort

At all times, consider user comfort when placing objects and UI elements. Design interactions in a way that minimises the need to:

  • Perform large and strenuous movements

  • Hold hand poses for extended periods of time

  • Raise the arm above the shoulder

  • Fully extend the arm

Graphic showing hand movements with VR hand tracking

Try to ensure that virtual objects and UI elements are positioned within easy reach and in clear view by default. When opening a UI panel users should never have to look around to find it, or stretch to reach it.

Ultraleap VR skeleton hand pressing a button in VR

Elements that users generally interact with by pushing with their fingers, such as UI panels, should be positioned around 60cm away by default. This should ensure that elements are comfortably in reach for all users.

Virtual objects to be grabbed or pinched should be positioned around 50cm away by default for comfortable reach and grip.

All distance recommendations from Peebles, L. and Norris, B., 1998. Adultdata: the handbook of adult anthropometric and strength measurements: data for design safety (p. 404). London: Department of Trade and Industry.

Use appropriate scaling

Virtual objects and UI components such as buttons and sliders should be sized appropriately – will users engage with one finger, a small pinch with two fingers, or the whole hand?

Ultraleap user interface in VR spinning a teapot

One–finger targets should be no smaller than 2cm. A UI component controlled with a pinch would require a slightly larger pinchable element.

Make sure there is enough space around each component to perform the expected interaction, and not accidentally trigger other targets.

The interaction zone

Field of view

The Leap Motion Controller provides 140x120° typical field of view, and the Stereo IR 170 provides 170x170°.

VR UI panel

This creates a large interaction zone with reliable hand tracking throughout.

Users can be surrounded by objects, controls, and UI panels within a full 170° (within comfortable reach, of course).

Preventing occlusion

Occlusion refers to instances where the Ultraleap camera does not have a clear view of a hand, and momentarily cannot track it properly. These moments are rare but they can happen in certain circumstances.

When one hand is directly between the other hand and the camera, this obscures a hand from the camera’s view.

Grey and Black hand illustration crossing palms

To prevent this, avoid interactions or controls which might prompt users to cross their hands over one another.

At certain angles the arm and wrist can obscure some fingers from the camera’s view.

To prevent this, do not spawn new objects or panels in the dead centre of the field of view, or directly behind users’ hands. Instead offset user gaze slightly to one side. This will ensure the best possible tracking.

Graphic showing side view of Ultraleap camera

Graphic showing front view of Ultraleap camera

Side perspective

Front perspective

Some fingers may be obscured when the hand is directly in front of the Ultraleap Hand Tracking Camera.

Back to top