Direct and Distant interaction modes

VR applications using hands can operate in one of two distinct interaction modes:

Direct interaction

../../_images/recommended.png

Included in Ultraleap XR plugins

Direct interaction works much like real life, and as such tends to be immediate and easy for new users to pick up. Users interact with objects and 3D components up close, grabbing items and pushing buttons with their hands and seeing those elements respond appropriately. Objects are picked up simply by grabbing them like as you would in real life, buttons depress when pushed and toggle switches can be flicked with a finger.

Rather than detecting and responding to a distinct hand poses however, direct interaction works best when it allows users to pick up, push and handle elements with a range of subtly different grips, in different positions and orientations. It’s all about the hand making contact with the element in a flexible manner that feels natural. Ultraleap research indicates that for a given interaction - like picking up an object - developers can always expect some natural variation in the way people grip and move it, because everyone’s body, reach, and mannerisms are slightly different. Direct interaction caters for that.

To achieve that immediacy, objects and components must be designed realistically, with depth and physicality, and must respond as expected and with appropriate physics. More on designing objects.

Distant interaction

Image of hands in virtual reality manipulating a glowing block ../../_images/Research-in-progress.jpg

Coming soon to Ultraleap XR plugins

Distant interaction - as the name suggests - allows users to pick up objects, hit buttons and use panels from a distance. This is something that we can’t do in real life, and as such there is a small learning curve for new users, requiring help and onboarding material such as the tutorial in Ultraleap’s Blocks demo.

Unlike direct interaction, distant is more binary. With a ray cast from their open hand, users take aim at distant objects or 3D components, and form a pinch or grab gesture to trigger selection or related action. The element then responds with a clear state change and noticeable visual and audio cues. As the hand registers as open or closed, the button reacts accordingly.

With distant interaction, users can interact with all components on a UI panel, pick up, move, drop and throw objects, as well as do things they can’t in real life, like summon distant objects towards them.


A key consideration for a VR application that uses hands, is which interaction mode is appropriate. Direct interaction requires users to be up close to interactive elements, so often requires a locomotion interaction to move around. Distant interaction allows users to reach far more from a stationery position, but cannot offer the same flexibility and immediacy in object handling.


Want to learn more about implementing these features in the game engine of your choice? Check out the implementation guides:


Getting Started

Get started manipulating objects with your hands using our XR plugins for Unity and Unreal, where we provide the Interaction Engine, and Physics Hands - two physics-based systems designed to easily manipulate objects directly with your hands.

Learn more about these tools, and how to practically apply them, here.

Download XR development plugins