Virtual Objects

Try these menus in your own projects with Ultraleap’s Unity toolkit:

Unity Modules

Virtual Objects – Interaction Engine Example Scene


Well–designed virtual objects are (quite literally) the building blocks of engaging XR experiences. When combined with hand tracking they can be incredibly intuitive to use.

Users already have a lifetime of familiarity with physical objects, so when virtual objects appear and behave similarly to how they would in the physical world, immersion is maintained and little or no instruction is required.

This section covers design guidance for object appearance. For guidance on object behaviour please visit Manipulating Virtual Objects.

Users immediately comprehend direct analogies to real–world counterparts. With clear affordances and instructions they can also grow familiar with novel, coherent interaction patterns. Actions such as re–sizing objects with their hands or storing them in infinite, virtual pockets extend users’ capabilities beyond what is possible in the physical world.


Interaction Engine

Ultraleap’s Interaction Engine is a quick and robust way to implement flexible interactions based on contact between fingers and a virtual object.

Objects of a variety of shapes and textures can be grabbed or otherwise interacted with. The Interaction Engine powers tasks that would be impossible with a controller.


Affordances

How do users figure out what to do when faced with unfamiliar objects and environments? Affordances are the characteristics of an object that signal what actions are possible and how they should be done.

Physical objects convey information about their use. A doorknob affords turning, a chair affords sitting. Well designed virtual objects provide clues about what can be done, what’s happening, and what is about to happen. Putting groves in the surface of a virtual ball guides users where to place their fingers to pick it up, buttons can look like they need to be pushed, UI panel elements can look like they can be grabbed and moved.

Users should be able to perceive affordances and instantly understand how to use the items.

If an object’s physics are realistic and respond to any part of the hand, users do not have to learn a specific way of interacting with it.

For example, a large circular button should be able to be pressed with either the palm, a finger, or the back of the user’s hand. It should react and move when pressed.

The buttons provided in Ultraleap’s Interaction Engine work as physically–simulated, mechanical components. They only need visual cues to indicate to the user that they are “pressable”.

When a specific interaction is required the right design will guide the user into choosing the appropriate interaction. It makes it clear how the object operates and how it should be interacted with.

Leap Motion Cat Explorer

The reactive handles in Cat Explorer respond to pinches and snap into the pinch point, guiding the user towards the correct action.*

Provide clear signals about what parts of your object accept which type of interaction by making the most of natural associations with existing mechanisms and forms. This could be handles or switches, as well as XR patterns like near–hand reactivity, pinch–point snapping, etc.

When affordances are clear and objects behave as expected, they feel natural and intuitive to users, and can require little or no instructions to use.


Case study: Designing a ball object to be picked up

While developing the Interaction Engine it became clear to us that everybody grabs objects differently – this is especially true in VR. Some people are very gentle and barely touch an object, while others will quickly close their whole hand into a fist inside the object.

In Ultraleap’s Weightless demo users can grab and throw small balls around the environment.

During development the team tried picking up smooth spheres and other shapes. Users had a hard time identifying how to hold these objects. Once they’d picked up an object with a grabbing interaction, many users closed their hands into fists. This made throwing more difficult as the object could become embedded in the hand’s colliders after release. It also led to accidental grabs.

Taking cues from the design of bowling balls and baseball pitcher’s grips, the team added some indentations to the shapes, and highlighted them with accent colours. In repeat testing, users tended to place their fingers in the indentations and keep the hand open, resulting in a smoother throw.

In this way, thinking like an industrial designer and adding subtle affordances to virtual objects can make a significant difference to user experience.

Graphic showing the correct way to interact with a virtual object in VR Interact with a red virtual ball in VR

Try this example yourself:

Try Weightless Demo


Object behaviour

Everything interactive should react to users

In XR interactive objects should respond to any casual movement. Users don’t always know what to expect from surrounding objects and environment. Dynamic feedback helps to build a mental model of how the virtual world works, and what each action achieves.

If your application does not include mid–air haptic feedback, it’s important to use other cues to show that interaction with a virtual object has taken place. Without consistent feedback users may struggle to know what can be interacted with, and how.

Building blocks in VR

Make sure interactive objects and elements respond to a user’s approach, and can be easily identified. You can do this by using a consistent visual style and consistent responses.

Virtual objects should also respond to being interacted with (picked up, activated, or otherwise) with clear visual and audio confirmations. This might include colour changes or small animations.

Interactive menu selection in XR

Example: Designing a button

  • Use shadow from the hand to indicate where the user’s hand is in relation to the button

  • Create a glow from the button that can be reflected on the hand to help understand the depth relationship

  • Ensure the button moves relative to the amount of pressure (z–press) from the user

  • Use sound to indicate when the button has been pressed (“click”)

  • Create specific and visually distinct button states (e.g. for hover, push down, selected)

Explore object behaviour with:

Blocks Demo

Cat Explorer Demo


Back to top