Designing Menus

Virtually wearable menus

Try these menus in your own projects with Ultraleap’s Unity toolkit:

Unity Modules

Hand UI - Interaction Engine Example Scene

Virtually wearable menus are a convenient way to access a small range of features quickly. Particularly suited to applications where users will be moving around the environment a lot, they can be accessed easily at any time while still being hidden during other actions.

Menus typically appear by a user’s hand, and contain selectable UI elements. When you anchor buttons to the side of the hand or wrist, users can access it more easily because they have a better sense of where it is in space relative to their own hand.

Virtually wearable menus can feel cumbersome if over–populated with too many UI elements or too large a panel. In cases where a large feature set needs to be accessed, consider using a world–anchored menu instead.

Menus often open new objects and panels in the environment. To avoid occlusion issues try not to spawn these items in the dead centre of field of view, or directly behind users’ hands. Instead offset users’ gaze slightly to one side. This will ensure the best possible tracking. Also make sure virtually wearable menus are oriented comfortably.

Opening menus

For many applications, using a twist gesture to open a menu with one hand works well as it is distinctive and easy to remember, and users are unlikely to perform it unintentionally when doing other things. This gesture will need to be introduced to new users using a short tutorial.

VR hands using a wrist menu

Another option is to open a menu with a button anchored near to the hand – buttons placed directly on the hand can cause occlusion issues. Pushing this button with the other hand should open the menu.

VR hands opening wrist menu with a button

For best performance, components should be placed such that one hand does not occlude the other during use. In the example above, the button is placed to the side of the hand. Buttons can also be placed directly on the wrist or arm, as in the example below.

VR hands menu archored to arm

Selecting menu items (Push)

Users activate features by simply pressing UI buttons that are fixed to the nearest side of their hand using a push gesture.

VR hands selecting colours in a virtual UI

Try this example yourself:

Try Paint Demo

Dragging menu items (Pinch + Release)

Users pinch the menu items and move them away from the hand before releasing to activate features at a chosen spot. This can be used to open UI panels in the surrounding environment.

The menu items should appear as small objects that look like they can be picked up (see affordances).

Once an object is dragged out of the menu, ensure the item on the hand shows a clear state change. This makes sure that users can see at a glance which has been removed.

VR hands selecting colours in a virtual UI

Detachable panels

Consider allowing menus to be picked up and placed in the environment when users require them for extended periods – keeping the hand twisted for more than a few seconds may become uncomfortable. This can be useful if users need access to a specialist toolbar for an extended task.

Users pinch or grab to select a menu panel before moving it and releasing, placing it somewhere convenient within comfortable reach.

Once placed, the detachable menu behaves the same as a world–anchored menu.

VR hands selecting detachable menus in a virtual UI

Returning detached panels to the hand

If the menu is anchored in the environment after previously being attached to the user’s hand, allow it to be grabbed and placed back in. Alternatively, the empty spot on the hand can be tapped to quickly reattach it.

VR hands moving detachable menus in a virtual UI

World–anchored menus

World–anchored menus are either:

  • Centred on anchor objects in the environment that show/hide the menu once selected.

  • Detached from a hand menu and positioned as a floating UI panel.

In both cases, world–anchored menus can be larger than hand menus, which allows them to display more controls. This makes them more suitable for tasks that require frequent access to a large selection of options and settings, such as a set of design tools or large colour palettes.

VR hands selecting world-anchored menus in a virtual UI

Menu panels should be positioned around 60cm away by default. This should ensure users can reach without having to fully outstretch their arm.

Selecting world-anchored menu items (Push)

Users activate features by simply pressing buttons placed within the borders of the menu.

Buttons should react clearly to being pushed with distinct visual states and mechanical movement where possible.

Consider adding an audible ‘click’ to indicate when the button has been pushed.

VR hands selecting world-anchored menus in a virtual UI

See more of this example:

Try Particles Demo

Got an HTC Vive or Oculus Rift?

Try Button Builder Demo

Moving world–anchored menus

An anchor object can indicate that it is interactive by responding to an approaching hand with animation or a visual state change. It may also help to use an icon, as shown in the Particles demo below, to indicate interactivity. Make sure the anchor object is the right size and shape to be grabbed – it should clearly look like something you could grab by hand in the physical world (see affordances).

Users pinch or grab the anchor object, move, and release to reposition the menu in the environment. The anchor object must be clearly visible and within comfortable reach in order to do this. World–anchored menus should default to no more than 70–80cm away from users.

VR hands selecting world-anchored menus in a virtual UI

See more of this example:

Try Particles Demo

Got an HTC Vive or Oculus Rift?

Try Button Builder Demo

Back to top