Link Search Menu Expand Document

Designing menus

Table of contents
  1. Virtually wearable menus
  2. Selecting menu items (Push)
  3. Dragging menu items (Pinch + Release)
  4. Sub-menus
  5. Detachable panels
  6. Returning detached panels to the hand
  7. World–anchored menus
  8. Selecting world-anchored menu items (Push)
  9. Moving world–anchored menus
  10. Dedicated gesture

Virtually wearable menus

Try these menus in your own projects with Ultraleap’s Unity toolkit:

Unity modules

Hand UI – Interaction Engine Example Scene


Virtually wearable menus are a convenient way to access a small range of features quickly. Particularly suited to applications where users will be moving around the environment a lot, they can be accessed easily at any time while still being hidden during other actions.

Virtually wearable menus typically appear by a user’s hand, and contain selectable UI elements. When you anchor buttons to the hand, users can access it more easily because they have a better sense of where it is in space relative to their own hand.

Using a twist gesture to activate the menu works well as it is distinctive and easy to remember, and users are unlikely to perform it unintentionally when doing other things. This gesture will need to be introduced to new users using a short tutorial.

Virtually wearable menus can feel cumbersome if over–populated with too many UI elements or too large a panel. In cases where a large feature set needs to be accessed, consider using a world–anchored menu instead.

Menus often open new objects and panels in the environment. To avoid occlusion issues try not to spawn these items in the dead centre of field of view, or directly behind users’ hands. Instead offset users’ gaze slightly to one side. This will ensure the best possible tracking. Also make sure virtually wearable menus are oriented comfortably.


Selecting menu items (Push)

Users activate features by simply pressing UI buttons that are fixed to the nearest side of their hand using a push gesture.

VR hands selecting colours in a virtual UI

Try this example yourself:

Try Paint Demo


Dragging menu items (Pinch + Release)

Users pinch the menu items and move them away from the hand before releasing to activate features at a chosen spot. This can be used to open UI panels in the surrounding environment.

The menu items should appear as small objects that look like they can be picked up (see affordances).

Once an object is dragged out of the menu, ensure the item on the hand shows a clear state change. This makes sure that users can see at a glance which has been removed.

VR hands selecting colours in a virtual UI

Try this example yourself:

Try Paint Demo


Virtually wearable menus can have sub–menus that allow you to organise many features into different categories, and access more features when needed.

Consider frequency of use and importance. Make sure more important features are quicker to access. Beyond two or three levels, sub–menus will become cumbersome to use. In this case consider a world–anchored menu instead.

Users can push buttons to select categories and reveal further sub–menus.

VR hands selecting sub-menus in a virtual UI


Detachable panels

Consider allowing menus to be picked up and placed in the environment when users require them for extended periods – keeping the hand twisted for more than a few seconds may become uncomfortable. This can be useful if users need access to a specialist toolbar for an extended task.

Users pinch or grab to select a menu panel before moving it and releasing, placing it somewhere convenient within comfortable reach.

Once placed, the detachable menu behaves the same as a world–anchored menu.

VR hands selecting detachable menus in a virtual UI


Returning detached panels to the hand

If the menu is anchored in the environment after previously being attached to the user’s hand, allow it to be grabbed and placed back in. Alternatively, the empty spot on the hand can be tapped to quickly reattach it.

VR hands selecting detachable menus in a virtual UI


World–anchored menus

World–anchored menus are either:

Centred on anchor objects in the environment that show/hide the menu once selected.

Detached from a hand menu and positioned as a floating UI panel.

In both cases, world–anchored menus can be larger than hand menus, which allows them to display more controls. This makes them more suitable for tasks that require frequent access to a large selection of options and settings, such as a set of design tools or large colour palettes.

VR hands selecting world-anchored menus in a virtual UI

Note: There are similarities between world–anchored menus and UI panels. We recommend also reading the UI panels section for additional design tips and techniques.


Selecting world-anchored menu items (Push)

Users activate features by simply pressing buttons placed within the borders of the menu.

Buttons should react clearly to being pushed with distinct visual states and mechanical movement where possible.

Consider adding an audible ‘click’ to indicate when the button has been pushed.

VR hands selecting world-anchored menus in a virtual UI

See more of this example:

Try Particles Demo

Got an HTC Vive or Oculus Rift?

Try Button Builder Demo


Moving world–anchored menus

An anchor object can indicate that it is interactive by responding to an approaching hand with animation or a visual state change. It may also help to use an icon, as shown in the Particles demo below, to indicate interactivity. Make sure the anchor object is the right size and shape to be grabbed – it should clearly look like something you could grab by hand in the physical world (see affordances).

Users pinch or grab the anchor object, move, and release to reposition the menu in the environment. The anchor object must be clearly visible and within comfortable reach in oder to do this. World–anchored menus should default to no more than 70–80cm away from users.

VR hands selecting world-anchored menus in a virtual UI

See more of this example:

Try Particles Demo

Got an HTC Vive or Oculus Rift?

Try Button Builder Demo


Dedicated gesture

Users perform a distinctive, dedicated gesture or pose at any time to open a particular panel. This gesture should be introduced using a tutorial.

Dedicated gestures should be reserved for the most frequently used panel in the application, such as a main menu.

Typical gestures include:

  • Turning palm–up and pinching
  • Holding a number of fingers up in a pose