Designing effective mid-air haptics

Here we present some considerations you should make when designing experiences with mid-air haptics.

1. Hand positioning

A key principle is to design the experience in such a way that optimises the extent to which the user feels the haptic sensation. Consider the hand positioning in order to achieve an optimal haptic experience.

The interaction zone is the volume of space above the Ultrahaptics array in which the haptic sensation can be felt and where the hand tracking device will track the hand. Ensure that the interactions and the haptic objects in the experience are within the interaction zone, anything outside of this zone will be weak or not felt at all.

hand-positioning-1-768x311.png

Figure 1. The green diagram shows that the user’s hand and the haptic interactive elements of the experience must be within the 50cm interaction zone to feel the optimum strength of haptic sensations. The red image shows a badly designed experience, where the interactive elements are outside the optimum interaction zone. Please note: interaction zone dimensions vary between array models.

Mechanoreceptors on the surface of the palm respond to acoustic pressure emitted by the Ultrahaptics array. The angle at which the focused pressure interacts with the hand is therefore an important consideration. To ensure maximum haptic sensation, the experience design should encourage the user to have an open hand with their palm facing the array when inside the interaction zone.

For example, if the array is placed pointing upwards on a table, the palm faces down. Conversely, if the array is facing downwards, acoustic pressure is directed downwards and the user should place their hand with their palm facing up. Remember to consider this during both the interaction design and the physical design of your experience.

This also goes for grasping gestures or gestures that close off the palm for receiving haptics. Avoid grasping gestures, gestures that move palm orientation away from the array, or close the palm. For example, if the array is pointing up, the sensation of catching a falling object will not be felt. Consider changing the array position, or adding an additional array.

During the physical design of your interaction space, think about the hand position your user is required to be in. This is both for maximising the haptic sensation that the user can feel but also for the user’s comfort.

hand-positioning-ultrahaptics.png

Figure 2. Green images show the correct hand positioning within the interaction zone with the palm open and facing the array. Red images showing suboptimal hand positioning.

The array angle is related to the point above – if the array is placed at an angle or flat on the table, the user must hold their hand at an angle to receive the optimum haptic sensation. As shown in Figure 2. the interaction zone is related to the array angle.

2. Haptic Congruency

Ensure there is congruency between haptic sensation and visual and auditory cues as well as the system status

Designers should aim to use haptics that are congruent with what is being conveyed. Here, congruency means two or more features corresponding to parameters such as timing, shape, size. For example, seeing a large sphere on the screen and feeling a small sphere on the hand is incongruent.

haptic-congruency.png

Figure 3. congruency between haptic, auditory and visual cues in the experience conveyed in time.

Timing and shape are key parameters to ensuring consistency between auditory, visual and haptic experience. This helps to heighten the feeling of presence for the user: research has shown that congruent haptic cues presented to the user enhance the level of presence they feel in virtual reality compared to incongruent tactile cues.

controlSystemStatus-3.gif

FF_FutureControls-1.gif

Figure 4. Example of congruency in a home control demo where the user moves their hand over the fast forward button and feels a point moving around their hand with congruent speed, size, and timing as the visuals.

3. The strength of haptic sensation

As with other perceptual modalities – visual, auditory – the perceived strength of the stimulus is due to the maximal stimulation of the sensory receptors. This principle is the same for haptic technology. Two guidelines that we have found to be useful are:

  1. Spatial Summation
    The size of the shape being drawn on the palm will impact on the strength. The larger the shape, up to the size of the palm and fingers (3cm x 3cm), the stronger it will feel.
  2. Dynamic Stimulation
    Moving the sensation around the hand creates more stimulation of the mechanoreceptors and therefore provides a stronger and more distinct sensation than keeping it stationary.
    As a corollary, avoid haptic fatigue: allowing regions of the hand to “adapt” to constant haptic sensation.
haptic-strength.png a) line-sensation.png b)

Figure 5. A larger surface area of the hand. a) When representing a line on the hand, two lines drawn next to each other will be perceived as stronger. b) As implemented in the Sensation Editor to provide a stronger line sensation.

4. Haptic switching duration

Pauses in sensations can be felt as much as the sensation itself. Rather than playing a different haptic effect, use pauses or gaps to increase impact and represent changes. For example when moving from one button to another. When switching between haptics, leaving a delay of at least 0.2 seconds is advisable. This delay, known as the haptic switching duration, makes the recognition of the change in haptics more discernible.

haptic-switching-duration-768x337.png

Figure 6. Haptic switching duration: Here, different sized circle sensations are separated by 0.2s.

5. Noise reduction

Time Point Streaming gives the haptic designer the ability to use up to four control points that can move and change intensity at up to 40000 times per second. This allows for some highly immersive and effective haptics but, because of how ultrasound is used by the Ultrahaptics array, it can result in audible artefacts. Audible noise happens when ultrasound demodulates in air at frequencies we can hear. This can be caused by changing the state of control points too rapidly, such as by

  • Changing the location
  • Switching between different haptics
  • Switching the haptic on and off

To minimise audible artefacts, avoid sharp, rapid changes in position, intensity or direction of control points.

For the example above, ramp down in intensity during the haptic switching period. You can experiment with different ramping approaches, such as using linear or a cosine profile. We recommend 0.02 seconds as a haptic switching duration.

noise-reduction.png

Figure 7. Example of the intensity ramping down during the haptic switching duration to reduce audible noise emitted from the array. Duration of this time interval is 0.2 seconds.

Some additional tips for how to change the properties of the sensation whilst keeping the noise to a minimum:

  • Make sure the control points are not too close together.
  • Avoid trying to create shapes with sharp angles, such as squares or triangles, by rounding off corners of the control point trajectory.
  • Avoid placing control points too close to the array surface or at the extremities of the interaction zone.

Note:

  1.  Amplitude Modulation does not require this type of mitigation since smooth transitions are designed into this mode.
  2. The Ultrahaptics STRATOS platform features a built-in smoothing filter that will minimise extreme switching. This filter is active by default.

6. Haptic priority

When more than one haptic sensation might be played back simultaneously you must choose the one which is most dominant. This is similar to visual hierarchy principles in visual design, when dominance conveys something critical about the experience.

For example, an experience that uses the hand to represent the position of a spaceship, with haptics for the presence of the spaceship and for lasers firing from it. When the spaceship gets hit by enemy fire, the haptics then represent an explosion, an event which changes the haptic priority.

haptic-priority.png

Figure 8. Illustrating the haptic priority principle where the highest priority haptic will play.

Depending on events in the interaction, the priority will shift. In this example, when there is an explosion, it will always take priority.

7. Static or moving

This question refers to the hand movements the user will be making and understanding the intention of these. Ultimately, the question is whether you want your users to move their hands towards a ‘static’ haptic sensation, or will the user’s hand be tracked with the haptic sensation tracking the hand. For now, think of the interaction requirements of this question: do you want the user to trigger something in a certain location.

static-movin.png

Figure 9. Illustrating the fixed and static principle.

In Figure 9, example 1) shows three buttons fixed in space within the interaction zone, the user will move their hand between these buttons, but the haptic for each button will be fixed. In example 2), the button follows the hand: the user will feel a button wherever their hand is located within the interaction zone.

You can read more articles about designing with Ultrahaptics in our complete series on design.

Have more questions? Submit a request

0 Comments

Article is closed for comments.