UCA Tutorial #11: HMI controls

Previous Tutorial  Tutorial 10: “Polylines” and 3D Shapes

Introduction

In this tutorial, we take the elements discussed in previous lessons and apply them to create a simple, touchless control interface. This type of touchless, haptic interface is one that is likely to be found in a smart home of the future.

Our interface is a simple console, which appears to float in mid-air and has a single button and a slider control. Each control becomes active when the hand enters their respective vicinity: The button toggles between its on and off state on each interaction; the slider sets a value between 0 and 1. These controls are connected by a top-level script so that the button changes the scene’s lighting and the slider controls the background music volume. Haptics are targeted to complement the state of each control. For example, when the button is pressed, clicked and released or when the slider moves and passes a notched marker point.

UCA HMI button slider

Tip

There are many ways to add haptics to an application. The ones we have used here are for demonstration purposes but you are encouraged to experiment to find what’s best for your application. For more information, read our series on haptic sensation design.

Up and Running

  1. Download or clone the UnityExamples Project found on our Unity Examples Github Repo
  2. In Unity, open the UnityExamples project
  3. From the Examples folder open the ButtonAndSlider Unity example scene.
  4. Add the UCA package and the Leap Motions assets as shown in previous tutorials. Connect up your hardware and press play to interact with the scene.

For each control, you will experience a haptic that complements the action undertaken. The controls also have an audible cue that coincides with specific haptic events.

Button

You can read a detailed description of the design of the haptics for a mid-air push-button in our separate article on using the circle sensation.

To summarise, the button is designed to maintain its state – on or off – after each click.

  1. On first moving the hand over the button, a tactile cue conveys the presence of the control;
    • The haptic is implemented as a circle sensation, with diameter matching the button,
    • It tracks the hand within the vertical volume of the button.
    • The button colour changes to give a visual indication of the interaction.
  2. The intensity of the circle increases as the hand presses down.
  3. Once the presser descends to a predefined point, a “click” is experienced indicating that the button has been fully depressed.
    This state, known as affordance, is implemented as an audible click with a brief, haptic gap before the sensation returns with maximum intensity.
  4. On releasing the button, a second click is felt to indicate completion of the switch’s state. In our example, the background lighting changes.
  5. While the button is released, the presence haptic’s intensity falls back to its initial value.
  6. The haptic cue stops once the hand leaves the vicinity.

UCA HMI button

Slider

The slider models a physical fader similar to those found in lighting or audio mixing consoles, and tracks the fingers as they move back and forth along a rail.

  1. When the hand is placed in the slider vicinity, the slider’s colour changes and the presence haptic is felt at the finger position.
  2. The slider moves to the position of the fingers as they move along the indicated line of travel.
  3. With increasing volume, the haptic changes size to convey value.
  4. The haptics for the slider tracks the finger outside of the physical position in Unity’s XY plane, maintaining the interactive presence.
  5. As the slider crosses each marker point the haptic changes briefly to a line running from fingertip to the palm.
  6. Once the hand leaves the vicinity of the slider, the haptics are disabled.

You can see how the haptic feedback changes on the right-hand-side image below: Haptic feedback indicates both presence and value using a circle that is stretched wider with increasing volume. You can also see the notch points with the line haptic.

UCA HMI Slider UCA HMI Slider with sensation

Scene analysis and mechanics

Opening up the scene we can see the contents of the Button and Slider console scene. We’ve already added the Ultrahaptics and Leap Motion prefabs. Much of the functionality of the scene is handled by the DemoController and Console game objects.

UCA button slider scene hierarchy

UCA button slider scene

The Console contains all the objects needed to construct and operate the button and slider.

  • The button is managed using the ButtonVicinity object. The ButtonPresser, a child of the ButtonVicinity, tracks the travel as the button moves, hosting images and sensation source game objects associated with the button movement.
  • The SliderVicinity object manages the slider behaviour and mechanics. It also has the Slider and Slider Sensation objects as children. The Slider objects represent the moving component and contain graphics as well as a sensation source for the notches.
  • The top-level DemoController object uses the attached ConsoleController script to connect the state of the controls to an actual function – the button control the background image colour, the slider controls the music volume. In addition, the script enables and disables the button and slider to ensure only one is functional at a time.

Button Vicinity Manager

Looking at the ButtonVicinity object in detail you will see that it is a capsule collider with an attached script ButtonVicinityManager.cs.

Button Vicinity Inspector

The ButtonVicinityManager manages the behaviour of the button by reacting to these trigger events when interacting with any other collider.

Tip

The Leap Motion hand model comprises a number of capsule objects, complicating when the hand is actually touching the button. To simplify the detection of the hand with the button, we use the attached HandTrackedBlock, which is a simple box collider with dimensions similar to the average hand and tracking the palm position.

The Button Vicinity Manager models the button’s behaviour as a simple state model, shown below:

Button Vicinity Manager state machine

As you can see the button is in a READY state until the hand enters the vicinity from above. It then moves to ACTUATING, in other words, you are now touching the virtual button to enable the presence haptic. This is shown in the code snippets from the script’s Update function:

if (switchPos == SWITCH_POSITION_E.READY)
{
    if (_buttonPositionY > _buttonMaxYValue)
    {
        hoverState.SetActive(true);
        _pressHaptic.Running = true;
        switchPos = SWITCH_POSITION_E.ACTUATING;
    }
}

While actuating, pressing increases the intensity of the sensation. On passing a threshold the pressing is complete and the button becomes CLICKED. The associated haptic and audio cue confirms this.

else if (_switchPos == SWITCH_POSITION_E.ACTUATING)
{                   
    // Update pressing haptic                   
    setPressHapticIntensity((_buttonPresser.transform.position.y - _buttonMinYValue) / _buttonTravelDistance);

    if (_buttonPositionY < _buttonMinYValue){
        _switchPos = SWITCH_POSITION_E.CLICKED;
        pressComplete();
    }          
}

The hand can continue moving down (the capsule collider extends below the travel of the visual component) but it is only once the hand returns back upwards and passes a second point is the click completed. It is at this point that the button is switched between on and off. The button state returns to ACTUATING since we can continue to press the button again and again.

else if (_switchPos == SWITCH_POSITION_E.CLICKED)
{              
    if (_buttonPositionY >= _buttonClickUpPosition)
    {
        // Hand is moving back up out off the button.
        _switchPos = SWITCH_POSITION_E.ACTUATING;
        clickComplete();
    }
}

It is only when the hand leaves the vicinity that we return to the READY state. Note that since the hand can leave the capsule collider by exiting to the side we can return to a READY state directly from CLICKED, as in real-life where we may accidentally touch a button without completing the operation.

private void OnTriggerExit(Collider other)
{
    if (_switchPos == SWITCH_POSITION_E.DISABLED)
        return;               

    if (other.name == "HandTrackedBlock")
    {
        // Leaving the vicinity, go back to ready state
        _switchPos = SWITCH_POSITION_E.READY;

        _tempObject = null;
        _hoverState.SetActive(false);
        _pressHaptic.Running = false;
    }      
}

The click sensation is achieved by using a haptic gap. This is done with the ButtonClickSensation with the intensity set to zero, and the RunForDuration function introduced in Tutorial 2 (the playback durations can be set in the inspector).

Slider components

The Slider behaviour is managed using the Slider Vicinity’s attached script Slider Manager. Like the button, it uses a simple state model to determine behaviour at each point and using the proxy Hand Tracked Block to trigger interaction with the slider vicinity. The slider itself tracks to a defined finger component, set in the Slider Vicinity inspector’s Tracked Finger Id field as middleFinger_intermediate_position, i.e. the middle of the middle finger. The Slider object represents the actual position of the slider, as well holding a sensation source component triggered for notch points on the slider travel.

Open up the Slider Manager script to see how it operates. Note that here, we also use a state machine to indicate interaction function at each stage:

Slider Manager State Machine

The slider has three states (as well as the DISABLED state controlled externally): ENABLED, READY and TRACKING. The slider is in a READY state when the hand enters the vicinity cuboid. The presence haptic is enabled and the slider graphic is shown to give the visual cue.

Note that the vicinity of the slider extends below the surface of the console. This ensures generous tolerance for the placement of the hand when positioning. This is a useful property when designing any hand tracked interaction.
The slider moves to a TRACKING state if the hand is aligned along the same direction as the slider travel. Hand alignment is defined as within 30˚ of the slider vicinity’s forward vector (_handAlignmentAngleThreshold value). This ensures that the slider position will not be changed by accidentally positioning the hand from other directions:

if (_sliderState == SLIDER_STATE_E.READY)
{
    _hoverState.SetActive(true);
    if (isHandAligned(_handAlignmentAngleThreshold))
    {
        _sliderState = SLIDER_STATE_E.TRACKING;                
    }
}

Once in the TRACKING state, if the tracked finger – middleFinger_intermediate_position – is within the bounds of the slider vicinity the haptic is positioned on the fingertip. One the tracked finger begins to move along the direction of the slider vicinity the slider moves. This updates the position of the slider graphic, haptics, and its value.

else if (_sliderState == SLIDER_STATE_E.TRACKING)
{
    if(isTrackedFingerInCollider(_trackedFingerId))
    {
        // Track the slider haptic to the middle finger
        Vector3 fingerPos = _autoMapper.GetValueForInputName(_trackedFingerId);
        _sensationTransform.position = new Vector3(fingerPos.x, fingerPos.y, _sensationTransform.position.z);

        if(HandAlongSliderDirection(fingerPos))
        {
            // Only update the slider position if the hand is moving along its motion of travel.
            float x = UpdateSliderPosition(fingerPos);

            UpdateSliderValue(x);
        }
        else
        {
            // Return to a ready state.
            _sliderState = SLIDER_STATE_E.READY;
        }
    }
}

As well as changing the slider’s circle sensation position, we actually change its dimensions, making it become larger as the value of the slider increases. This contributes to the affordance of the slider’s value through the haptic.

Tracking finger direction

A useful parameter when tracking hand gesture or position is velocity. When the slider is in TRACKING state, we wish to know that the slider is being moved. This is defined by taking the velocity of the selected finger position and making sure it’s direction is within some range of the direction of the slider travel. The code below updates a smoothed displacement value using the previous and latest position information. It then finds the angle between the displacement and the slider vicinity’s forward vectors. If this is within 20˚ then we say the finger is acting on the slider.

private bool HandAlongSliderDirection(Vector3 fingerPos)
{
    float toleranceDegrees = 20.0f;
    float beta = 0.5f; // Smoothing coefficient

    // Change in position since last call
    Vector3 sliderHandDisplacement = fingerPos-_lastFingerPoint;

    _lastFingerPoint = fingerPos;
    // Smooth using an autorecursive average
    _smoothedDisplacement = (1-beta) * sliderHandDisplacement + beta * _smoothedDisplacement;

    // Measure the angle between the slider and the direction of travel.
    float angle = Vector3.Angle(_smoothedDisplacement, transform.forward);

    // +/- 20 degrees?
    return angleWithinDegrees(angle, toleranceDegrees);
}

While we are ignoring time and not calculating absolute velocity, the resulting displacement is good enough for our purposes.

Adding haptic notches

To add the experience of moving a physical slider, we implement some virtual “notches” at regular points along the slider’s travel. In our example, shown in the animation above, we trigger a notch to match the numbered markers along the side of the slider. This is done by activating a different haptic, in our example the trackedLine, and an audio cue. The trackedLine goes for the centre of the palm to the tip of the middle finger and has a duration of only a tenth of a second:

private void UpdateSliderValue(float x)
{
    // Clamp between 0 and 1 and round to hundredths
    x = Mathf.Clamp(x, 0, 1.0f);
    x = Mathf.Round(x * 100.0f)/100.0f;
    
    _sliderValue = x * _sliderMaxValue;

    if (isNotch(x, _notchGradations, ref _lastNotch))
    {
        // play notch haptic and sound cue
        _notchSensation.RunForDuration(0.1f);
        _notchClick.Play();
    }
}

The effect of this is to give a noticeable difference from the presence sensation, again giving affordance of movement and position.

Connecting elements

Both button and slider simply set a visible property, on/off state and a value between 0 and 1 respectively, that is used by the top-level element of the scene. In our case, this is the Console Controller script attached to the Demo Controller game object. Our button is used to control the background image colour, the slider controls background music volume.

void Update()
{
    UpdateControlStatus();

    // Slider updates audio volume
    if (_lastSliderVal != _sliderManager._sliderValue)
    {
        _musicSource.volume = _sliderManager._sliderValue;
        _lastSliderVal = _sliderManager._sliderValue;
    }

    // Button updates switch status
    if (_lastSwitchState != _buttonVicinityManager._switchState)
    {
        // Button has clicked
        if (_buttonVicinityManager._switchState == ButtonVicinityManager.SWITCH_STATE_E.ON)
        {
            backgroundImage.color = onColor;
        }
        else
        {
            backgroundImage.color = offColor;

        }
        _lastSwitchState =  _buttonVicinityManager._switchState;
    }      
}

You can see how it would be a simple task to add additional controls, customise haptics and behaviour and create a more complex control interface.

For more information read our series on designing with haptics.

Have more questions? Submit a request

0 Comments

Article is closed for comments.