Previous tutorial: Sensation Animation and Sequencing
Next tutorial: Building an interactive poster
- Introduction
- Up and running
- Scene Elements
- Referencing a UCA data source in your script
- Making our sensation two-handed
- Forcefield Prefab
- Worked exercise – Audio Filter Slider
Introduction
In this lesson we look in detail at the “Forcefield” example and how we can use it to implement two-dimensional plane intersection. This simple application projects a fixed, vertical quad, representing a forcefield beam emanating from the array. As the hand passes through the beam, the sensation of the forcefield is felt. This example also has the ability to replicate the sensation on a second hand, allowing for two-handed interaction.
We also explain how to reference UCA data sources within your C# Unity scripts. At the end of the lesson, we bring it all together, working through an example to extend the forcefield beam to use as a simple control for an audio filter.
Up and running
Open the Forcefield example scene from the asset’s Examples folder. Making sure to first add the Leap Motion asset and prefab to the scene’s UltrahapticsKit/TrackingOrigin object, play the scene.
Move your hand through the blue forcefield: the sensation tracks the hand within the plane. You can see how the position of the sensation tracks the hand by enabling the Unity gizmos feature.
The sensation is created by projecting a line sensation across the hand as it intersects the plane. The line is fixed to the plane of the “forcefield”, giving the impression that you are feeling the plane itself. Place your other hand in the boundaries of the plane and feel the forcefield extend across it too.
Scene Elements
The Forcefield Unity scene is slightly more complex than in previous examples: In addition to the usual, functional elements – lighting, camera, background, UltrahapticsKit – we include the ForcefieldBeam Prefab, included with UCA.
ForcefieldBeam Prefab
The visible forcefield object is a trigger box collider with a cube, mesh renderer. This has the “Haptic Trigger Region” script used in previous example scenes to handle collisions and start and stop haptics.
ForcefieldSensation
This object is a child of the ForcefieldBeam and contains our Sensation Source component. The Sensation Source is set to use the Forcefield sensation block, which uses designed to calculate the line segment of intersection between the hand (modelled as a quadrilateral) and a boundary object (the imaginary forcefield beam!)
Changing the forcefield
The ForcefieldSensation component also has an attached script called ForcefieldBlockInputUpdater.cs. This has two fields that take the parent’s transform object and the Sensation Source. Its purpose is to update the Forcefield sensation’s inputs forcefieldCenter, forcefieldUp, and forcefieldRight:
This allows the sensation to correctly track the ForcefieldPlane’s position, incline, and rotation. Position your forcefield, or interactive plane, in any orientation independently of the array, while maintaining haptic sensation on the plane. This is shown in the image on the right, where the plane is tilted back and moved towards the back of the array.
Plane Intersection Mechanics
The forcefield sensation is created by projecting the Line Path, introduced in Tutorial 6, across the part of the hand where it intersects the forcefield plane. The forcefield and hand are modelled by the Forcefield Quad and Palm Quad. With the included QuadToQuadIntersection block, the tricky vector mathematics of calculating the line’s endpoints is taken care of, meaning you never have to worry.
You can read full details of the QuadToQuadIntersection block’s inputs and outputs in the Block Manifest. We connect the inputs of the QuadToQuadIntersection to the Forcefield plane and Palm Quad (i.e. a notional plane attached to the hand) and the output to the LinePath block.
import QuadToQuadIntersection quadToQuadIntersectionBlockInstance = createInstance("QuadToQuadIntersection", "QuadIntersectionInstance") linePathInstance = createInstance("LinePath", "line") connect(quadToQuadIntersectionBlockInstance.endpointA, linePathInstance.endpointA) connect(quadToQuadIntersectionBlockInstance.endpointB, linePathInstance.endpointB)
The resulting output is then connected to the createSensationFromPath script function.
forcefieldBlock = sh.createSensationFromPath("ForcefieldLine", { ("palm_position", quadToQuadIntersectionBlockInstance.center0) : (0.0, 0.0, 0.0), ("palm_scaled_direction", quadToQuadIntersectionBlockInstance.up0) : (0.0, 0.0, 0.0), ("palm_scaled_transverse", quadToQuadIntersectionBlockInstance.right0) : (0.0, 0.0, 0.0), ("forcefieldCenter", quadToQuadIntersectionBlockInstance.center1) : (0.0, 0.1, 0.0), ("forcefieldUp", quadToQuadIntersectionBlockInstance.up1) : (0.0, 0.1, 0.0), ("forcefieldRight", quadToQuadIntersectionBlockInstance.right1) : (0.1, 0.0, 0.0) }, output = linePathInstance.out, definedInVirtualSpace = True )
The “palm_scaled_direction” and “palm_scaled_transverse” are additional inputs provided by the Leap Data Source, introduced in Tutorial 5. The “palm_scaled_direction” and “palm_scaled_transverse” inputs mean that the length of the line intersection is properly scaled for the size of the hand.
And the block connectivity for the completed sensation:
Referencing a UCA data source in your script
You can reference a UCA Data Source’s references from within a C# script.
The IAutoMapper is an interface class that can be used to reference any UCA Data Source. The Leap Motion’s named inputs can be used to read the values returned by the Leap Motion camera module in any C# script in our scene.
You can see how the Leap’s palm_position and palm_normal are used here:
Referencing Leap Motion Inputs
private IAutoMapper autoMapper_; void Start() { autoMapper_ = FindObjectOfType<IAutoMapper>(); } public Vector3 getPalmPosition() { if (autoMapper_.HasValueForInputName("palm_position")) { var pos = autoMapper_.GetValueForInputName("palm_position"); return pos; } return new Vector3(0, 0, 0); } public Quaternion getPalmRotation() { if (autoMapper_.HasValueForInputName("palm_normal")) { var normal = autoMapper_.GetValueForInputName("palm_normal"); var direction = autoMapper_.GetValueForInputName("palm_direction"); var rot = Quaternion.LookRotation(direction, normal); return rot; } return new Quaternion(0, 0, 0, 0); }
Making our sensation two-handed
The forcefield.py sensation block has additional instructions that enable this sensation to operate with both hands. Looking in the folder of defined blocks – StreamingAssets/Python – you will find TwoHandedSensation.py. This script has the additional, succinctly-named, helper function makeSensationTwoHanded, which replicates the input sensation to render onto a second hand, returning a two-handed instance of the sensation.
from TwoHandedSensation import * # Making the Forcefield work for two hands forcefieldInst = createInstance("ForcefieldLine", "Inst") twoHandedForcefield = makeSensationTwoHanded(forcefieldInst, "Forcefield")
At the core of this function is a reference to the TwoHandsMux block. This block takes inputs from both hands and alternately sends each to the output at a regular interval. With this functionality, the makeSensationTwoHanded function alternates the sensation target between the left and right hand using a time period defined by handSwitchingPeriod
. This has a default value of 0.01 seconds in the helper function.
To complete the implementation of the two-handed forcefield, we connect up all the inputs to the top-level block as shown:
unconnectedInputs = ["forcefieldCenter", "forcefieldUp", "forcefieldRight"] defineInputs(twoHandedForcefield, *unconnectedInputs) for input in unconnectedInputs: connect(getattr(twoHandedForcefield, input), getattr(forcefieldInst, input)) defineBlockInputDefaultValue(twoHandedForcefield.forcefieldCenter, (0, 0.1, 0)) defineBlockInputDefaultValue(twoHandedForcefield.forcefieldUp, (0, 0.1, 0)) defineBlockInputDefaultValue(twoHandedForcefield.forcefieldRight, (0.1, 0, 0)) defineBlockInputDefaultValue(twoHandedForcefield.drawFrequency, (100, 0, 0)) setMetaData(twoHandedForcefield.handSwitchingPeriod, "Input-Visibility", False) defineOutputs(twoHandedForcefield, "out") connect(getattr(forcefieldInst, "out"), getattr(twoHandedForcefield, "out"))
Forcefield Prefab
Within the Forcefield example project, you will find the prefabs subfolder, containing the ForcefieldBeam. This means you can add additional panels around your scene.
This technique can be used to add boundaries to an interface of additional areas of operation. In addition, we can use the forcefield sensation to create a control.
Worked exercise – Audio Filter Slider
Once you’ve had a play around with the different settings on the forcefield, have a go at adding some fun functionality. Only a few small steps are needed to turn the forcefield into a simple slider control. As an example, let’s create an audio loop with a low pass filter control.
Adding the Audio Source
- Add an Audio Source game object to the project, with an Audio Low Pass Filter component.
- Find an audio sample that you wish to play, add it to your project and drag and drop onto the AudioClip. We’ve used a free drum loop from one of the many online resources.
- Set the audio clip to loop and play on wake.
- Test your scene by running it. Experiment with the filter’s cut-off frequency and resonance.
Creating the control
To create the slider indicator, add a cylinder game object to your scene. Change the transform so that it has diameter 1cm and is orientated on its side (z-rotation = 90˚), with its position at the centre of the array (0,0,0).
Drag and drop the cylinder to make it a child of the ForcefieldBeam object.
Add a new script to the cylinder called AudioSliderControl.cs. It is good practice to move the new script from the roots of the project folder to a local scripts folder.
In AudioSliderControl.cs, add a reference to using UltrahapticsCoreAsset:
using UltrahapticsCoreAsset; using Vector3 = UnityEngine.Vector3; using Transform = UnityEngine.Transform;
Note that the UCA defines its own Vector3 and Transform types.
Add public instances ofthe constraining transform and AudioLowPassFilter,
public Transform _constrainingTransform; public AudioLowPassFilter _filter;
Drag the ForcefieldBeam and AudioSource objects in respectively.
To reference the position of the palm, use the IAutoMapper object as shown in the section above, reading the palm position’s y value. This can then be used to update the cylinder transform, clamping it within the bounds of the constraining transform:
void Update() { // Define control boundaries float _minPos = _constrainingTransform.position.y - (_constrainingTransform.localScale.y/2); float _maxPos = _constrainingTransform.position.y + (_constrainingTransform.localScale.y/2); Vector3 palmPos = getPalmPosition(); // Update control indicator's position float y = palmPos.y; y = Mathf.Clamp(y, _minPos, _maxPos); transform.position = new Vector3(transform.position.x, y, transform.position.z); updateValue(); }
Controlling the filter
The updateValue function can be used to control the filter cutoff. The filter control uses a non-linear mapping from the slider position. This is a common trick used in synthesisers and mixing consoles to get much more expression when changing the lowest frequencies.
public void updateValue() { _value = (transform.position.y - _minPos) / (_maxPos - _minPos); _filter.cutoffFrequency = 22e3f * Mathf.Pow(_value, 4); }
The mapping used is shown below. The result is scaled to 22000 Hz, the highest cut-off frequency for our selected sample.
To ensure the control doesn’t move when the hand isn’t interacting with the beam, add a public boolean to the script with a settable function.
public bool _enableControl = false; public void Update() { if (!_enableControl) return; // additional code... } public void updateEnableControl(bool b) { _enableControl = b; }
This can then be referenced in the HapticTriggerRegion component of ForcefieldBeam, as shown:
Now when you run the scene and assuming you have some speakers or headphones connected, you should hear your sample loop. Move your hand up and down through the forcefield, you should hear the characteristic whoosh as the filter opens and closes.
Next tutorial: Building an interactive poster
0 Comments