Previous Tutorial: The Forcefield Sensation
Next Tutorial: Polylines and 3D Shapes
Introduction
In this lesson we start to pull all the elements of the previous lessons together to build an interactive poster, combining audio, visual and mid-air, haptic sensation.
By including haptics in interactive marketing campaigns, brands can increase engagement, dwell time and recall rates significantly, translating into increased return on marketing spend, increased revenues and brand loyalty.
Using haptics with audio and image cues mean we can deliver a multisensory and multimodal experience. Here, we’ll use the Ultrahaptics Poster template scene to add our own poster image and trigger haptics based on a simple interaction. To create the haptics, we use a modification of the Line Sensation, enhanced to simulate a “lightning” sensation.
Up and running
- Download or clone the UnityExamples Project found on our Unity Examples Github Repo
- In Unity, open the UnityExamples project
- From the Examples folder open the UltrahapticsPoster Unity example.
- Add the Ultrahaptics Core Asset to the Project hierarchy, and the Leap Motion assets and Hand Controller prefab to the TrackingOrigin object (see Quick Start Guide).
- Connect up your hardware, ensuring your Leap Motion is tracking and press play. You should see the game scene shown in the image above.
- When you place your hand over the array you will be able to interact with the globe, hearing the electrical sound effect and feeling the “Lightning” sensation.
- Enabling ‘gizmos’ will show the haptic control point’s path as you place your hand in the “Haptic Orb”.
Scene Analysis
Looking in detail at the Unity scene, we see that it is based around a Canvas (currently without an image), some text and a graphic representing an orb; the usual UltrahapticsKit prefab, camera, lighting, etc. as well as a HapticOrbTriggerZone game object.
We also have some particle effects – LightningParticles and OrbParticles – to go with our haptics.
Unity provides excellent tutorials on setting up scenes with 2D backgrounds as well as using the “Particle System” component, so we won’t spend too long on these.
Instead, let’s look at what is happening from a haptics point-of-view. The mechanics of the scene are in the HapticOrbTriggerZone. This contains, among other things, the Sensation Source component, a Sphere Collider, and Audio Source components.
This uses the same mechanism introduced in Tutorial 3 – Triggering Sensations with Hands – with the Sensation Source triggered by collisions between the hand tracking and the collider, using the Haptic Trigger Region script component.
As well as triggering haptics, this also triggers the Audio Source’s “LightningMixDown” audio clip.
Haptic Sensation
The sensation we’re using for this poster is called Lightning. This uses a Line Path block with a random number generator to randomly assign its endpoints between the palm (or wrist) and one of the fingertips.
You will remember from Tutorials 5 and 6 that we created line sensations that tracked between the palm of the hand and the end of the middle finger. If you haven’t already, please check out Tutorial 5 and Tutorial 6 to understand how we connect blocks together to create new sensations.
Our Lightning sensation enhances the standard Line Sensation with the addition of a random number generator and multiplexor block. This randomly assigns the Line Path’s endpointB input to one of the five fingertip positions. We can find the Python script for the Lightning sensation in the StreamingAssets/Python folder. The diagram below shows the block connections, with the addition of Mux5 and RandomIntGenerator.
The RandomIntGenerator is initialised to generate numbers between 0 to 4, with its output uses to control the Mux5 selector, thereby allowing for the selection of one of the auto-mapped finger positions defined in the Leap Data Source component to be selected as the line’s endpoint position. The RandomIntGenerator is a time-based block, so we can change the rate of the update of the finger selection, and therefore the sensation, using the “scanPeriod” field in the inspector.
Note: since RandomIntGenerator is a time-based block we must connect the special, hidden ‘t’ input to its input.
The listing shows how createSensationFromPath create the sensation transform for us:
Lightning.py
lightningBlock = sh.createSensationFromPath("Lightning",
{
("t", randomBlockInstance.t) : (0, 0, 0),
("scanPeriod", randomBlockInstance.period) : (0.1, 0, 0),
("indexFinger_distal_position", mux5Block.input0) : (0, 0, 0),
("middleFinger_distal_position", mux5Block.input1) : (0, 0, 0),
("ringFinger_distal_position", mux5Block.input2) : (0, 0, 0),
("pinkyFinger_distal_position", mux5Block.input3) : (0, 0, 0),
("thumb_distal_position", mux5Block.input4) : (0, 0, 0),
("palm_position", linePathBlock.endpointA) : (0, 0.2, -0.06)
},
output = linePathBlock.out,
definedInVirtualSpace = True
)
Make the poster your own!
Right now, our poster is pretty boring, so let’s add some background, change some effects and generally make this poster look like something you would want to play with.
First, let’s find a background. We could do a fantasy theme with a crystal ball or a sorcerer. Google allows us to search for copyright free images by selecting “labelled for reuse with modification”, so try searching for “magic sorcerer”.
I’ve used an image from a free image website, of a wizard in a forest and trimmed it to fit my canvas.
Save the image in a folder in your Unity Ultrahaptics Poster project hierarchy. To use the image in Unity, change its “Texture Type” to “Sprite (2D and UI)”.
Select the PosterImage game object and drag the imported image onto the Source Image field. You can resize the image to fit the canvas as you wish. I’ve selected to set anchor points to stretch in the x and y directions and resize the image to fill the canvas, then positioned it so that the “light ball” is slightly off centre.
Position the Haptic Orb Trigger Zone, and hence the trigger position, so that it coincides with the sorcerer’s light ball. Since the image itself contains a circular, light source we can actually disable the “Mesh Renderer” of the Haptic Orb Trigger Zone.
To get the interaction zone correct for your screen position, you may wish to adjust the camera position, making sure the Haptic Orb Zone is in the right position.
Finally, we can adjust the lightning particles and orb particles transform to coincide with the light ball too. For extra customisation, modify some colours and add your own text…
Next Tutorial: Polylines and 3D Shapes
0 Comments