Previous tutorial: Sensation Core Library and Blocks
Next tutorial: Hand Tracking Helper Functions
- Introduction
- Adding hand tracking to our example
- The UCA Transformation Pipeline
- The hand tracking circle sensation
- Changing the emitter position in the Unity world
- Line sensation with tracking
- In the next lesson…
Introduction
In the previous tutorial, we showed you how to create a basic “Circle Sensation” that was fixed above the array. We discussed the Sensation Core Library and the block graph model used to create a pipeline of sensation generation and manipulation functions. We used a transform with our sensation example to place it at a fixed point above the array. We finished by noting that our sensation offsets were in the Emitter Space but not in Unity Space.
In this tutorial, we show you how to add hand tracking to our simple circle example. We also introduce the concept of a UCA Data Source and how it can be used by the UCA’s Auto Mapper to connect inputs using keywords.
Finally, we show how we can change our example to use a tracked, line sensation and how we can modify it to track the palm and tip of the middle finger.
This lesson is advanced.If you prefer, you can skip to the next tutorial where we introduce helper functions that perform much of the operations discussed and refer back later on. |
Adding hand tracking to our example
Open up the Simple Circle example scene from the previous lesson. We can add hand tracking by importing the Leap Motion® assets and adding the prefabs (see Lesson 3 for full instructions). Make sure that the Leap Motion hand controller prefab has the correct transform offsets for your device and that you have the “Leap Data Source” in your scene. This can be placed anywhere, but is typically a child of the UltrahapticsKit prefab in the TrackingOrigin object:
Test that the hand tracking works by connecting your Leap Motion® camera module and running the scene. You should see the hand models appear in your scene when you place them above the camera.
Leap Data Source
The Leap Data Source is one of a number of UCA “Data Source” components and is part of the UCA’s Auto Mapper mechanism. These provide an easy way to connect to the inputs of our Sensation Block using a find-by-name approach. You can find other UCA data sources in the UltrahapticsKit prefab (see below). All are documented in the UCA’s Block Manifest.
Unless you intend changing the gesture tracking device or its position relative to the array, it is unlikely that you will need to be too concerned about the mechanics of the Auto Mapper at this stage.
The Leap Data Source gives our Sensation Source access to the Leap Motion data inputs by referring to special keywords or concatenated strings. When added to our block definition, these strings are automatically mapped to the Leap Motion® API through the Auto Mapper. The Block Manifest documentation gives the following description of the Leap Data Source:
Leap Data Source
This data source provides the required transform for going between Virtual Space and Virtual Space. This data source relies on the Leap Motion Core Asset for Unity and gets information from the Leap Hand Controller.
This is used for sensations which are to be mapped onto the hand.
To following keywords can be used to reference the palm position, direction, orientation and wrist position:
palm_position
palm_direction
palm_normal
wrist_position
The individual finger and thumb bones are referenced by concatenating finger and bone names in the following format:
<fingerName> _<boneName>_position
fingerName is one of: A. thumb B. indexFinger C. middleFinger D. ringFinger E. pinkyFinger boneName is one of the following phalanges:
|
The diagram shows that there is no proximal phalanx for the thumb. However, this input is provided by Leap Motion’s API.
As an example, to get the coordinate information of the tip of the middle finger, we would create an input to our block called middleFinger_distal_position.
In our simpleCircle.py script, we can replace the offset input with palm_position, using this as the offset control. Go ahead and do this now, ensuring you remove any reference to offset. Making sure to enable the sensation visualizer (click the gizmos button), run the scene and note the behaviour of the sensation.
If you completed the last tutorial, you won’t be surprised to see that, while the sensation moves, it does not track the hand. This is because our Leap Motion prefab supplies its data in Unity space. In the UCA, we refer to Unity space as Virtual Space.
The UCA Transformation Pipeline
To solve the problem of different coordinate spaces, we introduce the concept of a transformation pipeline.
The UCA transformation pipeline is a series of transforms that allow a sensation to be tracked to a virtual object, such as a camera tracked hand and translated to the correct position in the real world.
In the parlance of the UCA, the full transform pipeline can follow the path shown:
Sensation Space | Coordinates from point-of-view of sensations (2D convention) |
Virtual Space | Coordinates in Unity world* |
Object in Virtual Space | Position and orientation of a virtual object, in most cases a component of a tracked hand. |
Emitter Space | Ultrahaptics array coordinates |
*Virtual Space applies in whatever development environment we work in, so in Unreal®, the Virtual space would refer to the Unreal coordinate space.
When using multiple arrays, emitter space and device space may not always agree. Refer to the documentation included with the Ultrahaptics SDK on Multiple Device Support. |
UCA Data Sources
It is the UltrahapticsKit prefab’s Data Source components that define these spaces and provide data to Block inputs via the AutoMapper. These data sources are:
Emitter Data Source
This data source provides the required transform for going between Virtual Space and Virtual Emitter Space. This is used for sensations produced to be relative to a virtual emitter in Virtual Space. For example, when moving the virtual emitter closer to the virtual object which the sensation is mapped to, the sensation in Emitter Space will appear closer to the emitter. This is because the relative distance between the virtual emitter and the virtual object has reduced.
Sensation Space To Virtual Space
Transforms a point from Sensation Space to Virtual Space. Use this Block if you need your Sensation designed in Sensation Space to map to the hand in Virtual Space (provided by the tracking device) Note: This Block outputs a point in Virtual Space and is typically used in conjunction with a Block that transforms from Virtual to Emitter Space.
Virtual Space to Emitter Space
This data source provides the required transform for going between Virtual Space and Emitter Space.
Within each of these, you will find named keys that can be referenced in UCA blocks. With additional transforms, we can construct a pipeline that correctly places our circle sensation into both Emitter Space and Virtual Space. We will then feel and see our circle properly tracking our hand.
The hand tracking circle sensation
Let us now construct the block graph to use these data sources and create the transformation pipeline. The block graph will look like this, with keywords connected to the top-level input:
Each Data Source input is defined and connected for each transform. The code listing is shown below. While this looks complex, much of the code is boiler plate that implements our transformation pipeline.
TrackingCircle.py
from pysensationcore import *
trackingCircleBlock = defineBlock("TrackingCircle")
defineInputs(trackingCircleBlock,
"t",
"sensationXInVirtualSpace",
"sensationYInVirtualSpace",
"sensationZInVirtualSpace",
"sensationOriginInVirtualSpace",
"virtualObjectXInVirtualSpace",
"virtualObjectYInVirtualSpace",
"virtualObjectZInVirtualSpace",
"virtualObjectOriginInVirtualSpace",
"virtualXInEmitterSpace",
"virtualYInEmitterSpace",
"virtualZInEmitterSpace",
"virtualOriginInEmitterSpace")
defineOutputs(trackingCircleBlock, "out")
circlePathInstance = createInstance("CirclePath", "circlePathInstance")
renderPathInstance = createInstance("RenderPath", "renderPathInstance")
connect(trackingCircleBlock.t, renderPathInstance.t)
connect(Constant((0.02, 0, 0)), circlePathInstance.radius)
connect(Constant((125, 0, 0)), renderPathInstance.drawFrequency)
# Transform Sensation to Virtual space
composeSensn2VtlSpaceTform = createInstance("ComposeTransform", "ComposeSensn2VtlSpaceTform")
tformPathSensn2VtlSpace = createInstance("TransformPath", "TformPathSensn2VtlSpace")
connect(trackingCircleBlock.sensationXInVirtualSpace, composeSensn2VtlSpaceTform.x)
connect(trackingCircleBlock.sensationYInVirtualSpace, composeSensn2VtlSpaceTform.y)
connect(trackingCircleBlock.sensationZInVirtualSpace, composeSensn2VtlSpaceTform.z)
connect(trackingCircleBlock.sensationOriginInVirtualSpace, composeSensn2VtlSpaceTform.o)
connect(composeSensn2VtlSpaceTform.out, tformPathSensn2VtlSpace.transform)
# Transform to hand position
composeObjInVtlSpaceTform = createInstance("ComposeTransform", "ComposeObjInVtlSpaceTform")
tformPath2ObjectInVtlSpace = createInstance("TransformPath", "TformPath2ObjectInVtlSpace")
connect(trackingCircleBlock.virtualObjectXInVirtualSpace, composeObjInVtlSpaceTform.x)
connect(trackingCircleBlock.virtualObjectYInVirtualSpace, composeObjInVtlSpaceTform.y)
connect(trackingCircleBlock.virtualObjectZInVirtualSpace, composeObjInVtlSpaceTform.z)
connect(trackingCircleBlock.virtualObjectOriginInVirtualSpace, composeObjInVtlSpaceTform.o)
connect(composeObjInVtlSpaceTform.out, tformPath2ObjectInVtlSpace.transform)
# Transform Object in Virtual Space to Emitter space
composeObjInEmitSpaceTform = createInstance("ComposeTransform", "ComposeObjInEmitSpaceTform")
tformPathObjInEmitSpace = createInstance("TransformPath", "TformPathObjInEmitSpace")
connect(trackingCircleBlock.virtualXInEmitterSpace, composeObjInEmitSpaceTform.x)
connect(trackingCircleBlock.virtualYInEmitterSpace, composeObjInEmitSpaceTform.y)
connect(trackingCircleBlock.virtualZInEmitterSpace, composeObjInEmitSpaceTform.z)
connect(trackingCircleBlock.virtualOriginInEmitterSpace, composeObjInEmitSpaceTform.o)
connect(composeObjInEmitSpaceTform.out, tformPathObjInEmitSpace.transform)
# Connect up blocks
connect(circlePathInstance.out, tformPathSensn2VtlSpace.path)
connect(tformPathSensn2VtlSpace.out, tformPath2ObjectInVtlSpace.path)
connect(tformPath2ObjectInVtlSpace.out, tformPathObjInEmitSpace.path)
connect(tformPathObjInEmitSpace.out, renderPathInstance.path)
connect(renderPathInstance.out, trackingCircleBlock.out)
Copy the code above into a new file called TrackingCircle.py and save it to the same Python folder. Now, when we select the TrackingCircle block in the Sensation Block drop-down list and run the scene, the circle correctly tracks our hand. The plane of the circle tracks the plane of the hand and is centred on the palm. Note the values of the virtualObject inputs are updated as the hand moves in the scene:
Note: When running the Unity scene, collapse the Sensation Source component to optimise haptic output performance. Click the small triangle in the top-left corner of the component. |
The series of inputs to the second transform – virtualObjectXInVirtualSpace, etc. – could be defined for any data source. Since the Leap Data Source maps these to the inputs from the Leap Motion® device, the “Virtual Object” is our hand!
If you look at the Leap Data Source script you will see that it does still allow us to use palm_position instead of virtualObjectOriginInVirtualSpace.
Changing the emitter position in the Unity world
In some cases, we may wish to change the location of the emitter relative to our world space. The example above makes the assumption that the emitter is centred at the origin (0,0,0). What if this isn’t the case, or if we wish to place the tracking camera in a different position to the array?
In the above example, if we set the UltrahapticsKit object transform to x=0.1, i.e. 10cm to the right of the origin and run with our Tracking Circle sensation, you will find the circle no longer tracks to the hand. Enabling gizmos show it is now floating 10cm to the right of our hand.
An additional Emitter Data Source can be used to get the positional information of the array into our sensation block using the named keys:
- virtualEmitterXInVirtualSpace
- virtualEmitterYInVirtualSpace
- virtualEmitterZInVirtualSpace
- virtualEmitterOriginInVirtualSpace
To make use of the data source in our pipeline, we create these additional named inputs in our block and connect them to an inverse transform. An inverse transform can be created using the “ComposeInverseTransform” block and applied to the transform path in the same way:
# Add Inverse Transform
emitterInVSpaceTform = createInstance("ComposeInverseTransform", "EmitterInVSpaceTform")
tformPathVirtual2EmitterSpace = createInstance("TransformPath", "TformPathVirtual2Emitter")
connect(trackingCircleBlock.virtualEmitterXInVirtualSpace, emitterInVSpaceTform.x)
connect(trackingCircleBlock.virtualEmitterYInVirtualSpace, emitterInVSpaceTform.y)
connect(trackingCircleBlock.virtualEmitterZInVirtualSpace, emitterInVSpaceTform.z)
connect(trackingCircleBlock.virtualEmitterOriginInVirtualSpace, emitterInVSpaceTform.o)
connect(emitterInVSpaceTform.out, tformPathVirtual2EmitterSpace.transform)
Finally, connect the inverse transform between the virtual object in virtual space and the final emitter space transforms as shown in the image below.
# Connect up blocks
connect(circlePathInstance.out, tformPathSensn2VtlSpace.path)
connect(tformPathSensn2VtlSpace.out, tformPath2ObjectInVtlSpace.path)
# Insert new block
connect(tformPath2ObjectInVtlSpace.out, tformPathVirtual2EmitterSpace.path)
connect(tformPathVirtual2EmitterSpace.out, tformPathObjInEmitSpace.path)
connect(tformPathObjInEmitSpace.out, renderPathInstance.path)
connect(renderPathInstance.out, trackingCircleBlock.out))
When you run the scene, you will now see that the gizmo correctly tracks the hand. Remember that the Tracking Origin should be a child of the UltrahapticsKit object, otherwise you will not feel the haptic.
You should now be able to see how both the array and tracking device can be positioned and orientated within Unity using the UCA’s block system.
Line sensation with tracking
Let us now change our sensation to use the LinePath block. Instead of setting radius, we control the LinePath’s endpoints. Try this now:
- Make a copy of your modified TrackingCircle.py script in StreamingAssets/Python called MyFingerTrackedLine.py
- Define the top-level block name to TrackingLine. Change all names from ‘circle’ to ‘line to avoid any confusion or errors.
- Change the CirclePath block to a LinePath Again, rename instances to avoid any errors.
- Replace the CirclePath’s radius input to set the LinePath endpointA and endpointB inputs as shown:
connect(trackingLineBlock.t, renderPathInstance.t)
connect(Constant((-0.04, 0, 0)), linePathInstance.endpointA)
connect(Constant((0.04, 0, 0)), linePathInstance.endpointB)
connect(Constant((125, 0, 0)), renderPathInstance.drawFrequency)
- Save the script and select the new sensation from the Sensation Block drop-down box.
- Run the scene and set the Running status to true.
You should see the line track the hand ±4 cm, horizontally across the palm as shown.
Note: By default, any block in the StreamingAssets/Python folder will be listed in the Sensation Source component’s Sensation Block drop-down box. Not all blocks in this folder produce sensations. To prevent these being listed, use the setMetaData instruction with the property “Sensation-Producing” set to False for your block instance, shown here for an instance of block x:
setMetaData("Sensation-Producing", x.out, False) |
Modifying our line sensation
What if we wish to track the sensation to specific fingers or change its behaviour?
As an example, imagine that we wish the line sensation to go from the palm of the hand to the tip of the index finger. One simple solution would be to change the endpoint coordinate so that one is at the palm while the other is directly in front. This could be represented in Sensation Space by the endpoint coordinates (0,0,0) and (0,0.08,0) respectively. This seems like a reasonable solution until we bend our finger. You can try this by entering the coordinates in the Sensation Source inspector.
A better approach would be to set the endpoint coordinates using the coordinates given by the Leap Motion camera. Remember that using the UCA’s Leap Data Source we have access to any feature of the hand. We can therefore simply reference palm_position and middleFinger_distal_position directly as inputs to our block.
- In your script add middleFinger_distal_position as a named input.
- Replace the Constant connections to endpointA and endpointB to connect palm_position and middleFinger_distal_position.
- Save the script, reload the Sensation Block (de-select and re-select) and run the scene.
Hang on! This doesn’t work either… Our line hovers in front of the hand, rather than being attached to it. Why?
The palm_position and middleFinger_distal_position inputs set the line’s position. The LinePath, connected to our first transform block, moves from Sensation Space to Virtual Space. But in fact, since we are defining the endpoints of the line using the Leap Data Source, the line is already located in Virtual Space and is also tracking the hand “object”. Therefore, we only need to transform the LinePath output from Virtual Space to EmitterSpace.
Taking this into account, the revised block diagram now looks like this:
Go ahead and make the modifications shown in the diagram to your Python script. Once saved and reloaded, you should be able to run your scene and see the line correctly track between your palm and middle finger.
There are a few additional modifications that you can attempt if you wish:
- Connect the drawFrequency input to a top-level block input, making it visible in the inspector. Set a default value of 100 using the defineBlockInputDefaultValue. Note that defineBlockInputDefaultValue takes a 3-element vector.
- Add a SetIntensity block between RenderPath and the top-level output. This will allow output intensity to be set from the inspector. Use a default of 1.0.
Hints:
- See the BlockManifest for a full description of the SetIntensity
- Use setMetaData(<block.input>, “Type”, “Scalar”) to present an input as a scalar.
In the next tutorial…
In this tutorial, we’ve gone into a lot of detail on the architecture of creating and manipulating haptic sensations in Unity. In the next tutorial we introduce helper functions that replace much of the blocks needed for setting up the transformation pipeline and allow us to create more complex haptic behaviour. We also show how we can use Unity Transform components to track sensation.
Next tutorial: Hand Tracking Helper Functions
0 Comments