Previous tutorial: Hand Tracking and Transformations
Next tutorial: Sensation Animation and Sequencing
- Introduction
- UCA Tracked Sensations
- UCA “Helper” Functions
- Line Tracking with createSensationFromPath
- Using Unity Transforms with Sensations
- Coming up…
Introduction
In tutorials one, two and three, we used simple sensation blocks such as CircleSensation and LineSensation. These had Transform inputs that could be set in the inspector or manipulated through a script. In tutorial 3 an animated box collider was used to set the sensation’s position. By triggering the sensation on the interaction with the hand, we achieved a simple form of hand-tracked sensation.
In tutorial four and five, hand tracking was integrated into the sensation using the UCA’s Auto Mapper mechanism. This was done using the Auto Mapper and a sequence of transforms to render a path and implement a transformation pipeline.
In this tutorial we introduce helper functions that can be used to construct hand-tracked sensations, replacing much of verbose “boilerplate code” that we used in previous lessons.
UCA Tracked Sensations
The UCA package includes the PalmTrackedSensation scene in the Examples/PalmTrackedSensation folder. Open this and import the Leap Motion® asset and prefabs as instructed in the scene’s accompanying readme file and in tutorial 3.
Ensure your hardware is connected, with the Leap Motion service running, and click Play. To view the sensation path in the Game window, ensure gizmos are enabled. You will feel a pulsing, circling sensation on your palm that tracks as you move your hand above the array.
Palm Tracked Sensation example
The scene hierarchy is similar to that used in tutorial 3, where we introduced the animated “Touch Block” example. Then, we used the animated Haptic Cube game-object to trigger the sensation. In this scene, we have the SensationTriggerBox box collider.
The SensationTriggerBox is a similar size to the Leap Motion’s interaction zone.
We also have the ExampleSceneLandscapeCanvas prefab, which forms the background for our scene.
The PalmTrackedPulsingCircle has a Hidden Inputs header. Expanding this shows the Auto Mapper inputs provided by the Data Sources. It is these that allow the sensation to track the hand in all directions and orientations.
UCA “Helper” Functions
The behaviour of the PalmTrackedPulsingCircle sensation is almost identical to that of our Simple Circle example from the last few lessons, but with the added pulsed-radius behaviour.
These first two lines import the Sensation Core Library and the “sensation_helpers” module:
from pysensationcore import * import sensation_helpers as sh
We also import the TriangleWave block and create instances of CirclePath and TriangleWave. Connecting the TriangleWave output to the CirclePath’s radius input gives a pulsed circle:
import TriangleWave pathInstance = createInstance("CirclePath", "CirclePathInstance") triangleWaveBlockInstance = createInstance("TriangleWave", "triangleWave") connect(triangleWaveBlockInstance.out, pathInstance.radius)
Instead of passing the output of the CirclePath generating block, pathInstance, to a chain of transform blocks, we use the createSensationFromPath function. This creates a new block with the given name. Its output behaviour is defined by setting output to the out port of pathInstance:
sh.createSensationFromPath("PalmTrackedPulsingCircle", { ("t", triangleWaveBlockInstance.t) : (0,0,0), ("Start Radius (m)", triangleWaveBlockInstance.minValue) : (0.01, 0, 0), ("End Radius (m)", triangleWaveBlockInstance.maxValue) : (0.05, 0, 0), ("Pulse Period (s)", triangleWaveBlockInstance.period) : (5.0, 0, 0), }, output = pathInstance.out, drawFrequency = 70, intensity = None )
Top-level inputs are defined in Python dictionary format. They are connected and initialised in a single statement:
("t", triangleWaveBlockInstance.t) : (0,0,0)
The helper function instantiates all the low-level blocks needed to create the UCA’s transformation pipeline, as well as the path renderer and intensity control. It automatically connects up all the transformation pipeline inputs, meaning we don’t have to worry about data sources or the Auto Mapper. The unique behaviour of the sensation, in this case, the pulsing circle, can then be created separately to the sensation tracking mechanism.
createSensationFromPath
The createSensationFromPath function’s help text provides some additional information. Here is the input parameter list:
sensationName | Name of the Block to create |
inputs | Dictionary of inputs of the path generating block: {(“nameTopLevelInput”, handleInnerBlockInput): defaultValue} |
output | Instance output of the path generating block instance |
drawFrequency | Number of times per second the path is rendered. Default = 100 |
intensity | Set sensation intensity between 0 and 1 or none. Default = none, equivalent to 1) |
definedInVirtualSpace | Set to True if input is already in Virtual Space, bypass Sensation and Virtual Space transforms. Default = False. See Line Tracking example below |
renderMode | Defines path renderer behaviour, either
|
The function returns a reference to the named block.
A sensation created with createSensationFromPath will track the palm correctly in any direction by default, as long as our array is properly set up. That is the Leap Motion prefab and UltrahapticsKit Prefab are at the correct position and offset.
Line Tracking with createSensationFromPath
At the end of the last tutorial, we implemented a line sensation that tracked the palm and the middle finger. The createSensationFromPath function provides us with a very quick way of implementing this behaviour.
The Python listing below shows how:
# Tracked line between middle finger and palm from pysensationcore import * import sensation_helpers as sh lineABBlock = createInstance("LinePath", "line") sh.createSensationFromPath("MyFingerTrackedLine", { ("middleFinger_distal_position", lineABBlock.endpointA) : (0, 0, 0), ("palm_position", lineABBlock.endpointB) : (0, 0, 0), }, output = lineABBlock.out, drawFrequency = 70, definedInVirtualSpace = True )
Remember that LinePath has inputs for the coordinates of its endpoints: endpointA and endpointB. In the last tutorial, we connected these to the Auto Mapped palm_position and middleFinger_distal_position. These are now passed as inputs to the helper function.
In addition, the Leap Motion inputs are in Virtual Space (see tutorial five). When calling createSensationFromPath, we must set definedInVirtualSpace to True. The helper function will then do the rest by setting up the path renderer, intensity block, etc.
Copy this to a new script and use in the Sensation Source to confirm its behaviour.
Fixed and tracked line sensations
Of course, we could connect our endpoints inputs to any other Leap Data Source. And by simply defining named inputs, such as “endpointA” and “endpointB”, we can set the line endpoints directly from the component inspector. The code below places the line at a fixed position 20cm above array.
MyFingerTrackedLine.py
sh.createSensationFromPath("MyFingerTrackedLine", { ("endpointA", lineABBlock.endpointA) : (-0.04, 0.2, 0), ("endpointB ", lineABBlock.endpointB) : (0.04, 0.2, 0), }, output = lineABBlock.out, drawFrequency = 70, definedInVirtualSpace = True )
Note that if we now set definedInVirtualSpace to False, and the endpoints to have a Y=0 coordinate, the line will be fixed to track across the palm of the hand, just like in tutorial five.
Using Unity Transforms with Sensations
In our earlier lessons (1, 2 and 3), we were able to set the position of the sensation using a Unity Transform component. Look now at the Python source for the UCA’s included LineSensation. You will see the createSensationFromPath function has been used here and that it looks almost identical to our line sensation above. You will also notice the endpointA and endpointB inputs are defined as “Point” types using the setMetaData instruction:
setMetaData(line.endpointA, “Type”, “Point”)
where line is the instance returned by createSensationFromPath. The Block Manifest defines a Point as
Point A 3-tuple of real-valued numbers. Represents a position in 3-dimensional space. |
The result of this is that any input defined as a Point type appears in the Sensation Source component as a Unity Transform type. You will then be able to set the value based on any Transform in your Unity scene.
Here, we’ve created an arbitrary game object called linePositionA and dropped it onto the Sensation Source’s endpointA field. linePositionA’s Transform is the used to set the endpointA input (endpointB input has not yet been set):
Unless a Transform has been defined for an input, the default values (as defined in the script) will be used.
setMetaData(line.endpointA, "Type", "Point")
The “Allow-Transform” setting
Look now at the CircleSensation source script CircleSensation.py. CircleSensation differs from LineSensation in that it is defined in Sensation Space with a radius. To set its position with a Transform use the directive:
setMetaData(circle, "Allow-Transform", True)
With a Transform input, we can set not only the position but the rotation and scale of the sensation too.
Note, by removing the directive or making it false, the sensation automatically tracks the palm of the hand in both position and orientation.
Coming up…
In the next tutorial, we will be looking at more of the features provided by Unity itself and how the Ultrahaptics Core Asset can be used with them.
Next tutorial: Sensation Animation and Sequencing
0 Comments