Previous tutorial: Triggering Sensations with Hands
Next tutorial: Hand Tracking and Transformations
- Introduction
- Sensations as a block process
- The Sensation Core Library
- Changing the Sensation position: Transforms
- Virtual Space and Emitter Space
- Sensation Core Python scripting API – pysensationcore
Introduction
In this tutorial, we show how a UCA “sensation” is constructed from blocks of functionality provided by the Sensation Core Library (SCL). We discuss how the UCA works with the Sensation Core Library and present a block graph model for sensation generation and manipulation. We will add a transform to show how we can control the position of the sensation.
Note: a collection of additional Sensation Block examples can be found in the UnityExamples Project on Github
This lesson discusses UCA concepts that you may not find useful at this stage.If you prefer, you can skip to tutorial six where we introduce helper functions that perform much of the operations discussed and refer back later on. |
Sensations as a block process
When we add a Sensation Source to a Unity game object, we select from the Sensation Block drop-down box. Once selected, we see a list of inputs associated with the selected sensation in the inspector. The available sensations are provided with the UCA as Python scripts, stored in the StreamingAssets/Python folder in the root of the asset. To create new Sensation Blocks, we add Python scripts to this folder. Unity automatically scans and updates this folder on any changes (any errors in a script will cause a Unity console error). The Sensation Block drop-down box should be re-populated with any additional sensations.
A Simple Circle Example
Create a new Python file called “SimpleCircle.py” script and save it to the StreamingAssets/Python folder. Copy the listing below to the SimpleCircle.py file:
from pysensationcore import *
simpleCircleBlock = defineBlock("SimpleCircle")
defineInputs(simpleCircleBlock, "t")
defineOutputs(simpleCircleBlock, "out")
circlePathInstance = createInstance("CirclePath", "circlePathInstance")
renderPathInstance = createInstance("RenderPath", "renderPathInstance")
connect(simpleCircleBlock.t, renderPathInstance.t)
connect(Constant((0.02, 0, 0)), circlePathInstance.radius)
connect(Constant((125, 0, 0)), renderPathInstance.drawFrequency)
connect(circlePathInstance.out, renderPathInstance.path)
connect(renderPathInstance.out, simpleCircleBlock.out)
Create or reuse an existing Hello (Sensation) World type scene with an UltrahapticsKit prefab and a Sensation Source component added to an empty game object. Save this as a new scene called Simple Circle. You should now be able to select SimpleCircle in your Sensation Block drop-down box.
Run the scene and enable gizmos. You will see that the circle is fixed to the surface of the array.
But you will not feel any sensation.
Analysis
Let’s look in more detail at the construction of our script.
Import pysensationcore to use the Python scripting API. You will find a summary of the Python API at the end of this lesson. We define the top-level block “SimpleCircle”, along with an input and output. We then create instances of CirclePath and RenderPath before connecting these together. The output of RenderPath is connected to the single output of SimpleCircle.
We use Constant inputs to set the circle radius of 2 cm and drawFrequency of 125Hz (note the use of double brackets when using a Constant block). You can see from the diagram that the RenderPath block takes the input defined by the CirclePath and generates a single Control Point, rendering it to the emitter 125 times a second.
The Sensation Core Library
From the example we see that Sensations are constructed using a block graph concept: basic blocks of functionality connected together, passing information around before being sent to the emitter. The block graph concept is similar to a graphics pipeline model, in which a 3D scene is rendered to a two-dimensional image. Block graph models are common representations for modelling signal processing systems such as graphics, audio and animation and can be found in many modern software tools.
In the example, a sensation is constructed by defining a path-producing Block (CirclePath), before passing it to a renderer (RenderPath), which generates the control point, placing it in emitter space. To manipulate the position of the sensation a series transforms are used.
The Sensation Source is a Unity component that maps into the named block, giving us access to its inputs and output properties.
The UCA is built on the Ultrahaptics Sensation Core Library (SCL). The SCL encapsulates sensation generation and manipulation, with the Sensation Core providing scripting APIs, pipeline management and functionality. It provides an interface to the array via the Ultrahaptics’ SDK.
The SCL and UCA architecture is shown in the block diagram below.
Note that, while the SCL exposes a Python scripting API for creating our Sensation blocks, it also provides language bindings, allowing the SCL to be integrated into other platforms, or scripted outside of Unity.
The Block Manifest
The Block Manifest lists all the available blocks, including ones that are built into the SCL. Some of these, such as CircleSensation and LineSensation, are sensation generators. Others such as RenderPath and CirclePath, add processing functionality. The Quick Start Guide contains a handy list of the sensation generating blocks available with the UCA.
Note: You will always find the most up-to-date version of the Block Manifest file in the root of the UltrahapticsCoreAsset folder. |
We can see descriptions of RenderPath and CirclePath:
RenderPath Evaluates a path (e.g. LinePath) to produce control point positions.Note: The RenderPath Block does not produce control point output, unless it receives a valid path-producing input, to its path input.CirclePath Outputs a circular path, with a given radius, in the z=0 plane |
Changing the Sensation position: Transforms
The UCA uses the concept of Sensation Space to define the origin of a sensation. For the CirclePath the centre of the circle. The Ultrahaptics array uses the z coordinate as height above the array. This is referred to as the Emitter Space.
Since the CirclePath outputs in Sensation Space, it is fixed to the origin of the array – the surface – and cannot be felt. To feel any sensation, we must locate it in the interaction zone above the array. In other words, we must move it into the emitter space.
To do this we must apply a transform to the output of the CirclePath that will place it above the array. Looking at the Block Manifest, we can see that a transform type is available and is used to
“…manipulate position, rotation [and] scale” …
We also find the TransformPath block, which takes a transform input and applies it to a path:
TransformPathApply a transform to a path to generate a new path Inputs
Output
|
We can connect TransformPath between the CirclePath and PathRenderer. To provide the transform, use ComposeTransform. This takes four inputs, one for each column in a four by four matrix.
ComposeTransformSensation-producing: NO Compose a transform using the component vectors. Inputs
Outputs
|
You can see that the final input ‘o’ controls the offset. By connecting ‘o’ to a new, top-level input, we can reference it in the Sensation Source inspector and control the offset directly in Unity.
The block diagram for the complete system looks like this:
To avoid any rotations or translation of the sensation itself we set the x, y and z inputs as shown. The complete listing is shown below. Copy and save this to your SimpleCircle.py file:
SimpleCircle.py with translation
from pysensationcore import *
simpleCircleBlock = defineBlock("SimpleCircle")
defineInputs(simpleCircleBlock, "t", "offset")
defineOutputs(simpleCircleBlock, "out")
# Set an initial offset position
defineBlockInputDefaultValue(simpleCircleBlock.offset, (0, 0, 0.2))
circlePathInstance = createInstance("CirclePath", "circlePathInstance")
transformPathInstance = createInstance("TransformPath", "transformPathInstance")
transformInstance = createInstance("ComposeTransform", "transformInstance")
renderPathInstance = createInstance("RenderPath", "renderPathInstance")
connect(simpleCircleBlock.t, renderPathInstance.t)
connect(Constant((0.02, 0, 0)), circlePathInstance.radius)
connect(Constant((125, 0, 0)), renderPathInstance.drawFrequency)
# Initialise the transform
connect(Constant((1, 0, 0)), transformInstance.x)
connect(Constant((0, 1, 0)), transformInstance.y)
connect(Constant((0, 0, 1)), transformInstance.z)
connect(simpleCircleBlock.offset, transformInstance.o)
# Connect up blocks
connect(circlePathInstance.out, transformPathInstance.path)
connect(transformInstance.out, transformPathInstance.transform)
connect(transformPathInstance.out, renderPathInstance.path)
connect(renderPathInstance.out, simpleCircleBlock.out)
Note: When modifying a block’s Python script, you may need to deselect and reselect it in the Sensation Block drop-down list to view your changes. |
On reloading the sensation we can now see the offset listed as an input with the default value of z=0.2.
Run the scene. You should now be able to feel the sensation centred 20 cm above the centre of the array. You can change the offset vector to move the sensation around in the space above the array.
Note!
|
Virtual Space and Emitter Space
When modifying the Sensation Source offset, notice that the ‘y’ and ‘z’ coordinates are swapped compared to the Unity World Space. This is because our Simple Circle has a position defined with respect to the Ultrahaptics emitter.
The Simple Circle sensation is in Emitter Space.
The Unity world coordinate space is indicated by the axis indicator in the top-right of the scene pane. In 3D Unity World Space, y dictates height above or below the origin, while z is the distance along a line extending forward from the origin and to the horizon.
You can see this by selecting any game object in your scene and modify the ‘y’ and ‘z’ values of its Transform position. We will refer to Unity Space or Unity World Space as Virtual Space.
In the next tutorial, we will continue to look at the concept of transforms, how we can account for the Virtual Space. We will also add hand tracking and how we can account for the hand in our scene. In a future tutorial, we introduce a useful helper function that automates the entire transformation pipeline process.
0 Comments