Previous tutorial: Building an interactive poster
Introduction
In this tutorial, we look at more complex ways the UCA allows us to interact with shapes using Polylines, extrusion, and projection.
We’ve already seen how the UCA’s block graph model provides primitives paths, such as circle and line, that can be used to create tracked, haptic sensations. Polyline6 is another path generating block provided by the Sensation Core Library. This allows up to 6 points to be defined to create the haptic path.
Open the UCA’s Polyline example scene and experiment with selecting from the shape palette. You can then draw each of the vertices to form your own. By selecting the PolylineSensation game object you can see how the inputs for the Polyline6 Sensation Source are defined.
The UCA’s example Polyline scene does not feature hand tracking, but this functionality is available in the HandPolylineDesigner Unity project (within the Unity Examples Github Repo).
To obtain this project:
- Download or clone the UnityExamples Project found on our Unity Examples Github Repo
- In Unity, open the UnityExamples project
- From the Examples folder open the HandPolylineDesigner Unity example scene.
The Hand Polyline Designer allows you to
- Drag vertices of the Polyline6 sensation to the individual hand features – fingers, bones, palm, etc,
- See the resulting path on the hand in real-time and
- Save the output to a new Python block script that can be referenced in a Sensation Source.
Creating simple 3D shapes with the UCA
The UCA has two tools that can be used to create simple, 3D shape outlines: ProjectPathOntoPlane SCL block and the Sphere sensation Block (included in Unity Examples Github Repo)
With your UnityExamples Project, you can find the Simple Shapes Unity project.
Open this UnityExamples project and the SimpleShapes example scene, adding the UCA and Leap Motion tracking assets as normal.
Run the scene and interact with each of the three shapes to feel the outline as you move your hands through each shape.
ProjectPathOntoPlane Sensation
ProjectPathOntoPlane takes direction, point, and normal vectors to define a plane, and projects the input path to it. It can be combined with any of the path sources to project a shape onto a plane along a specified direction. Since the tracking data source provides enough information to define a plane, any two-dimensional shape can be used to project to the hand. In this way, we can define the outline of simple 3D shapes, such as cylinders, planes, cuboids, etc.
Looking at the Sensation Sources for either the Cube or Cylinder game objects you will see the associated Sensation Blocks: ExtrudedPolyline6 and ExtrudedCircle respectively. Open the associated Python block script ExtrudedCircle.py to understand the flow of operations.
The Circle Path connects to a Transform Path block to convert from sensation space to virtual space, followed by the ProjectPathOntoPlane block. The output of this is sent to a helper function called createVirtualToPhysicalFocalPointPipeline, represented in the diagram below as the Pipeline Renderer block. The rendered output is sent to the output block is the plane and projected path intersect. This is indicated by the ‘valid’ output of the ProjectPathOntoPlane block.
The haptics can be viewed by enabling the gizmos while running the scene. You will see how the circle and cube shapes deform as the hand’s orientation changes. The haptics is enabled using a triggered collider around each shape. The ProjectPathOntoPlane block makes the assumption that the shape is infinite. For this reason, the haptic path actually extends beyond the ends of the cube and cylinder. You can reduce this by making the collider smaller than the shape you are interacting with.
Exploration
Note that both ExtrudedPolyline6 and ExtrudedCircle have a hidden input called extrusionDirection. This is a unit vector that defines the direction along which the shape is extruded. Let’s look at the cylinder in isolation. Disable both the sphere and cuboid and bring the cylinder to the centre of the scene (set the transform position’s x to 0).
Run the scene and manipulate the cylinder’s transform rotation, for example, begin by setting z to 30˚. You will see the sensation manages to track the shapes new orientation.
This result is achieved by simply setting the “extrusionDirection” input to the shape’s “up” property in the ShapeInputUpdate.cs script:
Sensation.Inputs["extrusionDirection"].Value = transform.up;
You can now animate, move and modify the shape in your scene while maintaining the haptic interaction.
Sphere Sensation
The Sphere shape uses a more complex series of geometric operations to maintain a constant circle centred around the intersection of the hand, or the plane defined by the hand position and normal, as it moves through the sphere.
You can look at the Sphere.py script if you would like to understand how the geometry is tackled.
Suggested exercises
- Add additional shapes to the scene
- Overlay the primitive shapes to create more complex shapes.
- Turn your shapes into controls by reacting to hand position and controlling other elements in your scene. You can change the haptics to have different intensities or draw frequency to reinforce the control setting.
0 Comments