Leap Motion Orion SDK Walkthrough

Rushil Reddy
8/20/2016


Core Assets

Inside of the core assets the Hand Pool has within it the graphics hands and physics hands. In order to add new hands to your scene you would change the size of the model pool to 3. Next, give that group a name(instead of physics_hands). Clear the slots add your model prefabs to the same transform as the other hand models and then drag them into the editor.

If you enable can duplicate you’ll be able to have more than one of each hand(R or L).

If you turn gizmos on you can see the debug hand. So select the object in the hierarchy that has all the models and then turn on gizmos to get a better sense of where your hand data is mapping to from the leap.

Physics hands are for calculating collisions. The standard physics hand in core assets are for simple physics collisions. Removing the “Hand Disable and Enable” component will freeze your hands in space when you lose tracking. Ideal behaviour would extend the lerp for hand enable disable so that they wouldl drop back into position.

Hands Module:

Auto Rigging capability is huge for importing your own hand models for use with the Leap Motion. The hands module has Fbx examples on how to make best practice hands for leap. So in hands folder we have some example scenes. If we look at Hands_Viewer_Demo, in play mode we use arrow keys we can cycle through collection of hands. On each hand model there is a rigged hand class that is getting positional information from the palm and also all the finger joints. To set up hands you need to know the vector direction that each finger and hand is pointing, but if you get the finger direction vectors reversed from what they should be the hand will turn inside out. So these vectors in the rigged finger script are very important.

In each finger script we have a Deform Position bool. It’s an advanced user setting used for building your own hands. Each of those hands is a variant of ihand model class. Rigged hand script implements hand model class.

In the scene, Rigged_Hands_AutoRig_Example, you can see an auto rigging process example. Joint axis are pointing down the length of bone. So if you add Auto Rig hands component, then click auto rig button and all of the slots should be filled up. Behind the scenes it is Identifying palm, root of each finger, applying all of those scripts, within that figuring out which are finger joint. Figuring out which direction palms face and which direction fingers face is all done automatically now. So what we should see is the leaphandcontroller, we now have an entry for graphics hands that we auto rigged so that it adds a group to the hand pool we talked about earlier. Iterating on hand models should take a lot less time with this tool! So these hands are connected to mecanim rig and the AutoRig script exposes all values to you in the inspector. In the editor we want to be able to see the hands and verify that they are looking like, so we have an editor time frame. So “Set editor Leap Pose”, your hands will now be in front of camera. Use this to verify if model is working correctly. If you set it back it goes back to stored pose.

Note: They are stretch IK settings in Mecanim that are exposed to script, for hand reach problems. It’s one things to get good deformations for rotations but when you do the deform position setting the hands are going to deform so the weighting of hands it puts extra onus to match proportion to get it right. Autorigging works for many high quality characters in unity asset store(joint orientation must be correct).

UI-Input Module:

Hooked into Unity’s system, so there is a leap input module that replaces the mouse input module. So you can create standard Unity GUI’s and they should work with the hands. There are some prefabs that let you bring in a UI Module pretty quickly. Having buttons in UI that move and react to hands(having depth and perspective is a really important way to orient the user). That sort of affordance is a really important cognitive lift, so they’ve exposed these class different objects in different layers of the buttons. So for example in the compressible UI script, we can bring in transforms (Layer Transform) and then define the distance for the travel of the button. You can make your own buttons by dropping transforms in there.

Detection Examples Module:

In the finger detection scene, we have fingers detectors that are getting finger detection and the thumbs up. So Turn on gizmos and hit play to get some interesting feedback . You can’t do this in game view, you must be in the scene view. So these cones are there to see if your in the green and activating the pointer. So we have a Finger Direction Detector, and that can be set for the thumb, and we can set which axis relative to world. So Y relative to world is basic thumbs up. Detect for that and use that to drive other scripts(as in the demo scene). The basic idea is that you can combo finger detectors to gain access to relatively full hand gestures.