Unity FAQs

How can I make objects in Unity interactive?

To make an object interact with your hands, you’ll need an InteractionBehaviour script attached to it. Make sure the object you want to move doesn’t have kinematic set to true in the Rigidbody. Learn more at https://leapmotion.github.io/UnityModules.

How can I build a networked multiplayer game with Leap Motion?

This is not formally supported, but there is a community-created solution here: https://forums.leapmotion.com/t/tutorial-rigged-hands-across-a-network-using-unity-networking-unet/6629

What happened to Image Hands?

Image Hands are no longer supported. This is for a few reasons:

  1. To some extent Image Hands were developed as a way to overcome the limitations of our V2 tracking, specifically with occluded fingers and complex hand poses. These issues were resolved with the Orion release in 2016, so that the case for augmented reality passthrough for hands became less acute.
  2. Image Hands required a lot of custom rendering work which often broke with new Unity releases. For example, there was a period for several months when Image Hands would not work because of abstractions made in the Unity VR plugin.
  3. For most XR applications we found that the infrared-based Image Hands were aesthetically oppressive. They often failed to gel with virtual objects, and artifacts around the edges of the rendering would interact in unpredictable ways with virtual content (e.g. shaders).

How can I change the grab sensitivity in Unity?

There are two places where you can play around with different variables:

https://github.com/leapmotion/UnityModules/blob/master/Assets/LeapMotion/Modules/InteractionEngine/Scripts/Internal/HeuristicGrabClassifier.cs#L33

https://github.com/leapmotion/UnityModules/blob/master/Assets/LeapMotion/Core/Scripts/Utils/HandUtils.cs#L292

How can I prevent the hands from clipping through surfaces in Unity?

Our Unity assets are designed with the understanding that your hands are an unstoppable force within a virtual environment. Early and repeated user testing has revealed that artificially restraining the motion of the hands seriously breaks immersion. It is possible to modify the assets so that this doesn’t occur, but it would be difficult and unsupported.

Where can I find Pinch Draw and Pinch Move?

These old examples are no longer supported, but can be found here: https://github.com/leapmotion/UnityModules/tree/master/Assets/LeapMotion/Legacy/DetectionExamples/Scripts

How can I record and playback in Unity?

There are many ways of accomplishing this. One that might be worth looking at is the experimental record and playback scripts in the Unity assets: https://github.com/leapmotion/UnityModules/tree/2856bbf4ff510f2d6f3f58170f2372f690fe8595/Assets/LeapMotion/Experimental/Playback/Scripts

How can I mirror hands?

This technique is often used for mirror box treatments for stroke rehabilitation and phantom limb sufferers. We've experimented with graphical mirrors that are like physical mirrors in that they reflect light and the world around you, but we've never mirrored the hand data directly. Part of the challenge is that when you mirror a hand, it turns from a left hand into a right hand, and our hand models get confused.

The most efficient strategy would be to modify how our "post-processing" feature works for hand data. Currently, it happens after hands have been assigned to a model they're going to drive, which means by this point, it's too late to change the chirality (left vs. right) of the hand. If this step were to happen directly after frame data is received, a post-process could then flip the chirality of the hand and mirror it on the user's local X axis, resulting in the intended mirroring behavior.

Can I have the source code to Cat Explorer or Blocks?

We are not currently sharing the source code for these apps.

Have more questions? Submit a request

0 Comments

Article is closed for comments.