As partner of ASUS the Leap Motion uses the same emitted and reflected infrared light for tracking parts of the human body like the Asus Xtion Pro . Available since July 2013, the Leap Motion with about 90EUR is an inexpensive, but limited input device, which is optimized for tracking fingers and hands as illustrated in the following illustrations.
LeapMotion Visualization API allows track- ing of two hands and advanced gesture recognition
LeapMotion device emits infrared light, which can be seen with a non-filtered camera
The main features include the tracking of two simultaneous hands with gesture recognition for all ten fingers. For distances between 10cm and 1m at daylight the device works reliably.
During my thesis, I have tested the existing ROS driver, which currently only supports one hand and was not able to provide 3D PointCloud data. In brief, the Leap Motion unfortunately is inappropriate for our project as their only use could be unreliable robot control by hand gestures.
Today I’ve got the chance to get my hands on a Leap Motion. As it uses depth information to track hands on a short range from the device and as there is a ros driver package existing for it, I hoped to get a 3D PointCloud. It costs about 80€ and could have been a cheap replacement for the [amazon &title=Xtion&text=Asus Xtion].
Unfortunately its not possible (yet?) – here is a very nice post why.
But it is fun anyways to get both hands tracked:
LeapMotion – Infrared 1
LeapMotion – Infrared 2
LeapMotion – two hands, 10 fingers
The ros driver interfaces ros with only one hand – but we could do something like shown below to control the amosero:
Later it would be a nice way to control a robot arm – but for now we leave that nice little device as there is a lot of other stuff to be done.