After having worked with Leap Motion for almost 3 months so far, I can say that it indeed does some awesome work tracking hands and fingers. The technology seems to have advanced that far that such products may be used in commercial applications very soon. Although I also have to state that when programming more complex gestures than the ones, which come ready-made with the Leap SDK (swipes, air-taps and circles), it gets very difficult to track them accurately. On the one hand because gestures naturally interfere (forming the hand to a «thumb-up»-gesture will always resemble the gesture of clenching a hand into a fist for example.
This combined with the fact that sometimes the leap sensor misinterprets the position of fingers (e.g. index finger is extended, the Leap Motion says otherwise) makes it even more difficult to get a more or less reliable tracking.
But wouldn’t it be boring if everything would run plain smoothly?