3 Leap/3 Computer Setup

3-Leap-Setup

Because I can only use one leap per computer, I need to setup a rather complex linkage. I will use an iMac and two MacBooks and attach one leap motion to each of them. It gets even more complex because different sounds need to played. The iMac will at the same time show the concept video and therefore the sound of the video will be played via headphones. One MacBook will process the leap input associated with the music player (radio). So the headphones attached to one MacBook will play the Music itself.
This leaves one remaining MacBook to play the interface sounds via some speaker.
To play the interface sounds of the leaps connected with the other two computers, I will probably use shiftr to send play commands to the only computer which will playback the sounds.

Problem with palmVelocity

log of direction change [x, y, z]

The Leap Motion SDK offers a method called hand.palmVelocity. Unfortunately this does not seem to behave like I would expect it to, if measured during fast movements. For a quick hand shake (indicating a cancelling gesture) I may use the hands palm position to get a more reliable tracking. The picture above shows a log and logs direction changes depending on the x-coordinates ([x, y, z]). Again a good understanding of how the leap is registering hand movements is crucial for a successful gesture implementation.

Finger Tip Recording

logging_finger_position

Before defining rules for a new gesture detection I often need to carefully observe the data recorded by the leap motion sensor. Here I’m trying to find out what it needs to define the known «OK-gesture». I especially need to screen the finger tip position of the thumb and the index finger.

Video Documentation: User Tests

Video Documentation: User Tests from Jones Merc on Vimeo.

User tests showed that users direct their gestures mostly towards the object, they want to control. A direct dialogue between the user and the object is obvious.
It also showed some interesting insights about how they interact with the gesture lamp. All of them found one of the On/Off gestures pretty quickly, but were puzzled if they came across the other which rather confused them instead of helping them.
Another interesting thing said, was that if a user is controlling sound, the source is less evident than a light (lamp) or a wind (fan) source. Sound/Music is rather surrounding us and therefore a gesture may not be directed that clearly to the object (music-player) itself.
The tests sure make me rethink some interaction steps and help to develop a smoother interaction flow.