Because I can only use one leap per computer, I need to setup a rather complex linkage. I will use an iMac and two MacBooks and attach one leap motion to each of them. It gets even more complex because different sounds need to played. The iMac will at the same time show the concept video and therefore the sound of the video will be played via headphones. One MacBook will process the leap input associated with the music player (radio). So the headphones attached to one MacBook will play the Music itself.
This leaves one remaining MacBook to play the interface sounds via some speaker.
To play the interface sounds of the leaps connected with the other two computers, I will probably use shiftr to send play commands to the only computer which will playback the sounds.
Tag: leap motion
Problem with palmVelocity
The Leap Motion SDK offers a method called hand.palmVelocity. Unfortunately this does not seem to behave like I would expect it to, if measured during fast movements. For a quick hand shake (indicating a cancelling gesture) I may use the hands palm position to get a more reliable tracking. The picture above shows a log and logs direction changes depending on the x-coordinates ([x, y, z]). Again a good understanding of how the leap is registering hand movements is crucial for a successful gesture implementation.
Finger Tip Recording
Before defining rules for a new gesture detection I often need to carefully observe the data recorded by the leap motion sensor. Here I’m trying to find out what it needs to define the known «OK-gesture». I especially need to screen the finger tip position of the thumb and the index finger.
Video Documentation: User Tests
Video Documentation: User Tests from Jones Merc on Vimeo.
User tests showed that users direct their gestures mostly towards the object, they want to control. A direct dialogue between the user and the object is obvious.
It also showed some interesting insights about how they interact with the gesture lamp. All of them found one of the On/Off gestures pretty quickly, but were puzzled if they came across the other which rather confused them instead of helping them.
Another interesting thing said, was that if a user is controlling sound, the source is less evident than a light (lamp) or a wind (fan) source. Sound/Music is rather surrounding us and therefore a gesture may not be directed that clearly to the object (music-player) itself.
The tests sure make me rethink some interaction steps and help to develop a smoother interaction flow.
Setback: Leap upside down without good results
Unfortunately testing a Leap Motion upside down hanging from a lamp pointing downwards does not show the same accuracy as if placed in upright position. I therefore need to rethink exhibition layout.
Relay connected to the web
To build a lamp prototype which is controllable via gestures I built a relay attached to a normal extension cable. It can connect to the internet and by using Shiftr.io I can send commands from different sources.
Next step will be to use my Leap Motion to send those commands.