Increasing Complexity of Interaction Flow

interaction flow with increased complexity

Writing the interaction flow for the music player offering gestures for play, pause, track-change and volume-adjustment is already a lot more complex than the one I’ve written for a lamp with a simple on and off switch.
The «smarter» the device the more complex the multilinear story flow. Imagining this with a much more elaborated product seems almost crazy, or let’s one think that artificial intelligence definitely will be a must if products ever want to appear really «smart».

Leap Motion Review

Leap Motion Quality Product Shot High Resolution Gesture Tracking Device

After having worked with Leap Motion for almost 3 months so far, I can say that it indeed does some awesome work tracking hands and fingers. The technology seems to have advanced that far that such products may be used in commercial applications very soon. Although I also have to state that when programming more complex gestures than the ones, which come ready-made with the Leap SDK (swipes, air-taps and circles), it gets very difficult to track them accurately. On the one hand because gestures naturally interfere (forming the hand to a «thumb-up»-gesture will always resemble the gesture of clenching a hand into a fist for example.

This combined with the fact that sometimes the leap sensor misinterprets the position of fingers (e.g. index finger is extended, the Leap Motion says otherwise) makes it even more difficult to get a more or less reliable tracking.

But wouldn’t it be boring if everything would run plain smoothly?

3 Leap/3 Computer Setup

3-Leap-Setup

Because I can only use one leap per computer, I need to setup a rather complex linkage. I will use an iMac and two MacBooks and attach one leap motion to each of them. It gets even more complex because different sounds need to played. The iMac will at the same time show the concept video and therefore the sound of the video will be played via headphones. One MacBook will process the leap input associated with the music player (radio). So the headphones attached to one MacBook will play the Music itself.
This leaves one remaining MacBook to play the interface sounds via some speaker.
To play the interface sounds of the leaps connected with the other two computers, I will probably use shiftr to send play commands to the only computer which will playback the sounds.

Sound Design for Interface

Acoustic feedback could enhance the gesture based interface experience. Without haptic and only little visual feedback a hearable hint could help the users to understand what effect their actions will produce. Acoustic feedback could indicate if the user is holding his hands in the right position and could also help to find the «correct» gesture.

A few attributes I am looking for in my interface sounds:
– pleasant
– a bit mechanical
– unobtrusive
– discreet
– confirmative
– not squeeky
– not too playful

An imaginable sound for a lamp to switch on could maybe sound like this:

Problem with palmVelocity

log of direction change [x, y, z]

The Leap Motion SDK offers a method called hand.palmVelocity. Unfortunately this does not seem to behave like I would expect it to, if measured during fast movements. For a quick hand shake (indicating a cancelling gesture) I may use the hands palm position to get a more reliable tracking. The picture above shows a log and logs direction changes depending on the x-coordinates ([x, y, z]). Again a good understanding of how the leap is registering hand movements is crucial for a successful gesture implementation.

Finger Tip Recording

logging_finger_position

Before defining rules for a new gesture detection I often need to carefully observe the data recorded by the leap motion sensor. Here I’m trying to find out what it needs to define the known «OK-gesture». I especially need to screen the finger tip position of the thumb and the index finger.