Video Documentation: User Tests with Music Player

user testing music player teaser image
Another set of user tests was conducted. The music player and the helping assistant was tested this time. Again a lot of valuable feedback came together. Here a few to name:

  • The sound design still needs some tweaking, because sometimes the sound was not interpreted as an interface sound but rather as a part of the music. A clear distinction would be desirable.
  • Music choice still needs to be redefined. Electronic music is for exemplary representation a bad choice because of the similarity with the interface sounds.
  • For some people the assistant should maybe give more feedback. Perhaps even visual feedback.
  • The gesture recognition/tracking may be improved for a better experience. Sometimes – especially under bad light circumstances, the missing reliability can be very annoying.

User Tests with Radio Prototype

user test radio prototype teaser image

And at some point he (the programmed assistant) asks questions like «Did you know that other gestures exist?» That’s where I would like to answer, but no answer is expected by the machine. … That’s also confusing.

During the first complete user test with the music player a lot of interesting feedback came together. Beginning with things, which seem quite easy to resolve like the point above — the solution would be not to ask any questions, if the user can’t answer via gesture — going over to other simple statements («The music was too loud») and ending with more complex things like the question if and how much visual feedback is required to generate a pleasant user experience.

At the moment visual feedback is non-existent but substituted by acoustic feedback. Sounds for swiping, changing the volume and switching it on and off are provided. Still they are much more abstract, because the user first has to link a sound to a gesture or to an action respectively. Paired with faulty behaviour of the leap motion tracking device this leads to a lot of frustration. Some of it maybe can be replaced by redesign the assistant and it’s hints. (Maybe even warnings that the tracking is not 100% accurate).

Further user testing will give more insight if and how much the assistant should intervene.
Also, a deeper analysis of the video recordings taken from the test, will help improving the user experience.

Further notations:

  • Text display sometimes too fast
  • Sounds not distinguishable from music
  • Swipe is the clearest sound
  • Not clear why something was triggered
  • Inaccuracy (maybe light situation was not perfect for leaps tracking)
  • Assistant mostly taught the gestures correctly, sometimes the would not trigger due to technical constraints
  • On/Off gesture was not found by chance (in comparison with the lamp where almost all users found the exact same gesture to switch it on or off)

Sound Design for Interface

Acoustic feedback could enhance the gesture based interface experience. Without haptic and only little visual feedback a hearable hint could help the users to understand what effect their actions will produce. Acoustic feedback could indicate if the user is holding his hands in the right position and could also help to find the «correct» gesture.

A few attributes I am looking for in my interface sounds:
– pleasant
– a bit mechanical
– unobtrusive
– discreet
– confirmative
– not squeeky
– not too playful

An imaginable sound for a lamp to switch on could maybe sound like this: