User Tests with Radio Prototype

user test radio prototype teaser image

And at some point he (the programmed assistant) asks questions like «Did you know that other gestures exist?» That’s where I would like to answer, but no answer is expected by the machine. … That’s also confusing.

During the first complete user test with the music player a lot of interesting feedback came together. Beginning with things, which seem quite easy to resolve like the point above — the solution would be not to ask any questions, if the user can’t answer via gesture — going over to other simple statements («The music was too loud») and ending with more complex things like the question if and how much visual feedback is required to generate a pleasant user experience.

At the moment visual feedback is non-existent but substituted by acoustic feedback. Sounds for swiping, changing the volume and switching it on and off are provided. Still they are much more abstract, because the user first has to link a sound to a gesture or to an action respectively. Paired with faulty behaviour of the leap motion tracking device this leads to a lot of frustration. Some of it maybe can be replaced by redesign the assistant and it’s hints. (Maybe even warnings that the tracking is not 100% accurate).

Further user testing will give more insight if and how much the assistant should intervene.
Also, a deeper analysis of the video recordings taken from the test, will help improving the user experience.

Further notations:

  • Text display sometimes too fast
  • Sounds not distinguishable from music
  • Swipe is the clearest sound
  • Not clear why something was triggered
  • Inaccuracy (maybe light situation was not perfect for leaps tracking)
  • Assistant mostly taught the gestures correctly, sometimes the would not trigger due to technical constraints
  • On/Off gesture was not found by chance (in comparison with the lamp where almost all users found the exact same gesture to switch it on or off)

Video Documentation: User Tests

Video Documentation: User Tests from Jones Merc on Vimeo.

User tests showed that users direct their gestures mostly towards the object, they want to control. A direct dialogue between the user and the object is obvious.
It also showed some interesting insights about how they interact with the gesture lamp. All of them found one of the On/Off gestures pretty quickly, but were puzzled if they came across the other which rather confused them instead of helping them.
Another interesting thing said, was that if a user is controlling sound, the source is less evident than a light (lamp) or a wind (fan) source. Sound/Music is rather surrounding us and therefore a gesture may not be directed that clearly to the object (music-player) itself.
The tests sure make me rethink some interaction steps and help to develop a smoother interaction flow.