First sketch of the smart radio as controllable object via gestures. The object should remind of a radio to indicate the functions but the controls happen via gestures. The dialogue how a user get’s to know the different interaction inputs is a key feature in the development of this object prototype.
Tag: micro-gesture
Testing a dialogue between object and user
Video Documentation: Object – Wizard of Oz from Jones Merc on Vimeo.
A nonfunctional prototype (object) in a «Wizard of Oz»-test-setup. A nearby computer allows to write the text which is subsequently displayed on the object’s screen. Without having to program a smart object with a fully functioning gesture recognition one is able to test different scenarios like this dialogue between user and object. Focus of the dialogue is how to slowly establish a gesture language without presenting it in the first place to the user but rather developing it in a dialogue between the two.
What’s your Gesture?
Video Research: What’s your Gesture from Jones Merc on Vimeo.
To see if gesture trends can be found I asked multiple people to perform a gesture to each of eleven terms. They were asked to only use one hand and did not have time to think about the gesture in beforehand. I found that in some cases the results were pretty predictable and most of the gestures were alike. Other terms provoked a high diversity of different gestures and sometimes very creative gestures were used.
Another finding was that the interface that would be controlled with such gestures would directly influence the gesture itself. A lot of people asked how the object to control would look like and said that they may would have come up with different gestures if the’d seen the object or the interface respectively.
To see the full length gesture video go here.
Use a Micro-Gesture to start an application’s feature
Video Sketch: Start Shazam’s Listening Feature via Gesture from Jones Merc on Vimeo.
What if one could define specific gestures to start specific features of an application.
This would be a diversification of the smartphones interface (see also: Standardised vs. Diversified Interaction), because one could skip all the buttons which one would normally need to navigation into the app and to a particular action or a feature.
In the video sketch I show how it could feel if the cupping-your-hand-gesture would initiate shazam’s song-listening feature. Especially in that use-case one is glad if as little time as possible is needed to start the listening function (otherwise the song may be over already).
Standardised vs. Diversified Interaction
Highly standardised multi-functional devices like smartphones or computer sometimes need quite tedious actions to finally get to the action/control you want to execute. Because it’s multifunctional touch-buttons (in the case of a smartphone) need to tell the device what application and what action should be executed. It narrows down the options continuously.
If such devices could also be controlled in a diversified way (concerning interactions) for example via micro-gestures which offer far more possibilities, one could skip lots of steps. One specific gesture could mean: Go to app A, start feature B and choose option C.
Of course further question arise within that use case, for example what happens if a gesture is too close to a everyday life gesture and may start a process unintentionally.
For that case a preceding gesture could solve the problem. Just like saying «OK Google» initiates voice control in google services a very specific and unique gesture could start gesture recognition.
Gesture Camera
Video Sketch: Gesture Camera from Jones Merc on Vimeo.
An example for an micro-gesture application in the real world. Instead of using buttons Joep Frens proposed rich interactions to operate a camera. I took the exact same context and applied possible gestures to this scenario.
Video Sketch: Movie Scrubbing
Video Sketch: Movie Scrubbing via Micro-Gesture from Jones Merc on Vimeo.
The video sketch attempts to convey the feeling of controlling a video via micro-gestures. Grabbing the playhead and sliding it back and forth.
Such a scenario could be used in presentations for example, where it’s inappropriate to head over to an attached computer and move the playhead via mouse or trackpad. To further explore the possibilities brought by such an interaction a prototype is indispensable.