I really like Donald Norman’s essay »Natural User Interfaces Are Not Natural« that he wrote for the ACM interactions magazine. In this essay Norman shares his view on the role of natural user interfaces (for him represented, e.g., by »speech, gesture, or the tapping of the body’s electrical signals for “thought control,”«) in terms of interaction with future computer systems.
I have got the impression that marketing guys love to pronounce the end of mouse and keyboard interfaces that will in their opinion (or at least in their words) become obsolete as their place will be absorbed by natural user interfaces that are based on gestural interaction provided by systems like the Kinect or by multi-touch screens. I do not share this opinion. As it comes to me, (nearly) every interaction approach has its right to exist and usually is best for certain types of tasks. I doubt, for instance, that there soon will be a better way to enter text than with a standard keyboard based on haptic keys.
So, Norman provides one of the few pieces of literature that keeps cool about the natural user interface hype and gives profound arguments where he sees their potentials and pitfalls. I strongly recommend to read the whole essay. In the following, I cite some strong points his essay makes.
It is also unlikely that complex systems could be controlled solely by body gestures because the subtleties of action are too complex to be handled by actions–it is as if our spoken language consisted solely of verbs. We need ways of specifying scope, range, temporal order, and conditional dependencies. As a result, most complex systems for gesture also provide switches, hand-held devices, gloves, spoken command languages, or even good old-fashioned keyboards to add more specificity and precision to the commands.
…
Gesture and touch-based systems are already so well accepted that I continually see people making gestures to systems that do not understand them: tapping the screens of non-touch-sensitive displays, pinching and expanding the fingers or sliding the finger across the screen on systems that do not support these actions, and for that matter, waving hands in front of sinks that use old-fashioned handles, not infrared sensors, to dispense water.
…
All new technologies have their proper place. All new technologies will take a while for us to figure out the best manner of interaction as well as the standardization that removes one source of potential confusion. None of these systems is inherently more natural than the others. The mouse and keyboard are not natural. Speech utterances will have to be learned and gestures carefully developed and standardized through time. The standards don’t have to be the best of all possibilities. The keyboard has standardized upon variations of qwerty and azerty throughout the world even though neither is optimal–standards are more important than optimization.Are natural user interfaces natural? No. But they will be useful.