User Experience Design Principles of a Natural User Interface (NUI)

Dorothy Shamonsky

Dorothy Shamonsky, Ph.D., is a User Interface/User Experience (UI/UX) Designer for ICS, who holds broad practical experience and theoretical knowledge in the field and works extensively on new touchscreen product development at ICS.

By Dorothy Shamonsky | Monday, November 24, 2014

In a previous blog post (Defining a Natural User Interface) I explained how finding a clear and concise definition of a Natural User Interface (NUI) was not easy. Finding a clear and concise list of user experience design principles for a NUI is even more challenging. An obvious reason for this is that NUIs encompass a broad range of possibilities, and it’s difficult to be general enough to cover the range and at the same time specific enough to be useful. After searching for lists of design principles that have already been published, I ended up making my own list. Here it is.

1. Choose an input and output modality that is appropriate to the context of use.

What sets NUIs apart from Command Line Interfaces (CLIs) and Graphical User Interfaces (GUIs) is that they are not restricted to the paradigm of a physical keyboard, mouse and display screen in order to interact with them. Instead the modality of interaction is determined by the context of use. For example, touch is appropriate for a mobile device because elegant mobility is achieved by not having excess parts (a pointing device such as mouse or pen) and maximizing the touchscreen by not having a physical button keyboard. Speech interaction is appropriate where hands and eyes are occupied with other tasks such as driving.

2. Content is the interface.

Have the user interact directly with the content as opposed to commands and controls that manipulate content. For example, to delete an item a CLI would have a typed command and a GUI would have a menu command or require the user to drag an item to a trash can icon. Instead, a NUI could have a gesture to remove an item, such as crossing out (swiping) an item on a list or tossing (air gesture) an item away. A typical goal or a NUI visual interface compared to a GUI, is to reduce or eliminate chrome, the visual controls, as much as possible.

3. Leverage instinct or innate skills such as motor memory and sense of 2 and 3-D space.

Since humans have many innate physical and cognitive skills for interacting with physical reality, why not take advantage of those in experience design. CLIs and GUIs are restricted to a limited physical paradigm of keyboard and mouse and thus, use only eye-hand coordination to a great extent. With gesture and speech modalities, innate human abilities such as remembering physical gestures and actions and orienting in 2 and 3-D space can be utilized.

4. Leverage already learned behavior.

Our learned behavior spans all of our senses; it is not restricted to eye-hand coordination. Also, learned behavior often uses multiple of our senses, without our being aware that it does. In NUIs where input and output modalities are expanded to capture more human capabilities, experience design can tap into a wider spectrum of learned behavior than GUIs already do. For example, devices that capture air gestures can potentially use full body actions, such as conducting an orchestra, to control the device.

5. Discoverable and easy with progressive complexity.

By their nature (natural interaction), NUI applications or devices should not require training to use, but instead be discoverable in the way that things in physical reality can be figured out. This is where using innate and learned behavior is valuable. This can also be supported by using only simple interaction patterns. To achieve complexity, those patterns are replicated rather than complexified. For example, having the same, simple interaction (next level) to traverse many levels of data satisfies that requirement. The design of the user experience can also clue the user into what is available to them by hints added by deliberate careful design. A subtle texture on the surface of an item  (physical or virtual) can clue the user that the item can be moved or manipulated.

6.  Inviting and highly responsive.

We are used to highly responsive interaction with physical reality, especially with things that are alive. NUIs do have more relationship with anthropomorphism than GUIs do, which feels decidedly like a machine interface. Things that are alive and ready to interact are more inviting. If a device appears to be ready at a moments notice for our input, for example a smart thermostat, that device is more inviting to interact with than a device that needs to be turned on, woken up or otherwise roused from the off-state.

7. Pleasurable and enjoyable.

If we can, why would be ever want to waste our time with non-enjoyable interactions. Now that devices can be made to be more appealing, that there is a market demand for them, then that is the new normal.

8. Personalized.

A better user experience is often a personalized one. Personalized means automatically more valuable to us with less effort on our part, less waste of our time getting a device to give us what we need. Devices having social intelligence feels more natural to us as humans, who have a highly developed social intelligence.

9. Intelligent.

In order to meet some of the above requirements such as highly responsive and personalized, NUI user experiences need to have a level of intelligence.

10. Simple and Elegant.

If all of the above design principles are followed, the resulting user experience should be elegant. One of the definitions of elegant is often that something is simple; elegant means nothing superfluous. This is the larger goal of a NUI user experience.


Have a question or add to the conversation: Log in Register