Virtual Tactile-ness: Creating Engaging User Experiences

Virtual Tactile-ness: Creating Engaging User Experiences

Dorothy Shamonsky


Dorothy Shamonsky, Ph.D., is a User Interface/User Experience (UI/UX) Designer for ICS, who holds broad practical experience and theoretical knowledge in the field and works extensively on new touchscreen product development at ICS.

By Dorothy Shamonsky | Monday, July 18, 2016

When you walk on a beach, you may be tempted to reach down and pick up a smooth stone or a shiny shell, turn it around in your fingers feeling its weight and texture. If a friendly cat or dog walks close by, you may be tempted to reach out and stroke its fur. When humans are attracted to an object because of its color, shape, surface or texture – because of its visceral appeal – they are often compelled to reach out and touch it.

Can the same attraction happen in a virtual realm, in a user experience?

Visceral reactions are emotions that come from the gut. They are shortcuts in decision making that allow us to act quickly and with little cognitive effort. Instead of analyzing, we respond to sensory input immediately, emotionally. The connection between sensory input and response is very direct with gut feelings. We experience, we react. Sometimes, these decisive responses can be life saving, but mostly they allow us to navigate the world more efficiently and effectively. If you follow your gut, deciding what to wear, what coffee to order on the way to work or what path to take for a walk in the park can require next to no cognitive effort.

For something to be viscerally appealing it must act on our senses. The physical world provides us with non-stop sensory input, but the virtual realm is more limited. Mouse interaction is very dependent on visual input, supplemented by sound. But while imagery and sound provide rich sensory input, touch delivers a deeper level of appeal to our senses. So, how can designers leverage the power of our sense of touch to create viscerally appealing, and therefore compelling, interactions in the virtual world?

The answer requires we look at how our sense of touch works in the physical realm. Touch provides us with many details in terms of tactile feedback, whether we’re experiencing appealing things like shells and fur or repulsive things like slime. Feedback details include shape, color, surface, weight, movement, texture and temperature.

As there is not yet tactile -- or haptic -- feedback in touch technology, touch user experiences require something to replace the tactile feedback touchscreens lack. This feedback can be simulated with the visual, aural and algorithmic behavior of interface elements.

I call this virtual tactile-ness. Here’s some examples.

On many mobile devices scrolling components, such as the contact list, have a weight and a bounce at the end that makes scrolling feel elastic and as though the list has mass and inertia. When I talk about virtual tactile-ness I am referring to exactly that level of subtlety. It’s compelling and satisfying without getting in the way of that task of searching the list.

Games can take virtual tactile-ness to another level. Candy Crush, which requires the user to swipe candy pieces to make matches, uses button elements that squish a bit upon tapping. Animations and sounds support the feeling that the buttons are rubbery. The bounce behavior of the dialogs in response to touch gestures make them feel like they have mass and inertia. So while the screen is not actually delivering haptic feedback, the user’s brain translates the visual and aural input into an imaginary haptic experience. In other words, the user feels like he or she has touched a tangible object.

Three key components must be developed to achieve virtual tactile-ness: visual, gesture and sound.

Visual: The visual representation of an interface element must be detailed enough to evoke a concept in the user’s mind and contain a convincing set of animations representing all the various states that the element can have.

Gesture: An interface element needs to behave appropriately with touch gestures in a way consistent with its visual representation. Again, even with no real haptic feedback elements can feel heavy or light depending on how quickly an element can be flicked, swiped or dragged. An element that moves slowly while being swiped feels heavy, while an element that moves swiftly feels light. Related to weight is inertia, which can be simulated by the use of acceleration and deceleration in moving elements. An element that is slow to accelerate but picks up speed as it moves, then has a nice big bounce at the end, will feel like it has lots of inertia.

Sound: Sound effects reinforce what the visuals and gesture behavior are telling you and make the virtual tactile-ness feeling more powerful.

I’m not suggesting that virtual tactile-ness attempt to literally imitate the physical realm. Rather, it’s a creative opportunity to develop visceral experiences by borrowing ideas from physical reality. Although touch interaction may seem a simple step beyond mouse interaction, that’s not the case. Touch presents a potentially dramatically different -- heightened -- sensory experience for the user.

While the current generation of gesture-driven touch technology has limited capability -- it only allows us to use a small proportion of our vast sense of touch -- the technology will evolve until it offers a truly haptic experience. But even with today’s limited capabilities, the current technology opens up new opportunities for designers to create compelling, multi-sensory experiences and increase the visceral appeal of user experiences. These experiences will feel more natural to users as a result.  It is up to the designer to make effective use of virtual tactile-ness.



Have a question or add to the conversation: Log in Register