A physical demonstration of a touch enabled coverflow drag on an ICS kiosk.
With the popularity of smart phones and tablets, touch gesture expertise is mostly focused on small screen sizes. Large touch screens are much less common and present a unique set of problems that have not been as well explored. Recently at ICS, we’ve been experimenting with touchscreen devices of various sizes, and we’ve made some interesting observations about touch gestures on large screens.
We created a Qt/QML application at a native size of 1920 x 1400 pixels, testing it on a 22- inch touch monitor. The application contains images, movies, text, animation and sound. Then we adapted and deployed the application to various touchscreen devices that have wildly varying screen sizes, from small (a 4x7 inch Nexus tablet) to large, (a 55-inch ELO touch monitor). As a UX designer, I was concerned with how the touch gestures and touch target sizes would translate between screen sizes.
Predictably, getting the target touch sizes right, turned out to be relatively easy. We scaled the touch target sizes proportionately with the screen size, but established minimum and maximum sizes, thus restricting them from getting too small or too large. We did the same for the button or icon images associated with them. Having a tap area on a large screen that is the size of two or three fingers felt right, because on a large screen your perceived sense of spatial accuracy is less precise than on a small screen.
Our most surprising observation was that touch gestures themselves were not as comfortable to use at large screen sizes. Tapping was ok but dragging, scrolling and pinching proved problematic. Gestures that are effortless on a small screen require a noticeable effort on a large screen. For example, dragging or scrolling across, in some cases, a few feet of screen, requires more coordination than doing the same action in a hand-sized space. In addition, holding your arms up to screen level and moving them while coordinating hands –is actually tiring. Some pinching that could be accomplished with one hand easily on a small screen, needed two hands on a large screen. Two-handed pinching feels like a satisfactory action, but it also requires more effort than single-hand pinching. Also, using a touch keyboard on a large screen feels slow and tedious; thumb typing is not really possible. (Auto-complete is a must!)
In our application, we use a “coverflow” or carousel as top-level menu pages. We found that the amount that the carousel traveled on a flick (deceleration) needed to be different at different sizes. The deceleration that worked at a smaller size was too high (high deceleration equals slow), for a large screen. Test users perceived the carousel as broken on the 55-inch screen when the deceleration was the same and was perceived as correct for the 22-inch screen. (Our Carousel is a QML PathView object that has an attribute, flick deceleration. On the 22-inch screen, a setting of 35 worked well while on the 55-inch screen a setting of 20 felt correct.)
Using touch to interact on a large screen is a very different user experience from using touch on a small screen. On a small screen you only use your hands, while on a large screen the interaction requires arm action. As soon as you start to use your arms, it requires energy that is more physical and target accuracy becomes more difficult. It’s easy to make the leap that, at least for some types of applications, interactions that require arm action may be more comfortable for free-form gestures than touch. Stay tuned, as that is one of our next areas of exploration.