Make touchscreen interfaces easier to use with surface haptics technology

Article By : Craig Shultz

Software-defined surface haptics technology makes display devices with touch interfaces easier to use and more intuitive.

The mobile phone has been on a long journey of transformation. The first products, known as car phones, were utilitarian devices, a tool for voice communication (Figure 1). Today, the latest high-end smartphones are objects of desire as much as functional electronic devices. If the early mobile phones were tools that the user turned to only when necessary, the latest phones are as essential a part of the user’s life as clothing, food, and money.

photo of the AEG Telecar CD 452 car phoneFigure 1 The AEG Telecar CD 452 car phone from the 1980s was a utilitarian tool with little consumer appeal. Source: Christos Vittoratos, Wikimedia Commons

This transformation is of course in part to do with functionality: today’s phones are cherished because of what they enable the user to do and how easily they allow them to do it. The direct visual interface to the phone’s apps and functions via icons, sliders, and buttons makes interaction with the device feel natural and intuitive. The bright, colorful display, the array of touch-sensing icons, and the instant response to spoken commands all appeal to humans’ instinctive sensory mode of interaction with the world.

While the user’s senses of sight and hearing are richly stimulated by the smartphone UI, another sense, touch, is less well served. When a user touches the screen of a phone, tablet, self-service payment terminal or other device, it is a tactile experience, but the feel is flat and uniform.

Watch a consumer browsing in a clothing or furniture store, and you will find them assessing the merchandise not only by how it looks but by feel—an instinctive response to the different textures and materials on offer. By contrast, the feel of a touchscreen is characterless, and induces no sense of tactile connection.

Today, a phone’s screen makes a powerful visual impact through its rich rendition of color and its pin-sharp resolution, but the tactile dimension of a user’s interaction with screen content is absent.

This is the big missed opportunity in the practice of user interaction design. That’s because technology that is ready for deployment today can provide an almost infinite palette of textures and surfaces rendered on the flat glass cover of a touchscreen display—as well as on the back and sides of the phone where the fingers grip—and on a wide range of other surfaces not tied to a display.

The touch interface can, for the first time, be made engaging without being distracting. It can make the use of the device easier and more intuitive, so that the user can operate it without looking directly at it. And it can convey a distinctive brand character. This is the domain of software-defined surface haptics.

Limitations of today’s touch effects

Manufacturers of consumer devices are already aware of users’ weak tactile engagement with today’s touchscreens. Increasingly intensive engineering effort is being ploughed into the implementation of traditional vibrotactile haptic technology in the latest high-end smartphones.

The problem is that vibrotactile haptics is inherently constrained by the limited range of effects that it can produce, the limited scope for control and localization of those effects, and the small size of display in which those effects can be implemented. The very phenomenon of vibration itself fails to reflect people’s natural tactile response to the world. From birth, a human learns about the world through touch, but not because things vibrate. It’s because things have texture, weight, resistance to pressure, and other physical properties.

In fact, the base technology of motor-driven vibration was first developed as early as 1928. Still, vibrotactile haptics is an electromechanical technology; the range of effects that it can produce is constrained to vibration only. The frequency, temporal pattern, and amplitude of vibration can be modified, but in the end, the only sensation that can be produced is a type of global vibration.

Because of the tendency of vibrations to easily spread, attempts to locate the vibration in only a portion of a touchscreen have been ineffective. And since the range and location of tactile responses available through vibrotactile technology are so limited, its main use is as a feedback mechanism, to confirm a touch on the screen. But even this feedback is redundant, as the act of operating the touchscreen is inherently a tactile event. Broader forms of tactile engagement, such as tactile search or navigation, are beyond the limits of vibrotactile haptics’ capabilities.

In addition, the implementation of vibrotactile haptics becomes exponentially more difficult as screen size increases. The larger the display, the larger the motor required to generate sufficient vibration.

Despite the inherent drawbacks of vibrotactile haptics, OEMs persist in attempts to enhance its operation in products such as phones and game controllers—an indication of the vital importance of touch in deepening the user’s engagement with electronics devices and making the user experience more exciting.

Now, however, software-defined surface haptic technology provides a different way to provide tactile engagement. It offers an almost infinite range of precisely rendered surface textures. It is easily integrated with touch-sensing user controls, audio, and graphics effects. And because it’s a solid-state technology, it can be implemented in a display screen of any size and shape, as well as on other surfaces such as a glass or plastic enclosure. In addition, it suffers from none of the mechanical problems associated with vibrotactile haptics, as it uses force, not vibration, to produce tactile sensation.

Surface haptics technology

Surface haptics technology works on the principle of electroadhesion. By controlling the voltage applied to a transparent indium tin oxide (ITO) matrix—the same material used in a touchscreen sensor laid on the surface of the display glass—the adhesion of the finger to the surface can be increased (Figure 2). This amplifies and modulates the naturally occurring friction force between the finger and smooth glass, leading to the feel of a range of textures.

touch display diagram implementing electroadhesion-based surface haptics through an ITO layerFigure 2 The implementation of electroadhesion-based surface haptics in a standard display assembly requires only the addition of a patterned ITO layer on the cover glass. Source: Tanvas

The principle of electroadhesion has been known for a long time. What’s new is the ability to embed the capability to produce textures on the surface of a standard display assembly and on a variety of other surfaces and materials. That enables product designers to create unique, custom texture effects in an intuitive software development environment.

This is the promise of surface haptics. The technology gives designers fine-grained control over the feel of the touch surfaces that it creates. The visual appearance and feeling of a landing area on the screen, or of a virtual button, switch, slider, or dial can be designed as a single platform. On a flat plate of glass, the user’s finger can be made to feel the flip of a toggle switch, the click of a dial, and any range of textures from grainy to fine.

And the way these surface haptic effects are served to the user is highly dynamic and configurable. The effects make up a haptic landscape for the finger, which can be diversified depending on the speed, location, and direction of the finger’s movement. Different haptic effects can be served to different fingers/regions concurrently and the entire landscape of effects can be changed dynamically as fast as the screen’s graphics content changes.

An integral element of making surface haptics a reality is a desktop development kit, which contains all the software, tools, and training needed to enable a user interaction designer to start creating surface haptics effects. Textures and effects are conceived as images that may be linked to a display graphics (Figure 3). Software automatically converts the image created by the designer to the code executed in the surface touch controller, so the image is rendered as a texture on the screen surface as a finger passes over it.

Tanvas illustration of how graphics are translated to hapticsFigure 3 Software-defined surface haptics are designed as images in the TanvasTouch development kit. Source: Tanvas

Continue reading on EDN US: Multi-sensory surfaces


Craig Shultz is a senior electrical engineer and haptic designer at Tanvas, where he leads the Haptic Experience team.

Related articles:


Leave a comment