While augmented and virtual reality systems are fun, fascinating, and useful in and of themselves, they have also created renewed demand for innovation around environmental and touch sensing and actuation technologies that fall into the general category of tele-haptics. The benefits of this research will not only accrue to gamers and the general consumer, but also to surgeons performing delicate procedures remotely.

That said, good tele-haptics go way beyond force-feedback or the tiny motors generating the tiny vibration we get on our phones. We’ll take a quick look at the biological underpinnings that complicate the issue, as well as an interesting, albeit far-fetched Kickstarter solution that might, well, shock you.

In the context of AR or VR, we have plenty of ways to sense user movement and feed that into the system. We can use motion sensors on the user or external cameras or reflected light beams. For interfacing and controlling a machine, we have connected gloves and even the relatively new Myo armband. However, when it comes to touching a virtual or remotely visualised object, and then getting accurate sensory feedback to the fingertips regarding its shape, texture, pressure response, and other characteristics, the fundamentals of how our fingers actually work need to be grasped. Much of the research on this has taken place for remote medical procedures, but the principles apply to AR and VR too.

When we use our fingers to explore an object’s stiffness, damping, texture, hysteresis, and other characteristics, we squeeze the surface and gather the associated data using two main categories of sensors. The first is kinesthetic, which refers to basic geometry and force data on the limbs (position, velocity of joints, and actuation force, for example). The second category is tactile, using both cutaneous and subcutaneous sensors. The cutaneous information is used to derive pressure and indentation distribution and is mediated by mechanoreceptors innervating the derma and epidermis of the fingerpads. For accurate feedback, clear discrimination of different objects by their compliance needs to be communicated to the surface of the hand, and in particular the fingertips.

In medical systems, the remote haptic system (RHS) today comprises a telemanipulator to allow exploratory actions, and a haptic perceptual channel to return information to the operator. Both kinesthetic and tactile information need to be conveyed back and displayed. If the tactile information is missing, psychological literature has established that haptic discrimination is reduced, dramatically.

This makes sense of course, as we need to be able to “feel” what we’re touching with fine resolution. But sensing this accurately is harder than you might think to achieve, and reproducing it at the fingertips is considerably more difficult, with limited success to date. That said, some interesting technology in the form of ultra-haptics and a “shock-suit” may inspire some ideas.

In the meantime, researchers have turned to leveraging a bit of psychology to close the gap between what is real and what can be accurately sensed. They conjectured that, “…a large part of the haptic information necessary to discriminate softness of objects by touch is contained in the law that relates resultant contact force to the overall area of contact, or in other terms in the rate by which the contact area spreads over the finger surface as the finger is increasingly pressed on the object.”

They call this relationship the Contact Area Spread Rate (CASR), and while not complete, this spread may be a good surrogate for the complete sense of touch, if we rely upon the human brain to interpret it effectively. The researchers then began to investigate the possibility that a simplified form of tactile data could convey enough information to allow “…satisfactory discrimination of softness, while allowing practical construction of devices for practical applications.”

To prove out the conjecture, the team coated a thin film of piezoelectric (or piezoresistive, both work) material with metallic conductive layers on each side, and placed suitable instrumentation across the conductive layers to measure the voltage: charge amplifier or a Wheatstone bridge and differential amplifier, respectively (Figure 1).

EDNA AR piezoelectric material Figure 1: Researchers showed that it’s possible to use relatively simple technology, in this case a layer of piezoelectric material coated on both sides with a metallic conductor, to acquire enough tactile information to control a haptic display or feedback mechanism.

The signals to be measured comprise two (force and area of contact) time-varying analog signals to correlate with a time-varying spatial distribution of pressures that need to be sampled in both time and space. After much direct and psychophysical experimentation with numerous subjects, the team was encouraged having shown the device could provide enough information to control the haptic display.

While sensing tactile data is difficult, it’s even trickier to accurately present that data to the human hand with any useful level of resolution. Being able to do so has been a goal of a company called Ultrahaptics, which we’ve discussed in the past. They use focused ultrasound to present the sensation of buttons and knobs that aren’t there. Key target applications are medical, of course, as the fewer surfaces a doctor (or anyone else, for that matter) touches, the better the odds of escaping the hospital without contracting MRSA.

Ultrahaptics is well on its way and may also prove to also be useful for VR applications, but they don’t match the Teslasuit for plain “cool” factor. Tesla Studios' Teslasuit is a full-body haptic suit for a fuller, more-interactive VR experience. The only catch is that it was to provide feedback using electrical signals to stimulate the muscles or skin surface. Essentially it shocks you.

It started as a Kickstarter project, raised some money, but was cancelled in February. Maybe shocking users isn’t the best means of providing feedback. Definitely looking forward to more innovation here as VR and AR gather momentum.

This article first appeared on EDN.