Alternative touch interfaces – sensor fusion

Tuesday, September 21st, 2010 by Robert Cravotta

While trying to uncover and highlight different technologies that embedded developers can tap into to create innovative touch interfaces, Andrew commented on e-field technology and pointed to Freescale’s sensors. While exploring proximity sensing for touch applications, I realized that accelerometers represent yet another alternative sensing technology (versus capacitive touch) that can impact how a user can interact with a device. The most obvious examples of this are devices, such as a growing number of smart phones and tablets, which are able to detect their orientation to the ground and rotate the information they are displaying. This type of sensitivity enables interface developers to consider broader gestures that involve manipulating the end device, such as shaking it, to indicate some type of change in context.

Wacom’s Bamboo Touch graphic tablet for consumers presents another example of e-field proximity sensing combined with capacitive touch sensing. In this case, the user can use the sensing surface with an e-field optimized stylus or they can use their finger directly on the surface. The tablet controller detects which type of sensing it should use without requiring the user to explicitly switch between the two sensing technologies. This type of combined technology is finding its way into tablet computers.

I predict the market will see more examples of end devices that seamlessly combine different types of sensing technologies in the same interface space. The different sensing modules working together will enable the device to infer more about the user’s intention, which will in turn, enable the device to better learn and adapt to each user’s interface preferences. To accomplish this, devices will need even more “invisible” processing and database capabilities that allow these devices to be smarter than previous devices.

While not quite ready for production designs, the recent machine touch demonstrations from the Berkeley and Stanford research teams suggest that future devices might even be able to infer user intent by how the user is holding the device – including how firmly or lightly they are gripping or pressing on it. These demonstrations suggest that we will be able to make machines that are able to discern differences in pressure comparable to humans. What is not clear is whether each of these technologies will be able to detect surface textures.

By combining, or fusing, different sensing technologies together, along with in-device databases, devices may be able to start recognizing real world objects – similar to the Microsoft Surface. It is becoming within our grasp for devices to start recognizing each other without requiring explicit electronic data streams flowing between those devices.

Do you know of other sensing technologies that developers can combine together to enable smarter devices that learn how their user communicates rather than requiring the user to learn how to communicate with the device?

Tags:

One Response to “Alternative touch interfaces – sensor fusion”

  1. Ed says:

    Kionix is one of the inertial sensing companies doing some interesting work with new algorithms for User Interfaces that take advantage of the accelerometers, gyros and compasses built into so many portable products these days. They were the first with DirectionalTap and Double Tap and also have been showing an advanced Gesture recognition capability they call Gesture Designer. Worth checking out.

Leave a Reply