Entries Tagged ‘Embedded Vision’

Alternative Touch Interfaces

Tuesday, September 7th, 2010 by Robert Cravotta

Exploring the different development kits for touch interfaces provides a good example of what makes something an embedded system. To be clear, the human-machine interface between the end device and the user is not an embedded system; however, the underlying hardware and software can be. Let me explain. The user does not care how a device implements the touch interface – what matters to the user is what functions, such as multi-touch, the device supports, and what types of contexts and touch commands the device and applications can recognize and respond to.

This programmable rocker switch includes a display that allows the system to dynamically change the context of the switch.

So, while using resistive and capacitive touch sensors are among the most common ways to implement a touch interface in consumer devices, they are not the only way. For example, NKK Switches offers programmable switches that integrate a push button or rocker switch with an LCD or OLED display. In addition to displaying icons and still images, some of these buttons can display a video stream. This allows the system to dynamically change the context of the button and communicate the context state to the user in an intuitive fashion. I am in the process of setting up some time with these programmable switches for a future write-up.

Another example of alternative sensing for touch interfaces is infrared sensors. The infrared proximity sensing offered by Silicon Labs and the infrared multi-touch sensing offered by Microsoft demonstrate the wide range of capabilities that infrared sensors can support at different price points.

Silicon Labs offers several kits that include infrared support. The FRONTPANEL2EK is a demo board that shows how to use capacitive and infrared proximity sensing in an application. The IRSLIDEREK is a demo board that shows how to use multiple infrared sensors together to detect not only the user’s presence, but also location and specific motion of the user’s hand. These kits are fairly simple and straightforward demonstrations. The Si1120EK is an evaluation platform that allows a developer to explore infrared sensing in more depth including advanced 3-axis touchless object proximity and motion sensing.

By working with these kits, I have a greater appreciation of the possible uses for proximity sensing. For example, an end device could place itself into a deep sleep or low power mode to minimize energy consumption. However, placing a system in the lowest power modes incurs a startup delay when reactivating the system. A smart proximity sensing system could provide the system with a few seconds warning that a user might want to turn the system on, and it could speculatively activate the device and be able to respond to the user more quickly. In this scenario, the proximity sensor would probably include some method to distinguish between likely power-up requests versus an environment where objects or people pass near the device without any intent of powering up the device.

Finally, Microsoft’s Surface product demonstrates the other end of touch sensing using an infrared camera system. In essence, the Surface is a true embedded vision system – an implementation detail that the end user does not need to know anything about. In the case of the Surface table, there are several infrared cameras viewing a diffusion surface. The diffusion surface has specific optical properties that allow the system software to identify when any object touches the surface of the display. This high end approach provides a mechanism for the end user to interact with the system using real world objects found in the environment rather than just special implements such as stylus with specific electrical characteristics.

The point here is to recognize that there are many ways to implement touch interfaces – including sonic mechanisms. They may not support touch interfaces in the same way, nor be able to support a minimum set of commands sets, but taken together, they may enable smarter devices that are able to better predict what the end user’s true expectations are and prepare accordingly. What other examples of alternative touch sensing technologies are you aware of?