Voices of Industry Channel

It takes a community of embedded designers to develop the myriad applications that are newly available each day. This series gives voice to the members of the embedded design community as they share their experiences and knowledge in their expert application domains so that designers in other application domains can benefit from their lessons learned.

Get in Touch with Your Inner User Interface

Thursday, July 15th, 2010 by Ville-Veikko Helppi

Touchscreens have gone from fad to “must have” seemingly overnight. The rapid growth of touchscreen user interfaces in mobile phones, media players, navigation systems, point-of-sale, and various other devices has changed the landscape in a number of vertical markets. In fact, original device manufacturers (ODMs) see the touchscreen as a way to differentiate their devices and compete against one another in an ever-expanding marketplace. But ODMs take note – a touchscreen alone will not solve the problem of delivering a fantastic user experience. If the underlying user interface is not up to snuff, the most amazing whiz-bang touchscreen won’t save you.

Touchscreens have come a long way from the early 90’s applications where they were used in primitive sales kiosks and public information displays. These devices were not cutting-edge masterpieces, but they did help jump-start the industry and expose large audiences (and potential future users) to the possibilities of what this type of technology might offer. It wasn’t until a decade later before consumers saw the major introduction of touchscreens – and the reason for this was pretty simple: the hardware was just too big and too expensive. Touchscreens became more usable and more pervasive only after the size of hardware reduced significantly.

Today there are a host of options in touchscreen technology. These include resistive, projected-capacitive, surface-capacitive, surface acoustic wave, and infrared to name a few. According to DisplaySearch, a display market research organization, resistive displays now occupy 50 percent of the market due to its cost-effectiveness, consistency, and durable performance; while projected-capacitive has 31 percent of the market. In total, there were more than 600 million touchscreens shipped in 2009. DisplaySearch also forecasts that projected-capacitive touchscreens will soon pass resistive screens as the number one touchscreen technology (measured by revenues) because the Apple iPad utilizes projected-capacitive touchscreen technology. And finally, according to Gartner, the projected-capacitive touchscreen segment is estimated to hit 1.3 billion units by 2012, which means a 44 percent compounded annual growth rate. These estimates indicate serious growth potential in the touchscreen technology sector.

However, growth ultimately hinges on customer demand. Some of the devices, such as safety and mission-critical systems, are still not utilizing the capabilities found in touchscreens. This is because with mission-critical systems, there is very little room for input mistakes made by the user. In many cases, touchscreens are considered a more fault-sensitive input method when compared to the old-fashioned button- and glitch-based input mechanisms. For some companies, the concern is not about faulty user inputs, but cost; adding a $30 touchscreen is not an option when it won’t add any value to the product’s price point.

So what drives touchscreen adoption? Adoption is mainly driven by

  1. Lowering the cost of the hardware
  2. Testing and validating new types of touchscreen technologies in the consumer space, and then pushing those technologies into other vertical markets
  3. A touchscreen provides an aesthetic and ease-of-use appeal – a sexier device gains more attention over its not so sexy non-touchscreen cousin.

This is true regardless of the type of device, whether it’s a juice blender, glucose monitor, or infotainment system in that snazzy new BMW.

The second part in this four-part series explores the paradigm shift in user interfaces that touchscreens are causing.

Eating dog food? It’s all in the preparation.

Monday, June 28th, 2010 by Jason Williamson

Altia provides HMI (human machine interface) engineering tools to companies in industries like automotive, medical, and white goods. When you’re providing interface software, it makes sense to use your own tools for “real” work, just as your customers would. Not only do you prove you know your own product, but you get an invaluable “user’s perspective” into the workings of your software. You get the opportunity to see where your tools shine and where they are lacking, allowing your team to plan for new features to make them better. Through our own “dog fooding” experiences, we have developed some valuable guidelines that we believe make the process go more smoothly.

First, it is important to only use released versions of the product. It is tempting to pull in the latest beta capabilities to a project, but this is a perilous course. There is a reason why that feature hasn’t been released. It hasn’t been through the full test cycle. You cannot risk the project schedule or quality of what is delivered. Producing quality on time is why you’ve been engaged in the first place. Another reason to stick with the released versions of your tools is that you should approach all of your consulting work with the idea that the customer will ultimately need to maintain the project. They need to know that the features and output used in the creation of the project are mature and trustworthy.

The next guideline addresses releases and your revision control system.  A revision control system is the repository where all of the versions of product source code are stored.  This often includes the “golden,” release versions of the product as well as in-development “sand boxes.”  We structure our revision control system such that release-worthy code for new features is kept in a nearly ready-to-release state as the next version of our product. That is, whole feature groups should be checked in together and tested to an extent such that only running the overall test suites are needed to create a product. That way, if a new feature absolutely must be used in a project, you have a lower barrier to an interim release.

Finally, it is very important to spend sufficient time architecting the project. When deadlines rapidly approach, it is tempting to take shortcuts to the end result. Since you know your software so well, you can be quite certain that these shortcuts will not be a detriment to the delivered product. However, this is almost always a shortsighted choice. When handing off the design to another person, especially a valued customer, a well-documented and rigorously-followed architecture is paramount. Your customers need to own and usually extend this design. There should be no “duct tape” in it. Who would want to receive that call to explain a kludge four years after the project has been delivered?

I encourage you to have a hearty helping of your own dog food. Not only do you serve up a result that will please your customer, but you learn by experience where you can make your software stronger and more capable. By developing with current releases, by keeping new features tested and ready to go, and by taking appropriate measures to architect the project, you make the eating of your own dog food a gourmet experience — and keep your customers coming back for seconds.