Articles by Ville-Veikko Helppi

Ville-Veikko Helppi is a product marketing manager for the Mentor Embedded Division of Mentor Graphics. He holds a Master of Science in Electrical Engineering (Embedded Systems) and a Master of Science in Economics and Business Administration from the University of Oulu in Finland.

Touchscreen User Interface checklist: criteria for selection

Thursday, August 19th, 2010 by Ville-Veikko Helppi

Touchscreens require more from the UI (user interface) design and development methodologies. To succeed in selecting the right technology, designers should always consider the following important topics.

1) All-inclusive designer toolkit. As the touchscreen changes the UI paradigm, one of the most important aspects of the UI design is how quickly the designer can see the behavior of the UI under development. Ideally, this is achieved when the UI technology contains a design tool that allows the designer to immediately observe behavior of the newly created UI and modify easily before target deployment.

2) Creation of the “wow factor.” It is essential that UI technology enables developers and even end-users to easily create clever little “wow factors” on the touchscreen UI. These technologies, which allow the rapid creation and radical customization of the UI, have a significant impact on the overall user experience.

3) Controlling the BoM (Bill of Material). For UIs, everything is about the look and feel, ease of use, and how well it reveals the capabilities of the device. In some situations, adding a high-resolution screen with a low-end processor is all that’s required to deliver a compelling user experience. Equally important is how the selected UI technology reduces engineering costs related to UI work. Adapting a novel technology that enables the separation of software and UI creation enables greater user experiences without raising the BoM.

4) Code-free customization. Ideally, all visual and interactive aspects of a UI should be configurable without recompiling the software. This can be achieved by providing mechanisms to describe the UI’s characteristics in a declarative way. Such a capability affords rapid customization without any changes to the underlying embedded code base.

5) Open standard multimedia support. In order to enable the rapid integration of any type of multimedia content into a product’s UI (regardless of the target hardware) some form of API standardization must be in place. The OpenMAX standard addresses this need by providing a framework for integrating multimedia software components from different sources, making it easier to exploit silicon-specific features, such as video acceleration.

Just recently, Apple replaced Microsoft as the world’s largest technology company. This is a good example of how a company that produces innovative, user-friendly products with compelling user interfaces can fuel the growth of technology into new areas. Remember, the key isn’t necessarily the touchscreen itself – but the user interfaces running on the touchscreen. Let’s see what the vertical markets can do to take the user interface and touchscreen technology to the next level!

Impacts of touchscreens for embedded software

Thursday, August 5th, 2010 by Ville-Veikko Helppi

No question, all layers of the embedded software are impacted when a touchscreen is used on a device. A serious challenge is finding space to visually show a company’s unique brand identity, as it is the software that runs on the processor that places the pixels on screen. From the software point of view, the touchscreen removes one abstraction level between the user and software. For example, many devices have removed ‘OK’ buttons from dialogs as the user can click the whole dialog instead of clicking the button.

Actually, software plays an even more critical role as we move into a world where the controls on a device are virtual rather than physical. In the lowest level of software, the touchscreen driver provides a mouse-emulation that basically means the same as clicking a mouse cursor on certain pixels. However, the mouse driver gets its data as “relative” while the touchscreen driver gets its data as “absolute.” Writing the touchscreen driver is usually trivial, as this component only takes care of passing information from the physical screen to higher levels of software. The only inputs the driver needs are Boolean if the screen is touched, and in what x- and y-axes has the touch taken place.

At the operating system level, a touchscreen user interface means more frequent operating system events than the typical icon or widget-based user interface. In addition to a touchscreen, there may also be a variety of different sensors (e.g., accelerometers) inputting stimuli to the operating system through their drivers. Generally, the standardized operating system can give confidence and consistency to device creation, but if it needs to be changed, the cost of doing so can be astronomical due to testing the compatibility of other components.

The next layer is where the middleware components of the operating system are found, or in this context, where the OpenGL/ES library performs. Various components within this library do different things from processing the raw data with mathematical algorithms, providing a set of APIs for drawing, interfacing between software and hardware acceleration, or providing services such as rendering, font engines, and so on. While this type of standardization is generally a good thing, in some cases, it can lead to non-differentiation – in the worst case, it might even kill the inspiration of an innovative user interface creation. Ideally, the standardized open library, together with rich and easily customizable user interface technology, results in superb results.

The application layer is the most visible part of the software and forms the user experience. It is here where developers must ask:

1)      Should the application run in the full-screen mode or enable using widgets distributed around the screen?

2)      What colors, themes, and templates are the best ways to illustrate the behavior of the user interface?

3)      How small or large should the user interface elements be?

4)      In what ways will the user interface elements behave and interact?

5)      How intuitive do I want to make this application?

Compelling UI design tools is essential for the rapid creation of user interfaces.

In the consumer space, there are increasingly more competitive brands with many of the same products and product attributes. Manufacturers are hard-pressed to find any key differentiator among this sea of “me too” offerings. One way to stand out is by delivering a rich UI experience via a touchscreen display.

We are starting to see this realization play out in all types of consumer goods, even in white goods as pedestrian as washing machines. There are now innovative display technologies replacing physical buttons and levers. Imagine a fairly standard washing machine with a state-of-the-art LCD panel. This would allow the user to easily browse and navigate all the functions on that washing machine – and perhaps learn a new feature or two. By building an attractive touchscreen display, simply changing the software running on the display can manifest any customization work. Therefore, things like changing the branding, adding compelling video clips and company logos, all become much simpler because it’s all driven by software. If the manufacturer uses the right technology, they may not even need to modify the software to change the user experience.

Driven by the mobile phone explosion, the price point of display technology has come down significantly. As a result, washing machine manufacturers can add more perceived value to their product without necessarily adding too much to the BoM (bill of materials). Thus, before the machine leaves the factory, a display technology may increase the BoM by $30, but this could increase the MSRP by at least $100. No doubt, this can have a huge impact on the company’s bottom line. This results in a “win-win” for the manufacturer and for the consumer. The manufacturer is able to differentiate the product more easily and in a more cost effective manner, while the product is easier to use with a more enhanced UI.

The final part in this four-part series presents a checklist for touchscreen projects.

The User Interface Paradigm Shift

Thursday, July 22nd, 2010 by Ville-Veikko Helppi

Touchscreens are quickly changing the world around us. When clicking on an image, a touchscreen requires much less thinking and more user intuition. Touchscreens are also said to be the fastest pointing method available, but that isn’t necessarily true – it all depends on how the user interface is structured. For example, most users accept a ten millisecond delay when scrolling with cursor and mouse, but with touchscreens, this same period of time feels much longer so the user experience is perceived as not as smooth. Also, multi-touch capabilities are not possible with mouse emulations, at least, not as intuitively as with a touchscreen. The industry has done a good job providing a screen pen or stylus to assist the user when selecting the right object on smaller screens, thus silencing the critics of touchscreens who say it’s far from ideal as a precise pointing method.

The touchscreen has changed the nature of UI (user interface) element transitions. When looking at motions of different UI elements, these transitions can make a difference in device differentiation and if implemented properly tell a compelling story. Every UI element transition must have a purpose and context as it usually reinforces the UI elements. Something as simple as buffers are effective at giving a sense of weight to a UI element – and moving these types of elements without a touchscreen would be awkward. For UI creation, the best user experience can be achieved when UI element transitions are natural and consistent with other UI components (e.g., widgets, icons, menus) and deliver a solid, tangible feel of that UI. Also, the 3D effects during the motion provide a far better user experience.

3D layouts enable more touchscreen friendly user interfaces.

Recent studies in human behavior along with documented consumer experiences have indicated that the gestures of modern touchscreens have expanded the ways users can control a device through its UI. As we have seen with “iPhone phenomena” the multi-touchscreen changes the reality behind the display screen, allowing new ways to control the device through hand-eye (e.g., pinching, zooming, rotating) coordination. But it’s not just the iPhone that’s driving this change. We’re seeing other consumer products trending towards simplifying the user experience and enhancing personal interaction. In fact, e-Books are perfect examples. Many of these devices have a touchscreen UI where the user interacts with the device directly at an almost subconscious level. This shift in improved user experience has also introduced the idea that touchscreens have reduced the number of user inputs required for the basic functioning of a device.

The third part in this four-part series explores the impact of touchscreens on embedded software.

Get in Touch with Your Inner User Interface

Thursday, July 15th, 2010 by Ville-Veikko Helppi

Touchscreens have gone from fad to “must have” seemingly overnight. The rapid growth of touchscreen user interfaces in mobile phones, media players, navigation systems, point-of-sale, and various other devices has changed the landscape in a number of vertical markets. In fact, original device manufacturers (ODMs) see the touchscreen as a way to differentiate their devices and compete against one another in an ever-expanding marketplace. But ODMs take note – a touchscreen alone will not solve the problem of delivering a fantastic user experience. If the underlying user interface is not up to snuff, the most amazing whiz-bang touchscreen won’t save you.

Touchscreens have come a long way from the early 90’s applications where they were used in primitive sales kiosks and public information displays. These devices were not cutting-edge masterpieces, but they did help jump-start the industry and expose large audiences (and potential future users) to the possibilities of what this type of technology might offer. It wasn’t until a decade later before consumers saw the major introduction of touchscreens – and the reason for this was pretty simple: the hardware was just too big and too expensive. Touchscreens became more usable and more pervasive only after the size of hardware reduced significantly.

Today there are a host of options in touchscreen technology. These include resistive, projected-capacitive, surface-capacitive, surface acoustic wave, and infrared to name a few. According to DisplaySearch, a display market research organization, resistive displays now occupy 50 percent of the market due to its cost-effectiveness, consistency, and durable performance; while projected-capacitive has 31 percent of the market. In total, there were more than 600 million touchscreens shipped in 2009. DisplaySearch also forecasts that projected-capacitive touchscreens will soon pass resistive screens as the number one touchscreen technology (measured by revenues) because the Apple iPad utilizes projected-capacitive touchscreen technology. And finally, according to Gartner, the projected-capacitive touchscreen segment is estimated to hit 1.3 billion units by 2012, which means a 44 percent compounded annual growth rate. These estimates indicate serious growth potential in the touchscreen technology sector.

However, growth ultimately hinges on customer demand. Some of the devices, such as safety and mission-critical systems, are still not utilizing the capabilities found in touchscreens. This is because with mission-critical systems, there is very little room for input mistakes made by the user. In many cases, touchscreens are considered a more fault-sensitive input method when compared to the old-fashioned button- and glitch-based input mechanisms. For some companies, the concern is not about faulty user inputs, but cost; adding a $30 touchscreen is not an option when it won’t add any value to the product’s price point.

So what drives touchscreen adoption? Adoption is mainly driven by

  1. Lowering the cost of the hardware
  2. Testing and validating new types of touchscreen technologies in the consumer space, and then pushing those technologies into other vertical markets
  3. A touchscreen provides an aesthetic and ease-of-use appeal – a sexier device gains more attention over its not so sexy non-touchscreen cousin.

This is true regardless of the type of device, whether it’s a juice blender, glucose monitor, or infotainment system in that snazzy new BMW.

The second part in this four-part series explores the paradigm shift in user interfaces that touchscreens are causing.