User interface options continue to grow in richness of expression as sensor and compute processing costs and energy requirements continue to drop. The “paper” computing device is one such example, and it hints that touch interfaces may only be the beginning of where user interfaces are headed. Flexible display technologies like E-Ink’s have supported visions of paper computers and hand held computing devices for over a decade. A paper recently released by Human Media Lab explores the opportunities and challenges of supporting user gestures that involve bending the device display similar to how you would bend a piece of paper. A video of the flexible prototype paper phone provides a quick overview of the project.
The demonstration device is based on a flexible display prototype called paper phone (see Figure). The 3.7” flexible electrophoretic display is coupled with a layer of five Flexpoint 2” bidirectional bend sensors that are sampled at 20Hz. An E-Ink Broadsheet AM 300 Kit with a Gumstix processor that is capable of completing a display refresh in 780ms for a typical full screen grey scale image. The prototype is connected to a laptop computer that offloads the processing for the sensor data, bend recognition, and sending images to the display to support testing the flexibility and mobility characteristics of the display.
The paper outlines how the study extends prior work with bend gestures in two important ways: 1) the display provided direct visual feedback to the user’s bend commands, and 2) the flexible electronics of the bending layer provided feedback. The study involved two parts. The first part asked the participants to define eight custom bend gesture pairs. Gestures were classified according to two characteristics based on the location of the force being exerted on the display and the polarity of that force. The configuration of the bend sensors supported recognizing folds or bends at the corners and along the center edge of the display. The user’s folds could exert force forward or backwards at each of these points. Gestures could consist of folding the display in a single location or at multiple locations. The paper acknowledges that there are other criteria they could have used, such as the amount of force in a fold, the number of repetitions of a fold, as well as tracking the velocity of a fold. These were not investigated in this study.
The second part of the study asked participants to use and evaluate the bend gestures they developed in the context of complete tasks, such as operating a music player or completing a phone call. The study found that there was strong agreement among participants for the folding locations as well as the polarity of the folds for actions with clear directionality, such as navigating left and right. The applications that the participants were asked to complete were navigating between twelve application icons; navigating a contact list; play, pause, and select a previous or next song; navigate a book reader, and zoom in and out for map navigation. The paper presents analysis of the 87 total bend gestures that the ten participants created (seven additional bends were created in the second part of the study) in building a bend gesture/language, and it discusses shared preferences among the participants.
A second paper from Human Media Lab presents a demonstration “Snaplet” prototype for a bend sensitive device to change its function and context based on its shape. The video of the Snaplet demonstrates the different contexts that the prototype can recognize and adjust to. Snaplet is similar to the paper phone prototype in that it uses bend sensors to classify the shape of the device. Rather than driving specific application commands with bends, deforming the shape of the device drives which applications the device will present to the user and what types of inputs it will accept and use. The prototype includes pressure sensors to detect touches, and it incorporates a Wacom flexible tablet to enable interaction with a stylus. Deforming the shape of the device is less dynamic than bending the device (such as in the first paper); rather the static or unchanging nature of the deformations allows the device’s shape to define what context it will work in.
When the Snaplet is bent in a convex shape, such as a wristband on the user’s arm, Snaplet acts like a watch or media player. The user can place the curved display on their arm and hold in in place with Velcro. The video of the Snaplet shows the user interacting with the device via a touch screen with icons and application data displayed appropriately for viewing on the wrist. By holding the device flat in the palm of their hand, the user signals to the device that it should act as a PDA. In this context, the user can use a Wacom stylus to take notes or sketch; this form factor is also good for reading a book. The user can signal the device to act as a cell phone by bending the edge of the display with their fingers and then placing the device to their ear.
Using static bends provides visual and tangible cues to the user of the device’s current context and application mode. Holding the device in a concave shape requires a continuous pinch from the user and provides haptic feedback that signifies there is an ongoing call. When the user releases their pinch of the device, the feedback haptic energy directly corresponds with dropping the call. This means that users can rely on the shape of the device to determine its operating context without visual feedback.
The paper phone and Snaplet prototypes are definitely not ready for production use, but they are good demonstration platforms to explore how and when using bend gestures and deforming the shape of a device may be practical. Note that these demonstration platforms for these types of inputs do not suggest replace existing forms of user interfaces, such as touch and stylus inputs; rather, they demonstrate how bend gestures can augment those input forms and provide a more natural and richer communication path with electronic devices.
can you give the detailed bend gestures; i mean can you give all the bend gestures now the paper phone is having, with explanation?