Reference Information
Title: Pen + Touch = New Tools
Names of authors: Ken Hinckley, Koji Yatani, Michel Pahud, Nicole Coddington, Jenny Rodenhouse, Andy Wilson, Hrvoje Benko, and Bill Buxton
Conference: UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Summary:
Prior to this research, we had multi-touch interfaces (usually using a capacitive touch screen) and pen-touch interface (usually using a resistive touch screen). However, with the age of tablet computers, it was required to have an interface that allows users to touch and perform gestures with fingers and in addition to that, use pen for inking and note-taking. This paper presents a research that makes this possible. They used a Microsoft Surface and a LED pen for this experiment. In this paper, the authors describe techniques for direct pen+touch input. The motivation came from observing people’s manual behaviors with physical paper and notebooks. These serve as the foundation for a prototype Microsoft Surface application, centered on note-taking and scrapbooking of materials. Based on the observations and explorations, the authors advocate a division of labor between pen and touch: the pen writes, touch manipulates, and the combination of pen + touch yields new tools. This articulates how the system interprets unimodal pen, unimodal touch, and multimodal pen+touch inputs, respectively. For example, the user can hold a photo and drag off with the pen to create and place a copy; hold a photo and cross it in a freeform path with the pen to slice it in two; or hold selected photos and tap one with the pen to staple them all together. Touch thus unifies object selection with mode switching of the pen, while the muscular tension of holding touch serves as the glue that phrases together all the inputs into a unitary multimodal gesture. This helps the UI designer to avoid encumbrances such as physical buttons, persistent modes, or widgets that detract from the user’s focus on the workspace.
Discussion:
Today, we are getting rid of the physical hardware buttons from our electronics and substituting them with touch screen. People are rapidly switching to tablets, touch based ebook readers, touch based navigations systems etc. I think this research is revolutionary and provides a major breakthrough in the area of HCI. Some tablets like the iPad do not support pen interaction, while a keyboard is displayed on the screen and the respective keys can be touched to enter the information. However, many people aren't comfortable with this input method. Therefore, some newer tablets like the HP slate and Asus eee slate support multi touch and pen interface making the device ideal for browsing information and for inking / note-taking.
Application:
Wacom Bamboo Pen & Touch
No comments:
Post a Comment