Skip to main content

Community Repository Search Results

resource research Media and Technology
This paper explores the interactive possibilities enabled when the barrel of a digital pen is augmented with a multi-touch sensor. We present a novel multi-touch pen (MTPen) prototype and discuss its alternate uses beyond those of a standard stylus, such as allowing new touch gestures to be performed using the index finger or thumb and detecting how users grip the device as a mechanism for mode switch-ing. We also discuss the hardware and software implemen-tation challenges in realizing our prototype, and showcase how one can combine different grips (tripod, relaxed tripod, sketch, wrap) and
DATE:
TEAM MEMBERS: Jim Spadaccini Hyunyoung Song Hrvoje Benko Francois Guimbretiere Shahram Izadi Xiang Cao Ken Hinckley
resource research Media and Technology
Creating and editing large graphs and node-link diagrams are crucial activities in many application areas. For them, we consider multi-touch and pen input on interactive surfaces as very promising. This fundamental work presents a user study investigating how people edit node-link diagrams on an interactive tabletop. The study covers a set of basic operations, such as creating, moving, and deleting diagram elements. Participants were asked to perform spontaneous gestures for 14 given tasks. They could interact in three different ways: using one hand, both hands, as well as pen and hand
DATE:
TEAM MEMBERS: Mathias Frisch Jens Heydekorn Raimund Dachselt
resource research Media and Technology
New mobile devices with large multi-touch displays, such as the iPad, have brought revolutionary changes to ways users interact with computers. Instead of traditional input devices such as keyboards, touchpads and mice, multi-touch gestures are used as the primary means of interacting with mobile devices. Surprisingly, body-motion gestures are evolving to become a new, natural, and effective way for game players to interact with game consoles in a very similar fashion: in Kinect for Xbox 360, a controller-free gaming experience is made possible by using body-motion gestures to play games.
DATE:
TEAM MEMBERS: Yuan Feng Zimu Liu Baochun Li
resource research Media and Technology
This paper outlines research showing a suprizing agreement in the guesability of multitouch gestures on tabletop surfaces between users. It also provides more evidence that crowd sourcing gesture mapping will lead to more complete intuitive gesture set and potential convergence into a standard gesture library.
DATE:
TEAM MEMBERS: Jacob Wobbrock Meredith Moris Andrew Wilson
resource research Media and Technology
Zooming user interfaces are increasingly popular on mobile devices with touch screens. Swiping and pinching finger gestures anywhere on the screen manipulate the displayed portion of a page, and taps open objects within the page. This makes navigation easy but limits other manipulations of objects that would be supported naturally by the same gestures, notably cut and paste, multiple selection, and drag and drop. A popular device that suffers from this limitation is Apple’s iPhone. In this paper, we present Bezel Swipe, an interaction technique that supports multiple selection, cut, copy
DATE:
TEAM MEMBERS: Volker Roth Thea Turner
resource research Media and Technology
Most current multi-touch capable interactive user interfaces for tabletop are built from custom toolkits that are decoupled from, and on top of, the “Desktop” provided by the underlying Operating System. However, this approach requires that each individual touch system build their own suite of touch capable custom applications (such as photo browsers), usually resulting in limited functionality. In this paper, we propose a software architecture for supporting and integrating multi-touch capability on existing desktop systems, where multi-touch and multiple single pointer input can be used
DATE:
TEAM MEMBERS: Kelvin Cheng Benjamin Itzstein Paul Sztajer Markus Rittenbruch
resource research Media and Technology
It is increasingly common for software and hardware systems to support touch-based interaction. While the technology to support this interaction is still evolving, common protocols for providing consistent communication between hardware and software are available. However, this is not true for gesture recognition – the act of translating a series of strokes or touches into a system recognizable event. Developers often end up writing code for this process from scratch due to the lack of higher-level frameworks for defining new gestures. Gesture recognition can contain a significant amount of
DATE:
TEAM MEMBERS: Shahedul Huq Khandkar Frank Maurer
resource research Media and Technology
Despite the considerable quantity of research directed towards multitouch technologies, a set of standardized UI components have not been developed. Menu systems provide a particular challenge, as traditional GUI menus require a level of pointing precision inappropriate for direct finger input. Marking menus are a promising alternative, but have yet to be investigated or adapted for use within multitouch systems. In this paper, we first investigate the human capabilities for performing directional chording gestures, to assess the feasibility of multitouch marking menus. Based on the positive
DATE:
TEAM MEMBERS: Julian Lepinski Tovi Grossman George Fitzmaurice
resource research Media and Technology
Watching a long unedited video is usually a boring experience. In this paper we examine a particular subset of videos, tour videos, in which the video is captured by walking about with a running camera with the goal of conveying the essence of some place. We present a system that makes the process of sharing and watching a long tour video easier, less boring, and more informative.
DATE:
TEAM MEMBERS: Michael Cohen Jue Wang Suporn Pongnumkul
resource research Media and Technology
Multi-touch gestures have become popular on a wide range of touchscreen devices, but the programming of these gestures remains an art. It is time-consuming and error prone for a developer to handle the complicated touch state transitions that result from multiple fingers and their simultaneous movements. In this paper, we present Gesture Coder, which by learning from a few examples given by the developer automatically generates code that recognizes multi-touch gestures, tracks their state changes and invokes corresponding application actions. Developers can easily test the generated code in
DATE:
TEAM MEMBERS: Hao Lü Yang Li
resource research Media and Technology
Modern mobile phones can store a large amount of data, such as contacts, applications and music. However, it is difficult to access specific data items via existing mobile user interfaces. In this paper, we present Gesture Search, a tool that allows a user to quickly access various data items on a mobile phone by drawing gestures on its touch screen. Gesture Search contributes a unique way of combining gesture-based interaction and search for fast mobile data access. It also demonstrates a novel approach for coupling gestures with standard GUI interaction. A real world deployment with mobile
DATE:
TEAM MEMBERS: Yang Li
resource research Media and Technology
We are in the midst of an explosion of emerging humancomputer interaction techniques that redefine our understanding of both computers and interaction. We propose the notion of Reality-Based Interaction (RBI) as a unifying concept that ties together a large subset of these emerging interaction styles. Based on this concept of RBI, we provide a framework that can be used to understand, compare, and relate current paths of recent HCI research as well as to analyze specific interaction designs. We believe that viewing interaction through the lens of RBI provides insights for design and uncovers
DATE:
TEAM MEMBERS: Robert J.K. Jacob Audrey Girouard Leanne M. Hirshfield Michael S. Horn Orit Shaer Erin Treacy Solovey Jamie Zigelbaum