Skip to main content

Community Repository Search Results

resource research Media and Technology
Watching a long unedited video is usually a boring experience. In this paper we examine a particular subset of videos, tour videos, in which the video is captured by walking about with a running camera with the goal of conveying the essence of some place. We present a system that makes the process of sharing and watching a long tour video easier, less boring, and more informative.
DATE:
TEAM MEMBERS: Michael Cohen Jue Wang Suporn Pongnumkul
resource research Media and Technology
Multi-touch gestures have become popular on a wide range of touchscreen devices, but the programming of these gestures remains an art. It is time-consuming and error prone for a developer to handle the complicated touch state transitions that result from multiple fingers and their simultaneous movements. In this paper, we present Gesture Coder, which by learning from a few examples given by the developer automatically generates code that recognizes multi-touch gestures, tracks their state changes and invokes corresponding application actions. Developers can easily test the generated code in
DATE:
TEAM MEMBERS: Hao Lü Yang Li
resource research Media and Technology
Modern mobile phones can store a large amount of data, such as contacts, applications and music. However, it is difficult to access specific data items via existing mobile user interfaces. In this paper, we present Gesture Search, a tool that allows a user to quickly access various data items on a mobile phone by drawing gestures on its touch screen. Gesture Search contributes a unique way of combining gesture-based interaction and search for fast mobile data access. It also demonstrates a novel approach for coupling gestures with standard GUI interaction. A real world deployment with mobile
DATE:
TEAM MEMBERS: Yang Li
resource research Media and Technology
We are in the midst of an explosion of emerging humancomputer interaction techniques that redefine our understanding of both computers and interaction. We propose the notion of Reality-Based Interaction (RBI) as a unifying concept that ties together a large subset of these emerging interaction styles. Based on this concept of RBI, we provide a framework that can be used to understand, compare, and relate current paths of recent HCI research as well as to analyze specific interaction designs. We believe that viewing interaction through the lens of RBI provides insights for design and uncovers
DATE:
TEAM MEMBERS: Robert J.K. Jacob Audrey Girouard Leanne M. Hirshfield Michael S. Horn Orit Shaer Erin Treacy Solovey Jamie Zigelbaum
resource research Media and Technology
This paper demonstrates a pressure-sensitive depth sorting technique that extends standard two-dimensional (2D) manipulation techniques, particularly those used with multitouch or multi-point controls. Then analyzes the combination of this layering operation with a page-folding metaphor for more fluid interaction in applications requiring 2D sorting and layout.
DATE:
TEAM MEMBERS: Philip L. Davidson Jefferson Y. Han
resource research Media and Technology
In this paper, we propose Objects, Containers, Gestures, and Manipulations (OCGM, pronounced like Occam’s Razor) as universal foundational metaphors of Natural User Interfaces. We compare OCGM to existing paradigms using SRK behavior classification and early childhood cognitive development, and justify the “universal” and “foundational” descriptors based upon cognitive linguistics and universal grammar. If adopted, OCGM would significantly improve the conceptual understanding of NUIs by developers and designers and ultimately result in better NUI applications.
DATE:
TEAM MEMBERS: Ron George Joshua Blake
resource research Media and Technology
We introduce our view of the relation between symbolic gestures and manipulations in multi-touch Natural User Interfaces (NUI). We identify manipulations not gestures as the key to truly natural interfaces. Therefore we suggest that future NUI research should be more focused on designing visual workspaces and model-world interfaces that are especially appropriate for multi-touch manipulations.
DATE:
TEAM MEMBERS: Hans-Christian Jetter Jens Gerken Harald Reiterer
resource research Media and Technology
Though many tabletop applications allow users to interact with the application using complex multi-touch gestures, automated tool support for testing such gestures is limited. As a result, gesture-based interactions with an application are often tested manually, which is an expensive and error prone process. In this paper, we present TouchToolkit, a tool designed to help developers automate their testing of gestures by incorporating recorded gestures into unit tests. The design of TouchToolkit was informed by a small interview study conducted to explore the challenges software developers face
DATE:
TEAM MEMBERS: Shahedul Huq Khandkar S. M. Sohan Jonathan Sillito Frank Maurer
resource research Media and Technology
Proton is a novel framework that addresses both of these problems. Using Proton, the application developer declaratively specifies each gesture as a regular expression over a stream of touch events. Proton statically analyzes the set of gestures to report conflicts, and it automatically creates gesture recognizers for the entire set. To simplify the creation of complex multitouch gestures, Proton introduces gesture tablature, a graphical notation that concisely describes the sequencing of multiple interleaved touch actions over time.
DATE:
TEAM MEMBERS: Jim Spadaccini Kenrick Kin Björn Hartmann Tony DeRose Maneesh Agrawala
resource research Media and Technology
This article introduces a new interaction model called Instrumental Interaction that extends and generalizes the principles of direct manipulation. It covers existing interaction styles, including traditional WIMP interfaces, as well as new interaction styles such as two-handed input and augmented reality. It defines a design space for new interaction techniques and a set of properties for comparing them.
DATE:
TEAM MEMBERS: Michael Beaudouin-Lafon
resource research Media and Technology
Despite a long history of using participatory methods to enable public engagement with issues of societal importance, interactive displays have only recently been explored for this purpose. In this paper, we evaluate a tabletop game called Futura, which was designed to engage the public with issues of sustainability. Our design is grounded in prior research on public displays, serious games, and computer supported collaborative learning. We suggest that a role-based, persistent simulation style game implemented on a multi-touch tabletop affords unique opportunities for a walk-up-and-play style
DATE:
TEAM MEMBERS: Alissa N. Antle Joshua Tanenbaum Allen Bevans Katie Seaborn Sijie Wang
resource research Media and Technology
Modern smartphones contain sophisticated sensors to monitor three-dimensional movement of the device. These sensors permit devices to recognize motion gestures— deliberate movements of the device by end-users to invoke commands. However, little is known about best-practices in motion gesture design for the mobile computing paradigm. To address this issue, we present the results of a guessability study that elicits end-user motion gestures to invoke commands on a smartphone device. We demonstrate that consensus exists among our participants on parameters of movement and on mappings of motion
DATE:
TEAM MEMBERS: Jim Spadaccini Jaime Ruiz Yang Li Edward Lank