Skip to main content

Community Repository Search Results

resource research Media and Technology
Zooming user interfaces are increasingly popular on mobile devices with touch screens. Swiping and pinching finger gestures anywhere on the screen manipulate the displayed portion of a page, and taps open objects within the page. This makes navigation easy but limits other manipulations of objects that would be supported naturally by the same gestures, notably cut and paste, multiple selection, and drag and drop. A popular device that suffers from this limitation is Apple’s iPhone. In this paper, we present Bezel Swipe, an interaction technique that supports multiple selection, cut, copy
DATE:
TEAM MEMBERS: Volker Roth Thea Turner
resource evaluation
In recent years, a large amount of software for multitouch interfaces with various degrees of similarity has been written. In order to improve interoperability, we aim to identify the common traits of these systems and present a layered software architecture which abstracts these similarities by defining common interfaces between successive layers. This provides developers with a unified view of the various types of multitouch hardware. Moreover, the layered architecture allows easy integration of existing software, as several alternative implementations for each layer can co-exist. Finally
DATE:
TEAM MEMBERS: Florian Echtler Gudrun Klinker Jim Spadaccini
resource evaluation
Recent advances in touch screen technology have increased the prevalence of touch screens and have prompted a wave of new touch screen-based devices. However, touch screens are still largely inaccessible to blind users, who must adopt error-prone compensatory strategies to use them or find accessible alternatives. This inaccessibility is due to interaction techniques that require the user to visually locate objects on the screen. To address this problem, we introduce Slide Rule, a set of audiobased multi-touch interaction techniques that enable blind users to access touch screen applications
DATE:
TEAM MEMBERS: Jim Spadaccini Jeffrey Bigham Jacob Wobbrock
resource research Media and Technology
Most current multi-touch capable interactive user interfaces for tabletop are built from custom toolkits that are decoupled from, and on top of, the “Desktop” provided by the underlying Operating System. However, this approach requires that each individual touch system build their own suite of touch capable custom applications (such as photo browsers), usually resulting in limited functionality. In this paper, we propose a software architecture for supporting and integrating multi-touch capability on existing desktop systems, where multi-touch and multiple single pointer input can be used
DATE:
TEAM MEMBERS: Kelvin Cheng Benjamin Itzstein Paul Sztajer Markus Rittenbruch
resource research Media and Technology
Despite the considerable quantity of research directed towards multitouch technologies, a set of standardized UI components have not been developed. Menu systems provide a particular challenge, as traditional GUI menus require a level of pointing precision inappropriate for direct finger input. Marking menus are a promising alternative, but have yet to be investigated or adapted for use within multitouch systems. In this paper, we first investigate the human capabilities for performing directional chording gestures, to assess the feasibility of multitouch marking menus. Based on the positive
DATE:
TEAM MEMBERS: Julian Lepinski Tovi Grossman George Fitzmaurice
resource research Media and Technology
Watching a long unedited video is usually a boring experience. In this paper we examine a particular subset of videos, tour videos, in which the video is captured by walking about with a running camera with the goal of conveying the essence of some place. We present a system that makes the process of sharing and watching a long tour video easier, less boring, and more informative.
DATE:
TEAM MEMBERS: Michael Cohen Jue Wang Suporn Pongnumkul
resource research Media and Technology
Multi-touch gestures have become popular on a wide range of touchscreen devices, but the programming of these gestures remains an art. It is time-consuming and error prone for a developer to handle the complicated touch state transitions that result from multiple fingers and their simultaneous movements. In this paper, we present Gesture Coder, which by learning from a few examples given by the developer automatically generates code that recognizes multi-touch gestures, tracks their state changes and invokes corresponding application actions. Developers can easily test the generated code in
DATE:
TEAM MEMBERS: Hao Lü Yang Li
resource research Media and Technology
Modern mobile phones can store a large amount of data, such as contacts, applications and music. However, it is difficult to access specific data items via existing mobile user interfaces. In this paper, we present Gesture Search, a tool that allows a user to quickly access various data items on a mobile phone by drawing gestures on its touch screen. Gesture Search contributes a unique way of combining gesture-based interaction and search for fast mobile data access. It also demonstrates a novel approach for coupling gestures with standard GUI interaction. A real world deployment with mobile
DATE:
TEAM MEMBERS: Yang Li
resource research Media and Technology
In this paper, we propose Objects, Containers, Gestures, and Manipulations (OCGM, pronounced like Occam’s Razor) as universal foundational metaphors of Natural User Interfaces. We compare OCGM to existing paradigms using SRK behavior classification and early childhood cognitive development, and justify the “universal” and “foundational” descriptors based upon cognitive linguistics and universal grammar. If adopted, OCGM would significantly improve the conceptual understanding of NUIs by developers and designers and ultimately result in better NUI applications.
DATE:
TEAM MEMBERS: Ron George Joshua Blake
resource research Media and Technology
We introduce our view of the relation between symbolic gestures and manipulations in multi-touch Natural User Interfaces (NUI). We identify manipulations not gestures as the key to truly natural interfaces. Therefore we suggest that future NUI research should be more focused on designing visual workspaces and model-world interfaces that are especially appropriate for multi-touch manipulations.
DATE:
TEAM MEMBERS: Hans-Christian Jetter Jens Gerken Harald Reiterer
resource research Media and Technology
Proton is a novel framework that addresses both of these problems. Using Proton, the application developer declaratively specifies each gesture as a regular expression over a stream of touch events. Proton statically analyzes the set of gestures to report conflicts, and it automatically creates gesture recognizers for the entire set. To simplify the creation of complex multitouch gestures, Proton introduces gesture tablature, a graphical notation that concisely describes the sequencing of multiple interleaved touch actions over time.
DATE:
TEAM MEMBERS: Jim Spadaccini Kenrick Kin Björn Hartmann Tony DeRose Maneesh Agrawala
resource research
The NMC Horizon Report: 2011 Museum Edition, is a coproduction with the Marcus Institute for Digital Education in the Arts (MIDEA), and examines emerging technologies for their potential impact on and use in education and interpretation within the museum environment. The international composition of the advisory board reflects the care with which a global perspective for the report was assembled. While there are many local factors affecting the adoption and use of emerging technologies in museums, there are also issues that transcend regional boundaries and questions we all face. It was with
DATE:
TEAM MEMBERS: Larry Johnson Holly Witchey