In this paper, we propose Objects, Containers, Gestures, and Manipulations (OCGM, pronounced like Occam’s Razor) as universal foundational metaphors of Natural User Interfaces. We compare OCGM to existing paradigms using SRK behavior classification and early childhood cognitive development, and justify the “universal” and “foundational” descriptors based upon cognitive linguistics and universal grammar. If adopted, OCGM would significantly improve the conceptual understanding of NUIs by developers and designers and ultimately result in better NUI applications.
We introduce our view of the relation between symbolic gestures and manipulations in multi-touch Natural User Interfaces (NUI). We identify manipulations not gestures as the key to truly natural interfaces. Therefore we suggest that future NUI research should be more focused on designing visual workspaces and model-world interfaces that are especially appropriate for multi-touch manipulations.
DATE:
TEAM MEMBERS:
Hans-Christian JetterJens GerkenHarald Reiterer
Though many tabletop applications allow users to interact with the application using complex multi-touch gestures, automated tool support for testing such gestures is limited. As a result, gesture-based interactions with an application are often tested manually, which is an expensive and error prone process. In this paper, we present TouchToolkit, a tool designed to help developers automate their testing of gestures by incorporating recorded gestures into unit tests. The design of TouchToolkit was informed by a small interview study conducted to explore the challenges software developers face
DATE:
TEAM MEMBERS:
Shahedul Huq KhandkarS. M. SohanJonathan SillitoFrank Maurer
Proton is a novel framework that addresses both of these problems. Using Proton, the application developer declaratively specifies each gesture as a regular expression over a stream of touch events. Proton statically analyzes the set of gestures to report conflicts, and it automatically creates gesture recognizers for the entire set. To simplify the creation of complex multitouch gestures, Proton introduces gesture tablature, a graphical notation that concisely describes the sequencing of multiple interleaved touch actions over time.
DATE:
TEAM MEMBERS:
Jim SpadacciniKenrick KinBjörn HartmannTony DeRoseManeesh Agrawala
The NMC Horizon Report: 2011 Museum Edition, is a coproduction with the Marcus Institute for Digital Education in the Arts (MIDEA), and examines emerging technologies for their potential impact on and use in education and interpretation within the museum environment. The international composition of the advisory board reflects the care with which a global perspective for the report was assembled. While there are many local factors affecting the adoption and use of emerging technologies in museums, there are also issues that transcend regional boundaries and questions we all face. It was with
This article introduces a new interaction model called Instrumental Interaction that extends and generalizes the principles of direct manipulation. It covers existing interaction styles, including traditional WIMP interfaces, as well as new interaction styles such as two-handed input and augmented reality. It defines a design space for new interaction techniques and a set of properties for comparing them.
DATE:
TEAM MEMBERS:
Michael Beaudouin-Lafon
resourceresearchProfessional Development, Conferences, and Networks
For document visualization, folding techniques provide a focus-plus-context approach with fairly high legibility on flat sections. To enable richer interaction, we explore the design space of multi-touch document folding. We discuss several design considerations for simple modeless gesturing and compatibility with standard Drag and Pinch gestures. We categorize gesture models along the characteristics of Symmetric/Asymmetric and Serial/Parallel, which yields three gesture models. We built a prototype document workspace application that integrates folding and standard gestures, and a system for
DATE:
TEAM MEMBERS:
Patrick ChiuChunyuan LiaoFrancine Chen
Many tasks in graphical user interfaces require users to interact with elements at various levels of precision. We present FingerGlass, a bimanual technique designed to improve the precision of graphical tasks on multitouch screens. It enables users to quickly navigate to different locations and across multiple scales of a scene using a single hand. The other hand can simultaneously interact with objects in the scene. Unlike traditional pan-zoom interfaces, FingerGlass retains contextual information during the interaction. We evaluated our technique in the context of precise object selection
DATE:
TEAM MEMBERS:
Dominik K¨aserManeesh AgrawalaMark Pauly
resourceresearchProfessional Development, Conferences, and Networks
Open Exhibits held a Design Summit bringing together 30 professionals from the field to help guide future development. The Design Summit was convened in Corrales, New Mexico near the design studios of Ideum, the principal organization of Open Exhibits. It was held March 9th to 11th of 2011. Attendees came from large and small science centers, planetariums, zoos, local museums, and several other open source software initiatives. They were educators, evaluators, designers, researchers, software engineers, and museum professionals. Participants engaged in a combination of short presentations
Despite a long history of using participatory methods to enable public engagement with issues of societal importance, interactive displays have only recently been explored for this purpose. In this paper, we evaluate a tabletop game called Futura, which was designed to engage the public with issues of sustainability. Our design is grounded in prior research on public displays, serious games, and computer supported collaborative learning. We suggest that a role-based, persistent simulation style game implemented on a multi-touch tabletop affords unique opportunities for a walk-up-and-play style
DATE:
TEAM MEMBERS:
Alissa N. AntleJoshua TanenbaumAllen BevansKatie SeabornSijie Wang
Modern smartphones contain sophisticated sensors to monitor three-dimensional movement of the device. These sensors permit devices to recognize motion gestures— deliberate movements of the device by end-users to invoke commands. However, little is known about best-practices in motion gesture design for the mobile computing paradigm. To address this issue, we present the results of a guessability study that elicits end-user motion gestures to invoke commands on a smartphone device. We demonstrate that consensus exists among our participants on parameters of movement and on mappings of motion
DATE:
TEAM MEMBERS:
Jim SpadacciniJaime RuizYang LiEdward Lank
Assessing the Impact of a Visit to a Zoo or Aquarium: A Multi-institutional Research Project will create a functional taxonomy of zoo/aquarium visitors' entering knowledge, attitudes and behaviors. This taxonomy, in conjunction with data about the specific experiences visitors have during their visit, will enable investigators to understand and predict the contribution of zoos and aquariums to the public understanding of animals and their conservation. The results will clarify the role of zoos and aquariums as centers of informal learning and point to ways to strengthen their educational impact. The AZA convened a national advisory committee that commissioned and completed a thorough review, confirming a critical need to conduct more research, particularly research that attempts to ask broad questions, collect data systematically, and includes sufficient number and types of institutions to permit community-wide generalizations. Twelve AZA institutions of various sizes, geographic regions and types will participate in the study. The net result of the study will be a descriptive model of zoo and aquarium visitor learning experiences and development of a set of diagnostic tools to help zoo and aquaria staff understand and enhance the nature and extent of their public impact.