This project is aimed at creating a language / framework independent Gesture Recognition toolkit that takes OSC messages formatted with TUIO specification as input and outputs recognized gestures via OSC protocol. I will use the gesture recognition toolkit AMELiA to describe models specifically for the domain of multitouch gestures. This project will enable multitouch application developers to easily define a gesture and utilize it within their application, creating more engaging experiences.
Touch-sensitive devices are becoming more and more common. Many people use touch interaction, especially on handheld devices like iPhones or other mobile phones. But the question is, do people really understand the different gestures, i.e., do they know which gesture is the correct one for the intended action and do they know how to transfer the gestures to bigger devices and surfaces? This paper reports the results of usability tests which were carried out in semi public space to explore peoples’ ability to find gestures to navigate on a virtual globe. The globe is presented on a multi-touch
DATE:
TEAM MEMBERS:
Jim SpadacciniMarkus JokischThomas BartoschekAngela Schwering
resourceresearchProfessional Development, Conferences, and Networks
Museums are shifting from being object and collection centered, towards a focus on space, affect and audience by producing multi-dimensional spatial non-lineal experiences. Interactivity is used unquestionably to verify this shift. Through the findings of a case study the ‘High Arctic’, a temporary exhibition at the National Maritime Museum, the paper will discuss how the museum interprets and practices the notion of interactivity. Through examining the multiplicity of museum with the focus being on process, the possibility of opening and creating new models of experience can be evaluated
For the past twenty years there has been a slow trickle of research disseminated through a variety of channels on the natureand use of computer interactives within museum and gallery environments. This research has yet to be consolidated into arobust and coherent evidence base for considering and understanding the continued investment in such interactives byinstitutions.Simultaneously however, the technology has changed almost beyond recognition from early kiosk-based computer exhibitsfeaturing mostly film and audio content, through to the newer generation of multi-touch interfaces being
This paper explores the interactive possibilities enabled when the barrel of a digital pen is augmented with a multi-touch sensor. We present a novel multi-touch pen (MTPen) prototype and discuss its alternate uses beyond those of a standard stylus, such as allowing new touch gestures to be performed using the index finger or thumb and detecting how users grip the device as a mechanism for mode switch-ing. We also discuss the hardware and software implemen-tation challenges in realizing our prototype, and showcase how one can combine different grips (tripod, relaxed tripod, sketch, wrap) and
DATE:
TEAM MEMBERS:
Jim SpadacciniHyunyoung SongHrvoje BenkoFrancois GuimbretiereShahram IzadiXiang CaoKen Hinckley
Creating and editing large graphs and node-link diagrams are crucial activities in many application areas. For them, we consider multi-touch and pen input on interactive surfaces as very promising. This fundamental work presents a user study investigating how people edit node-link diagrams on an interactive tabletop. The study covers a set of basic operations, such as creating, moving, and deleting diagram elements. Participants were asked to perform spontaneous gestures for 14 given tasks. They could interact in three different ways: using one hand, both hands, as well as pen and hand
DATE:
TEAM MEMBERS:
Mathias FrischJens HeydekornRaimund Dachselt
In this paper we describe two projects that utilize reality-based interaction to advance collaborative scientific inquiry and discovery. We discuss the relation between reality-based and embodied interaction, and present findings from an experimental study that illustrate benefits of reality-based tabletop interaction for collaborative inquiry-based learning.
New mobile devices with large multi-touch displays, such as the iPad, have brought revolutionary changes to ways users interact with computers. Instead of traditional input devices such as keyboards, touchpads and mice, multi-touch gestures are used as the primary means of interacting with mobile devices. Surprisingly, body-motion gestures are evolving to become a new, natural, and effective way for game players to interact with game consoles in a very similar fashion: in Kinect for Xbox 360, a controller-free gaming experience is made possible by using body-motion gestures to play games.
Most tabletop research presents findings from lab-based user studies, focusing on specific interaction techniques. This means we still know little about how these new interfaces perform in real life settings and how users appropriate them. This paper presents findings from a field study of an existing interactive table in a museum of natural history. Visitors were found to employ a wide variety of gestures for interacting; different interface elements invited different types of gesture. The analysis highlights challenges and design conflicts in the design of tabletop interfaces for public
Classic article from the Journal of Visitor Behavior (1994) which discusses different approaches to exhibit design. The author cites considerable historical research, including one of the earliest visitor studies from 1935 about how visitors engage with exhibits. Very thorough analysis and critique of quantitative and qualitative evaluation techniques and when to apply them during exhibit design. Useful for exhibit design teams and anyone involved with designing exhibits for museums and galleries.
Direct-touch interaction on mobile phones revolves around screens that compete for visual attention with users‟ real-world tasks and activities. This paper investigates the impact of these situational impairments on touch-screen interaction. We probe several design factors for touch-screen gestures, under various levels of environmental demands on attention, in comparison to the status-quo approach of soft buttons. We find that in the presence of environmental distractions, ges-tures can offer significant performance gains and reduced attentional load, while performing as well as soft buttons
DATE:
TEAM MEMBERS:
Andrew BragdonEugene NelsonYang LiKen Hinckley
Zooming user interfaces are increasingly popular on mobile devices with touch screens. Swiping and pinching finger gestures anywhere on the screen manipulate the displayed portion of a page, and taps open objects within the page. This makes navigation easy but limits other manipulations of objects that would be supported naturally by the same gestures, notably cut and paste, multiple selection, and drag and drop. A popular device that suffers from this limitation is Apple’s iPhone. In this paper, we present Bezel Swipe, an interaction technique that supports multiple selection, cut, copy