ISE professionals can use this study as a guide to help them in understanding the uses of social networking sites (SNS). The author maintains that SNS provide a space that allows the public to become better acquainted with the work of scientists, stimulating transparency and accountability, and that encourages the public to become active contributors to scientific research and debate.
In 2011, ORG received a National Science Foundation (NSF) grant to develop resources for science, technology, engineering, and mathematics (STEM) learning by redesigning and expanding the "Jonathan Bird's Blue World" website; adding components to enable teachers and students to search episodes for specific themes, locations, or scientific concepts; and enhancing the lesson plans to explicitly match the content standards for teaching science. One of the major grant objectives was to make the "Jonathan Bird's Blue World" website content widely accessible as an open source via an Internet
In an effort to prepare female high school students for a college curriculum and achieve gender parity in the engineering industry, WGBH has developed an initiative entitled, Engineer Your Life (EYL). The initiative is targeted toward female high school students, career counselors/educators, and professional engineers. It is designed to: 1) increase these target audiences' understanding of engineering, 2) inspire young women to explore engineering as a career option and 3) help adults encourage young women to investigate engineering opportunities. One component of this initiative involves
Knight Williams Research Communications (Knight Williams, Inc), an independent evaluation firm specializing in the development and evaluation of science education media, conducted the summative evaluation for Ice Stories. The evaluation focused on the extent to which the project achieved the goals described in the Exploratorium's grant to the National Science Foundation (NSF) Arctic Research and Education, Antarctic Coordination and Information program within the Division of Research on Learning in Formal and Informal Settings (DRL). The NSF DRL program provided funding for both the project
DATE:
TEAM MEMBERS:
Valerie Knight-WilliamsExploratoriumDivan WilliamsChristina MeyersOra GrinbergTal SraboyantsEveen ChanDavid Tower
Finger-based touch input has become a major interaction modality for mobile user interfaces. However, due to the low precision of finger input, small user interface components are often difficult to acquire and operate on a mobile device. It is even harder when the user is on the go and unable to pay close attention to the interface. In this paper, we present Gesture Avatar, a novel interaction technique that allows users to operate existing arbitrary user interfaces using gestures. It leverages the visibility of graphical user interfaces and the casual interaction of gestures. Gesture Avatar
Multi-Touch technology provides a successful gesture based Human Computer Interface. The contact and gesture recognition algorithms of this interface are based on full hand function and, therefore, are not accessible to many people with physical disability. In this paper, we design a set of command-like gestures for users with limited range and function in their digits and wrist. Trajectory and angle features are extracted from these gestures and passed to a recurrent neural network for recognition. Experiments are performed to test the feasibility of gesture recognition system and determine
Creating multiple prototypes facilitates comparative reasoning, grounds team discussion, and enables situated exploration. However, current interface design tools focus on creating single artifacts. This paper introduces the Juxtapose code editor and runtime environment for designing multiple alternatives of both application logic and interface parameters. For rapidly comparing code alternatives, Juxtapose>introduces selectively parallel source editing and execution.
DATE:
TEAM MEMBERS:
Björn HartmannLoren YuAbel AllisonYeonsoo YangScott R. Klemmer
This paper explores the interactive possibilities enabled when the barrel of a digital pen is augmented with a multi-touch sensor. We present a novel multi-touch pen (MTPen) prototype and discuss its alternate uses beyond those of a standard stylus, such as allowing new touch gestures to be performed using the index finger or thumb and detecting how users grip the device as a mechanism for mode switch-ing. We also discuss the hardware and software implemen-tation challenges in realizing our prototype, and showcase how one can combine different grips (tripod, relaxed tripod, sketch, wrap) and
DATE:
TEAM MEMBERS:
Jim SpadacciniHyunyoung SongHrvoje BenkoFrancois GuimbretiereShahram IzadiXiang CaoKen Hinckley
Zooming user interfaces are increasingly popular on mobile devices with touch screens. Swiping and pinching finger gestures anywhere on the screen manipulate the displayed portion of a page, and taps open objects within the page. This makes navigation easy but limits other manipulations of objects that would be supported naturally by the same gestures, notably cut and paste, multiple selection, and drag and drop. A popular device that suffers from this limitation is Apple’s iPhone. In this paper, we present Bezel Swipe, an interaction technique that supports multiple selection, cut, copy
Most current multi-touch capable interactive user interfaces for tabletop are built from custom toolkits that are decoupled from, and on top of, the “Desktop” provided by the underlying Operating System. However, this approach requires that each individual touch system build their own suite of touch capable custom applications (such as photo browsers), usually resulting in limited functionality. In this paper, we propose a software architecture for supporting and integrating multi-touch capability on existing desktop systems, where multi-touch and multiple single pointer input can be used
DATE:
TEAM MEMBERS:
Kelvin ChengBenjamin ItzsteinPaul SztajerMarkus Rittenbruch
Despite the considerable quantity of research directed towards multitouch technologies, a set of standardized UI components have not been developed. Menu systems provide a particular challenge, as traditional GUI menus require a level of pointing precision inappropriate for direct finger input. Marking menus are a promising alternative, but have yet to be investigated or adapted for use within multitouch systems. In this paper, we first investigate the human capabilities for performing directional chording gestures, to assess the feasibility of multitouch marking menus. Based on the positive
DATE:
TEAM MEMBERS:
Julian LepinskiTovi GrossmanGeorge Fitzmaurice