This poster was presented at the 2010 Association of Science-Technology Centers Annual Conference. The Saint Louis Science Center is a partner in Washington University's Cognitive, Computational, and Systems Neuroscience interdisciplinary graduate program funded by the NSF-IGERT (Integrative Graduate Education and Research Traineeship) flagship training program for PhD scientists and engineers.
This introduction presents the essays belonging to the JCOM special issue on User-led and peer-to-peer science. It also draws a first map of the main problems we need to investigate when we face this new and emerging phenomenon. Web tools are enacting and facilitating new ways for lay people to interact with scientists or to cooperate with each other, but cultural and political changes are also at play. What happens to expertise, knowledge production and relations between scientific institutions and society when lay people or non-scientists go online and engage in scientific activities? From
This article is a case study and rhetorical analysis of a specific scientific paper on a computer simulation in astrophysics, an advanced and often highly theoretical science. Findings reveal that rhetorical decisions play as important a role in creating a convincing simulation as does sound evidence. Rhetorical analysis was used to interpret the data gathered in this case study. Rhetorical analysis calls for close reading of primary materials to identify classical rhetorical figures and devices of argumentation and explain how these devices factor in the production of scientific knowledge
Digital information and communication technologies (ICTs) are novelty tools that can be used to facilitate broader involvement of citizens in the discussions about science. The same tools can be used to reinforce the traditional top-down model of science communication. Empirical investigations of particular technologies can help to understand how these tools are used in the dissemination of information and knowledge as well as stimulate a dialog about better models and practices of science communication. This study focuses on one of the ICTs that have already been adopted in science
Responding to the expressed needs of the field, the U.S. Department of Education is building You for Youth (Y4Y), an online learning community whose modules will enhance the professional development of afterschool practitioners and program managers.
The article focuses on an educational program called Game Design Through Mentoring and Collaboration. The program is a partnership between McKinley Tech and George Mason University (GMU) in Fairfax, Virginia. Through this program the teachers ensure students understand the pathways needed for participation in the science, technology, engineering, and math (STEM) enterprise. Kevin Clark, is the principal investigator of the program.
This report is the National Education Technology Plan (NETP) submitted by the U.S. Department of Education (ED) to Congress. It presents five goals with recommendations for states, districts, the federal government, and other stakeholders. Each goal addresses one of the five essential components of learning powered by technology: Learning, Assessment, Teaching, Infrastructure, and Productivity. The plan also calls for "grand challenge" research and development initiatives to solve crucial long-term problems that the ED believes should be funded and coordinated at a national level.
DATE:
TEAM MEMBERS:
U.S. Department of EducationDaniel AtkinsJohn BennettJohn Seely BrownAneesh ChopraChris DedeBarry FishmanLouis GomezMargaret HoneyYasmin KafaiMaribeth LuftglassRoy PeaJim PellegrinoDavid RoseCandace ThilleBrenda Williams
It is increasingly common for software and hardware systems to support touch-based interaction. While the technology to support this interaction is still evolving, common protocols for providing consistent communication between hardware and software are available. However, this is not true for gesture recognition – the act of translating a series of strokes or touches into a system recognizable event. Developers often end up writing code for this process from scratch due to the lack of higher-level frameworks for defining new gestures. Gesture recognition can contain a significant amount of
Despite the considerable quantity of research directed towards multitouch technologies, a set of standardized UI components have not been developed. Menu systems provide a particular challenge, as traditional GUI menus require a level of pointing precision inappropriate for direct finger input. Marking menus are a promising alternative, but have yet to be investigated or adapted for use within multitouch systems. In this paper, we first investigate the human capabilities for performing directional chording gestures, to assess the feasibility of multitouch marking menus. Based on the positive
DATE:
TEAM MEMBERS:
Julian LepinskiTovi GrossmanGeorge Fitzmaurice
Modern mobile phones can store a large amount of data, such as contacts, applications and music. However, it is difficult to access specific data items via existing mobile user interfaces. In this paper, we present Gesture Search, a tool that allows a user to quickly access various data items on a mobile phone by drawing gestures on its touch screen. Gesture Search contributes a unique way of combining gesture-based interaction and search for fast mobile data access. It also demonstrates a novel approach for coupling gestures with standard GUI interaction. A real world deployment with mobile
In this paper, we propose Objects, Containers, Gestures, and Manipulations (OCGM, pronounced like Occam’s Razor) as universal foundational metaphors of Natural User Interfaces. We compare OCGM to existing paradigms using SRK behavior classification and early childhood cognitive development, and justify the “universal” and “foundational” descriptors based upon cognitive linguistics and universal grammar. If adopted, OCGM would significantly improve the conceptual understanding of NUIs by developers and designers and ultimately result in better NUI applications.
We introduce our view of the relation between symbolic gestures and manipulations in multi-touch Natural User Interfaces (NUI). We identify manipulations not gestures as the key to truly natural interfaces. Therefore we suggest that future NUI research should be more focused on designing visual workspaces and model-world interfaces that are especially appropriate for multi-touch manipulations.
DATE:
TEAM MEMBERS:
Hans-Christian JetterJens GerkenHarald Reiterer