The Ice Worlds media project will inspire millions of children and adults to gain new knowledge about polar environments, the planet’s climate, and humanity’s place within Earth’s complex systems—supporting an informed, STEM literate citizenry. Featuring the NSF-funded THOR expedition to Thwaites glacier, along with contributions of many NSF-supported researchers worldwide, Ice Worlds will share the importance of investments in STEM with audiences in giant screen theaters, on television, online, and in other informal settings. Primary project deliverables include a giant screen film, a filmmaking workshop for Native American middle school students that will result in a documentary, a climate storytelling professional development program for informal educators, and a knowledge-building summative evaluation. The project’s largest target audience is middle school learners (ages 11-14); specific activities are designed for Native American youth and informal science practitioners. Innovative outreach will engage youth underserved in science inspiring a new generation of scientists and investigative thinkers. The project’s professional development programs will build the capacity of informal educators to engage communities and communicate science. The Ice Worlds project is a collaboration among media producers Giant Screen Films, Natural History New Zealand, PBS, and Academy Award nominated film directors (Yes/No Productions). Additional collaborators include Northwestern University, The American Indian Science and Engineering Society, the Native American Journalism Association, a group of museum and science center partners, and a team of advisors including scientific and Indigenous experts associated with the NSF-funded Study of Environmental Arctic Change initiative.
The goals of the project are: 1) to increase public understanding of the processes and consequences of environmental change in polar ecosystems, 2) to explore the effectiveness of the giant screen format to impart knowledge, inspire motivation and caring for nature, 3) to improve middle schoolers’ interest, confidence and engagement in STEM topics and pursuits—broadly and through a specific program for Native American youth, and 4) to build informal educators’ capacity to share stories of climate change in their communities. The main evaluation questions are 1) to what extent does the Ice World film affect learning, engagement, and motivation around STEM pursuits and environmental problem solving 2) what is the added value of companion media for youth’s giant screen learning over short and longer term, and 3) what are the impacts of the culturally based Native American youth workshops.
The evaluation work will involve a Native American youth advisory panel and a panel of science center practitioners in the giant screen film’s development and evaluation process. Formative evaluation of the film will involve recruiting youth from diverse backgrounds, including representation of Native youth, to see the film in the giant screen theater of a partner site. Post viewing surveys and group discussions will explore their experience of the film with respect to engagement, learning, evoking spatial presence, and motivational impact. A summative evaluation of the completed film will assess its immediate and longer term impacts. Statistical analyses will be conducted on all quantitative data generated from the evaluation, including a comparison of pre and post knowledge scores. An evaluation of the Tribal Youth Media program will include a significant period of formative evaluation and community engagement to align activities to the needs and interests of participating students. Culturally appropriate measures, qualitative methods and frameworks will be used to assess the learning impacts. Data will be analyzed to determine learning impacts of the workshop on youth participants as well as mentors and other stakeholder participants. Evaluation of the community climate storytelling professional development component will include lessons learned and recommendations for implementation.
This award is funded in whole or in part under the American Rescue Plan Act of 2021 (Public Law 117-2).
The Accessible Oceans study will design auditory displays that support learning and understanding of ocean data in informal learning environments like museums, science centers, and aquariums. Most data presentations in these spaces use visual display techniques such as graphs, charts, and computer-generated visualizations, resulting in inequitable access for learners with vision impairment or other print-related disabilities. While music, sound effects, and environmental sounds are sometimes used, these audio methods are inadequate for conveying quantitative information. The project will use sonification (turning data into sound) to convey meaningful aspects of ocean science data to increase access to ocean data and ocean literacy. The project will advance knowledge on the design of auditory displays for all learners, with and without disabilities, as well as advance the use of technology for STEM formal and informal education. The study will include 425 participants but will reach tens of thousands through the development of education materials, public reporting, and social media. The study will partner with the Smithsonian National Museum of Natural History, Woods Hole Oceanographic Institution Ocean Discovery Center, the Georgia Aquarium, the Eugene Science Center, the Atlanta Center for the Visually Impaired, and Perkins School for the Blind.
The project will leverage existing educational ocean datasets from the NSF-funded Ocean Observatories Initiative to produce and evaluate the feasibility of using integrated auditory displays to communicate tiered learning objectives of oceanographic principles. Integrated auditory displays will each be comprised of a data sonification and a context-setting audio introduction that will help to make sure all users start with the same basic information about the phenomenon. The displays will be developed through a user-centered design process that will engage ocean science experts, visually impaired students and adults (and their teachers), and design-oriented undergraduate and graduate students. The project will support advocacy skills for inclusive design and will provide valuable training opportunities for graduate and undergraduate students in human-centered design and accessibility. The project will have foundational utility in auditory display, STEM education, human-computer interaction, and other disciplines, contributing new strategies for representing quantitative information that can be applied across STEM disciplines that use similar visual data displays. The project will generate publicly accessible resources to advance studies of inclusive approaches on motivating learners with and without disabilities to learn more about and consider careers in STEM.
This Pilots and Feasibility Studies project is supported by the Advancing Informal STEM Learning program, which seeks to advance new approaches to, and evidence-based understanding of, the design and development of STEM learning in informal environments. This includes providing multiple pathways for broadening access to and engagement in STEM learning experiences, advancing innovative research on and assessment of STEM learning in informal environments, and developing understandings of deeper learning by participants.
This project investigates long-term human-robot interaction outside of controlled laboratory settings to better understand how the introduction of robots and the development of socially-aware behaviors work to transform the spaces of everyday life, including how spaces are planned and managed, used, and experienced. Focusing on tour-guiding robots in two museums, the research will produce nuanced insights into the challenges and opportunities that arise as social robots are integrated into new spaces to better inform future design, planning, and decision-making. It brings together researchers from human geography, robotics, and art to think beyond disciplinary boundaries about the possible futures of human-robot co-existence, sociality, and collaboration. Broader impacts of the project will include increased accessibility and engagement at two partner museums, interdisciplinary research opportunities for both undergraduate and graduate students, a short video series about the current state of robotic technology to be offered as a free educational resource, and public art exhibitions reflecting on human-robot interactions. This project will be of interest to scholars of Science and Technology Studies, Human Robotics Interaction (HRI), and human geography as well as museum administrators, educators and the general public.
This interdisciplinary project brings together Science and Technology Studies, Human Robotics Interaction (HRI), and human geography to explore the production of social space through emerging forms of HRI. The project broadly asks: How does the deployment of social robots influence the production of social space—including the functions, meanings, practices, and experiences of particular spaces? The project is based on long-term ethnographic observation of the development and deployment of tour-guiding robots in an art museum and an earth science museum. A social roboticist will develop a socially-aware navigation system to add nuance to the robots’ socio-spatial behavior. A digital artist will produce digital representations of the interactions that take place in the museum, using the robot’s own sensor data and other forms of motion capture. A human geographer will conduct interviews with museum visitors and staff as well as ethnographic observation of the tour-guiding robots and of the roboticists as they develop the navigation system. They will produce an ethnographic analysis of the robots’ roles in the organization of the museums, everyday practices of museum staff and visitors, and the differential experiences of the museum space. The intellectual merits of the project consist of contributions at the intersections of STS, robotics, and human geography examining the value of ethnographic research for HRI, the development of socially-aware navigation systems, the value of a socio-spatial analytic for understanding emerging forms of robotics, and the role of robots within evolving digital geographies.
This project is jointly funded by the Science and Technology Studies program in SBE and Advancing Informal STEM Learning (AISL) Program in EHR.
As the digital revolution continues and our lives become increasingly governed by smart technologies, there is a rising need for reflection and critical debate about where we are, where we are headed, and where we want to be. Against this background, the paper suggests that one way to foster such discussion is by engaging with the world of fiction, with imaginative stories that explore the spaces, places, and politics of alternative realities. Hence, after a concise discussion of the concept of speculative fiction, we introduce the notion of datafictions as an umbrella term for speculative
Implementation of a permanent exhibition, on-line content, educational materials, and public programs exploring the history and cultural impact of video games.
Through the design, fabrication, and implementation of a 24,000-sq. ft. permanent, long-term gallery—tentatively entitled Digital Worlds—The Strong National Museum of Play will explore and share the history, influence, and experience of video games as they relate to culture, storytelling, human development, and the broader evolution of play. This gallery, the centerpiece of a transformational museum expansion, will include complementary and cohesive interactive exhibit spaces that showcase the history of video games through: (1) display of rare and unique historical artifacts; (2) use of multiple media formats that allow guests to discover the history of video games and their impact on society and culture; and (3) inclusion of one-of-a-kind interactive experiences that bring the history, art, and narrative structures of video games to life.
Virtual Reality (VR) shows promise to broaden participation in STEM by engaging learners in authentic but otherwise inaccessible learning experiences. The immersion in authentic learner environments, along with social presence and learner agency, that is enabled by VR helps form memorable learning experiences. VR is emerging as a promising tool for children with autism. While there is wide variation in the way people with autism present, one common set of needs associated with autism that can be addressed with VR is sensory processing. This project will research and model how VR can be used to minimize barriers for learners with autism, while also incorporating complementary universal designs for learning (UDL) principles to promote broad participation in STEM learning. As part of its overall strategy to enhance learning in informal environments, the Advancing Informal STEM Learning (AISL) program funds innovative research, approaches, and resources for use in a variety of settings. This project will build on a prototype VR simulation, Mission to Europa Prime, that transports learners to a space station for exploration on Jupiter's moon Europa, a strong candidate for future discovery of extraterrestrial life and a location no human can currently experience in person. The prototype simulation will be expanded to create a full, immersive STEM-based experience that will enable learners who often encounter cognitive, social, and emotional barriers to STEM learning in public spaces, particularly learners with autism, to fully engage and benefit from this STEM-learning experience. The simulation will include a variety of STEM-learning puzzles, addressing science, mathematics, engineering, and computational thinking through authentic and interesting problem-solving tasks. The project team's learning designers and researchers will co-design puzzles and user interfaces with students at a post-secondary institute for learners with autism and other learning differences. The full VR STEM-learning simulation will be broadly disseminated to museums and other informal education programs, and distributed to other communities.
Project research is designed to advance knowledge about VR-based informal STEM learning and the affordances of VR to support learners with autism. To broaden STEM participation for all, the project brings together research at the intersection of STEM learning, cognitive and educational neuroscience, and the human-technology frontier. The simulation will be designed to provide agency for learners to adjust a STEM-learning VR experience for their unique sensory processing, attention, and social anxiety needs. The project will use a participatory design process will ensure the VR experience is designed to reduce barriers that currently exclude learners with autism and related conditions from many informal learning opportunities, broadening participation in informal STEM learning. Design research, usability, and efficacy studies will be conducted with teens and adults at the Pacific Science Center and Boston Museum of Science, which serve audiences with autism, along with the general public. Project research is grounded in prior NSF-funded research and leverages the team's expertise in STEM learning simulations, VR development, cognitive psychology, universal design, and informal science education, as well as the vital expertise of the end-user target audience, learners with autism. In addition to being shared at conferences, the research findings will be submitted for publication to peer-reviewed journals for researchers and to appropriate publications for VR developers and disseminators, museum programs, neurodiverse communities and other potentially interested parties.
This Innovations in Development award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
DATE:
-
TEAM MEMBERS:
Teon EdwardsJodi Asbell-ClarkeJamie LarsenIbrahim Dahlstrom-Hakki
Sense-making with data through the process of visualization—recognizing and constructing meaning with these data—has been of interest to learning researchers for many years. Results of a variety of data visualization projects in museums and science centers suggest that visitors have a rudimentary understanding of and ability to interpret the data that appear in even simple data visualizations. This project supports the need for data visualization experiences to be appealing, accommodate short and long-term exploration, and address a range of visitors’ prior knowledge. Front-end evaluation
We developed a multi-touch interface for the citizen science video game Foldit, in which players manipulate 3D protein structures, and compared multi-touch and mouse interfaces in a 41-subject user study. We found that participants performed similarly in both interfaces and did not have an overall preference for either interface. However, results indicate that for tasks involving guided movement to dock protein parts, subjects using the multi-touch interface completed tasks more accurately with fewer moves, and reported higher attention and spatial presence. For tasks involving direct
DATE:
TEAM MEMBERS:
Thomas MuenderSadaab Ali GulaniLauren WestendorfClarissa VerishRainer MalakaOrit ShaerSeth Cooper
This INSPIRE award is partially funded by the Cyber-Human Systems Program in the Division of Information and Intelligent Systems in the Directorate for Computer Science and Engineering, the Gravitational Physics Program in the Division of Physics in the Directorate for Mathematical and Physical Sciences, and the Office of Integrative Activities.
This innovative project will develop a citizen science system to support the Advanced Laser Interferometer Gravitational wave Observatory (aLIGO), the most complicated experiment ever undertaken in gravitational physics. Before the end of this decade it will open up the window of gravitational wave observations on the Universe. However, the high detector sensitivity needed for astrophysical discoveries makes aLIGO very susceptible to noncosmic artifacts and noise that must be identified and separated from cosmic signals. Teaching computers to identify and morphologically classify these artifacts in detector data is exceedingly difficult. Human eyesight is a proven tool for classification, but the aLIGO data streams from approximately 30,000 sensors and monitors easily overwhelm a single human. This research will address these problems by coupling human classification with a machine learning model that learns from the citizen scientists and also guides how information is provided to participants. A novel feature of this system will be its reliance on volunteers to discover new glitch classes, not just use existing ones. The project includes research on the human-centered computing aspects of this sociocomputational system, and thus can inspire future citizen science projects that do not merely exploit the labor of volunteers but engage them as partners in scientific discovery. Therefore, the project will have substantial educational benefits for the volunteers, who will gain a good understanding on how science works, and will be a part of the excitement of opening up a new window on the universe.
This is an innovative, interdisciplinary collaboration between the existing LIGO, at the time it is being technically enhanced, and Zooniverse, which has fielded a workable crowdsourcing model, currently involving over a million people on 30 projects. The work will help aLIGO to quickly identify noise and artifacts in the science data stream, separating out legitimate astrophysical events, and allowing those events to be distributed to other observatories for more detailed source identification and study. This project will also build and evaluate an interface between machine learning and human learning that will itself be an advance on current methods. It can be depicted as a loop: (1) By sifting through enormous amounts of aLIGO data, the citizen scientists will produce a robust "gold standard" glitch dataset that can be used to seed and train machine learning algorithms that will aid in the identification task. (2) The machine learning protocols that select and classify glitch events will be developed to maximize the potential of the citizen scientists by organizing and passing the data to them in more effective ways. The project will experiment with the task design and workflow organization (leveraging previous Zooniverse experience) to build a system that takes advantage of the distinctive strengths of the machines (ability to process large amounts of data systematically) and the humans (ability to identify patterns and spot discrepancies), and then using the model to enable high quality aLIGO detector characterization and gravitational wave searches
DATE:
-
TEAM MEMBERS:
Vassiliki KalogeraAggelos KatsaggelosKevin CrowstonLaura TrouilleJoshua SmithShane LarsonLaura Whyte
The rise of artificial intelligence has recently led to bots writing real news stories about sports, finance and politics. As yet, bots have not turned their attention to science, and some people still mistakenly think science is too complex for bots to write about. In fact, a small number of insiders are now applying AI algorithms to summarise scientific research papers and automatically turn them into simple press releases and news stories. Could the science beat be next in line for automation, potentially making many science reporters --- and even editors --- superfluous to science
Increasingly, scientists and their institutions are engaging with lay audiences via media. The emergence of social media has allowed scientists to engage with publics in novel ways. Social networking sites have fundamentally changed the modern media environment and, subsequently, media consumption habits. When asked where they primarily go to learn more about scientific issues, more than half of Americans point to the Internet. These online spaces offer many opportunities for scientists to play active roles in communicating and engaging directly with various publics. Additionally, the proposed research activities were inspired by a recent report by the National Academies of Sciences, Engineering, and Medicine that included a challenge to science communication researchers to determine better approaches for communicating science through social media platforms. Humor has been recommended as a method that scientists could use in communicating with publics; however, there is little empirical evidence that its use is effective. The researchers will explore the effectiveness of using humor for communicating about artificial intelligence, climate science and microbiomes.
The research questions are: How do lay audiences respond to messages about scientific issues on social media that use humor? What are scientists' views toward using humor in constructing social media messages? Can collaborations between science communication scholars and practitioners facilitate more effective practices? The research is grounded in the theory of planned behavior and framing as a theory of media effects. A public survey will collect and analyze data on Twitter messages with and without humor, the number of likes and re-tweets of each message, and their scientific content. Survey participants will be randomly assigned to one of twenty-four experimental conditions. The survey sample, matching recent U.S. Census Bureau data, will be obtained from opt-in panels provided by Qualtrics, an online market research company. The second component of the research will quantify the attitudes of scientists toward using humor to communicate with publics on social media. Data will be collected from a random sample of scientists and graduate students at R1 universities nationwide. Data will be analyzed using descriptive statistics and regression modeling.
The broader impacts of this project are twofold: findings from the research will be shared with science communication scholars and trainers advancing knowledge and practice; and an infographic (visual representation of findings) will be distributed to practitioners who participate in research-practice partnerships. It will provide a set of easily-referenced, evidence-based guidelines about the types of humor to which audiences respond positively on social media.
This project is funded by the National Science Foundation's (NSF's) Advancing Informal STEM Learning (AISL) program, which supports innovative research, approaches, and resources for use in a variety of learning settings.
DATE:
-
TEAM MEMBERS:
Sara YeoLeona Yi-Fan SuMichael Cacciatore