This poster was presented at the 2016 Advancing Informal STEM Learning (AISL) PI Meeting held in Bethesda, MD on February 29-March 2. The overarching goal for the project (and the ongoing work of the Education Working Group of the Citizen Science Association (CSA)) was to expand and improve the use of citizen science in formal and informal STEM education to pursue the breadth of learning outcomes being pursued in those settings.
This presentation from the 2016 NSF Advancing Informal STEM Learning (AISL) Principal Investigators' Meeting presents an overview of the AISL Online Project Monitoring System (OPMS), including a report-out of findings from the data collected from projects funded between FY2006-FY2014.
This special issue of Science Education & Civic Engagement contains articles on work occurring in a variety of informal STEM education settings, and is dedicated to the memory of former CAISE co-Principal Investigator and adviser Alan Friedman. It was provided in hard copy form to 2016 NSF AISL PI meeting participants.
These slides were presented at the NSF Advancing Informal STEM Learning (AISL) Principal Investigators' Meeting held in Bethesda, MD from February 29-March 2, 2016. The presentation describes NSF INCLUDES, a funding opportunity that leverages collective impact strategies to broaden participation in STEM.
This poster was presented at the 2016 Advancing Informal STEM Learning (AISL) PI Meeting held in Bethesda, MD on February 29-March 2. The project is producing twenty videos of scientists and engineers presenting their research that are closely aligned with one hundred scientific inquiry and engineering design-based experiments and lesson plans.
These slides provide an overview of current NSF funding opportunities, including Dear Colleague Letters and foundation-wide mechanisms. The presentation occurred as a technical assistance session at the 2016 NSF AISL PI Meeting.
This poster was presented at the 2016 Advancing Informal STEM Learning (AISL) PI Meeting held in Bethesda, MD on February 29-March 2. DEVISE was conceived to address the need for improved evaluation quality and capacity across the field of citizen science.
This poster was presented at the 2016 Advancing Informal STEM Learning (AISL) PI Meeting held in Bethesda, MD on February 29-March 2. The purpose of this research is to advance theoretical and practical understanding of how participation in citizen science fosters and/or supports lifelong science learning. We are specifically examining the relationship between engagement, science learning, and science identity.
This poster was presented at the 2016 Advancing Informal STEM Learning (AISL) PI Meeting held in Bethesda, MD on February 29-March 2. The project investigates how Co-generative Dlogue (cogen), a respectful conversation among students and scientists for improving teaching and learning, may produce more engaging and productive interactions and learning environments.
This poster was presented at the 2016 Advancing Informal STEM Learning (AISL) PI Meeting held in Bethesda, MD on February 29-March 2. This project's interdisciplinary team will carry out research and training that will identify ways for professionals in science, technology, engineering, and mathematics (STEM) to engage with public audiences that currently lack the community connections, resources, time, or know-how to gain access to science education and to scientists.
Even in the best-resourced science communication institutions, poor quality evaluation methods are routinely employed. This leads to questionable data, specious conclusions and stunted growth in the quality and effectiveness of science communication practice. Good impact evaluation requires upstream planning, clear objectives from practitioners, relevant research skills and a commitment to improving practice based on evaluation evidence.
Access to high quality evaluation results is essential for science communicators to identify negative patterns of audience response and improve outcomes. However, there are many good reasons why robust evaluation linked is not routinely conducted and linked to science communication practice. This essay begins by identifying some of the common challenges that explain this gap between evaluation evidence and practice. Automating evaluation processes through new technologies is then explicated as one solution to these challenges, capable of yielding accurate real-time results that can directly