This poster was presented at the 2016 Advancing Informal STEM Learning (AISL) PI Meeting held in Bethesda, MD on February 29-March 2. The overarching goal for the project (and the ongoing work of the Education Working Group of the Citizen Science Association (CSA)) was to expand and improve the use of citizen science in formal and informal STEM education to pursue the breadth of learning outcomes being pursued in those settings.
The purpose of the Museum of Science and Industry’s new Teacher Professional Development Series (TPDS) is to improve student performance in science by enhancing their teachers’ science content knowledge, instructional strategies, and museum skills. By combining solid content, hands-on classroom activities, inquiry-based instruction, and tools for a successful Museum visit, the Museum seeks to assist 4th-8th grade teachers who want to help students explore basic science concepts in new and engaging ways.
The major goals for the overall Teacher Professional Development Series are as follows: (1
This presentation from the 2016 NSF Advancing Informal STEM Learning (AISL) Principal Investigators' Meeting presents an overview of the AISL Online Project Monitoring System (OPMS), including a report-out of findings from the data collected from projects funded between FY2006-FY2014.
This special issue of Science Education & Civic Engagement contains articles on work occurring in a variety of informal STEM education settings, and is dedicated to the memory of former CAISE co-Principal Investigator and adviser Alan Friedman. It was provided in hard copy form to 2016 NSF AISL PI meeting participants.
These slides about the NSF commitment to informal STEM education were presented during the keynote at the 2016 Advancing Informal STEM Learning (AISL) PI Meeting held in Bethesda, MD on February 29-March 2.
These slides were presented at the NSF Advancing Informal STEM Learning (AISL) Principal Investigators' Meeting held in Bethesda, MD from February 29-March 2, 2016. The presentation describes NSF INCLUDES, a funding opportunity that leverages collective impact strategies to broaden participation in STEM.
These slides provide an overview of current NSF funding opportunities, including Dear Colleague Letters and foundation-wide mechanisms. The presentation occurred as a technical assistance session at the 2016 NSF AISL PI Meeting.
This poster was presented at the 2016 Advancing Informal STEM Learning (AISL) PI Meeting held in Bethesda, MD on February 29-March 2. This project's interdisciplinary team will carry out research and training that will identify ways for professionals in science, technology, engineering, and mathematics (STEM) to engage with public audiences that currently lack the community connections, resources, time, or know-how to gain access to science education and to scientists.
Even in the best-resourced science communication institutions, poor quality evaluation methods are routinely employed. This leads to questionable data, specious conclusions and stunted growth in the quality and effectiveness of science communication practice. Good impact evaluation requires upstream planning, clear objectives from practitioners, relevant research skills and a commitment to improving practice based on evaluation evidence.
Access to high quality evaluation results is essential for science communicators to identify negative patterns of audience response and improve outcomes. However, there are many good reasons why robust evaluation linked is not routinely conducted and linked to science communication practice. This essay begins by identifying some of the common challenges that explain this gap between evaluation evidence and practice. Automating evaluation processes through new technologies is then explicated as one solution to these challenges, capable of yielding accurate real-time results that can directly
King et al. [2015] argue that ‘emphasis on impact is obfuscating the valuable role of evaluation’ in informal science learning and public engagement (p. 1). The article touches on a number of important issues pertaining to the role of evaluation, informal learning, science communication and public engagement practice. In this critical response essay, I highlight the article’s tendency to construct a straw man version of ‘impact evaluation’ that is impossible to achieve, while exaggerating the value of simple forms of feedback-based evaluation exemplified in the article. I also identify a