Proposals to the AISL program are due November 4, 2015. Available below is a narrated powerpoint presentation that addresses a number of elements of the solicitation to help prospective submitters prepare proposals.
This is the solicitation for proposals to the NSF Advancing Informal STEM Learning (AISL) program, which seeks to advance new approaches to and evidence-based understanding of the design and development of STEM learning in informal environments for public and professional audiences; provide multiple pathways for broadening access to and engagement in STEM learning experiences; advance innovative research on and assessment of STEM learning in informal environments; and develop understandings of deeper learning by participants.
This is the final report of the Open University’s RCUK-funded Public Engagement with Research Catalyst, ‘An open research university’, a project designed to create the conditions in which engaged research can flourish. The report describes an evidence-based strategy designed to embed engaged research within the University’s strategic planning for research and the operational practices of researchers. This programme of organisational change was informed by action research, working collaboratively with researchers at all levels across the institution to identify and implement strategies that
DATE:
TEAM MEMBERS:
Richard HollimanAnne AdamsTim BlackmanTrevor CollinsGareth DaviesSally DibbAnn GrandRichard HoltiFiona McKerlieNick MahonyNick Mahony
This document contains a summary of notes from an Open Space session on Media, Technology, and Informal Learning from the 2014 AISL PI Meeting. It includes a list of active AISL projects related to media.
There are a number of places evaluators can share their reports with each other, such as the American Evaluation Association’s eLibrary, the website informalscience.org, and organizations’ own websites. Even though opportunities to share reports online are increasing, the evaluation field lacks guidance on what to include in evaluation reports meant for an evaluator audience. If the evaluation field wants to learn from evaluation reports posted to online repositories, how can evaluators help to ensure the reports they share are useful to this audience? This paper explores this question through
As a part of the strategy to reach the NASA Science Mission Directorate (SMD) Science Education and Public Outreach Forum Objective 1.2: Provide resources and opportunities to enable sharing of best practices relevant to SMD education and public outreach (E/PO), the Informal Education Working Group members designed a nationally-distributed online survey to answer the following questions: 1. How, when, where, and for how long do informal educators prefer to receive science, mathematics, engineering, and/or technology content professional development? 2. What are the professional development and
The Guide to Science Centers and Museums of Latin American and the Caribbean was launched on Monday, May 25, during the XIV Congress of RedPop 2015 in Medellin, Colombia. The guide describes, country by country, all 468 science museums and science centers found in the region. In the guide you can find one-page descriptions including an institution's name, contact information, hours of operation, and a brief summary of the organization. The guide is available in Spanish and Portuguese.
As Kathleen Enright rightly stated in the opening article of this issue: “Until all participants in the evaluation chain embrace and support a learning focus, I think our ability to use evaluation results to increase impact will be limited.” As grantmakers, we arguably play the most important role in ensuring that all stakeholders receive the support they need to be full participants, contributors, learners and beneficiaries of an evaluation process. For evaluation to be worth the money and time we spend on it, everyone needs to be a full participant — funders, grantee organizations and even
This annotated bibliography provides selected references regarding multiculturalism and cultural competence in evaluation. It contains a section listing more recent publications dating beyond 2007.
CDC provides its funded programs with a wide range of evaluation resources and guides. State health departments, tribal organizations, communities, and partners working in a variety of public health areas may also find these tools helpful. The resources provide guidance on evaluation approaches and methods, relevant examples, and additional resources. The guides are intended to aid in skill building on a wide range of evaluation topics. Practical Strategies for Culturally Competent Evaluation is designed to complement the other evaluation resources offered by the Division for Heart Disease and
DATE:
TEAM MEMBERS:
Centers for Disease Control and Prevention (CDC)Derrick GervinRobin KuwaharaRashon LaneSarah GillRefilwe MoetiMaureen Wilce
The role of informal science institutions in supporting science learning and engagement is becoming increasingly recognised. However, research in this area is published in a variety of journals and can be challenging for practitioners to access and apply. Indeed, it appears that the informal science sector lacks a process by which research can be usefully integrated into practice, and by which practice can inform research. In this paper, we argue that there is a need for research and practice to work together to produce practically relevant and academically credible research. We outline the
As providers of informal STEM education, including libraries, grapple with the issue of offering inclusive programs and meeting the needs of their specific communities, potential program facilitators seek knowledge and guidance to develop and deliver effective STEM programming for underserved populations. Key questions that might be asked include: Have best practices been identified for effectively engaging underserved audiences? What key strategies, if any, have emerged from previous informal science education efforts that can inform new program development? Over the past 10 to 20 years