2008 Informal Science Education PI Summit (July 25-26)
Increasing the Impact and Communicating the Value of Informal Science Education
There were 251 attendees, of which 176 were representatives of NSF ISE funded projects. In addition to NSF, seven other federal agencies (CPB, DOE, IMLS, NASA, NIH, NOAA, NPS, Smithsonian) that support informal science education were also represented.
The participants, projects represented, and agenda are included in the program.
Workshops
Representatives of VSA, NSF, and Westat delivered interactive workshops at the ISE PI Summit 2008. Workshop sessions were designed to increase participants' evaluation expertise, share knowledge about visitor studies/informal learning research, and provide technical assistance related to administering NSF-funded grants.
Visitor Studies 101: Evaluating Impact
Presenter: Ellen Guisti, Independent Consultant
Workshop sessions were designed to increase participants’ evaluation expertise, share knowledge about visitor studies/informal learning research, and provide technical assistance related to administering NSF-funded grants. They were a mix of presentations, how-to, and hands-on formats.
Your project is in a museum or science center, and you know how to create good programs and exhibits. But now NSF is asking you to evaluate the impact of those programs and exhibits. Where do you start? With Visitor Studies 101.
This workshop will introduce the four basic phases of evaluation: (1) front-end, used during design development, (2) formative, used during design implementation, (3) remedial or corrective, and (4) summative. Participants will learn how to establish a project’s learning goals and then use the NSF report, Framework for Evaluating Impacts of Informal science Education projects, to measure what they have achieved.
This workshop will provide participants with an overview of exhibition and program evaluation, beginning with the differences between basic research and program evaluation. Attendees will learn that funders’ requirements are not the only or even the principal reason to conduct evaluation—rather, evaluation is the “right thing to do” for anyone concerned with successful outcomes in informal education. Attendees will come away with the basic concepts and tools needed to work with a professional evaluator. To be defined and discussed: evaluation terminology, quantitative vs. qualitative methods, and the pros and cons of various data collection and analysis approaches.
Visitor Studies 101 will have an informal seminar-type format rather than a lecture, with plenty of opportunity for questions and comments from the participants. Participants will take away a substantial handout developed by the presenter in collaboration with Smithsonian Institution colleagues Zahava Doering and Andrew Pekarik.
- Workshop Slides - Visitor Studies 101: Evaluating Impact and Understanding Audiences (pdf, 24 pp.)
- Workshop Handout - Formative Evaluation: Evaluation Goals & Protocol (pdf, 2 pp.)
- Workshop Handout - Formative Evaluation: Observation & Questions (pdf, 2 pp.)
- Workshop Handout - Front-End Evaluation: Interview Protocol (pdf, 2 pp.)
- Workshop Handout - Summative Evaluation: Interview Protocol (pdf, 2 pp.)
- Workshop Handout - Visitor Studies Resources & Evaluation Bibliography (pdf, 6 pp.)
- Workshop Handout - Hypothetical Logic Model (pdf, 1 p.)
- Workshop Handout - ISE Logic Model (pdf, 1 p.)
Beyond Counting Hits: Strategies for Evaluating ISE Websites
Presenters: Saul Rockman and Jennifer Borse, Rockman et al
People are clicking on your ISE website. What does that actually tell you about the impact your site is having on the public understanding of science and technology?
In this workshop, participants will be introduced to the range of issues in and approaches for the evaluation of ISE websites. As part of this workshop, presenters will provide participants with a worksheet to help them define the outcomes they want from their websites and the kinds of data that can be used to capture impacts and outcomes. The presenters seek to narrow the range of expectations to those that are more realistic and to encourage efforts that identify meaningful short- and long- term outcomes rather than just counting hits, page views, stickiness, and other trivial data. For summative outcomes, the focus will be on what data can be captured on the websites, what can be linked to web interactivity, and what can lead to offsite actions that can be tied back to the site. Examples will include ISE websites for media and museums.
The workshop will introduce participants to a wide variety of evaluation tools and practices, including open source tools and proprietary ones. Among the topics discussed will be evaluation planning, user profiles, formative evaluation, beta tests, usability/navigation, universal design, web log analysis, assessments of learning, and off-site actions.
Workshop Slides - Beyond counting hits: Strategies for evaluating ISE websites (pdf, 9 pp.)
Financial Practices and Reporting: Strategies and Tools to Support Good Business Practice
Presenters: Pamela Hawkins and Tarsha Johnson, NSF Division of Grants and Agreements and Carol Orlando and Beatriz Azor, NSF Cost Analysis and Audit Resolution Branch
Sound financial practices and reporting are crucial to effective grant management. This workshop, led by staff from NSF’s Office of Budget, Finance, and Award Management (BFA) Division of Grants and Agreements (DGA) and Cost Analysis and Audit Resolution Branch (CAAR), focuses on NSF financial management, requirements, and the strategies and tools to help make the process effective and (relatively) painless. Examples of tools and good practice are included. Questions are welcome.
Workshop Slides - Business Assistance (pdf, 29 pp.)
Methods and Instruments: How about Tools? Evaluation 101: Everything You Need to Know to Get Started
Presenter: Terrie Nolinske, TNI Consultants LLC
Evaluating the impact of informal science education projects isn’t easy. Fortunately, there are a variety of methods available that can be used by project leaders to assess program efficacy.
This workshop will include an overview of methods used in data collection with an emphasis on in-person interviews, telephone interviews, focus groups, and questionnaires. Methods such as journals, portfolios, and observations with follow-up interviews may also be discussed. Participants will work in small groups to identify the advantages and disadvantages of each method of evaluation and will share their findings with the group at large. Participants will examine resources related to survey research and evaluation, including online sources.
The format for this workshop is interactive and hands-on. Lecturettes will alternate with discussions, case studies, and individual and group activities, such as reflections and problem solving. Participants are encouraged to bring examples of instruments they have used or are using. They are also encouraged to share their experiences and start to apply what they learn to their respective settings during the workshop. Participants will leave with a comprehensive set of handouts and resources.
- Workshop Handout - Survey Research and Evaluation Methods (pdf, 32 pp.)
- Workshop Article - Minimizing Error When Developing Questionaires (pdf, 11 pp.)
Evaluation 101: Everything You Need to Know to Get Started
Presenters: Saul Rockman and Jennifer Borse, Rockman et al
You know almost exactly what you want to do to improve the public understanding of science and technology. But you don’t have much of an idea about how to start to evaluate your project, to improve its effectiveness, and then to prove its success. Evaluation 101 to the rescue. This workshop will begin with “Why do an evaluation?” and “What is an evaluation?” and quickly follow with “How would this work with a planetarium show, website, or television show?” We will help participants identify the products or processes in their ISE initiatives. The rationale will include interactive discussions of the value of improving the product, communicating its impact or value, responding to questions about the initiative, clarifying the content and presentations to better serve the needs of the audience, and building the next program or media product.
The workshop will be based on the content of EvaluationSpringboard.org, an existing, freely available, and accessible website. Topics include creating a logic model, formulating and prioritizing evaluation questions, human subjects and informed consent, identifying evaluation types, identifying evaluation methods, planning for and collecting data, analyzing and interpreting data, and reporting and using findings. The labs match the content covered in the Framework for Evaluating Impacts of Informal Science Education Projects.
- Workshop Slides - Evaluation 101: Everything you need to know to get started evaluating informal science education media (pdf, 32 slides)
- Workshop Handout - Evaluation 101: Everything you need to know to get started evaluating informal science education media (pdf, 1 p.)
- Workshop Handout - Logic model for the ISE program (pdf, 2 pp.)
- Workshop Handout - Intended impacts, indicators & evidence (pdf, 1 p.)
- Workshop Handout - Alignment of research questions, constructs, and data sources for a youth media program (pdf, 1 p.)
- Workshop Handout - Types of survey questions (pdf, 4 pp.)
- EvaluationSpringboard.org is a website designed in response to the need for knowledge and skills in evaluation for those who want to undertake or commission evaluations in educational settings.
Ethical and Practical Solutions for Evaluation Studies
Presenter: Josh Gutwill, Exploratorium
With the line between research and evaluation blurring, more ISE projects are employing evaluation and must now meet formal requirements for protecting human subjects. What ethical and legal considerations do you need to take into account in order to evaluate the effectiveness of your work in informal science education? What steps should you take to deal with the federal government’s requirements? In this workshop, we will discuss a range of topics related to these requirements and how PIs can effectively address them.
By reviewing different evaluation scenarios, participants will consider the ethical tensions that commonly emerge in informal learning environments. For example, how can we adequately protect the privacy of visitors, participants, or viewers while still capturing the details of their interactions? How can we obtain informed consent to participate without disrupting the experience in the free-choice learning setting?
Participants will leave the workshop with an understanding of when and how to apply the federal guidelines for the protection of human subjects and how to work well with IRBs.
- Workshop Slides - Ethical and practical solutions for evaluation studies: Protecting human subjects (pdf, 26 pp.)
- Workshop Handout - Resources for complying with federal human subjects guidelines (pdf, 3 pp.)
- Workshop Handout - Is research involving human subjects? Four case studies (pdf, 2 pp.)
“Big Ideas” for ISE Projects
Presenter: Beverly Serrell, Serrell and Associates
You are an informal science education professional, and you have a proposal or a grant for an NSF-supported project. That’s fine, but what’s your Big Idea?
The concept of a “Big Idea” is widely applicable to many informal science education programs. Similar to a thesis statement, a big idea clearly delineates the content and subject of the program you are offering. Stated as a sentence, it gives the subject momentum and direction. The benefits of having a Big Idea are many: Visitors, viewers, or participants in an ISE program enjoy clearer messages and better program organization, and ISE professionals share a clearly articulated vision for the program. One of the most important functions of a Big Idea is what it tells you to leave out.
The workshop will include a presentation of the background and current issues, discussion of a hand-out of Big Idea exemplars, and plenty of time for Q&A.
Working with the ISE Project Monitoring System
Presenters: Gary Silverstein and John Wells, Westat
Staff from Westat will present preliminary findings from the pilot study of the ISE Project Monitoring System. During this session, they will also be discussing the challenges that projects encountered—most notably delineating measurable impacts and indicators for their projects. Finally, they will be soliciting feedback from projects about additional assistance and guidance that would enhance the capacity of projects to respond to individual items (or the overall system).
- Primer for the Informal Science Eduction Online Project Monitoring System (pdf, 25 pp.)
- Evaluation and Monitoring in the ISE Program (pdf)
- View SlideShare presentation or Upload your own.
Framework for Evaluating the Impacts of Informal Science Education Projects
Framework for Evaluating the Impacts of Informal Science Education Projects was the centerpiece of the sessions on the second day of the ISE PI Summit.
Download the Framework for Evaluating the Impacts of Informal Science Education Projects (pdf, 117 pp.)
Introduction to the Framework (pdf, 10 pp.) - Alan Friedman
View SlideShare presentation or Upload your own.
Putting the Framework into Practice (pdf, 17 pp.) - Sue Allen, Gary Silverstein, and Lynn Dierking
View SlideShare presentation or Upload your own.
Evaluation and Monitoring in the ISE Program (pdf, 8 pp.) - Al DeSena and Gary Silverstein
View SlideShare presentation or Upload your own.
Framework Breakout Sessions
The charge for the breakout was to:
- Share experiences, insights, concerns, and plans for using the Framework with each other
- Write comments and feedback for the authors of the Framework and for NSF with your issues, queries, and suggestions.
Each group was given these instructions (pdf, 1 p.). Notes are posted below for each breakout session.
Mass Media
- Mass Media Breakout Session Notes #1 (pdf, 2 pp.)
- Mass Media Breakout Session Notes #2 (pdf, 2 pp.)
- Mass Media Breakout Session Notes #3 (pdf, 1 p.)
Youth & Community Programs
Youth & Community Breakout Session Notes (pdf, 4 pp.)
Learning Technologies
ISE PI Summit 2008 Learning Technologies Breakout Session Notes (pdf, 5 pp.)
Collaborations
ISE PI Summit 2008 Collaborations Breakout Session Notes (pdf, 4 pp.)
Multiple Deliverables
ISE PI Summit 2008 Multiple Deliverables Breakout Session Notes (pdf, 5 pp.)