The aim of this study was to explore 22 Web site evaluation reports, or sections of larger evaluation reports centering on a Web site, to identify, define, and provide examples of the range of evaluation focus areas to inform the design of Web site evaluation studies. The sample included a group of reports contributed to the Informalscience.org online database. Prior to this study, staff members at the Science Museum of Minnesota organized and coded the database of evaluation reports as part of the Building Informal Science Education (BISE) project funded by the National Science Foundation
The EndNote library includes citations for all 520 reports that were coded as part of the Building Informal Science Education (BISE) project. PDF copies of each report are included with the citations. This is a file downloaded from EndNote that can be imported into Mendeley citation management software. Disclaimer: Citations may need to be cleaned once imported into Mendeley, as it may not be a clean transfer from EndNote.
The EndNote library includes citations for all 520 reports that were coded as part of the Building Informal Science Education (BISE) project. PDF copies of each report are included with the citations.
This worksheet helps you think through ways you might use the Building Informal Science Education (BISE) project’s resources to plan your own evaluation or learn about evaluation practices in the informal learning field.
This zip file includes the 520 reports that were downloaded from informalscience.org and coded as part of the Building Informal Science Education (BISE) project. Each of the reports is referred to by a project ID number that is used across all of the BISE resources.
This document provides examples of questions you can answer in NVivo by running matrix queries, running coding queries, and creating sets. It was created to help users navigate the NVivo Database as part of the Building Informal Science Education (BISE) project.
BISE’s NVivo database includes all of the coding applied by the BISE team based on the BISE Coding Framework. This includes codes that were applied to specific sections of a report (referred to as “nodes” in Nvivo) and codes that were applied to an entire report (referred to as “attributes” in Nvivo). For Mac or NVivo 9 versions, visit the VSA website at http://www.visitorstudies.org/bise.
This paper presents a conceptual framework for analyzing how researchers and district leaders perceive and navigate differences they encounter in the context of research-practice partnerships. Our framework contrasts with images of partnership work as facilitating the translation of research into practice. Instead, we argue that partnership activity is best viewed as a form of joint work requiring mutual engagement across multiple boundaries. Drawing on a cultural-historical account of learning across boundaries (Akkerman & Bakker, 2011) and evidence from a study of two longterm partnerships
Collaboration is a prerequisite for the sustainability of interagency programs, particularly those programs initially created with the support of time-limited grant-funding sources. From the perspective of evaluators, however, assessing collaboration among grant partners is often difficult. It is also challenging to present collaboration data to stakeholders in a way that is meaningful. In this article, the authors introduce the Levels of Collaboration Scale, which was developed from existing models and instruments. The authors extend prior work on measuring collaboration by exploring the
DATE:
TEAM MEMBERS:
Bruce FreyJill LohmeierStephen LeeNona Tollefson
To address the Informal Science Learning for Indigenous communities raises a number of issues. What is “informal” and how does this notion influence the everyday lived lives of Indigenous peoples? Can we separate the informal from the formal, and is the nexus of the two a productive place from which to explore, teach, and pursue science in Indigenous communities? This commissioned paper attempts to begin addressing these questions.
DATE:
TEAM MEMBERS:
Bryan Mckinely Jones BrayboyAngelina Castagno
Educational assessment systems are frequently challenged by divergent stakeholder needs. A major insight from experts who work on school assessment systems is the need to clearly articulate and evaluate assessment choices in relation to these distinct goals. The out-of-school STEM ecosystem faces similar challenges. This background paper presents ideas for new assessment methodologies that include biographical and narrative approaches, measures of sustained learning, and social network representations to complement more traditional approaches that capture average effects of a particular