Engagement is the cornerstone of learning in informal science education. During free-choice learning in museums and science centers, visitor engagement shapes how learners interact with exhibits, navigate through exhibit spaces, and form attitudes, interests, and understanding of science. Recent advances in multimodal learning analytics are creating novel opportunities for expanding the range and richness of measures of visitor engagement in free-choice settings. In particular, multimodal learning analytics offer significant potential for integrating multiple data sources to devise a composite picture of visitors' cognitive, affective, and behavioral engagement. The project will center on providing a rich empirical account of meaningful visitor engagement with interactive tabletop science exhibits among individual visitors and small groups, as well as uncovering broader tidal patterns in visitor engagement that unfold across exhibit spaces. A key objective of the project is creating models and practitioner-focused learning analytic tools that will inform the best practices of exhibit designers and museum educators. This project is funded by the Advancing Informal STEM Learning (AISL) program. As part of its overall strategy to enhance learning in informal environments, AISL funds research and innovative approaches and resources for use in a variety of settings. The research team will conduct data-rich investigations of visitors' learning experiences with multimodal learning analytics that fuse the rich multichannel data streams produced by fully-instrumented exhibit spaces with the data-driven modeling functionalities afforded by recent advances in machine learning and educational data mining. The research team will conduct a series of visitor studies of naturalistic engagement in solo, dyad, and group interactions as visitors explore interactive tabletop science exhibits. The studies will utilize eye trackers to capture visitors' moment-to-moment attention, facial expression analysis and quantitative field observations to track visitors' emotional states, trace logs generated by exhibit software, as well as motion-tracking sensors and coded video recordings to capture visitors' behavioral interactions. The studies will also use conversation recordings and pre-post assessment measures to capture visitors' science understanding and inquiry processes. With these multimodal data streams as training data, the research team will use probabilistic and neural machine learning techniques to devise learning analytic models of visitor engagement. The project will be conducted by a partnership between North Carolina State University and the North Carolina Museum of Natural Sciences. The research team will 1) design a data-rich multimodal visitor study methodology, 2) create the Visitor Informatics Platform, a suite of open source software tools for multimodal visitor analytics, and 3) launch the Multimodal Visitor Data Warehouse, a curated visitor experience data archive. Together, the multimodal visitor study methodology, the Visitor Informatics Platform, and the Multimodal Visitor Data Warehouse will enable researchers and practitioners in the informal science education community to utilize multimodal learning analytics in their own informal learning environments. It is anticipated that the project will advance the field of informal STEM learning by extending and enriching measures of meaningful visitor engagement, expanding the evidence base for visitor experience design principles, and providing learning analytic tools to support museum educators. By enhancing understanding of the cognitive, affective, and behavioral dynamics underlying visitor experiences in science museums, informal science educators will be well-positioned to design learning experiences that are more effective and engaging. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Funders
TEAM MEMBERS
If you would like to edit a resource, please email us to submit your request.