Measuring Participation in Online Learning Environments
This article was migrated from a previous version of the Knowledge Base. The date stamp does not reflect the original publication date.
Overview
To measure participation in an informal, online learning environment, what evidence do we look for? Which environments do we include and exclude? Do we agree on what learning is? What counts as participation?
Findings from Research and Evaluation
From the project work on Facilitating Learning in Digital Environments, the following definition of online learning has been proposed (Grabill, 2014):
Learning is an ongoing process of change due to interactions between individuals and their social and physical environments. This ecological perspective emphasizes that the activity of learning is shaped by learners' interests and needs, their prior experiences, their companions, facilitators, and other distinctive features of their specific socio-cultural and physical environments. In our people-centered view of an online learning ecology, we see evidence of learning through the examination of an individual’s adaptations, which may be observed in many ways including their actions, language, emotions and practices. For our study of online learning environments, we are particularly concerned with facilitation, such as listening, questioning, involving, encouraging, and redirecting, which play an important role in creating strong learning environments.
In a broad view, online learning environments are defined as the use of technology to supplement, augment or replace face-to-face instruction (Berge et al, 2000). Perhaps an informal approach to learning online requires that technology always and only augments the learner’s life, both real virtual (Barry, 2006)?
This may also mean that readily available metrics — subscribers, pageviews, bounce rates, friends, followers, video plays, favorites, retweets — do not paint a complete picture of informal participation. Participation may need to be measured by observing behaviors, just as we do on the floor of our institutions (Bell, et al., 2009).
Technologies explored by informal learning institutions range from websites, forums, social media, blogs, mobile, games, MOOCs, wearables, devices, to the undeveloped and undiscovered at the time of this writing.
For environments where participants generate text as part of participation — posts, updates and comments — these words can be used as a dataset to observe participant behavior. Using discourse analysis, researchers have observed scientific argumentation, empathy, and identity building in online learning environments like Science Buzz and Experimonth (Grabill, 2009 & Grabill, 2014).
Some environments generate media as a result of participation — photographs, drawings, videos, audio, and computer generated graphics, animations, and models. Researchers have studied the evidence of science learning in drawings of children (Eberbach, C., and Crowley, K., 2009 & Fox, 2010), but this research has yet to cross over into online environments where those (or other) byproducts of creative work are uploaded as informal participation.
Some environments generate data as a result of participation. Through Public Participation in Scientific Research projects at the Cornell Lab of Ornithology, citizen science has been designed to help participants learn about the organisms they are observing (Bonney, et al, 2009). And other data-based participatory projects, such as the use of quantified self tools for learning, have been studied to understand if they can support reflective learning processes (Rivera-Pelayo, 2012).
Directions for Future Research
Research & Evaluation
Many of the strands of engagement, defined in the Learning Science in Informal Environments publication can be remapped to online behavior.
The Writing in Digital Environments group at Michigan State University has studied learning and facilitating learning in online environments using discourse analysis.
The Cornell Lab of Ornithology maintains a database of evaluation materials about PPSR projects.
Resources
Examples
Tools
- The Facilitation Toolbox presents a rubric for facilitating learning in online environments
- The Citizen Science Toolkit provides a step-wise framework for building learning-oriented PPSR projects.
Communities of Practice
References
Barry A., (2006). Creating A Virtuous Circle Between a Museum’s On-line And Physical Spaces. In J. Trant and D. Bearman (eds.). Museums and the Web 2006: Proceedings. Toronto: Archives & Museum Informatics. Last updated February 28, 2006, consulted 2008. http://www.archimuse.com/mw2006/papers/barry/barry.html
Bell, P., Lewenstein, B., Shouse, A. W., & Feder, M. A., (Eds.) (2009). Learning science in informal environments: People, places, and pursuits. Washington, DC: National Academies Press. Accessed from: http://informalscience.org/research/ic-000-000-002-024/LSIE
Barry A. (2006). Creating A Virtuous Circle Between a Museum’s On-line And Physical Spaces. In J. Trant and D. Bearman (eds.). Museums and the Web 2006: Proceedings. Toronto: Archives & Museum Informatics. Last updated February 28, 2006, consulted 2008. http://www.archimuse.com/mw2006/papers/barry/barry.html
Berge Z, Collins M and Dougherty K (2000) Design guidelines for web-based courses in Abbey B(ed) Instructional and cognitive impacts of web-based education IDEA Group, Hershey, 32–40.
Bonney, R. (2009). “Citizen science: A developing tool for expanding science knowledge and scientific literacy”. BioScience, 59 (11), p. 977.
Eberbach, C., and Crowley, K. (2009). From every day to scientific observation: How children learn to observe the biologist’s world. Review of Educational Research, 79(1), 39-68.
Fox, J. (2010). The Role of Drawing in Kindergarteners’ Science Observations. International Art in Early Childhood Research Journal, Volume 2, Number 1. 2010. Accessed from: http://artinearlychildhood.org/artec/images/article/ARTEC_2010_Research_Journal_1_Article_5.pdf
Grabill, J.T. et al., Take Two: A Study of the Co-Creation of Knowledge on Museum 2.0 Sites. In J. Trant and D. Bearman (eds). Museums and the Web 2009: Proceedings. Toronto: Archives & Museum Informatics. Published March 31, 2009. Consulted March 17, 2015. http://www.archimuse.com/mw2009/papers/grabill/grabill.html
Rivera-Pelayo, V., Zacharias, V., Müller, L., & Braun, S. (2012). Applying Quantified Self Approaches to Support Reflective Learning. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge: Association for Computing Machinery. Accessed from: http://delivery.acm.org/10.1145/2340000/2330631/p111-rivera-pelayo.pdf