Handbook of Learning Analytics

Chapter 13

Handbook of Learning Analytics
First Edition

Learning Analytics Implementation Design

Alyssa Friend Wise & Jovita Vytasek


Abstract

This chapter addresses the design of learning analytics implementations: the purposeful shaping of the human processes involved in taking up and using analytic tools, data, and reports as part of an educational endeavor. This is a distinct but equally important set of design choices from those made in the creation of the learning analytics systems themselves. The first part of the chapter reviews key challenges of interpretation and action in analytics use. The three principles of Coordination, Comparison, and Customization are then presented as guides for thinking about the design of learning analytics implementations. The remainder of the chapter reviews the existing research and theory base of learning analytics implementation design for instructors (related to the practices of learning design and orchestration) and students (as part of a reflective and self-regulated learning cycle). Implications for learning analytics designers and researchers and areas requiring further research are highlighted.

Export Citation: Plain Text (APA)     BIBTeX     RIS


Supplementary Material

No Supplementary Material Available

References (64)

Aguilar, S. (2015). Exploring and measuring students’ sense-making practices around representations of their academic information. Doctoral Consortium Poster presented at the 5th International Conference on Learning Analytics and Knowledge (LAK ʼ15), 16–20 March 2015, Poughkeepsie, NY, USA. New York: ACM.

Azevedo, R., Moos, D. C., Johnson, A. M., & Chauncey, A. D. (2010). Measuring cognitive and metacognitive regulatory processes during hypermedia learning: Issues and challenges. Educational Psychologist, 45(4), 210–223.

Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., Gašević, D., Mulder, R., Williams, D., Dawson, S., & Lockyer, L. (2016). A conceptual framework linking learning design with learning analytics. Proceedings of the 6th International Conference on Learning Analytics & Knowledge (LAK ʼ16), 25–29 April 2016, Edinburgh, UK (pp. 329–338). New York: ACM.

Beheshitha, S. S., Hatala, M., Gašević, D., & Joksimović, S. (2016). The role of achievement goal orientations when studying effect of learning analytics visualizations. Proceedings of the 6th International Conference on Learning Analytics & Knowledge (LAK ʼ16), 25–29 April 2016, Edinburgh, UK (pp. 54–63). New York: ACM.

Boekaerts, M., Pintrich, P., & Zeidner, M. (2000). Handbook of self-regulation. San Diego, CA: Academic Press.
Brooks, C., Greer, J., & Gutwin, C. (2014). The data-assisted approach to building intelligent technology enhanced learning environments. In J. A. Larusson & B. White (Eds.), Learning analytics: From research to practice (pp. 123–156). New York: Springer.

Brusilovsky, P., & Peylo, C. (2003). Adaptive and intelligent web-based educational systems. International Journal of Artificial Intelligence in Education, 13(2), 159–172.

Buder, J. (2011). Group awareness tools for learning: Current and future directions. Computers in Human Behavior, 27(3), 1114–1117.

Clow, D. (2012). The learning analytics cycle: Closing the loop effectively. Proceedings of the 2nd International Conference on Learning Analytics & Knowledge (LAK ʼ12), 29 April–2 May 2012, Vancouver, BC, Canada (pp. 134–138). New York: ACM.

Corrin, L., & de Barba, P. (2015). How do students interpret feedback delivered via dashboards? Proceedings of the 5th International Conference on Learning Analytics & Knowledge (LAK ʼ15), 16–20 March 2015, Poughkeepsie, NY, USA (pp. 430–431). New York: ACM.

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” the learning community: An exploration of the development of a resource for monitoring online student networking. British Journal of Educational Technology, 41(5), 736–752.

Dietz-Uhler, B., & Hurn, J. E. (2013). Using learning analytics to predict (and improve) student success: A faculty perspective. Journal of Interactive Online Learning, 12(1), 17–26.

Donnelly, D., McGarr, O., & O’Reilly, J. (2011). A framework for teachers’ integration of ICT into their classroom practice. Computers & Education, 57(2), 1469–1483.

Duffy, T. M., & Cunningham, D. J. (1996). Constructivism: Implications for the design and delivery of instruction. In D. Jonassen (Ed.), Handbook of research for educational communications and technology (pp. 170–198). New York: Macmillan.

Dyckhoff, A. L., Lukarov, V., Muslim, A., Chatti, M. A., & Schroeder, U. (2013). Supporting action research with learning analytics. Proceedings of the 3rd International Conference on Learning Analytics & Knowledge (LAK ’13), 8–12 April 2013, Leuven, Belgium (pp. 220–229). New York: ACM.

Ertmer, P. A. (1999). Addressing first- and second-order barriers to change: Strategies for technology integration. Educational Technology Research and Development, 47(4), 47–61.

Feldon, D. F. (2007). Cognitive load and classroom teaching: The double-edged sword of automaticity. Educational Psychologist, 42(3), 123–137.

Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(18), 304–317.

Ferguson, R., Buckingham Shum, S., & Deakin Crick, R. (2011). EnquiryBlogger: Using widgets to support awareness and reflection in a PLE Setting. In W. Reinhardt & T. D. Ullmann (Eds.), Proceedings of the 1st Workshop on Awareness and Reflection in Personal Learning Environments (pp. 11–13) Southampton, UK: PLE Conference.

Govaerts, S., Verbert, K., Duval, E., & Pardo, A. (2012). The student activity meter for awareness and self-reflection. Proceedings of the CHI Conference on Human Factors in Computing Systems, Extended Abstracts (CHI EA ’12), 5–10 May 2012, Austin, TX, USA (pp. 869–884). New York: ACM.

Hall, G. E. (2010). Technology’s Achilles Heel: Achieving high-quality implementation. Journal of Research on Technology in Education, 42(3), 231–253.

Holman, C., Aguilar, S. J., Levick, A., Stern, J., Plummer, B., & Fishman, B. (2015). Planning for success: How students use a grade prediction tool to win their classes. Proceedings of the 5th International Conference on Learning Analytics & Knowledge (LAK ʼ15), 16–20 March 2015, Poughkeepsie, NY, USA (pp. 260–264). New York: ACM.

Janssen, J., & Bodemer, D. (2013). Coordinated computer-supported collaborative learning: Awareness and awareness tools. Educational Psychologist, 48(1), 40–55.

Järvelä, S., Kirschner, P. A., Panadero, E., Malmberg, J., Phielix, C., Jaspers, J., & Järvenoja, H. (2015). Enhancing socially shared regulation in collaborative learning groups: Designing for CSCL regulation tools. Educational Technology Research and Development, 63(1), 125–142.

Koh, E., Shibani, A., Tan, J. P. L., & Hong, H. (2016). A pedagogical framework for learning analytics in collaborative inquiry tasks: An example from a teamwork competency awareness program. Proceedings of the 6th International Conference on Learning Analytics & Knowledge (LAK ʼ16), 25–29 April 2016, Edinburgh, UK (pp. 74–83). New York: ACM.

Kolb, D. A. (1984). Experiential education: Experience as the source of learning and development. Upper Saddle River, NJ: Prentice Hall.

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459.

Lonn, S., Aguilar, S. J., & Teasley, S. D. (2015). Investigating student motivation in the context of a learning analytics intervention during a summer bridge program. Computers in Human Behavior, 47, 90–97.

Lytle, S., & Cochran-Smith, M. (1990). Learning from teacher research: A working typology. The Teachers College Record, 92(1), 83–103.

Martínez-Monés, A., Harrer, A., & Dimitriadis, Y. (2011). An interaction-aware design process for the integration of interaction analysis into mainstream CSCL practices. In S. Puntambekar, G. Erkens, & C. Hmelo-Silver (Eds.), Analyzing interactions in CSCL: Methods, approaches and issues (pp. 269–291). New York: Springer.

Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588–599.

Molenaar, I., & Wise, A. (2016). Grand challenge problem 12: Assessing student learning through continuous collection and interpretation of temporal performance data. In J. Ederle, K. Lund, P. Tchounikine, & F. Fischer (Eds.), Grand challenge problems in technology-enhanced learning II: MOOCs and beyond (pp. 59–61). New York: Springer.

Mor, Y., Ferguson, R., & Wasson, B. (2015). Learning design, teacher inquiry into student learning and learning analytics: A call for action. British Journal of Educational Technology, 46(2), 221–229.

Pardo, A., Ellis, R. A., & Calvo, R. A. (2015). Combining observational and experiential data to inform the redesign of learning activities. Proceedings of the 5th International Conference on Learning Analytics & Knowledge (LAK ʼ15), 16–20 March 2015, Poughkeepsie, NY, USA (pp. 305–309). New York: ACM.

Pardo, A., Han, F., & Ellis, R. A. (2016). Exploring the relation between self-regulation, online activities, and academic performance: A case study. Proceedings of the 6th International Conference on Learning Analytics & Knowledge (LAK ʼ16), 25–29 April 2016, Edinburgh, UK (pp. 422–429). New York: ACM.

Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), 438–450.

Persico, D., & Pozzi, F. (2015). Informing learning design with learning analytics to improve teacher inquiry. British Journal of Educational Technology, 46(2), 230–248.

Penuel, W. R., Fishman, B. J., Cheng, B. H., & Sabelli, N. (2011). Organizing research and development at the intersection of learning, implementation and design. Educational Researcher, 40(7), 331–337.

Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educational Psychology Review, 16(4), 385–407.

Rodríguez-Triana, M. J., Martínez-Monés, A., Asensio-Pérez, J. I., & Dimitriadis, Y. (2015). Scripting and monitoring meet each other: Aligning learning analytics and learning design to support teachers in orchestrating CSCL situations. British Journal of Educational Technology, 46(2), 330–343.

Roll, I., & Winne, P. H. (2015). Understanding, evaluating, and supporting self-regulated learning using learning analytics. Journal of Learning Analytics, 2(1), 7–12.

Roll, I., Harris, S., Paulin, D., MacFadyen, L. P., & Ni, P. (2016, October). Questions, not answers: Doubling student participation in MOOC forums. Paper presented at Learning with MOOCs III, 6–7 October 2016, Philadelphia, PA. http://www.learningwithmoocs2016.org/

Santos, J. L., Govaerts, S., Verbert, K., & Duval, E. (2012). Goal-oriented visualizations of activity tracking: A case study with engineering students. Proceedings of the 2nd International Conference on Learning Analytics & Knowledge (LAK ʼ12), 29 April–2 May 2012, Vancouver, BC, Canada (pp. 143–152). New York: ACM.

Schön, D. A. (1983). The reflective practitioner: How professionals think in action. New York: Basic Books.

Schunk, D. H. (2008). Metacognition, self-regulation, and self-regulated learning: Research recommendations. Educational Psychology Review, 20(4), 463–467.

Schunk, D. H., & Zimmerman, B. J. (2012). Self-regulation and learning. In I. B. Weiner, W. M. Reynolds, & G. E. Miller (Eds.), Handbook of psychology, Vol. 7, Educational psychology, 2nd ed. (pp. 45–68). Hoboken, NJ: Wiley.

Shirazi, S., Gašević, D., & Hatala, M. (2015). A process mining approach to linking the study of aptitude and event facets of self-regulated learning. Proceedings of the 5th International Conference on Learning Analytics & Knowledge (LAK ʼ15), 16–20 March 2015, Poughkeepsie, NY, USA (pp. 265–269). New York: ACM.

Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57, 1380–1400.

Slade, S., & Prinsloo, P. (2013). Learning analytics ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529.

Suthers, D. D., & Rosen, D. (2011). A unified framework for multi-level analysis of distributed learning. Proceedings of the 1st International Conference on Learning Analytics & Knowledge (LAK ʼ11), 27 February–1 March 2011, Banff, AB, Canada (pp. 64–74). New York: ACM.

van Harmelen, M., & Workman, D. (2012). Analytics for learning and teaching. Technical report. CETIS Analytics Series Vol 1. No. 3. Bolton, UK: CETIS.

van Leeuwen, A. (2015). Learning analytics to support teachers during synchronous CSCL: Balancing between overview and overload. Journal of Learning Analytics, 2(2), 138–162.

Wasson, B., Hanson, C., & Mor, Y. (2016). Grand challenge problem 11: Empowering teachers with student data. In J. Ederle, K. Lund, P. Tchounikine, & F. Fischer (Eds.), Grand challenge problems in technology-enhanced learning II: MOOCs and beyond (pp. 55–58). New York: Springer.

Winne, P. H. (2010). Bootstrapping learner’s self-regulated learning. Psychological Test and Assessment Modeling, 52(4), 472–490.

Winne, P. H. (in press). Leveraging big data to help each learner upgrade learning and accelerate learning science. Teachers College Record.

Winne, P. H., & Baker, R. S. (2013). The potentials of educational data mining for researching metacognition, motivation and self-regulated learning. Journal of Educational Data Mining, 5(1), 1–8.

Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. Proceedings of the 4th International Conference on Learning Analytics & Knowledge (LAK ʼ14), 24–28 March 2014, Indianapolis, IN, USA (pp. 203–211). New York: ACM.

Wise, A. F., Hausknecht, S. N., & Zhao, Y. (2014). Attending to others’ posts in asynchronous discussions: Learners’ online “listening” and its relationship to speaking. International Journal of Computer-Supported Collaborative Learning, 9(2), 185–209.

Wise, A. F., Vytasek, J. M., Hausknecht, S., & Zhao, Y. (2016). Developing learning analytics design knowledge in the “middle space”: The student tuning model and align design framework for learning analytics use. Online Learning Journal 20(2).

Wise, A. F., & Vytasek, J. M. (in preparation). The three Cs of learning analytics implementation design: Coordination, Comparison, and Customization.

Wise, A. F., Zhao, Y., & Hausknecht, S. N. (2013). Learning analytics for online discussions: A pedagogical model for intervention with embedded and extracted analytics. Proceedings of the 3rd International Conference on Learning Analytics & Knowledge (LAK ’13), 8–12 April 2013, Leuven, Belgium (pp. 48–56). New York: ACM.

Wise, A. F., Zhao, Y., & Hausknecht, S. N. (2014). Learning analytics for online discussions: Embedded and extracted approaches. Journal of Learning Analytics, 1(2), 48–71.

Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical background, methodological developments, and future prospects. American Educational Research Journal, 45(1), 166–183.

Zimmerman, B. J., & Schunk, D. H. (Eds.) (2011). Handbook of self-regulation of learning and performance. New York: Routledge.


About this Chapter

Title
Learning Analytics Implementation Design

Book Title
Handbook of Learning Analytics

Pages
pp. 151-160

Copyright
2017

DOI
10.18608/hla17.013

ISBN
978-0-9952408-0-3

Publisher
Society for Learning Analytics Research

Authors
Alyssa Friend Wise1
Jovita Vytasek2

Author Affiliations
1. Learning Analytics Research Network, New York University, USA
2. Faculty of Education, Simon Fraser University, Canada

Editors
Charles Lang3
George Siemens4
Alyssa Wise1
Dragan Gašević5

Editor Affiliations
3. Teachers College, Columbia University, USA
4. LINK Research Lab, University of Texas at Arlington, USA
5. Schools of Education and Informatics, University of Edinburgh, UK


 
×

Register | Lost Password