Course Design with Learning Analytics: A Case-Study in Using Data to Identify Problems and Measure Improvements

Presenters:

David Lindrum
Soomo Learning, Founder and Course Designer
david.lindrum@soomolearning.com

Patrick Duffy
Soomo Learning, Development Editor
patrick.duffy@soomolearning.com

Description:

In this case-study, we use learning analytics to identify a potential problem in instructional design and to evaluate the impact of a fix. For a key assignment in a Sociology course, learning analytics revealed that an unusual number of students were not clicking to open the assigned article. A half-dozen questions were presented which students would need the article to answer and points were given for completing the questions. However, because the questions were being used as low-stakes formative assessment, students were permitted to reset the questions and answer repeatedly. About a third of the students who answered the questions, did so without ever clicking to open the article, evidently by answering the questions repeatedly until they were all correct. So median scores for both those who read and those who didn’t was very high. The course designers theorized that this group of students believed it would be easier to answer the questions repeatedly until they were all correct than to actually read the article.

Our strategy to address this was to increase the number of questions and the difficulty of questions so that more students would believe it easier to read the article than to “”game”” the system with frequent resets. However, we wanted to make sure our intervention did not make the questions so difficult that engaged students saw substantially lower scores. Nor did we want the assignment to become so daunting that participation rates fell.

After deploying the new strategy, learning analytics enabled us to:
– know that this intervention resulted in 50% fewer non-readers.
– validate our understanding of the underlying problem
– evaluate the impact of this change on overall participation
– consider the impact of this change on already-engaged students.
– devise additional steps to further increase student reading and completion

This project also contributed to our ongoing effort to understand how to present data to faculty with sufficient clarity to allow them to enter into data-based discussions of activity trends, likely causes, potential solutions, and results.

This very specific finding is offered primarily as a case-study in the use of learning analytics to 1) find instructional problems and 2) assess the impact of changes to instructional strategy. This process can be generalized to any kind of instruction producing learning analytics.

Bios:

David has been designing instruction and measuring the results since 1994. Past presentations on learning analytics include papers and panels at the annual conferences of the Online Learning Consortium (2011), SEEDS (2015), Educational Data Mining (2015 and 2016), and SXSW (2016) among many others.

Patrick has a background in STEM education and, more specifically, Chemistry. His research interests include a recent study on the effects of presentation modality and structural representation on cognitive load in undergraduate students. He has presented at the national meeting of the American Chemical Society (2015) and been published in the “Journal of Chemical Education” (2017).