Presentations

The Virtuous Loop of Learning Analytics & Academic Technology Innovation
John Whitmer (Blackboard Inc.)

John WhitmerIndividual faculty and academic departments creating innovative educational practices are often starved for useful data and analysis to determine whether their innovations made a difference improving student learning. Assessment efforts are often slow, costly, and limited to reductionist comparisons of outcomes that ignore the complex interaction between student intent, behavior, course design and outcomes.

The adoption of academic technologies creates a rich data source that provides insights not previously possible in terms of the level of detail provided, the speed at which it can be accessed, and the potential scalability of analysis. In this presentation, we’ll present findings from ongoing internal analytics at Blackboard that provides insights into the potent value and efficacy of data provided by these technologies.

The research presented will include findings from a research project to explore the validity of automated rule-based triggers based on LMS use and grades. This research investigated a large data set of LMS activity (1.2M student course weeks, 34,519 courses, 788 institutions). Findings included a small effect size in the relationship between time spent in the LMS and student grade; however, a small set of courses had a strong relationship that merits further research and consideration for all academic institutions using the LMS and other academic technologies in learning.

 

Predictive Modeling of Student Outcomes in Introductory History Courses 
Jeffrey R. Young (Georgia State University)

Jeffrey R. YoungInvestigators at Georgia State are engaged in a multi-year experiment on the impact of digital resources on student learning in introductory history classes taken by five thousand GSU students each academic year. As part of this larger project, data from high school GPAs, ACT/SAT scores, and GSU campus GPAs have been compiled for 2,500 students enrolled in 101 sections of the U.S. survey course (History 2110) in the Fall 2015 semester. These measures of students’ previous academic performances and level of readiness for coursework in higher education have been correlated with data from students’ date of registration and final grades for History 2110 with an eye toward examining whether:

  1. students with particular academic backgrounds are more or less likely to enroll in online course sections as opposed to hybrid or traditional sections
  2. students with particular academic backgrounds experience measurably different outcomes in History 2110
  3. students with particular academic backgrounds do better or worse than expected (given the aggregate statistical patterns) when enrolled in online, hybrid, or traditional sections
  4. students with particular academic backgrounds are, in online, hybrid, or traditional sections, more or less likely to have course outcomes that diverge significantly from the individual students’ campus GPAs

 

Quantifying the Qualitative for Holistic Program Evaluation
L. Roxanne Russell (Emory University)
Vijaykant Nadadur (Stride.AI)
Bryan L. Dawson (University of North Georgia)

BryanDawson
vijay
RoxanneRussell

Building on research presented at SEEDS last year on Informing course design with analytics and experiences, we will discuss subsequent developments of a program evaluation framework for blended and online learning that offers a method for gathering and triangulating multi-perspective data. By correlating student behaviors, satisfaction and performance across indicators, it becomes possible to identify where design or user interventions could be implemented to improve learner experiences and outcomes. However, in blended and online learning environments, these indicators come in both quantitative and qualitative forms. For example, while user behavior data in an LMS and satisfaction data in Likert scales are quantifiable, student satisfaction surveys may also include open-ended comments and performance feedback may include substantive reviews beyond numeric grades. This year, we will present approaches to expand the data analysis capacity of this program evaluation framework with the use of a powerful artificial intelligence platform that translates textual data into actionable insights: TexSIE, Textual Sentiment Interpretation Engine. With this platform, we are able to transform qualitative data about into quantitative data so correlational analyses can be run across multiple forms of information about behavior, satisfaction and performance.

 

MOOCs, Big Data, and Learning Analytics: A 21st Century Opportunity
Amanda Madden (Center for 21st Century Universities)
Rob Kadel (Center for 21st Century Universities)

Angela Bell

David Tanner

Since offering our first MOOC in 2012 at Georgia Tech, our instructors, course development team, and researchers at C21U have learned a great deal about best practices designing, facilitating, and disseminating research for MOOCs. Our team has discovered both opportunities and challenges, including data collection, management, and analysis (both quantitative and qualitative), resource allocation (time, funding, and personnel), and disseminating findings for both internal and external audiences. This presentation will discuss the research process and framework we have built over the past four years by briefly discussing several case studies and outlining how our preliminary research findings on MOOC demographics and learner behavior has shaped both our research directions both online and offline and shaped a framework for research and teaching.


 

 

University System of Georgia Student Success Analytics Project
Angela Bell (University System of Georgia)
David Tanner (Carl Vinson Institute of Government)

Angela Bell

David Tanner

In June of 2014, the University System of Georgia and the Carl Vinson Institute of the University of Georgia partnered to develop better ways to use the large amounts of student data submitted by institutions to inform policy, practice, and planning in the system. The project first integrated the term-by-term student data for 10 years into an analysis-ready longitudinal data set and geocoded students’ addresses to enable the addition of Census block data on community background variables. Analysts with backgrounds in data science, geographic information systems, social networks analysis, and demography brought their perspectives to bear in developing a set of analytic tools that explore student pathways to and through higher education in Georgia. These tools range from visualizations of student transfer patterns and the high school to college feeder patterns, which can be used for presentation as well as tools of inquiry, to sophisticated machine learning models predicting student success that can target entering students most in need of intervention. In order to expand and equalize analytic capability across USG institutions, the project is providing analytic tools to campus personnel as well. This presentation will provide an overview of how the data set and analytic tools were developed, showcase several of the tools themselves, and explain how this project is “closing the loop” in using data and analytics to plan and make policy decisions, target students for intervention, and evaluate student success efforts.

 

Using Advanced Analytics to Inform Course Scheduling and Improve Student Time to Completion
Sarah Collins (Ad Astra Information Systems)

Sarah CollinsCollege and university campuses own a wealth of academic and operational data that can be analyzed to inform course scheduling and resource alignment for student success. In this presentation we will discuss the use of historical, degree audit, advising/planning and student data to forecast demand for courses and to align resources. In addition, you will be introduced to the Higher Education Scheduling Index, a framework for tracking course supply and demand and resource alignment. The HESI™ metrics provide the context for comparing institutional performance to the industry and a sub-set of like institutions. This framework can also be used to more effectively manage the highly decentralized model of course scheduling employed on campuses today.

 

Creating a Culture of Analytics: Identifying advocates, overcoming objectives, and avoiding technology fatigue during a high-stakes rollout at a large, decentralized university
Chris Hutt (Kennesaw State University)
Wendy Kallina (Kennesaw State University)

WendyKallina

ChrisHutt

In 2014, Kennesaw State University committed to a predictive analytics platform. The pilot programs met with some success but the demands of consolidation with Southern Polytechnic State University strained the resources needed for a campus-wide implementation. KSU is now preparing for a campus-wide launch in the Fall of 2016.

The technology is in place and the datasets have been verified, and the task is how to build a culture of advising assessment and data-informed decision making. Some schools have completely centralized academic advising and hired dozens of new staff as part of a sweeping change. Few of us have that opportunity. So, given the environment of shared governance and a highly decentralized advising structure, how can we effect sweeping change in student success outcomes? What have been the unexpected challenges? And who have been our most important – and unexpected – allies to date?

Rather than share a list of our accomplishments, we will review the process that led us to this point, and how other institutions might benefit from our progress, as well as our missteps. This interactive presentation will provide participants with an overview of the challenges and opportunities we have faced thus far in the early stages of implementation of a process to close the loop in student success, as well as informing the decisions made in implementing a culture of assessment on their home campuses.

 

Analytics & Action for Student-Centered LMS Support
Dana Smith Bryant (Emory University)

Dana BryantStudent acceptance of learning management technologies is critical to academic success. Students rely on peers, faculty, and institutional IT support for learning management system support. Institutional support groups are recognized for delivering sound functional training for the campus learning management system, but specific learning interventions driven by analytics as related to course performance are usually reserved for subject faculty/instructional designers. Institutional support groups, however, are uniquely poised to use student analytics data for ongoing and responsive intervention strategies regarding learning management system (LMS) adoption. This presentation discusses the role student analytics utilized within a LMS pilot of Canvas at Emory University during the 2015-2016 academic year. The Teaching & Learning Technologies group paired Canvas usage analytics with student survey results to inform the 2nd semester of the Canvas Pilot with regard to student specific training, outreach events, and communication strategy. Future action plans for utilizing student analytics within Emory’s LMS support team will also be presented.