The field of analytics in higher education is relatively new and descriptions are often imprecise. Different types of analytics, with little in common, are regularly lumped together. At the 1st International Conference on Learning Analytics and Knowledge in 2011, analytics was defined broadly as the “measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.”
Analytics comes in many shapes and sizes, depending on:
- Type of data required
- Sources of the data captured
- Purpose or intended goal
We propose three categories of analytics in online higher education, each concerned with student performance:
- Institutional Analytics
- Engagement Analytics
- Learning Analytics
Identifying and defining the different categories is an important first step in helping educational professionals more quickly focus on the analytics that are of most value to their institutions.
Institutional analytics is primarily concerned with tracking learners through their educational lifecycle, from enrollment to graduation. The data collected focuses on information such as student profiles (age, address, ethnicity), course selections, pace of program completion, use of support services and graduation rates.
Data needed for institutional analytics is often readily available in colleges and universities within the institution’s registration system. The task, then, is to organize the data and identify the metrics that are most important to the institution. The value of the information can be multiplied by linking this data to other systems, such as enrollment and learning management systems, student support applications, and customer relationship management software. Analytics of this type is commonly the purview of the institution’s internal research and business analysts.
Much of the data required can be found within the institutions’ registration system, but the value of the information can be multiplied by linking this data to other systems, such as enrollment and learning management systems, student support applications and customer relationship management software. Analytics of this type is commonly the purview of the institution’s internal research and business analysts.
Institutions use this information in a variety of ways to:
- Align recruiting tactics with institutional aid
- Identify better recruiting practices to improve retention and completion rates
- Predict high-risk students earlier in order to provide more targeted support
External demand for this kind of information is growing as state and regulatory bodies seek to better monitor (and reward) certain types of institutional performance. Similarly, more institutions are distributing information to help students and parents make more informed decisions about programs and schools.
Examples of software used to support institutional analytics includes:
Unlike institutional analytics, engagement analytics track student activity within the course environment, which is typically the learning management system (LMS). The information generated can be of value to the institution, students and instructors. But most of the information is designed with the instructor in mind, keeping with the overarching instructional model of higher education.
The type information collected in engagement analytics often includes:
- Number of page views (per page)
- Contributions by students to discussion threads
- Which students (and what percentage of the total cohort) have completed the assignments
- Number of logins
This information, as with other types of analytics, is presented in a visual format, often as a dashboard, with its roots in business intelligence software. A well-constructed visual display of data makes interpreting course activity faster and simpler.
When used effectively, this information can help instructors and institutions identify students who may need additional support and encouragement, and help determine the most effective type of student support intervention.
However, engagement analytics do not necessarily measure learning, per se. What’s measured is student activity, which may or may not signal actual learning. For example, engagement analytics is often used to track student page views. The student’s presence on that particular page within the course site tells us that the student has been exposed to that part of the curriculum. But it doesn’t tell us whether the student understands the curriculum. In fact, it may be that the student inadvertently left the browser window open while searching the Internet.
Writer and researcher Stephen Downes, who specializes in online learning, describes the challenge of using engagement analytics this way:
“There are different tools for measuring learning engagement, and most of them are quantificational. The obvious ones [measure] page access, time-on-task, successful submission of question results – things like that. Those are suitable for a basic level of assessment. You can tell whether students are actually doing something. That’s important in certain circumstances. But to think that constitutes analytics in any meaningful sense would be a gross oversimplification.”
Examples of software used for engagement analytics includes
Learning analytics measure the student’s actual learning state; what students know, what they don’t know, and why. We propose that the category of learning analytics be reserved for analytics that actually measure changes in a students’ knowledge and skill level, with respect to specific curriculum. The insights generated from true learning analytics support optimization of learning through information, recommendation and personalization. Learning analytics are actionable.
Examples of the type of information that can be captured by learning analytics include:
- What aspects of the course did the student master?
- Which students are struggling, and with which concepts, topics and problems?
- What misconceptions about the curriculum are leading to poor performance?
- What topics require more attention or better presentation?
The data for learning analytics is captured through frequent formative and summative assessments. Based on the data generated from a student’s interaction with these assessments, it is now possible, as a result of extensive research at Carnegie Mellon University and elsewhere, to derive strikingly accurate measurements of student knowledge and skills. Learning theory offers explanations of the mechanisms of learning (e.g., the power law of learning, cognitive load, rate of learning, learning decay, etc.) and these cognitive factors can be incorporated into learning analytic models to measure, predict and respond to student performance in the online course.
The insights produced by learning analytics can be used to create dashboard-style reports of student performance or to modify a student’s experience in real-time, or both. Learning dashboards give learners, faculty and institutions a visual snapshot of each student’s performance, as it relates to specific learning objectives.
The information can also be used in real-time to continuously adapt the instructional activities (e.g., level of difficulty) presented to the learner to match individual needs. Automated recommendations help create a personal path of learning for each student. Practice is personalized so that students receive the right amount of practice, targeted at the right level, for the right topics. Instructors and mentors receive dashboards and alerts to guide more timely and effective interactions and interventions. Instructional design teams use the data to measure efficacy of course materials, and continuously improve the learning experience, saving time and resources.
Each of the three types of analytics offers value. And there is inevitably some overlap between the approaches. But learning analytics is the only approach upon which educators can confidently determine the actual state of a student’s learning. It provides a true foundation for new opportunities to improve and optimize learning.
 Collaboration, Analytics, and the LMS: A Conversation with Stephen Downes. Retrieved February 6, 2014. http://campustechnology.com/newsletters/ctfocus/2010/10/collaboration_analytics_and-the-lms_a-conversation-with-stephen-downes.aspx
Dr. Keith Hampson is Managing Director, Client Innovations at Acrobatiq, a Carnegie Mellon University venture born out of CMU’s long history in cognitive science, human-computer interaction, and software engineering. In addition to adaptive “intelligent” courseware and learning analytics, we offer a range of consulting and professional development services for colleges and universities that increase the quality of their digital programs.