I’ve known Alfred Essa for a couple of years. From the start, Al struck me as one of those people that is truly focussed on improving higher education. After a career at various organizations in the US, including MIT and Minnesota State Colleges and Universities, Essa now serves as the Director of Innovation and Analytics Strategy at Desire2Learn, the Canadian education technology company (which recently accepted an 80 million dollar investment from OMERS).
KCH: I suspect there is still considerable confusion about what the term “analytics” means when applied to higher education. Can you provide your own high-level perspective?
AE: Analytics is sometimes characterized misleadingly as “data-driven decision making”. Why is this misleading? Because we use data all the time to make decisions. Analytics is fundamentally not about having more data from a user perspective. It’s about having access to quality data, relevant data, and contextual data. That’s a very hard problem which can’t be solved by merely building very large databases. When it comes to analytics at Desire2Learn our mantra is: “Less Data, More Insights.” Analytics is as much about design as it is about technology. On the technology side it is now possible to leverage wonderfully new techniques such as machine learning to uncover hidden patterns. But that’s only the first step. On the design side we refine, shape, and expose those patterns visually so that they can be interpreted by humans. On the user side analytics means looking at the world empirically and statistically, which for most of us is not intuitive and requires training.
KCH: Are there different degrees to which institutions can use analytics?
AE: It’s useful to think of analytics as having three levels or degrees of maturity. Analytics (Level I), which is the current state of most tools and also organizational capability, is reporting. It is the realm of traditional business intelligence. What are student retention, achievement, and completion rates? How does it break down by demographics? What is the trend? Analytics I is retrospective. It’s data about the past. At best, it’s data about the present. It’s like a rear-view mirror in a car. You need it but imagine having a car where you can see where you came from but not what is ahead of you. Analytics II is data about the future. Where will I end up being? With Analytics II we begin to use advanced techniques such as predictive modeling to forecast the future. Analytics III is the holy grail. Anytime someone wants to go from point A to point B, there will be multiple routes. Analytics III is about finding the optimal path among many to reach one’s destination. The most competitive institutions will be those who are either at Analytics III or well on their way. The least competitive one’s haven’t even begun.
KCH: Using data to plan and manage organizations is now fundamental in other sectors. What are the key obstacles to greater use of analytics in higher ed?
AE: There are two key obstacles. The first obstacle is technical. Underneath the hood analytics is technically very complex. Most organizations lack the resources and talent to do it on their own. The second obstacle is dogmatic. The reigning dogma is that we know how students learn. The reality is that we know very little. Erik Mazur’s work at Harvard illustrates this point. The promise of analytics is that we can finally take an empirical approach to understanding the learning process and use evidence, rather than guess work or speculation, as our guide for determining what works and what doesn’t. With analytics our understanding of learning is about to transition from alchemy to science.
KCH: Which segments of the higher education market have moved most quickly to implement analytics?
AE: The for-profit sector has taken an early lead, at least in recognizing that analytics can be a differentiator. Markets tend to concentrate the mind. The non-profits have taken longer to awaken from their dogmatic slumber. But they are waking up. In the long run an advantage that non-profits have is the value they place on research and collaboration. This will be a critical driver for successful adoption. There is also the sideshow going on with MOOCs, which will be interesting to watch
KCH: Faculty can be resistant to the idea of implementing analytics, as some interpret it as a tool for tracking their performance. How do you work past this issue?
AE: I remember a conversation several years ago with Mark Millirons on this topic. Mark is a pioneer in the field of analytics. His advice to me was you have to flip the conversation with faculty. The goal of analytics should be to empower students and instructors. If faculty perceive that analytics is yet another regime by administrators to use data about them against them, then analytics will not succeed. I think of analytics as real-time instrumentation for pilots and air-traffic controllers. The student is the pilot in their learning journey. They need real-time instruments in the cockpit to help them navigate through various obstacles. Air-traffic controllers also need real-time instruments to make sure all planes in their charge or on course. The measures and instrumentation for evaluating pilots and air-traffic controllers is different and not primary. The primary goal of analytics should be to empower, not evaluate. We often confuse the two.
- The Pearson Think Tank (UK): An Exchange with Louis Coiffaint (highereducationmanagement.wordpress.com)