This post is Part 2. Part 1 can be found here

The Growing Chasm in the Online Higher Education Market (2 of 2)

One of the key characteristics that distinguishes faster growing, more scalable, and increasingly high-quality online universities (described in “The Growing Chasm”) is the systematic use of knowledge about what works in online instruction and what doesn’t. This handful of US institutions tend to capture more data about student learning, learn from it, and act on it.chairAndnewspapers

As simple as this process sounds, it’s difficult to implement in our traditional colleges and universities, where course design and development is typically a very decentralized activity. Instruction is determined on a course-by-course basis and there’s rarely a systematic, robust process in place for identifying and sharing knowledge about what drives student-learning outcomes most effectively. I recall, as a new member of faculty, being surprised to learn that my colleagues had little to no knowledge of how their other colleagues in the department ran their courses, for example.

This needn’t be the case, though. The required technology now exists and if used creatively in conjunction with basic change management practices, it is possible to increase the volume and quality of information sharing, even within the most decentralized institutions.

Initiatives to drive instructional innovation are most likely to be led by central support units responsible supporting online courses and programs. It’s in these departments that the pressure to ensure quality and bring about change is felt greatest.

The core elements include:

  1. Get buy-in
  2. Define the current state of affairs
  3. Make new approaches clear and easy to adopt
  4. Distribute tools to measure student learning

A Few Details . . .

Pitch it. Put together a clear and compelling description of your plan. Use it to get feedback and solicit buy-in. (It won’t hurt if you get buy-in from people with influence, but passion goes a long way, too.)

Catalogue it. Take an inventory of the instructional practices currently being used. (Maintain full confidentiality of Instructors). Share this inventory, once organized for simple review, with all stakeholders. There will be some surprises. And many educators will benefit from this tactic, alone – given the current dearth of information. “What are others doing?”

Package it. Ask a team consisting of representatives to select 8-10 interesting instructional strategies that they believe would be of value to others within the institution. Showcase these examples, using events, a dedicated website, external conferences, and other means available. Be an insufferable promoter. Reconfigure these 8-10 instructional strategies so that they can be easily understood and copied. Vagueness, here, is your enemy. Be clear, simple and above all, concrete. As Dan and Chip Heath suggest: If you want people to eat a more healthy mix of foods, don’t tell them to “eat low fat foods”, tell them to “buy skim milk.”

Assess it. Implement learning analytics that each instructor can use to measure student learning. If the goal is to have students learn more and more quickly, then the analytics must actually measure student learning, not merely track their behaviour; I call this “engagement analytics”. The type information collected in engagement analytics often includes:

  • Number of page views (per page)
  • Contributions by students to discussion threads
  • Which students (and what percentage of the total cohort) have completed the assignments
  • Number of logins

Engagement analytics do not necessarily measure learning, per se. What’s measured is student activity, which may or may not signal actual learning. For example, engagement analytics is often used to track student page views. The student’s presence on that particular page within the course site tells us that the student has been exposed to that part of the curriculum. But it doesn’t tell us whether the student understands the curriculum. In fact, it may be that the student inadvertently left the browser window open while searching the Internet.

Learning analytics, on the other hand, measure the student’s actual learning state; what students know, what they don’t know, and why. It’s this kind of information that’s needed if individual educators are going to imagine new and better ways to stimulate learning. See figure A. Information that can be captured by learning analytics include:
  • What aspects of the course did the student master?
  • Which students are struggling, and with which concepts, topics and problems?
  • What misconceptions about the curriculum are leading to poor performance?
  • What topics require more attention or better presentation?

Keith Hampson, PhD is the founder of digital / edu / strategy, a research and consulting service that helps colleges, universities and education businesses develop better strategies for maximizing value. 

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s