This post is Part 2. Part 1 can be found here.
The Growing Chasm in the Online Higher Education Market (2 of 2)
One of the key characteristics that distinguishes faster growing, more scalable, and increasingly high-quality online universities (described in “The Growing Chasm”) is the systematic use of knowledge about what works in online instruction and what doesn’t. This handful of US institutions tend to capture more data about student learning, learn from it, and act on it.
As simple as this process sounds, it’s difficult to implement in our traditional colleges and universities, where course design and development is typically a very decentralized activity. Instruction is determined on a course-by-course basis and there’s rarely a systematic, robust process in place for identifying and sharing knowledge about what drives student-learning outcomes most effectively. I recall, as a new member of faculty, being surprised to learn that my colleagues had little to no knowledge of how their other colleagues in the department ran their courses, for example.
This need not be the case, though. The required technology now exists and if used creatively in conjunction with basic change management practices, it is possible to increase the volume and quality of information sharing, even within the most decentralized institutions.
Initiatives to drive instructional innovation are most likely to be led by central support units responsible supporting online courses and programs. It’s in these departments that the pressure to ensure quality and bring about change is felt greatest.
The core elements include:
- Get buy-in
- Define the current state of affairs
- Make new approaches clear and easy to adopt
- Distribute tools to measure student learning
A Few Details . . .
Pitch it. Put together a clear and compelling description of your plan. Use it to get feedback and solicit buy-in. (It won’t hurt if you get buy-in from people with influence, but passion goes a long way, too.)
Catalogue it. Take an inventory of the instructional practices currently being used. (Maintain full confidentiality of Instructors). Share this inventory, once organized for simple review, with all stakeholders. There will be some surprises. And many educators will benefit from this tactic, alone – given the current dearth of information. “What are others doing?”
Package it. Ask a team consisting of representatives to select 8-10 interesting instructional strategies that they believe would be of value to others within the institution. Showcase these examples, using events, a dedicated website, external conferences, and other means available. Be an insufferable promoter. Reconfigure these 8-10 instructional strategies so that they can be easily understood and copied. Vagueness, here, is your enemy. Be clear, simple and above all, concrete. As Dan and Chip Heath suggest: If you want people to eat a more healthy mix of foods, don’t tell them to “eat low fat foods”, tell them to “buy skim milk.”
Assess it. Implement learning analytics that each instructor can use to measure student learning. If the goal is to have students learn more and more quickly, then the analytics must actually measure student learning, not merely track their behaviour; I call this “engagement analytics”. The type information collected in engagement analytics often includes:
- Number of page views (per page)
- Contributions by students to discussion threads
- Which students (and what percentage of the total cohort) have completed the assignments
- Number of logins
Engagement analytics do not necessarily measure learning, per se. What’s measured is student activity, which may or may not signal actual learning. For example, engagement analytics is often used to track student page views. The student’s presence on that particular page within the course site tells us that the student has been exposed to that part of the curriculum. But it doesn’t tell us whether the student understands the curriculum. In fact, it may be that the student inadvertently left the browser window open while searching the Internet.
- What aspects of the course did the student master?
- Which students are struggling, and with which concepts, topics and problems?
- What misconceptions about the curriculum are leading to poor performance?
- What topics require more attention or better presentation?
Let us know which of these tactics you’ve tried and which has served you well.