Was MITx a success? I’d like to add a few final thoughts about what MITx got right and missed to my earlier posts (see #1#2#3).

In terms of the basic traditional metric of course completion, the MITx experience was a success for me and over 7000 others. Combining my score on the final with the midterm, homeworks and labs, I earned a B. According to MITx,  ~5% of the ‘registrants’ completed the course. Using a more meaningful benchmark, the course completion rate was 10% of participants who looked at the first problem set, and 68% of those who earned at least one point on the midterm.

What are meaningful metrics for the success of a MOOC? Clearly course completion numbers and rate shouldn’t be the only (and probably are not the most meaningful) metrics. How can we judge the value of these educational experiences and experiments? Ultimately, I found MITx to be a highly valuable experience, and mostly in ways that I wouldn’t have predicted. My initial interest was primarily in the course design and instructional technology, but the content of the course turned out to be far more compelling and engaging than I could have predicted. Aside from specific content knowledge, I was also surprised at the extent to which my general appreciation for technology and engineering was enhanced.

Having invested myself personally in this question over the past six months, and having great professional interest as well, let me consider the question from a professional standpoint. My MITx student experience provided me with a different perspective on the merits of MOOCs and open education.

Given Duke’s partnership with Coursera, I will have the opportunity to investigate some of those here, but the engagement across higher education with this question in the coming months will no doubt continue to increase. While the activities of faculty and students in a MOOC are in many ways quite similar to traditional for-credit online courses, other aspects of this phenomenon are dramatically different and make direct comparisons likely inappropriate. Questions that should be addressed would include:

  • What does the pipeline of students look like for a MOOC? Is it possible to establish benchmarks for engagement, retention and completion? What factors would influence these – discipline? length? pedagogy?
  • What student or faculty characteristics predict the success of a MOOC?
  • Is there a threshold number of successful students (varying by discipline or other factors) which justifies the effort of delivering a course like this?
  • How much does it cost, and what is the return on that investment (not merely in financial terms)?
  • What impact will MOOCs have on the faculty and campuses who sponsor them?
  • What happens when the same MOOC is offered again? Does the level of interest and investment increase or decrease? What about the level of quality?
  • What role can technology play in helping to identify true “students” versus lurkers? In promoting compliance with honor codes?

I also invite you to post your questions and thoughts about how the higher education community should judge the value of MOOCs for their institutions and the broader community of learners.

Overblown-Claims-of-Failure Watch: How Not to Gauge the Success of Online Courses
the Atlantic, Rebecca J. Rosen, July 22, 2012

Who takes MOOCs?
Inside Higher Ed, Steve Kolowich

What you need to know about MOOCs
Chronicle of Higher Ed  – summary of coverage over the past several years

Final thoughts on MITx

As for MITx in particular, I wanted to add a few suggestions to my previous ones about how this course or similar ones might be improved in future iterations.

Learners want to know where they stand. The clarity and transparency offered to students on the MITx platform of grading and course expectations was excellent. The incrementally built ‘Profile’ was well implemented from both a technical and instructional design point of view. The obvious missing piece was a lack of context. Students clearly craved more information about their performance relative to other learners in the course. Although you could argue that MITx was simply taking a criterion-based approach rather than a norm-referenced approach, the number of grassroots student efforts using discussion board threads, polls, and even online surveys to generate this kind of data made it clear that students were eager to know how their performance stacked up. Simply posting the overall grade distribution on the midterm and final, for example, would have been one way to respond.

Fourteen weeks really is a long time for a MOOC. Maybe too long. For many students and faculty, whether at brick & mortar schools or online ones, a 14 week semester probably seems like a natural length of time for a course that is part of a degree program. For working professionals or other students taking just one stand-alone online course, the time commitment of fourteen straight weeks of assignments due on time or not at all was problematic. It’s not just me; there was a quite a bit of discussion on the course site about the length, and I’m sure it was just as disheartening for the MITx staff as it was for me and other dedicated students to see quite a few students bail out after ten weeks because the course was just a few weeks too long. Shorter courses would likely appeal to a broader audience of learners. It’s no surprise that to me now to see Coursera and Udacity are offering many classes that are 8 weeks or less, although some have crept up toward three months. It appears that edX will retain the semester length format in its future offerings. Oh, and Professor Agarwal? Spring break would have been nice (if for no other reason than to catch up!).

The community of learners (or at least parts of it) may be sustaining itself. The final exam ended on June 11. Since then, the MITx discussion board has remained active. Some of the most active students clearly aren’t intending for the experience to end. I’ve seen MITx spinoff code building projects to build new interactive educational content, collaborative Google Maps to visualize the location of participants, students organizing study groups on follow-up content, and a variety of other interesting offshoots driven entirely by the course participants to maintain momemtum and sustain the learning community.

As the edX team analyzes what has occurred in this pilot and (hopefully) shares those findings broadly, I suspect that although there will be a few metrics to track, the most interesting questions and outcomes will be qualitative:

  • How many MITx students will go on to enroll in (and complete) subsequent courses?
  • What impact will the MITx experience have on those who successfully completed the course but were not already enrolled in college (high school, hobbyists, working professionals)?
  • How will colleges (including MIT and others) respond to students asking for prior learning assessment after completing these types of courses?
  • What secondary impacts will occur through the re-use of MITx content by other educational institutions?

Yvonne Belanger

Prior to May of 2013, Yvonne led assessment and program evaluation for CIT and for university initiatives in which CIT takes a leading role. She also provided leadership to library assessment efforts.

More Posts

Tagged with:
 

Comments are closed.