Last Sunday, a New York Times editorial educated us on “The Trouble with Online College.” When the editorial focuses on the results of a longitudinal study by Columbia University’s Community College Research Center (CCRC). While they make some good points, when they stray from the facts they have a bit of trouble.
The editorial states:”First, student attrition rates — around 90 percent for some huge online courses — appear to be a problem even in small-scale online courses when compared with traditional face-to-face classes.” This statement treats MOOCs and online for-credit courses as the same things and have the same attrition rates. The 90% attrition rate for MOOCs is in line with reports that I have heard, but the CCRC study that is cited shows attrition rates that are closer to 10% and only slightly below that of face-to-face courses. That’s just plain disingenuous.
The Community College Research Center Study
While the Times’ editorial was a bit over-the-top, the CCRC research is piece of work that we in the distance education community needs to study closely and be ready to address. The study follows students from the Washington community colleges through several years. Their research shows that:
“While all types of students in the study suffered decrements in performance in online courses, some struggled more than others to adapt: males, younger students, Black students, and students with lower grade point averages. In particular, students struggled in subject areas such as English and social science, which was due in part to negative peer effects in these online courses.”
Those of us in the distance education community should welcome such research. While we would like to see the same type of attention placed on the face-to-face courses, we should be eager to learn from research and continue to improve. And we agree that online education might not be for everyone, but we would like to hear face-to-face proponents admit that the traditional classroom experience is also not for everyone.
The report cites many findings that the detractors will use against distance education. We will need to be able to discern the helpful from the hyperbole.
I have read the whole report. While I have many thoughts, below are a few that I would like to share with you. I would really like to hear your take on the research in the comments below.
Students “Adapting” to Online Learning
I agree with how they have framed the issue in terms of how students have “adapted” to online learning. The vast majority of students spend the bulk of their education career without having taken an online course. It is an adjustment. It is not too surprising that we are still in a period in which institutions are still figuring out what works and students are figuring out how to succeed in an online course.
The biggest takeaway that I took from the research is the need to make sure that students are prepared to succeed in an online course. I disagree with how they recommend addressing that issue. The authors are off-base in suggesting that students be “screened” out of an online class. They ably cite the reasons that such screening would probably not work. Instead, I suggest that we rely on education. Colleges should pay more attention to assuring that students are prepared to take an online class. Some institutions have a required short course to prepare students. I don’t know if any of Washington’s Community Colleges have such a requirement or even an optional service to help those new to online education. Such focus on those who are not ready for online education who help to address the study’s concern about certain demographic groups falling behind. If intervention were based upon need, the intervention could be applied regardless of group identity. Research on how students adapt when given the proper up-front help would be interesting.
The First Year Effect
It is very helpful that the study separates out the impact on students taking online courses in their first year at an institution. First, year students (as a whole) performed more poorly in the online environment. Many who did poorly in the first try at an online course did not enroll in such a course in subsequent years. This is good to know and lends more credence to the recommendations that early intervention is needed. Statistically, while they pulled out the first year students as a separate analysis they never pulled them out of the overall analysis. These poorly performing first year students have to be pulling down the overall performance measures. What would the differences look like if you took out the “one and done” students? The performance differences would have to be much less.
The Impact of Student Services is Missing
For further research, the impact of student services needs to be included. The assumption in the fourth recommendation is that the problem is with the “quality of all online courses taught at the college.” For the underprepared students that the study worries about most, student support services (advising, tutoring, library resource materials, study skills assistance, technical assistance) could be the differentiator. These services may be readily available on campus, but might be available on a limited basis or not at all for online students. Those differences are not measured by the study. Overlooking the importance of quality student services is mistake often made by those new to distance education.
Is “Statistically Significant” Actually Significant?
When people hear “statistically significant” they often do not fully understand the implications of that term. If I obtain a large enough sample, any difference becomes statistically significant. While some of the differences are “statistically significant,” they might not be “practically significant” enough to warrant large-scale interventions. What would have to be done to close the 1% gap in attrition rates between some groups that were cited for some groups?
Will the Figlio Study Never Die?
The Figlio study cited on page four is not about online courses it is about using video in a face-to-face course. When the Chronicle of Higher Education made a similar error in reporting the results of the study in their publication, they changed their headline after several people objected. The study is continually misused. Sorry…this is a pet peeve of mine.
The New York Times reaches an unsupported conclusion:
“The online revolution offers intriguing opportunities for broadening access to education. But, so far, the evidence shows that poorly designed courses can seriously shortchange the most vulnerable students.”
Is it really “poorly designed courses”? We need to continue to pay close attention to the quality of distance education, but even the CCRC research cites another big contributor to the differences…the students have not yet “adapted” to this type of learning. They are inexperienced with it. Additional contributing factors are the availability of student services and ability of the faculty, many of whom are also new to the online environment.
In this blog posting, I leaned toward trying to defend some distance education practices as the New York Times editorial shows us that this report will be often cited and often cited incorrectly. The CCRC research is an important advancement giving some new considerations for differences in performance by student characteristics and by academic subject areas. We will need to review the report in more depth and discern what we can learn from it.
The discussion will go on. What’s your take on it?
Photo: Morgue File: http://www.morguefile.com/archive/display/53841