By Phil Hill and Russ Poulin, cross-posted to e-Literate blog.

Last week the National Center for Education Statistics (NCES) released a new report analyzing the new IPEDS data on distance education. The report, titled Enrollment in Distance Education Courses, by State: Fall 2012, is a welcome addition to those interested in analyzing and understanding the state of distance education (mostly as an online format) in US higher education.

The 2012 Fall Enrollment component of the Integrated Postsecondary Education Data System (IPEDS) survey collected data for the first time on enrollment in courses in which instructional content was delivered exclusively through distance education, defined in IPEDS as “education that uses one or more technologies to deliver instruction to students who are separated from the instructor and to support regular and substantive interaction between the students and the instructor synchronously or asynchronously.” These Web Tables provide a current profile of enrollment in distance education courses across states and in various types of institutions. They are intended to serve as a useful baseline for tracking future trends, particularly as certain states and institutions focus on MOOCs and other distance education initiatives from a policy perspective.

We have previously done our own analysis of the new IPEDS data at both e-Literate and WCET blogs. While the new report is commendable in its improved access to the important dataset, we feel the missing analysis and potentially misleading introductory narrative takes away from the value of this report.

Value of Report

The real value of this report in our opinion is the breakdown of IPEDS data by different variables such as state jurisdiction, control of institution, sector and student level. Most people are not going to go to the trouble of generating custom tables, so including such data in a simple PDF report will go a long way towards improving access to this important data. As an example of the data provided, consider this excerpt of table 3:

Sample of part of IPEDS table with state-by-state analyses of distance ed enrollments.

 

The value of the data tables and the improved access to this information are precisely why we are concerned about the introductory text of the report. These reports matter.

Need for Better Analysis and Context

We were hoping to see some highlights or observations in the report, but the authors decided to present the results as “Web Tables” without any interpretation. From one standpoint, this is commendable because NCES is playing an important role in providing the raw data for pundits like us to examine. It is also understandable that since this was the first IPEDS survey regarding distance education in many years, there truly was no baseline data for comparison. Even so, a few highlights of significant data points would have been helpful.

There also is a lack of caveats. The biggest one has to do with the state-by-state analyses. Enrollments follow where the institution is located and not where the student is located while taking the distance courses. Consider Arizona: the state has several institutions (Arizona State University, Grand Canyon University, Rio Salado College, and the University of Phoenix) with large numbers of enrollments in other states. Those enrollments are all counted in Arizona, so the state-by-state comparisons have specific meanings that might not be apparent without some context provided.

Even though there are no highlights, the first two paragraphs contain a (sometimes odd) collection of references to prior research. These citations beg the question as to what the tables in this report have to say on the same points of analysis.

Postsecondary enrollment in distance education courses, particularly those offered online, has rapidly increased in recent years (Allen and Seaman 2013).

This description cites the long-running Babson Survey Research Group report by Allen and Seaman. Since the current IPEDS survey provides baseline data, there is no prior work on which to judge growth; therefore, this reference makes sense to include. It would have made sense, however, to provide some explanation of the key differences between IPEDS and Babson data. For example, Phil described in e-Literate the fact that there is major discrepancy in number of students taking at least one online course – 7.1 million for Babson and 5.5 million for IPEDS. Jeff Seaman, one of the two Babson authors, is also quoted in e-Literate on his interpretation of the differences. The NCES report would have done well to at least refer to the significant differences.

Traditionally, distance education offerings and enrollment levels have varied across different types of institutions. For example, researchers have found that undergraduate enrollment in at least one distance education course is most common at public 2-year institutions, while undergraduate enrollment in online degree programs was most common among students attending for-profit institutions.

This reference indirectly cites a previous NCES survey that used a different methodology regarding students in 2007-08.

  • That survey found that enrollment in at least one distance education course was “most common” at public 2-year colleges and the new data reaffirms that finding.
  • Enrollment in fully distance programs was “most common” in students attending for-profit institutions and the new data reaffirms that finding. However, leaving the story there perpetuates the myth that “distance education” equals “for-profit education.” The new IPEDS data show (see Table 1 below from a WCET post by Russ) that 35% of students enrolled exclusively at a distance attend for-profit institutions and only 5% of those who enroll in some (not all) distance courses attend for-profits. People are often amazed at what a big portion of the distance education market is actually in the public sector.

IPEDS Table that shows that shows full distance enrollments by sector:  47% at public colleges, 18% at non-profits, and 35% at for-profits.

A 2003 study found that historically black colleges and universities (HBCUs) and tribal colleges and universities (TCUs) offered fewer distance education courses compared with other institutions, possibly due to their smaller average size (Government Accountability Office 2003)

What a difference a decade makes. Both types of institutions show few of their students enrolled completely at a distance, but they now above the national average in terms of percentage of students enrolled in some distance courses in Fall 2012.

Rapidly changing developments, including recent institutional and policy focus on massive open online courses (MOOCs) and other distance education innovations, have changed distance education offerings.

Only a small number of MOOCs offer instruction that would be included in this survey. We’re just hoping that the uniformed will not think that the hyperbolic MOOC numbers have been counted in this report. They have not.

Upcoming Findings on Missing IPEDS Data

We are doing some additional research, but it is worth noting that we have found some significant cases of undercounting in the IPEDS data. In short, there has been confusion over which students get counted in IPEDS reporting and which do not. We suspect that the undercounting, which is independent of distance education status, is in the hundreds of thousands. We will describe these findings in an upcoming article.

In summary, the new NCES report is most welcome, but we hope readers do not make incorrect assumptions based on the introductory text of the report.Photo of Phil Hill

 

Phil Hill
Mindwires.com
e-Literate blog

 

Russ PoulinPhoto of Russ Poulin with baseball bat
WCET – WICHE Cooperative for Educational Technologies

 

If you’re not already a member, come join us!

3 replies on “A Response to New NCES Report on Distance Education”

Thanks to Phil and Russ for their ongoing work on this important topic. I agree with your opinion that the real value of the NCES report is the breakdown of IPEDS data by different variables. It is interesting to look at these variables for useful and even startling data (e.g., why so few DL learners percentage-wise in CA or DE? Do DL learners really comprise almost half of all higher ed. students in AZ, and if so, why?).

It’s also important to find plausible explanations, if not exactly convergence, among different data sources. It seems as though the difference between NCES and BSRG figures (7.1M vs. 5.5M) are explainable by undercounting (as you mention) + inclusion/exclusion of non-degree/certificate-seeking students + differences in DL course definitions (80% vs. 100%; I think I disagree with Jeff Seaman on this latter point). In fact, as noted in my 2012 book The Seven Futures of American Education (pp. ), the research firm Ambient Insight cites even larger numbers (12M DL learners in 2010) because their report included students from all postsecondary institutions which participated in Title IV federal student aid programs, including non-degree-granting institutions. The numbers from MOOCs and blended/hybrid courses will continue to blur the distinction between “distance” and “not distance” students.

In the end, however, we are better off having multiple numbers, because they will help compel us to look more deeply at the assumptions behind those numbers and to select different figures judiciously based on the assumptions needed at the moment. So long as this is done with reasonable transparency, this would be a good thing IMO. And just as the BSRG report data yielded useful patterns which informed subsequent practice, the IPEDS report can do the same in new and different ways as you both note…

Comments are closed.

Subscribe

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 2,374 other subscribers

Archive By Month

Blog Tags

Distance Education (315)Student Success (294)Online Learning (228)Managing Digital Learning (218)State Authorization (213)WCET (211)U.S. Department of Education (203)Regulation (196)Technology (167)Digital Learning (148)Innovation (125)Teaching (121)Collaboration/Community (114)WCET Annual Meeting (105)Course Design (103)Professional Development (98)Access (97)Faculty (88)Cost of Instruction (88)SAN (87)Financial Aid (84)Legislation (83)Completion (74)Assessment (69)Instructional Design (68)Open Educational Resources (66)Accreditation (65)COVID-19 (64)SARA (64)Accessibility (62)Credentials (62)Professional Licensure (62)Competency-based Education (61)Quality (61)Data and Analytics (60)Research (58)Diversity/Equity/Inclusion (56)Reciprocity (56)WOW Award (51)Outcomes (47)Workforce/Employment (46)Regular and Substantive Interaction (43)Policy (42)Higher Education Act (41)Negotiated Rulemaking (39)Virtual/Augmented Reality (37)Title IV (36)Practice (35)Academic Integrity (34)Disaster Planning/Recovery (34)Leadership (34)WCET Awards (30)Every Learner Everywhere (29)IPEDS (28)State Authorization Network (28)Adaptive/Personalized Learning (28)Reauthorization (28)Military and Veterans (27)Survey (27)Credits (26)Disabilities (25)MOOC (23)WCET Summit (23)Evaluation (22)Complaint Process (21)Retention (21)Enrollment (21)Artificial Intelligence (20)Correspondence Course (18)Physical Presence (17)WICHE (17)Cybersecurity (16)Products and Services (16)Forprofit Universities (15)Member-Only (15)WCET Webcast (15)Blended/Hybrid Learning (14)System/Consortia (14)Digital Divide (14)NCOER (14)Textbooks (14)Mobile Learning (13)Consortia (13)Personalized Learning (12)Futures (11)Marketing (11)Privacy (11)STEM (11)Prior Learning Assessment (10)Courseware (10)Teacher Prep (10)Social Media (9)LMS (9)Rankings (9)Standards (8)Student Authentication (8)Partnership (8)Tuition and Fees (7)Readiness and Developmental Courses (7)What's Next (7)International Students (6)K-12 (6)Lab Courses (6)Nursing (6)Remote Learning (6)Testing (6)Graduation (6)Proctoring (5)Closer Conversation (5)ROI (5)DETA (5)Game-based/Gamification (5)Dual Enrollment (4)Outsourcing (4)Coding (4)Security (4)Higher Education Trends (4)Mental Health (4)Fall and Beyond Series (3)In a Time of Crisis (3)Net Neutrality (3)Universal Design for Learning (3)Cheating Syndicates Series (3)ChatGPT (3)Enrollment Shift (3)Nontraditional Learners (2)Student Identity Verification (2)Cross Skilling/Reskilling (2)Virtual Summit (2)Higher Education (2)Title IX (1)Business of Higher Education (1)OPMs (1)Department of Education (1)Third-Party Servicers (1)microcredentials (1)Minority Serving Institution (1)