After billions of dollars spent on administrative computer systems and billions of dollars invested in ed tech companies, the U.S. higher education system is woefully out of date and unable to cope with major education trends such as online & hybrid education, flexible terms, and the expansion of continuing and extended education. Based on an investigation of the recently released distance education data for IPEDS, the primary national education database maintained by the National Center for Education Statistics (NCES), we have found significant confusion over basic definitions of terms, manual gathering of data outside of the computer systems designed to collect data, and, due to confusion over which students to include in IPEDS data, the systematic non-reporting of large numbers of degree-seeking students.

In Fall 2012, the IPEDS (Integrated Postsecondary Education Data System) data collection for the first time included distance education – primarily for online courses and programs. This data is important for policy makers and institutional enrollment management as well as for the companies serving the higher education market.

We first noticed the discrepancies based on feedback from analysis that we have both included at the e-Literate and WCET blogs. One of the most troubling calls came from a state university representative that said that the school has never reported any students who took their credit bearing courses through their self-supported, continuing education program.  Since they did not include the enrollments in reporting to the state, they did not report those enrollments to IPEDS. These were credits toward degrees and certificate programs offered by the university and therefore should have been included in IPEDS reporting based on the following instructions.

Photo of a calculator focused on the Clear Button
Calculating distance education enrollments hit some major snags.

“Include all students enrolled for credit (courses or programs that can be applied towards the requirements for a postsecondary degree, diploma, certificate, or other formal award), regardless of whether or not they are seeking a degree or certificate.”

Unfortunately, the instructions call out this confusing exclusion (one example out of four)

“Exclude students who are not enrolled for credit. For example, exclude: Students enrolled exclusively in Continuing Education Units (CEUs)”

How many schools have interpreted this continuing education exclusion to apply to all continuing education enrollments? To do an initial check, we contacted several campuses in the California State University system and were told that all IPEDS reporting was handled at the system level. Based on the introduction of the Fall 2012 distance education changes, Cal State re-evaluated whether to change their reporting policy. A system spokesman explained that:

“I’ve spoken with our analytic studies staff and they’ve indicated that the standard practice for data reporting has been to share only data for state-supported enrollments. We have not been asked by IPEDS to do otherwise so when we report distance learning data next spring, we plan on once again sharing only state-supported students.”

Within the Cal State system, this means that more than 50,000 students taking for-credit self-support courses will not be reported, and this student group has never been reported.

One of the reasons for the confusion, as well as the significance of this change, is that continuing education units have moved past their roots of offering CEUs and non-credit courses for the general public (hence the name continuing education) and taking up a new role of offering courses not funded by the state (hence self-support). Since these courses and programs are not state funded, they are not subject to the same oversight and restrictions as state-funded equivalents such as maximum tuition per credit hour.

This situation allows continuing education units in public schools to become laboratories and innovators in online education. The flip side is that given the non-state-funded nature of these courses and programs, it appears that schools may not be reporting these for-credit enrollments through IPEDS, whether or not the students were in online courses. However, the changes in distance education reporting may actually trigger changes in reporting.

Did Other Colleges Also Omit Students from Their IPEDS Report?

Given what was learned from the California State University System, we were interested in learning if other colleges were having similar problems with reporting distance education enrollments to IPEDS.  WCET conducted a non-scientific canvassing of colleges to get their feedback on what problems they may have encountered.  Twenty-one institutions were selected through a non-scientific process of identifying colleges that reported enrollment figures that seemed incongruous with their size or distance education operations.  See the “Appendix A:  Methodology(Link added after publishing) for more details.

From early August to mid-September, we sought answers regarding whether the colleges reported all for-credit distance education and online enrollments for Fall 2012.  If they did not, we asked about the size of the undercount and why some enrollments were not reported.

Typically, the response included some back-and-forth between the institutional research and distance education units at each college.  Through these conversations, we quickly realized that we should have asked a question about the U.S. Department of Education’s definition of “distance education.”   Institutions were very unclear about what activities to include or exclude in their counts.  Some used local definitions that varied from the federal expectations.  As a result, we asked that question as often as we could.

The Responses

Twenty institutions provided useable responses. We agreed to keep responses confidential.  Table 1 provides a very high level summary of the responses to the following two questions:

  • Counts Correct? – Do the IPEDS data reported include all for-credit distance education and online enrollments for Fall 2012?
  • Problem with “Distance Education” Definition? – Although we did not specifically ask this question, several people volunteered that they had trouble applying the IPEDS definition.

 

Table 1:  Counts for Institutional Responses
Counts Correct? Problem with “Distance Education” Definition?
Yes 11 3
Maybe 5 5
No 4 12

 

One institution declined to respond.  Given that its website advertises many hundreds of online courses, the distance education counts reported would leave us to believe that they either: a) under-reported, or b) average one or two students per online class.  The second scenario seems unlikely.

Of those that assured us that they submitted the correct distance education counts, some of them also reported having used their own definitions or processes for distance education.  This would make their reported counts incomparable to the vast majority of others reporting.

Findings

This analysis found several issues that call into question the usability of IPEDS distance education enrollment counts and, more broadly and more disturbingly, IPEDS statistics, in general.

There is a large undercount of distance education students

While only a few institutions reported an undercount, one was from the California State University System and another from a large university system in another populous state.  Since the same procedures were used within each system, there are a few hundred thousand students who were not counted in just those two systems.

In California, they have never reported students enrolled in Continuing Education (self-support) units to IPEDS.  A source of the problem may be in the survey instructions.  Respondents are asked to exclude: “Students enrolled exclusively in Continuing Education Units (CEUs).”  The intent of this statement is to exclude those taking only non-credit courses.  It is conceivable that some might misinterpret this to mean to exclude those in the campuses continuing education division. What was supposed to be reported was the number of students taking for-credit courses regardless of what college or institutional unit was responsible for offering the course.

In the other large system, they do not report out-of-state students as they do not receive funding from the state coffers.

It is unclear what the numeric scope would be if we knew the actual numbers across all institutions.  Given that the total number of “students enrolled exclusively in distance education courses” for Fall 2012 was 2,653,426, an undercount of a hundred thousand students just from these two systems would be a 4% error.  That percentage is attention-getting on its own.

The IPEDS methodology does not work for innovative programs…and this will only get worse

Because it uses as many as 28 start dates for courses, one institutional respondent estimated that there was approximately a 40% undercount in its reported enrollments.  A student completing a full complement of courses in a 15-week period might not be enrolled in all of those courses at the census date.  With the increased use of competency-based programs, adaptive learning, and innovations still on the drawing board, it is conceivable that the census dates used by an institution (IPEDS gives some options) might not serve every type of educational offering.

The definition of ‘distance education’ is causing confusion

It is impossible to get an accurate count of anything if there is not a clear understanding of what should or should not be included in the count.  The definition of a “distance education course” from the IPEDS Glossary is:

“A course in which the instructional content is delivered exclusively via distance education.  Requirements for coming to campus for orientation, testing, or academic support services do not exclude a course from being classified as distance education.”

Even with that definition, colleges faced problems with counting ‘blended’ or ‘hybrid’ courses.  What percentage of a course needs to be offered at a distance to be counted in the federal report?  Some colleges had their own standard (or one prescribed by the state) with the percentage to be labeled a “distance education” course varied greatly.  One reported that it included all courses with more than 50% of the course being offered at a distance.

To clarify the federal definition, one college said they called the IPEDS help desk.  After escalating the issue to a second line manager, they were still unclear on exactly how to apply the definition.

The Online Learning Consortium is updating their distance education definitions.  Their current work could inform IPEDs on possible definitions, but probably contains too many categories for such wide-spread data gathering.

There is a large overcount of distance education students

Because many colleges used their own definition, there is a massive overcount of distance education.  At least, it is an overcount relative to the current IPEDS definition.  This raises the question, is the near 100% standard imposed by that definition useful in interpreting activity in this mode of instruction?  Is it the correct standard since no one else seems to use it?

In addressing the anomalies, IPEDS reporting becomes burdensome or the problems ignored

In decentralized institutions or in institutions with “self-support” units that operate independently from the rest of campus, their data systems are often not connected.  They are also faced with simultaneously having to reconcile differing “distance education” definitions. One choice for institutional researchers is to knit together numbers from incompatible data systems and/or with differing definitions. Often by hand. To their credit, institutional researchers overcome many such obstacles.  Whether it is through misunderstanding the requirements or not having the ability to perform the work, some colleges did not tackle this burdensome task.

Conclusions – We Don’t Know

While these analyses have shed light on the subject, we are still left with the feeling that we don’t know what we don’t know.  In brief the biggest finding is that we do not know what we do not know and bring to mind former Secretary of Defense Donald Rumsfeld’s famous rambling:

“There are known knowns. These are things we know that we know. We also know there are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are ones we don’t know we don’t know.”

The net effect is not known

Some institutions reported accurately, some overcounted, some undercounted, some did both at the same time.  What should the actual count be?

We don’t know.

The 2012 numbers are not a credible baseline

The distance education field looked forward to the 2012 Fall Enrollment statistics with distance education numbers as a welcomed baseline to the size and growth of this mode of instruction.  That is not possible and the problems will persist with the 2013 Fall Enrollment report when those numbers are released.  These problems can be fixed, but it will take work.  When can we get a credible baseline?

We don’t know.

A large number of students have not been included on ANY IPEDS survey, EVER.

A bigger issue for the U.S. Department of Education goes well beyond the laser-focused issue of distance education enrollments.  Our findings indicate that there are hundreds of thousands of students who have never been reported on any IPEDS survey that has ever been conducted.  What is the impact on IPEDS?  What is the impact on the states where they systematically underreported large numbers of students?

We don’t know.

Who is at fault?

Everybody and nobody.  IPEDS is faced with institutional practices that vary greatly and often change from year-to-year as innovations are introduced.  Institutional researchers are faced with reporting requirements that vary depending on the need, such as state oversight agencies, IPEDS, accrediting agencies, external surveys and ranking services, and internal pressures from the marketing and public relations staffs.  They do the best they can in a difficult situation.  Meanwhile, we are in an environment in which innovations may no longer fit into classic definitional measurement boxes.

What to expect?

In the end, this expansion of data from NCES through the IPEDS database is a worthwhile effort in our opinion, and we should see greater usage of real data to support policy decisions and market decisions thanks to this effort. However, we recommend the following:

  • The data changes from the Fall 2012 to Fall 2013 reporting periods will include significant changes in methodology from participating institutions. Assuming that we get improved definitions over time, there will also be changes in reporting methodology at least through Fall 2015. Therefore we recommend analysts and policy-makers not put too much credence in year-over-year changes for the first two or three years.
  • The most immediate improvement available is for NCES to clarify and gain broader consensus on the distance education definitions. This process should include working with accrediting agencies, whose own definitions influence school reporting, as well as leading colleges and universities with extensive online experience.

 

NOTE:  The research on the IPEDS survey and this blog post are the result of an on-going partnership between Phil Hill (e-literate blog and co-founder of MindWires Consulting, @PhilonEdTech) and WCET. Throughout this year, we coordinated in analyzing and reporting on the IPEDS Fall Enrollment 2012 distance education enrollment data. As they came to light, we also coordinated in examining the anomalies. We very much appreciate Phil and this partnership.

Much thanks goes to Terri Taylor Straut, who performed the heavy lifting for WCET in surveying institutions and conducting follow-up calls with respondents. Her insightful questions and attention to details was invaluable. And thank you to Cali Morrison, WCET, for her work in helping us to get the word out about our findings.
Russ Poulin

 

Photo of Russ Poulin with baseball batRussell Poulin
Interim Co-Executive Director
WCET – WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu
Twitter:  @RussPoulin

 

 

 

 

Photo of Phil HillPhil Hill
MindWires / e-Literate
phil@mindwires.com
Twitter:  @philonedtech

 

 

 

 

Calculator Photo Credit: Alvimann

10 replies on “Investigation of IPEDS Distance Education Data: System Not Ready for Modern Trends”

It seems like quite a stretch to go from a data point about distance education enrollments being underreported to the statements that IPEDS is “not ready for modern trends” and that the “U.S. higher education system is woefully out of date and unable to cope with major education trends”.

(Opinions expressed are solely my own.)

Comments are closed.

Subscribe

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 2,374 other subscribers

Archive By Month

Blog Tags

Distance Education (315)Student Success (294)Online Learning (228)Managing Digital Learning (218)State Authorization (213)WCET (211)U.S. Department of Education (203)Regulation (196)Technology (167)Digital Learning (148)Innovation (125)Teaching (121)Collaboration/Community (114)WCET Annual Meeting (105)Course Design (103)Professional Development (98)Access (97)Faculty (88)Cost of Instruction (88)SAN (87)Financial Aid (84)Legislation (83)Completion (74)Assessment (69)Instructional Design (68)Open Educational Resources (66)Accreditation (65)COVID-19 (64)SARA (64)Accessibility (62)Credentials (62)Professional Licensure (62)Competency-based Education (61)Quality (61)Data and Analytics (60)Research (58)Diversity/Equity/Inclusion (56)Reciprocity (56)WOW Award (51)Outcomes (47)Workforce/Employment (46)Regular and Substantive Interaction (43)Policy (42)Higher Education Act (41)Negotiated Rulemaking (39)Virtual/Augmented Reality (37)Title IV (36)Practice (35)Academic Integrity (34)Disaster Planning/Recovery (34)Leadership (34)WCET Awards (30)Every Learner Everywhere (29)IPEDS (28)State Authorization Network (28)Adaptive/Personalized Learning (28)Reauthorization (28)Military and Veterans (27)Survey (27)Credits (26)Disabilities (25)MOOC (23)WCET Summit (23)Evaluation (22)Complaint Process (21)Retention (21)Enrollment (21)Artificial Intelligence (20)Correspondence Course (18)Physical Presence (17)WICHE (17)Cybersecurity (16)Products and Services (16)Forprofit Universities (15)Member-Only (15)WCET Webcast (15)Blended/Hybrid Learning (14)System/Consortia (14)Digital Divide (14)NCOER (14)Textbooks (14)Mobile Learning (13)Consortia (13)Personalized Learning (12)Futures (11)Marketing (11)Privacy (11)STEM (11)Prior Learning Assessment (10)Courseware (10)Teacher Prep (10)Social Media (9)LMS (9)Rankings (9)Standards (8)Student Authentication (8)Partnership (8)Tuition and Fees (7)Readiness and Developmental Courses (7)What's Next (7)International Students (6)K-12 (6)Lab Courses (6)Nursing (6)Remote Learning (6)Testing (6)Graduation (6)Proctoring (5)Closer Conversation (5)ROI (5)DETA (5)Game-based/Gamification (5)Dual Enrollment (4)Outsourcing (4)Coding (4)Security (4)Higher Education Trends (4)Mental Health (4)Fall and Beyond Series (3)In a Time of Crisis (3)Net Neutrality (3)Universal Design for Learning (3)Cheating Syndicates Series (3)ChatGPT (3)Enrollment Shift (3)Nontraditional Learners (2)Student Identity Verification (2)Cross Skilling/Reskilling (2)Virtual Summit (2)Higher Education (2)Title IX (1)Business of Higher Education (1)OPMs (1)Department of Education (1)Third-Party Servicers (1)microcredentials (1)Minority Serving Institution (1)