State Authorization for Distance Ed Federal Regulation to be Implemented 07/01/2018

Frequently Asked Questions: Overview and Direction of the Regulations

The U.S. Department of Education is scheduled to implement state authorization for distance education regulations on July 1, 2018. 2018 calendar showing July 2018 with the 1st circledThere is still some uncertainty about whether the Department will implement the regulation and, if they do, what institutions need to do to comply. WCET and its State Authorization Network (SAN), the National Commission for State Authorization Reciprocity Agreements (NC-SARA), and the Distance Education Accrediting Commission (DEAC) have asked for clarification from the Department more than once and through different channels.

This Frequently Asked Questions Document provides an overview of the status and a few recommendations on how to proceed on the following topics:

    1. The Status of the Regulations – As Best We Know It.
    2. The Federal State Authorization Regulation – What are the Big Outstanding Questions? And What Should We Do?
      • Reciprocity.
      • Complaints.
      • Residence vs. Location.
      • Notifications.

More detailed information will be provided to State Authorization Network members.

1. Topic Area: The Status of the Regulation – As Best We Know It

Q: Will this regulation go into effect?

  • The Department and Congress have had numerous opportunities to kill this regulation. We thought they would have done so by now, if that was their intent. We have received mixed signals about the future of the regulation in recent weeks. On March 30, Politico had a convincing report that “state authorization” was one of a handful of regulations that could be “under review” soon. That would likely delay implementation and result in a rulemaking panel to rewrite the regulation. We think this is a likely outcome, but we also predicted they would have acted on it – one way or the other – by now.
  • To clarify, neither WCET nor WCET/SAN are recommending a review. We have long believed that complying with state laws and regulations is reasonable.
  • As an institution, it is important to remember our mantra for the last year, that: “A regulation is a regulation until it is not a regulation.” Until we are told otherwise, institutions should expect to comply.

Q: What about the bill in the House of Representatives, the PROSPER Act? Won’t that bill do away with this requirement?

  • The PROSPER Act will not pass, as is.
  • For a bill to go into effect, it must pass the House, the Senate, and be signed by the President. The House Bill is the product of one party. The PROSPER Act is a first attempt at reauthorizing the Higher Education Act, which covers the basic federal regulations for the higher education / federal government relationship. An update (or “reauthorization”) is long overdue. The Senate is working on a bipartisan version of the reauthorization of the Higher Education Act that will likely differ greatly from the PROSPER’s provisions.
  • Since it is an election year and summer Congressional breaks are coming, many pundits believe that if reauthorization does not happen by June or, maybe, July, that it is probably dead for this year. Such action is appearing to be unlikely, but this Congress has had a tendency to suddenly produce bills and hurry them through.
  • Our current thinking is that reauthorization probably will not happen this year, therefore Congress will probably not change the state authorization regulation.

Q: If Congress does cease requiring federal state authorization, can I stop worrying about state authorization and drop my SARA and WCET/SAN memberships?

  • No.
  • Regardless of what the federal government does, the states will still have their regulations and will expect you to comply.

2. Topic Area: The Federal State Authorization Regulation – What are the Big Outstanding Questions? And What Should We Do?

WCET, NC-SARA, and DEAC have submitted to the Department of Education a list of questions that would need to be answered and recommendations on changes that should be made. For some of these items, we have heard trusted experts provide opposing opinions on what colleges are supposed to do to comply with the regulatory language. IF the regulation goes into effect on July 1, below are some of the top issues that should have been addressed long before now…because compliance takes time:

2.a. Reciprocity – Will reciprocity be recognized by the Department of Education as a path to demonstrate compliance in a state?

  • When the regulation was originally released, it seemed to recognize reciprocal agreements that did prohibit states from enforcing their own laws. Essentially, that would have negated reciprocity since the agreement would lack any enforcement capacity. Shortly after we released our analysis of the new regulation, Department of Education personnel told WCET and NC-SARA staff that we had misinterpreted the meaning of the regulatory language. Instead, they said that their intention was that a reciprocal agreement would not be recognized if there was a “conflict” between state law and reciprocity agreement requirements. Since states joining SARA agree to its provisions, “conflict” is removed before it can join. A letter from the Undersecretary of Postsecondary Education describing this clarification of the Department’s intent was sent to WCET and NC-SARA. While this letter does not hold the force of rule, it is our understanding that Department personnel fully support NC-SARA and, if the regulation goes into effect, that this clarification will be codified. It is highly unlikely that the Department would undermine the will of 48 (soon 49) states, the U.S. Virgin Islands, and the District of Columbia.Panic Button Image

What Should You Do? Don’t’ Panic. The Department is very supportive of reciprocity. We expect clarification to be positive for SARA.

2.b. Complaints – Institutions are required to “document” the complaint processes for students in each state, what do we do about states (like California) that do not have a compliant process? Should we stop enrolling students from California?

  • If a state has joined SARA, then SARA has processes to handle complaints. California will not join SARA this year and might not next year. California has complaint and oversight processes for out-of-state for-profit institutions, but not out-of-state publics or non-profit institutions. They are aware of the issue, but it will take legislative action to fix it. Such action does not appear to be imminent. This could be a problem if you enroll students in California at a distance.
  • If you are at an institution that is not a SARA member, there are other states that do not have a complaint process for you. That could be a problem for enrolling students in those states, as well.

What Should You Do?

  • Some have recommended citing the California Attorney General’s office as the handler of complaints. Since that office has openly declined this responsibility, the action might not withstand a financial aid review by the Department of Education.
  • Both the Department and California officials are aware of the issue. It is our hope and belief that (if the regulation goes into effect) that they will create some type of accommodation so as not to harm students. If they don’t we should all scream loudly…very loudly.
  • We can’t give you an absolute answer on this one as too many possible scenarios. We suggest that you:
    • Count how many distance students that you have or will have in California.
    • Communicate with your institution’s legal counsel.
    • Determine the level of risk that your institution is willing to assume.
    • Act accordingly.
    • Be prepared to act.

2.c. Residence vs. Location – Since state authorization is based upon the location of the student, why does the regulation use “reside” or “state of residence” so often in the regulation?

  • This is another one on which state authorization experts have differed in their advice. Some have suggested that institutions need to collect the student’s official state of residence in addition to location. That’s lots of extra work.
  • During the 2014 Negotiated Rulemaking Marshall Hill (NC-SARA), Leah Matthews (DEAC), and Russ Poulin (WCET) all served as negotiators or alternates. We made sure that all references in the regulations being developed at that time referenced location and not residence. It is too bad that our work was forgotten. We see this as an error in the current regulation and are urging them to correct it.

What Should You Do? 

  • Again, we cannot give you a definite answer, but we should have more clarity soon…we hope.
  • Assess the amount of work this would take.
  • Determine the level of risk that your institution is willing to assume.
  • Act accordingly.

2.d. Notifications – How do we implement the new required notifications?

  • Institutions are expected to make several public (via websites) or individualized (direct) communication with the student notifications. There are several details that need to be clarified, such as “what exactly is an adverse action,” “do SARA institutions report state refund policies,” and several questions about notifications for programs that lead to professional licensure or certification.

photo of a person looking confused and shrugging

What Should You Do?

  • For programs that lead to professional licensure or notification, the requirements are close to those already required by SARA. Also, the lack of proper notifications on these programs has been the leading cause of student lawsuits against institutions, protect yourself.
    • For SARA institutions, you should review SARA requirements, proceed in determining your program’s applicability in any state you will enroll students at distance for that program. Comply with state regulations. Notify students of your status for that program in his/her state.
    • For institutions that are not a member of SARA, we still think that the threat of lawsuits makes this work worthwhile.
  • For the other notifications, if you have not implemented them yet, you may wish to wait for clarification. Again, determine the level of risk that your institution is willing to assume.

Finally…

To be clear, institutions should be preparing for the July 1, 2018 regulations, to the best of their ability, until we hear further information. To that end we have attempted to summarize the aspects of the regulation into somewhat of a check list of requirements for you to download. WCET and WCET|SAN are committed to providing information as soon as it becomes available and guiding institutions through whatever we learn in the upcoming months.

Cheryl Dowd
Cheryl Dowd
Director,State Authorization Network
WCET
cdowd@wiche.edu

 

 

Photo of Russ Poulin
Russell Poulin
Director, Policy & Analysis
WCET – The WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu | @russpoulin

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

Count All Students! Outcome Measures for Community Colleges

q1Should we count all students when analyzing higher education, or only some of them? We think all students should be included….and community colleges are often misrepresented by not doing so. This third post in a series of posts on the IPEDS Outcome Measures data (released by the U.S. Department of Education late in 2017), we turn our attention to community colleges.

The Outcome Measures is a new, more comprehensive view of what happened to students who attend an institution. This new Measure improves on the traditional Graduation Rate, which only included first-time, full-time students and left out new part-time or transfer-in students. There are many of those students at two-year institutions.

The first post in our series introduced the issue generally and the second examined the Outcome Measures results for institutions that serve a large number of students who take all of their courses at a distance. This post samples the results for the community colleges in one state.

Some overall observations:

  • Our friends from the WICHE Policy and Research unit found that 60% of all higher education students and 71% of community college students were not included in the Department’s Graduation Rate statistic that has been used for many years.
  • The Graduation Rate likely did not count 68.8% of community college students in Colorado. That’s almost 15,000 students. They are included in the new Outcome Measures.
  • The Graduation Rate likely counted only 26.8% of community college students in Colorado, as it focused only on those who received a degree or certificate. Adding the Outcome Measures “transfer-out” and “still-enrolled” categories accounts for another 35.6% (7,215) students who entered in 2008.

This is the first year for the data to be released. As with any new statistic, some institutions were more successful than others in gathering and reporting the data. This should improve in the coming years.

We will also provide you with our spreadsheet of data so that you can analyze the data and perform similar analyses for your institution.

The bottom line: Institutions should insist on broader use of the Outcome Measures data when reporting to legislators, state system offices, the press, and those pesky ratings services.q2

What Are the Outcome Measures?

For the old Graduation Rate measure, institutions were asked to identify a “cohort” of students who entered as first-time (they’ve not attended college before) and full-time (they are taking a full load of courses) in the fall of a given academic year. Institutions track and report what happened to those students after a set number of years.

The new Outcome Measures asks institutions to expand this tracking by adding three additional cohort categories of students:

  • First-time, full-time (similar to the Graduation Rate).
  • First-time, part-time.
  • Non-first-time (transfer-in), full-time.
  • Non-first-time (transfer-in), part-time.

graph 1Source: https://nces.ed.gov/ipeds/use-the-data/survey-components/outcome-measures

The first cohorts were compiled for the Fall 2008 academic term and the first dataset containing results was released last October. It is important to note that these data are for all students in a given cohort, not just distance education students.

Our Sample of Public Institutions: The Community Colleges of Colorado

WICHE’s Policy and Research Brief provided statewide and regional results for each of the western states. We wanted to see the impact on an institutional basis and gauge some variations among institutions in a similar setting.

Given that community college leaders long complained about Department of Education’s Graduation Rate not providing an accurate profile of their students, we thought that we should look at that sector. We chose community college in Colorado for several reasons:

  • We both live in Colorado and are more familiar with the geography, if not the intricacies of their operations, for institutions in our home state.
  • Colorado includes a mix of geographic settings: urban, suburban, rural, and remote.
  • With fourteen community colleges, the sample was not so large as to be unwieldy.

Most Community College Students Are Not Included in the Graduation Rate

First, let’s look at the input into the Outcome Measures statistic. Which students are included?

Community colleges may be the group best served by the expanding view of the Outcome Measures statistic. The Graduation Rate (previously the only measure of what happened to those admitted to an institution) includes only students who were first-time (they are new to higher education) and full-time (they enrolled in a full load of courses). Especially for urban institutions with a focus on working adults and those returning to college, this leaves out the bulk of the students they serve.

This fact is starkly highlighted in the WICHE brief on the western states in which they found that only 29% of all community college students in the western states were first-time, full-time students. That means that 71% of community college students in the west were likely not included in the Graduation Rate counts. In our opinions, that is a serious undercount.

For Colorado’s community colleges, the percentage of students who were first-time, full-time students  is similar to the other western states at 31.2% of enrollments. Therefore, 68.8% of community colleges students in Colorado (almost 15,000) were likely not included in the Graduation Rate counts.

Looking at the numbers more closely, in Colorado there is a huge rural/urban divide in serving first-time, full-time students:

  • Rural – 57.7% were first-time, full-time.
  • Urban – 27.8% were first-time, full-time.

This is not surprising, as rural community colleges may tend to serve students in their immediate area. Their enrollments may reflect the desire of traditional age students to begin their college experience locally. It also could be a tribute to those colleges offering programs that are designed to lead to employment locally. On the other hand, urban institutions, due to the make-up of their local population, may recruit and serve more adult and returning students.

Why does this matter? If only the old Graduation Rate were used, the comparisons were not equal. The Graduation Rate includes a much smaller percentage of students for urban institutions and is a poor representation of the institution’s overall population and activities.

Community Colleges Benefit from Including More Outcomes

In addition to granting degrees and certificates, the mission of many community colleges includes preparing students to transfer to other institutions. The Outcome Measure now accounts for one of the basic goals of these institutions. The low completion rates of community colleges have been cited as evidence of their ineffectiveness. Admittedly, some colleges deserve the criticism. By including counts for the students who transferred and those still enrolled, a more complete picture of what happened to the students who entered is obtained.

Below is a table of the results of the Outcome Measures data for students who entered each institution in 2008. The columns included are:

  • Urban or Rural – This is our own (Colorado-based and probably unscientific) view of the communities in which the institutions reside. Those labeled “rural” are in smaller towns in more remote settings. Admittedly, to someone from the east or west coast, Pueblo or Greeley may seem “rural,” but they are larger towns that also include a university.
  • FTFT Over 50% – A “yes” appears if “first-time, full-time” students represent more than 50% of the institution’s incoming freshman class cohort for the Outcome Measure. We found it interesting to note that four of the six rural community colleges served more traditional students. Additionally, a fifth institution was barely below the 50% mark. Urban institutions tend to have a much smaller percentage of “first-time, full-time” incoming students.
  • Completers – These students received a degree or certificate within eight years of entering the institution. This serves as a proxy (though might not be an exact match) for the students included in the Graduation Rate measure.
  • Completers + Transfer-Out + Still-Enrolled (CTS) – The percentages of those who transferred to another institution and those who are still enrolled at the institution after eight years are added to the “completers” percentage. As you may have guessed, there are precious few “still enrolled” students after eight years at community colleges. Some community college professionals have called similar statistics the “success” rate, but we avoided that value-laden term.
  • Difference – Subtract the “completers” column from the next column to obtain the Difference. The reason for showing the difference is to highlight the percentage of students who would likely not have been counted in the Graduation Rate.

chart

Our main observations from this table are:

  • The “completers” designation (which mimics the Graduation Rate) accounted for the ultimate fate for only 26.8% of Colorado’s community college enrollments. Adding the “transfer-out” and “still-enrolled” students account for another 35.6% (7,215) of the students who entered in 2008. Community colleges should get recognition for these students.
  • There are differences in the end results between rural and urban institutions. Rural institutions tend to have higher percentages of “completers.” That result may be partially due to the composition of their incoming student bodies having more “first-time, full-time” students.
  • There are still a large number of students in the “unknown” category. Some of this may be due to inadequate record keeping because this is a new statistic. Some of it may be due to the college’s inability to track some transfers. It is likely that many of the students dropped out. If (in a worse case scenario) nearly 40% of students who entered community college in Colorado in 2008 have left higher education, that is a number that may give policymakers pause.
  • The “difference” column is more than 20% for every college except Pueblo Community College, which leads us to wonder if they had difficulty with identifying transfers or experienced other problems with implementing the reporting requirements in the first year of Outcome Measures data collection.

The Incoming Students: Composition of Fall 2008 Cohort

One of the benefits of Outcome Measures (OM) is to better understand the composition of the student cohorts at our institutions. While the IPEDS Graduation Rate includes only “first-time, full-time” students, Outcome Measures includes the four cohorts listed above. Let’s look at results for those institutions for each of the four cohorts…

First-time, Full-time Cohort

Northeastern Junior College has the largest “first-time, full-time” entering cohort of students. The College is located in Sterling, CO, which has historically been a farming community. Located along the Interstate about 130 miles from Denver, it is closer to Wyoming and Nebraska than the Capitol city.

First time full time 69.4, first time part time 18.1, non first time, full time 8.7, non first time, part time 3.9

First-time, Part-time Cohort

Morgan Community College has the largest “first-time, full-time” entering cohort of students. Located in Fort Morgan, Colorado, it is about 40 miles closer to Denver on the same interstate as Northeastern Junior College. It is interesting to note the differences in the entering student composition of these two rural institutions that are less than an hour from each other.

First time full time 29.8, first time part time 56.1, non first time, full time 5.7, non first time, part time 8.3

Non-First Time, Full-time Cohort

Arapahoe Community College has the largest “non-first-time, full-time” cohort of entering students. It is located in the southern suburbs of Denver. At 17.5%, the College has the highest percentage in this category, but the others are not far behind with most colleges enrolling between 8-16% o their students in this cohort.

First time full time 21.6, first time part time 34.2, non first time, full time 17.5, non first time, part time 26.8

Non-First Time, Part-time Cohort

It is not surprising that the Community College of Denver, in the state’s largest city, leads with the most returning students who are studying only part-time.

First time full time 21.7, first time part time 36.5, non first time, full time 13.5, non first time, part time 28.3

Student Outcomes: What Happened to the Students?

In addition to providing observations about the 2008 student cohort make up, the IPEDS Outcome Measures data also provides insights regarding the disposition of the students over time. Institutions report student outcomes in one of four categories:

  • Completion (Received an Award).
  • Transfer Out.
  • Still Enrolled.
  • Unknown.

The data reported here are at eight years since matriculation (as of Fall, 2016). In addition to eight-year data, IPEDS also collects and reports outcomes at six years. We compared the six and eight-year data and found little difference. For simplicity and consistency, we decided to use only the eight-year outcomes in this analysis. It is also noteworthy that 8 years beyond the matriculation date is four times the typical completion for an Associate degree.

The Outcome Measures data for Colorado’s community colleges are reported below:

Colorado Two-Year Public Institutions

Completion 26.8, Transfer out 31.6, Still enrolled 2.8, Unknwon 38.9

 Completion

Continuing the tour of the state to the rural southeast, Otero Junior College is located in La Junta and has the highest completion rate. Three other institutions broke the 40% completions mark: Morgan Community College (43.4%), Trinidad State Junior College (40.8%), and Lamar Community College (40.3%). All of them are located in rural settings.

Completion 47, Transfer out 24.1, Still enrolled .2, Unknwon 28.7

Completers rates vary greatly and often need local context. For example, Red Rocks Community College (in a middle-class to upscale suburb west of Denver) has the lowest “completers” rate at 16.6%. Its “transfer-out” rate is 39.0%, reflected a possible suburban trend to start locally and transfer.

 Transfer Out

Colorado Northwestern Community College tops the list for “transfer out” colleges. It is located in remote Rangely, CO near the border with Wyoming. It is about 200 miles northwest of Denver on the other side of the mountains. On a good day, it is a four-hour drive. The lure to start locally and transfer may be based on geography. Other two-year institutions with high transfer rates include Community College of Denver 45.9% (an urban college) and Lamar Community College 43.1% (a rural college).

Completion 22.2, Transfer out 48.6, Still enrolled .4, Unknwon 28.8

Still Enrolled

Although the percentages are small across the board, Front Range Community College has the highest rate of students still enrolled at the institution without having completed a degree or certificate after eight years. Front Range serves Boulder, Fort Collins, and the northern suburbs of Denver. Other community colleges with significant transfer rates reported include other suburban institutions: The Community College of Aurora (2.3%), and Red Rocks Community College (2.2%).

Completion 32.4, Transfer out 30.1, Still enrolled 2.8, Unknown 34.6

Unknown

Pueblo Community College has the highest rate of “unknown” responses, which we suspect may be due to problems with gathering data for this statistic in its first year of existence. We suspect that because of the suspiciously low “transfer out” rate that they report, especially since Colorado State University-Pueblo nearby is only six miles away. Northeastern Junior College (16%) and Lamar Community College (15.9%) reported the lowest “unknown” rates.

Completion 30, Transfer out 3.7, Still enrolled 1.2, Unknown 65

Next Steps

We are still absorbing what this all means.

We definitely urge you to examine your institution’s Outcome Measures rates. Use our spreadsheet as a starting place. We should all demand that colleges, policymakers, the press, and online listing/ranking services use these data to better represent what might happen to a student who enrolls at your institution.

If we write more or have calls for action, we will let you know.

Meanwhile, what do you think? Let us know.

Photo of Russ Poulin

 

Russell Poulin
Director, Policy & Analysis
WCET – The WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu | @russpoulin

 

 

Terri Straut

 

Terri Taylor-Straut
Director of Actualization
ActionLeadershipGroup.com

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

Count All Students! Outcome Measures for Non-Traditional Institutions

Should we count all students when analyzing higher education, or only some of them? It’s not surprising that when you include all students, you get different results from that analysis than when you don’t. We think all students should be included.quote 1

As reported in our last blog post (“Count All Students! New Outcome Measures Now Include Non-Traditional Students”), the U.S. Department of Education released data for a new Outcome Measures statistic last year. We thought that the Outcome Measures would provide more comprehensive data regarding student outcomes than the traditional Graduation Rate measure, which includes only first-time, full-time students. We were not disappointed.

Some overall observations:

  • Our friends from the WICHE Policy and Research unit found that 60% of all higher education students and 71% of community college students were not included in the Department’s Graduation Rate statistic that has been used for many years.
  • This is the first year for the data to be released. As with any new statistic, some institutions were more successful than others in gathering and reporting the data. This should improve in the coming years.
  • Institutions with large non-traditional student enrollments (e.g.: community colleges, online colleges, inner city universities, military-serving institutions) have long clamored for a statistic that better represented the populations they serve. The addition of “transferred out” and “still enrolled” are important new options to show the student’s progress at the institution. For the institutions analyzed:
    • For some, their Outcome Measures results are much better than what might be reported in their Graduation Rate.
    • For others…not so much. It will be interesting to see if better data collection improves their results in the coming years.
  • Based upon feedback from institutions to the Department, there are already some changes to data collections for future years.

In this post and the next, we will examine the Outcome Measures results for different sample sets of institutions. We will also provide you with our spreadsheet so that you can analyze the data and perform similar analyses for your institution.

What Are the Outcome Measures?

For the old Graduation Rate measure, institutions were asked to identify a “cohort” of students who entered as first-time (they’ve not attended college before) and full-time (they are taking a full load of courses) in the fall of a given academic year. Institutions track and report what happened to those students after a set number of years.

The new Outcome Measures asks institutions to expand this tracking by adding three additional cohort categories of students:

  • First-time, full-time (similar to the Graduation Rate).
  • First-time, part-time.
  • Non-first-time (transfer-in), full-time.
  • Non-first-time (transfer-in), part-time.

graph shows break down of

Source: https://nces.ed.gov/ipeds/use-the-data/survey-components/outcome-measures

The first cohorts were compiled for the Fall 2008 academic term and the first dataset containing results was released last October.

Top 15 Institutions Offering “Exclusively Distance Education Courses”

In examining the results more closely, the data related to student matriculation and outcomes vary dramatically, based on the mission of the institution. This is especially clear when we look at institutions that are known to serve large populations of students taking “exclusively distance education courses.”

With U.S. Department of Education’s Integrated Postsecondary Education Data System (IPEDS) data, it is possible to identify the institutions with the largest distance education enrollments. Enrollment data is collected separately from Outcome Measures, and we have reported on distance education enrollments several times over the past few years.

Using the 2016 IPEDS distance education enrollment data (the most recent available), we determined the top 15 institutions, based on reported number of enrollments taking exclusively distance education courses. These institutions, in order of exclusive distance education enrollment (largest to smallest), include:

  1. University of Phoenix-Arizona.
  2. Western Governors University.
  3. Southern New Hampshire University.
  4. Liberty University.
  5. Grand Canyon University.
  6. Walden University.
  7. American Public University System.
  8. University of Maryland-University College.
  9. Excelsior College.
  10. Ashford University.
  11. Capella University.
  12. Kaplan University-Davenport Campus.
  13. Brigham Young University-Idaho.
  14. Arizona State University-Skysong. (Note: Arizona State University has recently begun reporting some of its enrollments under this name even though it is not a separately accredited institution. Since the Outcomes Measures data that we analyzed reports students first enrolled in Fall 2008, there is no data to report for Skysong students.)
  15. Colorado Technical University-Colorado Springs.

That list includes eight for-profit, five non-profit, and two public institutions. Why this group? We were searching for a group of non-traditional institutions and this seemed better than any bias resulting from picking a set of institutions on our own. In our next blog post, we will examine public community colleges and universities in one state.

We compiled a complete Excel file of the 15 institutions and their reported data for each cohort. In the analysis below we provide graphic examples of the highs and/or lows for the category in question.

The Incoming Students: Composition of Fall 2008 Cohort

One of the benefits of Outcome Measures (OM) is to better understand the composition of the student cohorts at our institutions. While the IPEDS Graduation Rate includes only first-time, full-time students, Outcome Measures includes the four cohorts listed above. Let’s look at results for those institutions for each of the four cohorts…

First-time, full-time cohort

Predictably, the reported number of students in each of these four categories is aligned with the mission of the organizations. For example, the largest number of first-time, full-time students among the 15 organizations are BYU-Idaho with 76.7% and Southern New Hampshire University with 67.3%. These institutions both have large on-campus students populations in addition to their online presence. The Outcome Measures represent overall institutional enrollments; therefore, these data combine on-campus and online enrollment.

Graph showing BYU Idaho and Southern New Hampshire University

Institutions long-focused on serving students at a distance reported few, if any, first-time, full-time students. In fact, of the 14 institutions that reported OM data, half (50.0%) reported under 5% of the 2008 cohort were first-time, full time students. For those institutions, their Graduation Rate is calculated on less than 5% of their overall student population, which is why the Outcome Measures will be a better barometer of student progress for them.

First-time, part-time cohort

Institutions that are known to serve part-time students reported the largest numbers of students in the first-time, part-time cohort. Walden University led with 56.6% of the cohort, followed by American Public University System with 38.8% of reported students being first-time, part-time in 2008.

Walden University led with 56.6% of the cohort, followed by American Public University System with 38.8% of reported students being first-time, part-time in 2008.

As with first-time, full-time, several institutions reported few or no first-time, part-time enrollments.

Non-first-time (transfer-in), full-time cohort

Institutions that focus on serving full-time students, with a promise of accelerated degree completion through credit for life experience or competency-based education, led with reported enrollment in the returning, full-time category for the 2008 cohort. Western Governors University (WGU) reported 93% of their students in this category; Ashford University reported 84.8%.

Western Governors University (WGU) reported 93% of their students in this category; Ashford University reported 84.4%.

Non-first-time (transfer-in), part-time cohort

Excelsior College reported all (100%) of their enrollments as returning, part-time, followed by Capella University reporting 80.8% of their 2008 enrollment in that category. Both institutions have a mission focused on serving adult learners, many of whom cannot attend college full-time. As previously noted, institutions that are designed to serve full-time students reported very small enrollments in this category.

graphs5

Student Outcomes: What Happened to the Students?

In addition to providing observations about the 2008 student cohort make up, the Outcome Measures data also provides insights regarding the disposition of the students over time. Institutions report student outcomes in one of four categories:

  • Completion (Received an Award).
  • Transfer Out.
  • Still Enrolled.
  • Unknown.

The data reported here are at eight years since matriculation (as of Fall, 2016). In addition to eight-year data, IPEDS also collects and reports outcomes at six years. We compared the six and eight-year data and found little difference. For simplicity and consistency, we decided to use only the eight-year outcomes in this analysis.

As noted by our WICHE Policy colleagues in their recent OM report that examined outcomes for students in the WICHE states:

“The credential rates from Outcome Measures data and graduation rates from Graduation Rate data are not synonymous and generally should not be interchanged. There are important distinctions as discussed in this brief. Users should clearly distinguish between OM and GR results.”

That report is a worthy read. With this caveat as background, let’s look at the outcomes.

Completion

Completion indicates that the student received a certificate or degree within eight years of entering the institution. Of the top 15 institutions in 2016, based on exclusively distance education enrollment, BYU-Idaho led in completions with 59.4%; Southern New Hampshire University reported 55.7% completion. It is important to remember that the enrollment reported was total enrollment, not just distance education students. On this measure the more traditional institutions did better than their less traditional counterparts.

graphs6

Institutions that reported the lowest completion rates include Kaplan University-Davenport Campus with 20.6% and Capella University with 19.5% completion at eight years. Both institutions also had unusually high percentages of students in the “unknown” category.

Kaplan University-Davenport Campus with 20.6% and Capella University with 19.5% completion at eight years.Transfer Out

We know that a primary mission for many institutions of higher education is the successful transfer of students to other institutions. Of the reporting institutions in this analysis, Capella University led with 42.5% of students reported as transferred out, and BYU-Idaho reported the second highest transfer rate, at 38.9%.

graphs8

Still Enrolled

Eight years beyond the date of matriculation is 200% of the timeframe expected to complete a Bachelor’s degree, therefore, we expected this category of enrollment to be the smallest reported. However, institutions that are focused on serving part-time students would logically have the highest rates of students reported as still enrolled at eight years. These students are working toward their degree, often a few courses at a time. Of the reported institutions, American Public University System (which has a large number of students in the military) led with 19.5%. Excelsior College ranks a distant second with 6.7% of reported enrollment reported as still enrolled.

graphs9

Unknown Outcomes – Those with High Percentages

Student enrollments that did not result in one of the prior dispositions are reported as “unknown”. These are troubling outcomes, as it is the responsibility of institutions of higher education to track student progress.

There are three institutions in this sample with over 70% of their student outcomes reported as unknown: Kaplan University-Davenport (79.2%), University of Maryland-University College (72.2%), and University of Phoenix-Arizona campus (71.0%).

graphs10

It is unclear whether the “unknown” classification is due to the students having dropped out, is due to an inadequate classification of students as this is the first OM data collection, or some other reason. It will be interesting to watch the results in the upcoming years.

Unknown Outcomes – Those with Low Percentages

Among the 14 institutions we examined, those that reported few students with “unknown” outcomes include BYU-Idaho reporting 0% (that is remarkably low) and Southern New Hampshire University reporting 12.6% of students with “unknown” outcomes.

graphs11

Outcome Measures Creates a New View of Positive Student Outcomes

As we previously stated, the above institutions also

have a sizeable on-campus contingent. What about institutions that are focus completely on non-traditional students? For technical reasons the exact percentage for the Graduation Rate might be slightly different than the completion percentage for the Outcome Measures statistic. For purposes of these analyses, we will imagine that they are close to each other.

When we had only the Graduation Rate, the implication was that those who did not graduate were drop-outs or that the institution how somehow failed to student. By adding the additional outcomes categories, we can see that institutions might not graduate a student, but that the student is still well served. The two institutions listed below show a large percentage of students who transferred out or are still enrolled. While eight years is a long time to be still enrolled, for adults with work and families, they may be able to take only a few classes at a time.

For Excelsior College, examining completions alone would show success for fewer than half of their enrollees. With the OM, fewer than one-in-five are in the “unknown” category.

For Excelsior College, examining completions alone would show success for fewer than half of their enrollees. With the OM, fewer than one-in-five are in the “unknown” category.

OM Measures legend

A Caveat about First-Year Data Collections

We know from our early work with IPEDS distance education data that the first year of reporting may yield problems in reporting resulting in high numbers of enrollment categorized as “unknown.” We have also noted that the proportion of students reported as “unknown” tends to decrease as institutions become familiar with the new reporting and are able to adjust their systems to better capture the data required.

However, having such large proportions of students reported with an “unknown” disposition is never-the-less concerning. The cost of higher education is high, both to students and to our society. When institutions accept students and their tuition and fees, they should be able to report what happened to the student.

The Next Post – Community Colleges and Colleges/Universities in Colorado

When institutions accept students and their tuition and fees, they should be able to report what happened to the student.

In our next blog post, we will look more closely at public institutions in Colorado, as a sampling of Outcome Measures in a single state. We will also urge you to conduct your own analyses.

These data are a gift…and the result of hard work by all the reporting institutions. Let’s use them.

~Russ and Terri

Photo of Russ Poulin

 

Russell Poulin
Director, Policy & Analysis
WCET – The WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu | @russpoulin

 

 

Terri Straut

 

Terri Taylor-Straut
Director of Actualization
ActionLeadershipGroup.com

 

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

Count All Students! New Outcome Measures Now Include Non-Traditional Students

There is new improvement to the U.S. Department of Education’s Graduation Rate statistic. And we should all be using it.

Institutions with large non-traditional student enrollments (e.g.: community colleges, online colleges, inner city universities, military-serving institutions) have not been well-represented by the Department’s Graduation Rate statistic. Few of their students are included in the results because only first-time, full-time enrollees are included in the process used to calculate the Graduation Rate.

quote box reads: The new Department of Education's Outcome Measures includes about 60% more students in the WICHE region than does its traditional Graduation Rate.Since 2008 the Integrated Postsecondary Education Data System (IPEDS) has collected data on a new metric to solve that problem – the Outcome Measures. Last year they released the first results, but there has been less news about it than we hoped.

Institutions that serve a large number of non-traditional students need to be using Outcome Measures. Today the Western Interstate Commission for Higher Education (WICHE) Policy and Research unit released a great analysis of this new statistic and they calculated the impact across the western states today. This blog post is the first in a series where we provide observations and suggestions on how you can use Outcome Measures to better understand the outcomes of students at your own institution.

What Are the Outcome Measures?

For the Graduation Rate measure, which until recently had been the only available measure, institutions are asked to identify a “cohort” of students who enter as first-time (they’ve not attended college before) and full-time (they are taking a full load of courses) in the fall of a given academic year. Institutions track and report what happened to those students after a set number of years depending on the type of degree they are pursuing.

The new Outcome Measures asks institutions to expand this tracking by adding three additional cohort categories of students:

  • First-time, full-time (similar to the Graduation Rate)
  • First-time, part-time
  • Non-first-time (transfer-in), full-time
  • Non-first-time (transfer-in), part-time

graph shows break down of

Source: https://nces.ed.gov/ipeds/use-the-data/survey-components/outcome-measures

The first cohorts were compiled for the Fall 2008 academic term and the first dataset containing results was released last October. Inside Higher Ed reported on the release and Robert Kelchen (higher education policy and data expert) provided some initial analysis.

 Important Findings from the WICHE Policy Work

We started playing with these analyses a few months ago and learned that our friends down the hall in WICHE Policy and Research were doing the same thing. We were able to share what we learned and discovered that we were doing complementary analyses. Peace Bransberger and Colleen Falkenstern of WICHE Policy created a wonderful WICHE Insights brief, focused on state and regional analyses (only in the Western states) of the new data. Meanwhile, we at WCET, also a unit of WICHE, are focusing on the impact at the institutional level and how institutional personnel might use and display the data.

Many More Students are Counted

The WICHE Policy brief (released today) produced great findings for the WICHE region:

  • Outcome Measures data provide information for about 60% more 2008-09 undergraduates in the WICHE region than covered by the IPEDS Graduation Rate data. This breaks down as:
  • 39% more students at four-year institutions
  • 71% more students at two-year institutions

Wow! Sixty percent of all western students were not included in the Graduation Rate. That strikes us as a large percentage. Institutions were not recognized for their results…good or bad.

Surprising Results on Completion Rates for the Newly Counted Students

In looking at the graduation results for “non-first-time” (transfer-in) students, there were some surprising results in calculations across the WICHE states:

“And overall, non-first-time undergraduates completed a credential within six years at higher rates than first-time students. The rate (72 percent) at which non-first-time, full-time students completed a credential within six years exceeded that rate (59 percent) for their first-time peers by 13 percentage points. And the rate for non-first-time, part-time students (52 percent) was 28 percentage points higher than for their first-time peers (24 percent).”

Students in the “non-first-time” (or transfer-in) group completed a credential within six years at a higher rate than the more traditional first-time, full-time students. But, before now they were not even counted.

Peace and Colleen also provide a great caveat for us when talking about this data:

“There are important distinctions between the credential rate from Outcome Measures data and the Graduation Rate data…Users should clearly distinguish between these data as they are not synonymous.”

Great work by Peace and Colleen. We recommend you read their brief, even if you are not from the West.

Why Should I Care?

This is a great advance for institutions that serve non-traditional students, as many of their students may be enrolled for only a few courses and/or may have attended another college or university. Many students simply were not included in the Graduation Rate.

To show the impact, let’s look at some concrete examples of institutions with a focus on non-traditional students. From the Outcome Measures that were released last Fall, here are percentage of students who were listed as first-time, full-time from their Fall 2008 cohort:

  • A few Colorado universities:
    • Metropolitan State University of Denver – 41%
    • University of Colorado, Colorado Springs – 56%
    • University of Colorado, Denver/Anschutz Medical Campus – 42%
  • A few Colorado community colleges:
    • Community College of Denver – 22%
    • Front Range Community College – 27%
    • Pueblo Community College – 27%
  • A few institutions with large online enrollments:
    • Capella University – 0.9%
    • University of Maryland-University College – 2.5%
    • Western Governors University – 7.0%

Remember that these were the only students counted in the previous IPEDS Graduation Rate calculations. Therefore, their graduation results were based on a fraction of the students attending the institution. As an analogy…if we declared the Super Bowl winner before halftime, the Atlanta Falcons would own the 2017 trophy instead of the New England Patriots.

A Real Example of the Problem

The press, policymakers, and college ranking services like to use the IPEDS Graduation Rate as a simple, independent indicator of the results of students attending college. For example, here is an alarming warning about University of Maryland-University College on an Education Trust website:

screen shot of

If you were a potential student or a parent of a student who is shopping for a college, what would you think?

The site provides no context that this is warning is based on only 2.5% of the students in the University’s freshman class. If you go to the graduation rates tab, you find a “NOTE:” in which there is an attempt to give caution for low numbers of “full-time freshmen students,” but it would be a very exceptional student or parent to be able to make any sense of that caution.

2nd screen shot of the university of maryland information page

What is a student to think? Or a lawmaker? Or anyone else?

To make matters worse, the instructions are incomplete. It took us awhile to find the “Retention and Progression Rates” tab mentioned in the “Note.” We discovered (through trial and error) that you need to first select the “Similar Colleges” tab. You can then find the “Retention and Progression and Rates” tab. There you find there were 107 full-time students in the 2008 cohort. So, their warning is based upon 107 students out of 35,154 total enrollment that they report on the initial page for the institution. The “30-student caution” that they give is simply too low for larger institutions.

About Education Trust…they do good work and have the correct goals in mind as they proclaim themselves as: “Fierce advocates for the high academic achievement of all students – particularly those of color or living in poverty.” Kudos. Keep it up. This is just one place in which you can better serve those students. And it is not your fault that better data were not available previously.

And What About the Student Achievement Measure?

There is a similar, independent effort to attack the problem with Graduation Rate undercounts. The Student Achievement Measure (SAM): “provides a comprehensive picture of student progress on their path to earning a college degree or certificate. As compared to the limitations of typical graduation rate measures, SAM reports more outcomes for more students”

SAM does wonderful work and we support what they do. According to their website (as of this posting) they have 632 participating institutions. While there may be some limitations or growing pains with the IPEDS Outcome Measures, the great advantage is that all institutions offering federal financial aid are required to participate in the IPEDS surveys.

The Bottom Line…and Upcoming Posts

The IPEDS Graduation Rate simply does not represent the reality of what happens at institutions serving a significant number of non-traditional institutions. IPEDS has given us a good new statistic, but we’ve noticed little uptake in using it.

We need to change this. We need to get involved!

In future blog posts, we will look at some outcomes for a few community colleges, universities, and non-profits. Some institutions show better results with the Outcome Measures, others do not. We’ll also suggest a start for you to calculate and display the data for your institution. We thank Peace and Colleen from WICHE Policy for their great help with our thinking on data displays.

Meanwhile, some questions for you…

  • Is your institution well-represented by the Graduation Rates and/or Outcome Measures data?
  • How can we more people across your institution to know about and use the Outcome Measures?
  • What else should we ask about the Outcome Measures?

Let’s promote better data.

 

Photo of Russ Poulin
Russell Poulin
Director, Policy & Analysis
WCET – The WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu | @russpoulin

 

 

Terri Straut
Terri Taylor-Straut
Director of Actualization
ActionLeadershipGroup.com

 

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

Rigor, Meet Reality

How do you define academic rigor? I know when I was completing my undergraduate and graduate coursework, I could tell the difference between a rigorous course and one that would be a little less time consuming. I also understood, especially in graduate school, that the more rigorous a course was, the more I “got out of it.” Is there a way to capture the difference so instructors can ensure the best educational experiences for their students?

To help us do just that, today we’re excited to welcome Andria Schwegler from Texas A&M University. Andria is here to discuss her search for the definition of academic rigor, gathered through discussions with her colleagues and presentations at Quality Matters conferences.

Enjoy the read and enjoy your day,

Lindsey Downs, WCET


Rigor is touted by many educators as a desirable quality in learning experiences, and many institutional mission statements promise rigorous academic experiences for students. Unfortunately, a definition of rigor has been elusive. How do institutions leverage evidence to support their claim of rigorous experiences?

The Task to Operationally Define Academic Rigor

In search of a definition of academic rigor, I sought out the types of evidence that faculty members at my institution use to document rigor in their academic programs and courses. The search was prompted by an invitation to serve as a panel discussant representing a faculty member’s perspective in one of two special sessions “Quality Online Education: What’s Rigor Got to Do with It?” at the Quality Matters Connect Conference in 2017. The purpose of the sessions was to open dialogue across stakeholders in higher education regarding the role of rigor in traditional and alternative learning experiences. As a faculty member, I could articulate rigor in my own courses and program, but because defining rigor had not been a topic of discussion across departments, I sought to learn what my colleagues considered evidence of academic rigor.

Group of instructors

Photo from #WOCinTech Chat

Operational Definitions of Rigor Offered by Faculty

Several of my colleagues accepted my invitation to discuss the issue, and they provided a variety of measurable examples to demonstrate rigor in their courses and programs. Rigorous learning experiences they described included:

  • audio and video recorded counseling sessions with clients that were subsequently critiqued in class,
  • problems identified by students in current or former employment contexts that were brought to class and addressed by applying course content to the cases, and
  • quantitative research projects requiring application of completed coursework to collecting and interpreting original datasets.

Distilling a broader summary from course-specific assignments and program-specific assessments revealed that the most frequently cited evidence to support claims of rigor were student-created artifacts. These artifacts resulted from articulated program- and course-learning outcomes that specified higher-level cognitive processing. Learning outcomes as static statements were not considered evidence of rigor in themselves; they were prerequisites for learning experiences that could be considered rigorous (or not) depending on how they were implemented.

Implementing these activities included a high degree of faculty support and interaction as students created artifacts that integrated program- or course-specific content across time to demonstrate learning. The activities in which students engaged were leveled to align with the program or course and were consistent with those they would perform in their future careers (i.e., authentic assessment; see Mueller, 2016). Existing definitions of rigor include students’ perceptions of challenge (“Rigor,” 2014), and the evidence of rigor faculty members provided highlighted the intersection of students’ active engagement with curriculum relevant to their goals and interaction with the instructor. These conditions are consistent with flow, which is characterized by concentration, interest, and enjoyment that facilitate peak performance when engaged in challenging activities (Shernoff, Csikszentmihalyi, Schneider, & Shernoff, 2003).

Translating Rigor into Student Assessments

Creating meaningful assessments of learning outcomes that integrate academic content across time highlights the importance of planning learning experiences, not only in a course but also in a program. Faculty colleagues explicitly warned against considering evidence of rigor in a course outside of the context of the program the course supports. Single courses do not prepare students for professional careers; programs do. It was argued that faculty members must plan collaboratively beyond the courses they teach to design program-level strategies to demonstrate rigor.

text box which reads: …Faculty members must plan collaboratively beyond the courses they teach to design program-level strategies to demonstrate rigor.

Planning at the program level allows faculty members to make decisions regarding articulating curriculum in courses, sequencing coursework, transferring coursework, creating program goals, and assessing and implementing program revisions. Given these responsibilities, instead of being viewed as an infringement on their academic freedom (see Cain, 2014), faculty members indicated that collaborative planning and curriculum design were essential in setting the conditions for creating assessments demonstrating rigor.

Though specific operational definitions of rigor provided by colleagues were as diverse as the content they taught, the underlying elements of their examples were similar and evoked notions of active student engagement in meaningful tasks consistent with a “vigorous educative curriculum” (Wraga, 2010, p. 6).

Student Activities During Lecture

Stepping back from the examples of student work that faculty members offered as evidence of rigor, I reflected on student activities that were missing from our conversations. None of my colleagues indicated that attending class, listening to lecture, and taking notes were evidence of rigor. Though lecture is “the method most widely used in universities throughout the world” (Svinicki & McKeachie, 2011, p. 55), student activities associated with it never entered our conversations.

In fact, one colleague’s example of rigorous classroom discussion directly contradicted the approach. She explained that during discussions, she tells her students not to believe a word she says, though she was quick to add that she does not mislead students. Her approach puts the burden to obtain support for discussions on students, who cannot passively rely on the teacher as an authority. Instead, students are held accountable for substantiating claims provided. This technique offers more evidence of rigor than simply receiving the content via lecture.

text box reads: None of my colleagues suggested that students’ grades were evidence of rigor.

Student Grades

None of my colleagues suggested that students’ grades were evidence of rigor.

One noted that some students may not meet high standards, leading them to fail a course or program. But, these failures in demonstrating performance were viewed as unfortunate consequences of rigor, not evidence to document its existence. This sentiment was complimented by another colleague’s comment that providing support (e.g., remedial instruction, additional resources) to students was not a threat to the rigor of a course or program. Helping students meet high standards and improve performance was evidence of rigor, whereas failing grades because students found the content difficult were not.

Teaching Evaluations and Mode of Delivery

None of my colleagues suggested that students’ evaluations of a course or an assignment were evidence of rigor. When documenting rigor, faculty members offered students’ performance on critical, discipline-specific tasks, not their opinions of the activities. Supporting this observation, Duncan, Range, and Hvidston (2013) found no correlation between students’ perceptions of rigor and self-rated learning in online courses. Further, definitions of rigor provided by graduate students in their study were strikingly similar to the definitions provided by my colleagues (e.g., “challenge and build upon existing knowledge…practical application and the interaction of theory, concept, and practice…must be ‘value-added’” p. 22). Finally, none of my colleagues indicated that mode of delivery (e.g., face-to-face, blended, online) was related to rigor, an observation also supported by Duncan et al. (2013).

Defining Academic Rigor: A Research-Based Perspective

textbox which reads: None of my colleagues indicated that mode of delivery (e.g., face-to-face, blended, online) was related to rigor.

Thanks to my colleagues, I arrived at the Quality Matters Connect conference with 17 single-spaced pages of notes documenting an understanding of rigor. Though the presentation barely scratched the surface of the content, I was optimistic that we were assembling information to facilitate multiple operational definitions of rigor that could be used flexibly to meet assessment needs. This optimism contributed to my surprise when, during small group discussion among session attendees, the claim was made that academic rigor has too many interpretations and cannot be defined.

I cannot support this claim because most variables addressing human behavior have multiple ways they can be operationally defined, and converging evidence across diverse assessments increases our understanding of a given variable. From this perspective, a single, narrow definition of rigor is neither required nor desirable. A research-based perspective allows for multiple operational definitions and makes salient the value of assessment data that may be underutilized when it informs only a single program’s continuous improvement plans. As Hutchings, Huber, and Ciccone (2011) argue, faculty members engage in valuable work when they apply research methods to examine student learning and share results with others. When assessment and improvement plans are elevated to the level of research, the information can be shared to inform others’ plans and peer reviewed to further improve and expand the process.

Articulating and sharing ways to observe and measure rigor can provide educators and administrators a selection of techniques that can be shaped to meet their needs. Engaging in an explicit examination of this issue across institutions, colleges, programs, and courses facilitates the identification of effective techniques to provide evidence of rigor to support the promises made to our stakeholders.

author headshot Andria Schwegler

 

Andria Schwegler
Associate Professor, Counseling and Psychology
Texas A&M University – Central Texas

 

 

 


References

Cain, T. R. (2014, November). Assessment and academic freedom: In concert, not conflict. (Occasional Paper No. 22). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment. Retrieved from http://www.learningoutcomesassessment.org/documents/OP2211-17-14.pdf

Duncan, H. E., Range, B., Hvidston, D. (2013). Exploring student perceptions of rigor online: Toward a definition of rigorous learning. Journal on Excellence in College Teaching, 24(4), 5-28.

Hutchings, P., Huber, M. T., & Ciccone, A. (2011). The scholarship of teaching and learning reconsidered: Institutional integration and impact. San Francisco, CA: Jossey-Bass.

Mueller, J. (2016). Authentic assessment toolbox. Retrieved from http://jfmueller.faculty.noctrl.edu/toolbox/whatisit.htm

Rigor. (2014, December 29). In The Glossary of Education Reform. Retrieved from https://www.edglossary.org/rigor/

Shernoff, D. J., Csikszentmihalyi, M., Schneider, B., & Shernoff, E. S. (2003). Student engagement in high school classrooms from the perspective of flow theory. School Psychology Quarterly, 18(2), 158-176.

Svinicki, M., & McKeachie, W. J. (2011). How to make lectures more effective. McKeachie’s teaching tips: Strategies, research, and theory for college and university teachers (13th ed., pp. 55-71). Belmont, CA: Wadsworth.

Wraga, W. G. (2010, May). What’s the problem with a “rigorous academic curriculum”? Paper presented at the meeting of the Society of Professors of Education/American Educational Research Association, Denver, Colorado. Retrieved from https://eric.ed.gov/?id=ED509394

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

Teaching with Wikipedia: A High Impact Open Educational Practice

This week’s Frontiers Blog is written by one of my OER heroes, Dr. TJ Bliss. Like so many others working to reduce costs and increase open access to education, I was inspired, mentored, and empowered by TJ. In his previous position as the OER Program Officer at the William and Flora Hewlett Foundation, TJ granted $45 million to over 30 organizations around the world actively working to advance the field of open education. Now, as the Director of Development and Strategy at Wiki Education, TJ invites higher education to contribute to and access open knowledge in the world’s largest encyclopedia. Read more about TJ Bliss and his inspiring work. Thank you to TJ for writing for WCET Frontiers.

We hope you enjoy today’s post!

– Tanya Spilovoy, Director, Open Policy, WCET


Wikipedia is arguably one of the most successfully crowd-sourced digital projects to date. By nature of its open-source and collaborative design, the knowledge represented on the site is consistently improved, built upon, and updated. While Wikipedia still suffers from systemic biases and content gaps, it’s designed to include the most voices possible in the articulation of its information. It is also designed to be accessible to as many people as possible, and there’s no doubt that it has tremendous reach: Almost 500 million unique users traffic the site every month.

Few readers of Wikipedia fully understand its mechanisms. Where does that information come from? Is it trustworthy? One might be aware that content on Wikipedia is written entirely by volunteers around the world. This fact inspires uncertainty or skepticism in many. If just anyone can change Wikipedia, won’t there be inaccuracies? Won’t people potentially abuse that power?

In the age of fake news, providing opportunities for people to learn how to identify misinformation online is increasingly important. And students are particularly vulnerable. Researchers at the Stanford Graduate School of Education published a report in 2016 that found that students have difficulty identifying credible sources online, distinguishing advertisements from news articles, and understanding where information they encounter comes from.

So how do we equip students with the skills they need to understand their ever-changing digital landscapes? We can use open educational practice to teach them how to evaluate and improve the informational resources that hundreds of millions of people rely on every day. In particular, we can teach them to edit Wikipedia articles as part of their regular coursework.

A student using the Wiki Education Dashboard.

A student using the Wiki Education Dashboard.

Successfully Using Wikipedia in the Classroom

Supporting higher education instructors who teach with Wikipedia is Wiki Education‘s specialty. We offer free resources for instructors to do this, which take the form of instructional design consultation, assignment management software, tutorials about how to contribute to Wikipedia, and print resources about editing in particular disciplines. These tools all come together in our Dashboard, a free and open-source web application that we consistently improve. In fact, we’ve built a volunteer development community around that continual improvement, and we engage a number of newcomers in this tech development.

The Dashboard is where instructors create and manage their Wikipedia assignment using our templates. They’re able to refine the assignment to meet their needs and can closely track student progress as they contribute course content to Wikipedia articles. The Dashboard is also where students take tutorials about how to edit and track their work and that of their classmates. Since 2010, Wiki Education has supported more than 43,000 students at over 400 universities and colleges throughout North America. These students have improved or created 60,000 Wikipedia articles so far and have added 40.3 million words to the site.

Increases in Learning

The open educational practice of assigning students to edit Wikipedia teaches information literacy and engages students in new ways. They walk away with the ability to apply a critical lens to information they encounter outside of the classroom. Eighty-seven percent of instructors who taught with Wikipedia in the Fall 2017 term agree that a Wikipedia assignment is more effective in teaching information literacy than a traditional assignment.

When students edit Wikipedia, they engage in the technological mechanisms of an online resource they rely on (i.e. they learn how to edit and respond to Wikipedia’s community of fellow editors); they learn to distinguish primary and secondary sources in accordance with Wikipedia’s encyclopedic standards; they recognize the difference between encyclopedic and argumentative writing styles; and they understand how to identify where information comes from and whether or not it’s accurate.

One student who completed a Wikipedia assignment in Fall 2017 explained what they learned from the process:

“What I realize now is that Wikipedia is in a constant state of research and conversations are happening so things get updated to reflect that knowledge. That is not to say that things do not need to be checked, but it is always good when research is progressive. I really learned the ins and outs of editing and all the things that come with it. Another valuable thing I learned that I will use forever is how to assess an article. I had never done anything like this in a previous class, and I can say that I learned things that I will hold with me forever. It is so important that instead of just disregarding [Wikipedia] as a whole, we should be working on strengthening it.”

Students Become Content Creators

When students engage with Wikipedia in the classroom, they come to understand what knowledge is not adequately represented on the site. And they take an active role in the solution, becoming creators of knowledge, rather than merely consumers.

The public facing nature of a Wikipedia classroom assignment also carries a lot of weight for students. Participating in scholarship for mass consumption not only keeps students accountable, but inspires a sense of confidence in them, as well. Editing Wikipedia uniquely positions students as knowledge producers, inspiring them to have a real command over course material and an active approach to expressing what they’ve learned. Held accountable by their instructor, other students, other Wikipedia editors, and Wikipedia’s global readership, students are more motivated to produce high quality work. Students are also motivated by the autonomy that the assignment can inspire, and feel a personal investment in better representing a topic of their choice on the world’s most popular online encyclopedia.

Students are Engaged and Giving Back

Again, with the rise of fake news, engaging students in questions about knowledge production and dissemination is critical. Teaching information literacy is not only a beneficial academic experience, but a critical life skill. We want students to emerge from higher education more confident and better informed to participate as citizens.

Katie Webber, a Rice University student in our program, proposed editing Wikipedia as a civic duty:

“I call my senators, I vote, I donate to the ACLU, and now, I edit Wikipedia,” she wrote in a reflective blog post.

Wiki Education student participants at UC Berkeley.

Wiki Education student participants at UC Berkeley.

When students understand themselves as knowledge producers, they feel a responsibility to a learning community larger than the classroom. Wikipedia’s self-described purpose is to provide free access to the sum of human knowledge to the most people possible. People in higher education are beginning to recognize the importance of digital writing and public scholarship. York University, for example, recently awarded a prestigious faculty award to a student for the Wikipedia article he wrote about the digital divide in Canada.

“I learned and gained more from working with Wikipedia than I have from almost any other assignment I have completed,” the student reflected. “[I learned] how to interact with Wikipedia’s collaborative social network, adapt to a work environment that isn’t a traditional word processor, and practice a style of writing which isn’t common among university assignments. These are all things that I would not have experienced if I had been working on something more traditional, yet I believe having less traditional experiences like these is also an important part of growing academically.”

Jon Sufrin, coordinator of the faculty-wide competition at York, also spoke to the value of academic engagement with a medium like Wikipedia.

“It’s pretty clear that digital writing is going to be in demand in the future, and this kind of writing takes a specific set of skills to do well,” he said. “You have to be able to sort through all the available sources, have skills at hyperlinking, and understand how to make use of the web as a dynamic medium. Digital writing isn’t just screen prose, it’s interactive prose. All of these skills are in addition to actually being able to write something.”

Improving Wikipedia as a course assignment situates student learning within broader conversations about access to information and mechanisms at work in the production of knowledge. And with access to institutional resources usually restricted behind paywalls, students are in a great position to contribute academic knowledge to public commons for the benefit of learners everywhere.

With further developed skills in digital literacy, information literacy, critical research, and collaboration, as well as an increased sense of motivation in their work, students emerge from the experience of editing Wikipedia better equipped for the digital and informational landscapes of the future.

To find out more about how you can incorporate an assignment like this in your own course, visit teach.wikiedu.org.

 

TJ Bliss Headshot

 

TJ Bliss
Director of Development and Strategy
Wiki Education

 

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

OLC and WCET Need Feedback on Accessibility

Even a gazillion dollar industry like the National Football League can be blind to accessibility needs, at times. Perhaps, even, color blind. If you have not seen it, be sure to look at this video of the game in which they had the two teams wear special jerseys all of one color…

 

While we don’t have a gazillion dollar budget, WCET and the Online Learning Consortium (OLC) are partnering to help our members on addressing accessibility issues…and we need your help on what activities will best help you.

We Don’t Know What We Don’t Know

While we in the educational technology and distance education world can also fall short at times, it is not for a lack of trying. Our friends at the Instructional Technology Council (ITC) Network conduct an annual survey of their (mostly) community college membership. Below, I chart the results of one of their questions, which was featured in a recent article about the “confidence” in institutional accessibility compliance.

Chart showing degrees of confidence in accessibilty compliance.

The “Completely” or “Mostly” compliant options were chosen by 73% of respondents in 2008. That number fell to only 33% in the most recent survey.

I’ve used this graphic in discussions and presentations over the last year and asked “why?” Are we getting worse at this? The answer is almost uniformly that we did not know all we needed to know back in 2008. And, even today, there is more to learn.

OLC and WCET Partnering to Help Our Members

After a year of searching for an issue on which WCET and the Online Learning Consortium (OLC) could partner to help both memberships, accessibility was chosen. As stated by Kathleen Ives, OLC’s Chief Executive Officer and Executive Director:Photo of Kathleen Ives speaking at a podium

“Advancements in technologies intending to make academic life easier present challenges for students who have disabilities. The need for more accessible services and devices proves to be an educational imperative so they can be used by all students of all abilities.”

Our partnership kicked-off this week with a joint webinar that explained some of the basic issues. You are encouraged to view the archive of it. Moving forward, watch for more webinars, papers, blogs, and conference sessions on these issues.

BUT, We Need Your Help!

One of our first efforts was to create a survey that seeks to gain data on the following issues:

  • What are attitudes of a variety of people on campuses surrounding accessibility?
  • Are there structures in place on campus to support accessibility?
  • How can we help?

We are hearing that there is still lots of confusion about what is required and what should be done regardless of requirements. Based on the survey responses, OLC and WCET will plan professional development offerings to help meet the needs that you identify.

The survey is focused on institutional based personnel. If you work at an institution, please take the survey:  https://www.surveymonkey.com/r/WJFWVY5

Watch for more information in the future.

Thank you,

Russ

Photo of Russ Poulin

 

Russell Poulin
Director, Policy & Analysis
WCET – The WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu | @russpoulin

 

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

Higher Education Act – Innovations, Definitions, and State Authorization

When passed in 1965, the Higher Education Act (HEA) was intended to “to strengthen the educational resources of our colleges and universities and to provide financial assistance for students in postsecondary and higher education.” Updated or “reauthorized” several times since then, the Act has historically housed most of federal resources and regulations for higher education in the United States. Remember that the states have authority over higher education in their jurisdictions, but the lure of financial aid for students, research funds, and other federal levers gives the federal government considerable sway over institutional activity.

Reauthorization is overdue. The House of Representatives has weighed in with its PROSPER Act, which was a purely one-party production. The Senate has been holding hearings and may soon deliver its own, more bipartisan, version in the coming weeks.

What would we like to see in HEA?

Richard Nelson, President of Nicolet College, eloquently presented his ideas in last week’s post, in which he sought to encourage innovation and protect aid from fraudulent uses. He also issued a challenge to WCET and reminded us of our first responsibility:
“Once again, I suggest that WCET is well-positioned to do this work and hold firmly to the belief that students come first. Always.”crowd of college graduates moving tassles on caps from one side to the other

Meanwhile, the Policy unit of WICHE (our parent organization) invited WCET to contribute to a “Statement of Principles and Positions” regarding the HEA. In this post, I’ll share with you some thoughts in response to President Nelson and what was included in those “Principles and Positions” that were recently approved by the WICHE Commission. I’ll focus on two issues:

  • Innovation vs. Protecting Students: Stop trying to define the moving targets of each innovation. While it is a worthy and necessary goal in protecting students, we can do better.
  • State Authorization for Out-of-State Activities: Recognize state responsibilities and reciprocity agreements.

The Balance Between Encouraging Innovation and Protecting Students

There is a constant tension between: a) the introduction of new modes of instruction and b) the need to assure that students are not harmed and federal aid funds are used properly. These two desires should not be at odds. Innovators want to move forward unfettered by rules while consumer protection professionals seek tight controls based upon a history of malfeasance by a select few.

President Nelson states the challenge very well:
“If well-crafted regulation can reduce the risks associated with innovation and help overcome resistance to change…How do we find the balance between making room for new models without throwing the door open to the unscrupulous opportunists who will exploit every perceived regulatory loophole?”

row of old dictionary books

In the WICHE “Statement of Principles and Positions”, we suggest a long- and short-term strategy to find this balance.

But First…the Problem with Definitions

The one constant in life is change. Let’s accept that.

Congress, the Department of Education, and the rest of us are currently mired in efforts to try to define “distance education,” “correspondence education,” “regular and substantive interaction,” and “competency-based education.” One staffer from the House Education and Workforce Committee told a group of accreditation leaders that one reason they left the “distance education” definition out of the PROSPER Act is that they could not agree on one.

Let’s stop it.

The House and Senate could better spend its time by preparing for future (currently unimagined) innovations, rather than pursuing granular definitions that are instantly outdated. For example, the current federal definition allows your use of “video cassettes, DVDs, and CD-ROMs.” Ask your kids if they know what those are. Some of you reading this might not know what they are.

a black and white photo of a vhs tape and cds/dvdsThe principles document acknowledges three basic tenets in making our recommendations:

  • “Policy formation lags innovation, and it always will.
  • “Change is inevitable, and new innovations that are not new envisioned are on the horizon.
  • “Students must be protected, and federal financial aid should not be used for non-productive or fraudulent purposes.”

Long-term: Create a Flexible Measure for Innovations Not Based on Specific Definitions

Given the difficulty of a major policy change, we suggest a long-term strategy that will take time and discussion. But, we are certainly open to a quicker, innovative solution. Our recommendations:

  • “Create a commission to develop a new process and set of regulations to handle innovations. Rather than waiting for years after an innovation has already become main stream, adopt new processes that allow aid to be used for emerging innovations with clear safeguards.
  • “As a model for regulating innovative modes of instruction, consider a modified version of the medical model for approving drugs and treatments…”

Also, we want to shift the discussion from input-based measures (e.g., “regular and substantive interaction”) to outcome-based measures (e.g., “last day of attendance,” student progression).

Short-term: Maintain Definitions Until the Process Described Above is Ready

While waiting for the longer-term solution, we can live with definitions a little while longer. We recommended:

  • That the Department maintain the current “distance education” definition, but I would like to amend that. Since we wrote that recommendation, a recommendation from seven large, innovative colleges (all of them are WCET members) wrote a letter with their own suggestions for definitions. A copy of the letter appears at the end of those post. Their recommendations should be seriously considered.
  • Adding a definition of “competency-based education.” Unfortunately, the seven presidents were unsuccessful in creating a definition for that mode of instruction. A definition is needed to provide those students with aid and it would be wise to consult the Competency-Based Education Network for their input on a definition.
  • Finally, replacing the “regular and substantive interaction” definition. The Department of Education has been lax in providing actionable guidance in how to comply. See the analysis by Van Davis (now an independent consultant) and myself of the many documents that need to be consulted to determine how to comply. It is also a criterion based on inputs. The issue should be “DO students learn” not “HOW students learn.” Let’s focus on outcomes…even in the short-term. We can create outcomes-based safeguards that will protect both students and aid expenditures.

State Authorization for Out-of-State Activities

The Department of Education released a new federal regulation for state authorization of distance education that is set to go into effect on July 1, 2018. Watch for more from us on this issue in the coming weeks. Meanwhile, the House’s PROSPER Act suggests completely doing away with this regulation. Of course, they think that this will relieve institutions of seeking authorization in each state, but state regulations remain in place regardless of federal rules.

Some basic tenets underlying our recommendation:

  • States are charged with overseeing higher education activities within their borders.
  • This oversight provides necessary student protections.
  • Oversight is for ANY activity in the state, not just distance education.
  • An interstate reciprocity agreement is one means for an institution to obtain approval in a state.

Our recommendations are simple:

  • “…to better protect students, WICHE supports a requirement that postsecondary institution comply with authorization regulations for each state in which it serves students for eligibility to disburse federal financial aid.”
  • “…the U.S. Department of Education should recognize interstate reciprocity agreements as an acceptable method for an institution to obtain that authorization.”

In Conclusion…

There are many more issues that we must watch and comment upon. I highly recommend that you let your Representative or Senator know what you think. As an individual you can express your opinion. Volume counts.

drawing of a man holding a megaphone with explanation points coming out of itAnd…you might have an extra loud voice if you are in one of the twenty-four states with a member on the Senate Health, Education, Labor, and Pensions Committee. They’re working now. Make your voice heard.

Many feel that reauthorization will not pass this year. That would break the record for longest time between such actions. We still need to pay attention and participate. Sometimes language introduced now survives until the final product. Let’s make sure it is good language.

Thank you,

Russ

Russ Poulin smiling while holding a small bat

 

Russell Poulin
Director, Policy & Analysis
WCET – The WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu    @russpoulin

 

 


= = = LETTER FROM THE SEVEN PRESIDENTS = = =

February 20, 2018

The Honorable Lamar Alexander

United States Senate

Chair, U.S. Senate Committee on Health, Education, Labor & Pensions

455 Dirksen Senate Office Building
Washington, DC 20510

The Honorable Patty Murray

United States Senate

Ranking Member, U.S. Senate Committee on Health, Education, Labor & Pensions

154 Russell Senate Office Building

Washington, DC 20510

Dear Senator Alexander and Senator Murray,

We, the undersigned presidents of private not-for-profit and public colleges and universities across the nation, write to you to respectfully offer guidance as you revisit the distance education requirements within the Higher Education Act.

Over the past decade, higher education institutions of all sizes and status have embraced distance education/online learning as an innovative long-term strategy to meet student needs, deliver a more flexible, cost-effective form of academic instruction, and advance the Completion Agenda. Today, distance education has become a ubiquitous component of contemporary higher education with more than six million students taking at least one course at a distance.

Unfortunately, with the Higher Education Act last updated in 2008, federal regulations have been slow to keep up with advancing technology and innovations within the online learning sector. As a result, distance education is staring down an uncertain future – its long-term viability threatened by obsolete standards regarding “regular and substantive interaction” within federal statute. To create an environment open to sensible experimentation and that fosters innovations in products, programs, and services, Congress must revise the Higher Education Act accordingly.

Therefore, we are heartened by your desire to work in a bipartisan manner to draft a Senate reauthorization bill and your expressed openness to guidance on these provisions.

Republicans and Democrats share a commitment to increasing access, equity, affordability, and accountability, if by different means. We are confident the following revisions, which resulted from a collaborative effort of the undersigned to balance the needs of public and private institutions with adequate student and taxpayer protections, meet these obligations.

We encourage replacing the current definition of Distance Education with the following:  

 

Distance Education: Except as otherwise provided, the term “distance education” means education that provides students who are not in a physical classroom with substantive interaction, including, but not limited to, instruction, assessment, mentoring/advising, and learning support, enabled by ready access to qualified faculty, monitoring of student progress, and active intervention.

We recommend the following revision to the definition of Correspondence Course:

Correspondence Course: (1) A course provided by an institution under which the institution provides instructional materials, by mail or electronic transmission, including examinations on the materials, to students who are separated from the instructor. Interaction between the instructor and student and academic support, including access to qualified faculty, is limited, is not regular and substantive, and is primarily initiated by the student. Correspondence courses are typically self-paced and self-taught.

(2) If a course is part correspondence and part residential training, the Secretary considers the course to be a correspondence course.

(3) A correspondence course is not distance education.

Additionally, to enhance the ability of students to make informed decisions when choosing an educational provider, we recommend standardizing outcomes data at the institutional level and for each program of study:

Transparency: Make transparent outcomes data available by program at the undergraduate and master’s level for the following; (1) 100% and 150% graduation rates. (2) One-year retention rate. (3) Average annual cost for full-time attendance, broken out by tuition, fees, and living costs. (4) Federal student debt from tuition and fees.

The Higher Education Act serves as the single most important federal means to increase the capacity of low- and middle-income individuals to finance a post-secondary education. The Senate stands on the cusp of a historic opportunity. As you begin drafting the reauthorization bill, we strongly encourage you to consider these new and revised definitions.

Please contact us with any questions.

Respectfully submitted,

Dr. James N. Baldwin, President, Excelsior College

Dr. Michael M. Crow, President, Arizona State University

Dr. Sue Ellspermann, President, Ivy Tech Community College

Ed Klonoski, President, Charter Oak State College

Dr. Paul J. LeBlanc, President, Southern New Hampshire University

Scott D. Pulsipher, President, Western Governors University

Dr. Becky Takeda-Tinker, President, Colorado State University – Global Campus

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

What Should Reauthorization Be Like?

In February 2018, we had a question submitted through our WCETDiscuss email list about the Reauthorization of the Higher Education Act. This prompted a discussion of how the reauthorization would or should impact distance education. The original question asked about academic integrity, specifically student identity authentication. Russ Poulin, our Director of Policy and Analysis, answered this question (see his adapted answer below).

Good question. You are probably looking at the PROSPER Act, which was the product of the House of Representatives. That bill was completely the product of one party and, while parts of it may survive, the resulting Higher Education Act will probably look much different.

I received inquiries from others who may be unclear that there already is a federal regulation regarding distance education and academic integrity. It is Chapter 34, 602.17(g), which places on accrediting agencies the responsibility to monitor institutions…

… I think the Senate (which is currently working on its version of reauthorization) will include language regarding student identity for distance education.

So…we can recommend it not be included, but we better be ready with alternatives. Our mere suggestion that reauthorization not address student identity will just be further proof to the naysayers that more regulation is needed.

If a regulation were to exist, what can we live with? Personally, I’d like to see the responsibility stay with the accrediting agencies. If anything needs to be beefed up it would be the gathering of evidence that such methods work. Like  many of you who wrote previously, given the ubiquity of technology and cheating, I don’t see why it should be limited to distance education.

Richard Nelson, the President of Nicolet College, provided a wonderful response to Russ’ question about “What Reathorization Should Look Like.” Upon our request, he adapted his WCETDiscuss response for today’s post. Thank you President Nelson for your thoughtful response and your challenge to WCET!

Enjoy the read and enjoy your day,

Lindsey Downs, WCET


What Would Reauthorization Be Like?

What would we like to see in a reauthorized Higher Education Act (HEA)? At first, visions of unprecedented innovation powered by boundless creativity and liberated from burdensome regulation come to mind. Then reality sets in and I grudgingly recognize the inevitability of regulation. However, if we’re diligent and a little bit lucky, sound regulation of limited scope may actually help the cause by mitigating some of the widespread skepticism in DC and most state capitols of anything other than counting butt-in-seat hours.

That skepticism isn’t surprising if you suspect, as I do, that a huge majority of members and staffers in all branches of government went directly from good high schools to very traditional and very enjoyable four-year residential college experiences. It’s what they know. It’s what “college” means to them. That’s neither good nor bad, it just is. Recognizing the frame of reference different people bring to the table can be immensely helpful as we seek to influence their perspectives.

Things Have Changed

Fortunately, we have many influential innovators among us, and we have much more evidence that online learning (or technology-enabled learning in general) is at least as effective as face-to-face than we did the last time HEA reauthorization came around. Other things have changed too:

  • Affordability is much more important than it was prior to the great recession.
  • Students are more sophisticated and have more choices.
  • Although higher education is late to the party compared to other economic sectors, the transfer of power from provider to consumer has arrived.
  • Instead of high unemployment, potentially crippling labor shortages challenges entire industries.
  • Elected officials and employers are rapidly realizing that the projected demand for well-prepared workers simply cannot be met, unless up-skilling the existing workforce is a big part of the solution.
  • And the only way the vast majority of currently employed adults can re-engage in higher education, no matter how much they might want to, is to offer programs with enough flexibility to accommodate real life.

What Can Higher Education Do?

When we consider affordability and flexibility, it’s no wonder that short-term credentials and competency-based programs are quickly gaining favor with students and employers. Their full potential will only be realized when they’re embraced by our regulators and accreditors, but it’s not an easy ask. They aren’t known for risk-tolerance, nor should they be. Identifying the risks of innovation and taking effective steps to mitigate them is our job, not theirs, and I suggest that organizations like WCET have the right people with the right motivation to do this work exceptionally well.

The risks of innovation come in two flavors, the risk of failure faced by the innovators and the risk of market disruption feared by the traditionalists.

In reality, however, defenders of the status quo will typically resist change by amplifying the risks innovation, so it turns out that that the two flavors aren’t really so different. Either way, reducing the risk of failure to a level that our regulators and accreditors can tolerate is prerequisite to overcoming resistance to change.

This brings us back to the discussion at hand. Certainly, we all agree that students need to represent themselves truthfully, whether we’re talking about what they know or who they are. Accordingly, let’s focus first on finding ways to provide strong assurances of student identity and of academic honesty. In so doing, we will eliminate perceived weaknesses in these areas as a justification for opposing change.

A Challenge for WCET

If well-crafted regulation can reduce the risks associated with innovation and help overcome resistance to change, the less thoughtful sort can just as easily drown the emerging seedlings of progress. How do we find the balance between making room for new models without throwing the door open to the unscrupulous opportunists who will exploit every perceived regulatory loophole? How do we encourage the kind of community-based collaboration that can move us all forward without spawning yet another regulation-fueled cottage industry of companies and organizations eager to monetize their own purported compliance solutions? Once again, I suggest that WCET is well-positioned to do this work and hold firmly to the belief that students come first. Always.

For me, I’d be thrilled to see the reauthorized HEA treat distance education not as an afterthought, but as our best bet to bring the benefits of higher education – enhanced social well-being and economic vitality – to more people in more places than ever before. And if we can avoid another “regular and substantive” fiasco while we’re at it, I’d take that as a win too.

Richard Nelson Nicolet College

 

Richard Nelson
President
Nicolet College

 

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

Distance Education Enrollment Growth—Major Differences Persist Among Sectors

The Integrated Postsecondary Education Data System (IPEDS) recently released its 2016 distance education data. This report shows course enrollment for distance education programs in the United States.

Today, we welcome Terri Taylor-Straut, Senior Research Analyst for WCET, to WCET Frontiers. Terri joins us to review some of the trends in the recently released information and to provide some intriguing conclusions that can be drawn based on several years of analysis on these data points. Thank you Terri for today’s post!

Enjoy the read and enjoy your day,

~Lindsey Downs, WCET


WCET has been analyzing and reporting on the Integrated Postsecondary Education Data System (IPEDS) data that reports distance education course enrollment since the data became available for the Fall 2012 term. These data are reported to the U.S. Department of Education’s National Center for Educational Statistics (NCES) annually as part of the IPEDS Fall Enrollment reporting. Based on the Fall 2013 data, WCET was the first organization to report that there are significant differences in the distance education course enrollment trends based on higher education sectors.

With the 2016 IPEDS distance education data now available, we have four years of sector data and many of the trends we first identified in 2013 have continued. Looking more closely at the sector trends illuminates some changes that might be missed by looking solely at consolidated distance education data.

Total Higher Education Enrollment for Fall 2016

Public institutions of higher education continue to educate nearly three-fourths (73.0% in 2016) of all enrolled students, regardless of mode of delivery. Private non-profits reported 20.9% of 2016 enrollment; private for-profits reported just 6.1% of enrolled students. Any discussion of sectors should be grounded in an understanding of the relative size of the sectors. Publics remain by far the largest sector, so small changes in the public sector impact the whole data set.

 

Chart showing 2016 total sector enrollment. 1,218,646 Private, For profit; 4,199,850 Private, Non Profit; 14,664,481 Public.

 

Total Reported Higher Education Enrollment: 2016 Total Sector Enrollment
  2016 Enrollment % of Total Enrollment
Public 14,664,481 73.0%
Private, Non-Profit 4,199,850 20.9%
Private, For-Profit 1,218,646 6.1%
Total 20,082,977  

Significant Variation in Results Over Time by Sector

an arrow with 4% textOverall higher education enrollment has declined by 4.0%, or 845,466 students, over the four-year period.

Public institutions reported a 2% decline over the period. Private for-profit institutions have declining enrollment at a rate of -34.4%. Private non-profit institutions have bucked the overall trend and increased total enrollment by 2.3% over the four-year period. The significant decline in reported distance education enrollment by for-profit institutions may be partly attributable to the negative attention those institutions received from the Department of Education and the media during the time period.

Total Reported Higher Education Enrollment: Sector Data Trends 2012 to 2016
  2012 2016 % Change
Public 14,966,033 14,664,481 -2.0%
Private, Non-Profit 4,105,872 4,199,850 2.3%
Private, For-Profit 1,856,538 1,218,646 -34.4%
Total 20,928,443 20,082,977 -4.0%

Definitions of Distance Education Enrollment

IPEDS reporting requires institutions to report two categories of distance education enrollment, “exclusively enrolled in distance education courses” and “enrolled in some but not all distance education courses”. In addition, WCET and others have continued to combine these two categories to match the historic Babson Survey Research Group (BSRG) category “enrolled in at least one online course”. Additional information about the methodology used is covered in the methodology section below. Sector enrollment in each of these three categories of distance education enrollment are reported for 2012 and 2016.

Students Enrolled Exclusively in Distance Education Courses: Sector Data Trends Private for profit 23%, public 52%, private, non profot 25%

Exclusively Distant Students are Growing for Public and Non-profit Sectors

Enrollment growth in “exclusively distance education” courses is significant at 13.2% in the four-year period, particularly considering that overall enrollment declined 4% in the same period. This is the category of distance education where we see the largest variation among the sectors over the four-year time frame.

The non-profit sector reported a whopping 54.7% growth in exclusively distance education enrollments in the period. Public institutions reported significant growth as well at 25.5%, while for-profit institutions reported a decline of 24.3%. Some non-profits experienced rapid growth during the period.

 

Students Enrolled Exclusively in Distance Education Courses: Sector Data Trends 2012 to 2016
  2012 2016 % Change
Public 1,231,816 1,545,708 25.5%
Private, Non-Profit 473,800 733,007 54.7%
Private, For-Profit 927,899 702,139 -24.3%
Total 2,633,515 2,980,854 13.2%

Students Taking Some Distance Courses at Publics Grows By Larger Headcount But Smaller Percentage

The four-year trend results for students enrolled in “some but not all” distance education courses are less dramatic than those reported for “exclusively in distance education courses” but they reveal similar trends. Non-profits again lead with 35.7% growth in these enrollments; public institutions reported nearly a 20% (19.7%) increase. The only sector to report declining enrollment over the period is for-profits with a 6.8% decline.

In analyzing these data, it is interesting to note the differences in the base enrollment numbers for each sector. The four-year growth in public students taking “some but not all” of their courses at a distance was 467,135. This comes close to equaling the total number of students enrolled in some distance courses (519,849) for the other two sectors. The percentage growth for non-profit institutions was much higher, but it started with a much lower enrollment base.

Students Enrolled in “Some But Not All” Distance Education Courses: Sector Data Trends 2012 to 2016
  2012 2016 % Change
Public 2,366,675 2,833,810 19.7%
Private, Non-Profit 290,897 394,668 35.7%
Private, For-Profit 134,319 125,181 -6.8%
Total 2,791,891 3,353,659 20.1%

For “At Least One” Distance Education Course, Non-Profits Overtake For-Profits as Second Biggest Sector

Since this category is simply the combination of the two categories of distance education enrollment that IPEDS requires, it is not surprising that the trends remain consistent, with non-profits reporting large gains at 47.5% in the four-year period; publics reporting healthy growth at 21.7%. The private for-profit sector is the only one to report enrollment decline at the rate of 22.1% between 2012 and 2016.

Distance education enrollment growth continues for the public sector. arrow2It is instructive to note that in 2012 for-profit institutions enrolled about 300,000 more distance students than the non-profits. In 2016, the non-profits are the second biggest sector and enroll about 300,000 students taking distance courses than does the for-profit sector.

Students Enrolled in “At Least One” Distance Education Courses: Sector Data Trends 2012 to 2016
  2012 2016 % Change
Public 3,598,491 4,379,518 21.7%
Private, Non-Profit 764,697 1,127,675 47.5%
Private, For-Profit 1,062,218 827,320 -22.1%
Total 5,425,406 6,334,513 16.8%

Conclusions

As we have previously noted, it is not our intention to place value judgments on the different sectors, but rather to continue to chip away at common myths that exist about distance education enrollments by sector. With four years of distance education now available, the trends are clearer. This information informs the marketplace as well as those responsible for regulatory oversight.

One trend that has never wavered is the fact that public institutions continue to educate the vast majority of students, both on campus and by distance education courses.

Methodology

WCET has worked with other professional organizations with an interest in the IPEDS distance education data since the 2012 data became available. These organizations include e-Literate (particularly Phil Hill) and Babson Survey Research Group (BSRG). While these organizations have worked closely together, and at time shared IPEDS distance education data sets, slight differences in the data have been reported from year to year. The purpose of this blog is to illuminate trends in the distance education data across sectors and over the four-year period where IPEDS data is available. Comparisons to the historic data reported prior to 2012 by BSRG allows us to approximate the same measures used by the prior BSRG surveys. Phil Hill has done a fine job of illuminating the differences in the data and definitions used over time. We remain grateful for his continued work in this area.

A reader raised the question last year about the impact of the transition of for-profits to non-profit status. In response, Phil Hill analyzed the data to show the impact to be negligible. As those transitions happen in the future, those changes will need to be part of any sector-based analysis.

Readers may also be interested in the prior WCET blog posts and reports written focused on the IPEDS distance education data.

Headshot image of Terri

 

Terri Taylor-Straut
Senior Research Analyst
WCET – WICHE Cooperative for Educational Technologies

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

%d bloggers like this: