Federal Regulations Groundhog Day

Breaking news! The 2016 Federal regulations for State Authorization of Distance Education have been delayed. Today we are joined by Cheryl Dowd, Director of WCET’s State Authorization Network (SAN), to discuss what we do know about the delay and provide further information. Thank you Cheryl for today’s post!

Enjoy the read and enjoy your day,

– Lindsey Downs, WCET


It’s Like Groundhog Day

The 2016 Federal regulations for State Authorization of Distance Education, initially to be effective July 1, 2018, have been delayed for two years. The Department proposes another round of negotiated rulemaking to amend regulations governing legal authorization of institutions by States and amend regulations for state authorization of distance education.

Bill Murray in the 1993 movie

Bill Murray in the 1993 movie “Groundhog Day” (Photo: Columbia Pictures)

Here we go again! Déjà vu!

Does living through the Federal regulation of State Authorization of Distance Education feel like the movie Groundhog Day to all of you? Phil Connors, the hapless weatherman played by Bill Murray, re-lived the same day over and over again! So, I “googled” the movie to see how long Bill Murray was caught in the loop. The movie’s director, Harold Ramis, said that Bill Murray’s character was trapped for 10 years! We are all Bill Murray, or if you are a fan of the Broadway musical version, like me, we are Andy Karl.

Let’s Start at the Very Beginning

Our story begins about nine years ago. The Department of Education gathers committees of experts to negotiate details of specific regulations. As the result of a “negotiated rulemaking” panel working on new rules related to the administration of title IV funds, the Department of Education released new rules back in June 2010, which included a proposed Federal Regulation for state authorization for institutions being approved by their home states. BUT, the proposed regulation did not contain the language for state authorization of distance education. The additional language (600.9(c)), regarding distance education, was released in the final regulation in October 2010. The result was many institutions found that they must scramble to be in compliance with various states’ laws and regulations for the out-of-state activities of their institutions. States became more aware of their role in oversight of activities of out-of -state institutions that occur in their states. The story goes on…

History of State Authorization

  • 1791 –States have the authority to regulate educational activities in their states. (The U.S. Constitution; Amendment X)
  • 2009 – A negotiated rulemaking committee considered specific mention of distance education in state authorization regulations but did not include it.
  • June 2010 – The Department released the 2010 Proposed Federal regulations for State Authorization for public comment, minus the language about distance education.
  • October 2010 – Final 2010 Federal regulations for State Authorization released, including language about distance education).
  • 2011 – Several Dear Colleague Letters to clarify and help implementation.
  • 2011 – The WCET/State Authorization Network (SAN) was created.
  • 2011 – The U.S. District Court struck down the distance education portion of the regulation on procedural grounds (reason: the public was not able to comment period since the distance education language was not included in the June 2010 proposed regulation.
  • 2012 – The U.S. Court of Appeals upholds the District Court’s ruling to vacate the distance education portion of the regulation. As a result, there is no enforceable Federal Regulation!
  • 2014 – The State Authorization Reciprocity Agreements (SARA) welcomes its first state, Indiana.
  • 2014 – A negotiated rulemaking committee did not come to consensus for a Federal regulation for State Authorization of Distance Education.
  • 2016 – The Department released proposed Federal regulations for State Authorization of Distance Education in July for comment.
  • 2016 – The Department released the Final 2016 Federal regulations for State Authorization of Postsecondary Distance Education, Foreign Locations in December. Effective date: July 1, 2018.
  • 2017 – The new Administration includes state authorization as a target of deregulation.
  • 2018 – The May announcement of a two-year delay of regulation enforcement and a proposed plan for a negotiated rulemaking committee to amend the Federal regulations.

What about 2019-2020?? – What will we see? Negotiated Rulemaking? Consensus?Lines with arrows on each end showing a never ending loop No Consensus? A new regulation? Do you have a scorecard? Can we break out of the loop?

We are awaiting an announcement from the Department, which may provide more details about the delay and proposed negotiated rulemaking.  However, the May 9, 2018, announcement from the Office of Management and Budget (OMB) made the intention of the Department clear.

So, What Does This Actually Mean?

It means that although institutions are legally obligated to be compliant with the state’s laws and regulations for the out-of-state activities of the institution, the effective date of tying that state regulatory obligation to compliance with a federal regulation to participate in title IV HEA programs, has been delayed.

We cannot stress enough that the institution is still under a regulatory obligation to the states in which the institution enrolls students, offers services, or participates in activities. The compliance obligation may be met by individual state compliance or through participation in the State Authorization Reciprocity Agreements (SARA) for SARA participating institutions, as provided in the SARA Manual.

What Do Institutions Need to Do Now?

  1. Know where your institution participates in out-of-state activities (online courses, experiential learning, marketing, recruiting, out-of-state faculty teaching online, face to face classes, brick and mortar locations, servers, etc.).
  2. Be compliant with the state laws and regulations of the states where the activities occur.
    • SARA participation provides uniform compliance for SARA participating institutions for many out-of-state activities in SARA participating states.
    • Activities outside of SARA or activities by institutions not participating in SARA may require compliance through the individual states.
  3. SARA participating institutions must follow the requirements acknowledged in the initial SARA application, renewal application, and SARA manual. (the alternative is individual state applications, fees, reporting, and renewals to obtain individual state compliance).
  4. For courses and programs leading to professional licensure, notify current and prospective students whether the course or program meets licensure board prerequisites in the state where the student is participating in the course or program.
    • Federal Regulations for Misrepresentation: 34 CFR 668.71 and 34 CR 668.72.
    • SARA notification requirements: SARA Manual Section 5.2.
    • Liability mitigation/avoidance to the institution.
    • Moral obligation to the student by the institution.
  5. Stay tuned to WCET and SAN for the latest information about the next steps in rule making by the Department!

quote: the foundation of regulatory compliance for out-of-state activities of the institution is the state.Being stuck in the Federal regulation loop, about whether title IV funds will or will not be tied to state authorization for out-of-state activities of the institution, is a challenge to explain at our institutions. However, we do have a message to share with our institutional leaderships: the foundation of regulatory compliance for out-of-state activities of the institution is the state. Breaking the federal loop is irrelevant to that message. The institutions cannot choose whether they wish to follow the laws and regulations of the state where the institution participates in activity. Institutions must be compliant in the states where their activities occur. So, the message is simple. Keep focused on state compliance including state licensure boards for the courses and programs offered out of state.

Meanwhile, unlike Bill Murray being caught in a continuous loop of 6 more weeks of winter, we are heading into summer and we will wait for the Department to give us our next dose of déjà vu!

– Cheryl Dowd, WCET

Cheryl Dowd

 

Cheryl Dowd
Director,State Authorization Network, WCET
cdowd@wiche.edu

 

 

 


CC Logo

 

Learn about WCET Creative Commons 4.0 License

 

Shifting Campus Culture through Mentoring

Today’s post is an important example of how a campus culture can impact student success. WCET is happy to share this post from Sarah Torres Lugo, Research Assistant with NCHEMS and the Foundation for Student Success. Sarah is here to discuss a Foundation for Student Success project which connects model (mentor) institutions with other institutions (who become mentees) which may require a campus cultural shift to impact equity and equality of education for their campus. The group is excited to share what they’ve learned from these mentor/mentee relationships.

Thank you, Sarah, for writing this great post for us today. We’re looking forward to following this project in the future!

~Lindsey Downs, WCET


What are the key levers for shifting campus cultures to eliminate the equity gap in postsecondary education? Can this culture shift result in an increase in overall student success? Through our continuing partnership with our mentor and mentee institutions, we at the Foundation for Student Success are working towards identifying and sharing critical elements of effective student success movements.

Foundation for Student Success Mentoring Project

Founded in 2016, the Foundation for Student Success (FSS), housed at the National Center for Higher Education Management Systems (NCHEMS) in Colorado, launched its first project in the Fall of 2016. The project began by identifying a small group of community colleges and public universities whose student success rates were higher than their input variables, such as Pell eligibility and high school grades of their students, predicted. The leaders at these institutions were interviewed and seven institutions were selected as mentors. Mentor leads were then identified by each mentor campus. FSS mentors include three Provosts, two Vice Presidents for Student Affairs, a Dean of Student Success, a Dean of Institutional Equity and Inclusion, an Executive Director of Academic Success and Equity Initiatives, a Vice Chancellor for Academic Programs and Services, an Assistant Provost, and a Dean of Student Development. The selected institutions have at least 25 percent American Indian, Black, and/or Latino students. The mentor institutions were then matched with three demographically similar mentee institutions—we refer to the group of one mentor institution and three mentee institutions as a pod. The mentee institutions are in very different places on their journey to cultivate the campus culture that best supports student success and this variation reflects campus realities across the nation.

The following map indicates the geographic distribution of the mentors (darker shade) and mentees (lighter shade).

United States map indicates the geographic distribution of the mentors and mentees. Mentor states: RI, NC, FL, TX, CA. Mentee states: WA, NV, UT, AZ, NM, CO, OK, IL, MI, OH, KY, GA, VA, HY, MA

Once we matched mentor and mentee institutions, each of the seven mentors invited teams from their three mentee institutions to the mentor institution campus. The leads at each of the seven mentor institutions were in charge of creating the agenda and arranging the meeting so that the mentee teams would have an opportunity to hear from and speak with those that have been most involved in the mentor institution’s efforts to improve the way they serve students. All mentor campus visits took place between March 22, 2017 and April 28, 2017.

Sharing What We’re Learning

We have since facilitated and tracked interactions among the pods of institutions as they work together to reach their self-defined goals towards changing the culture of their campuses in order to reduce equity gaps and increase overall student success. As a means to share lessons from mentor and mentee campuses, FSS is featuring the 2018 webinar series titled “Engaging in Tough Conversations Toward Equitable Student Success”. The series kicked off with a webinar about the realities of student demographics shifts. black graduation capTwo mentor and two mentee institutions shared their institutions’ journeys toward an understanding of the shifting student demographics on their own campuses, why these shifts matter, what steps they have taken and are planning to take in pursuit of equitable educational outcomes for their students, and what strategies have and have not worked. While institutions participating in the FSS project are learning from one another, we are also learning a tremendous amount. One of the aspects we are learning about and are codifying is the critical role data plays in cultivating a campus culture that promotes equitable educational outcomes.

San Jacinto College

During the “Shifting Student Demographics Matter— How to Start the Campus-Wide Conversation” webinar, we heard from Van Wigginton, San Jacinto College Provost, and Shelley Rinehart, Dean of Student Development at San Jacinto’s Central Campus, about the transformative change their institution experienced when the institution’s focus on enrollment data was shifted to a commitment to student success and completion that involved every employee. That change in focus necessitated the disaggregation of data and has brought the institution into the space of using predictive analytics to identify and eliminate barriers to student success. Without doubt, underlying this shift in focus and commitment is the recognition that there is a shared responsibility for student success between students and all institutional actors.

Community College of Aurora

We heard Quill Phillips, Special Assistant to the President for Inclusive Excellence at the Community College of Aurora (CAA), describe CCA’s work towards shifting the campus culture using an inclusive excellence framework and equity mindset. Quill spoke about the fascinating 2013 journey they embarked on with the Center for Urban Education at the University of Southern California. CCA’s math department piloted the Equity in Excellence action research process during which the math department looked at their student success data disaggregated by race/ethnicity and gender. This process enabled the math department to identify disparities, raise awareness of achievement gaps, and motivate individuals and units to seek strategies and change practices in order to better serve their students. Core to the process is the idea that the solution to the problem “lies within the institution—in its culture and in the beliefs and values that influence the expectations and practices of individuals” (Massey et al. 2005, p.40). We have similarly heard from all of our mentors, repeatedly, that there is no silver bullet or secret sauce behind the reduction of equity gaps.

CCA demonstrates an understanding that not a single change in practice or tool will bring about the desired outcomes with their efforts in creating a sustainable structure supported by various intentional steps they have taken. Such steps include intensifying the Center for Urban Education work through the Equity in Instruction Leadership Academy. Through the cohort-based academy, full-time faculty are taken through the entire Center for Urban Education protocol which includes delving into the data of their own course with the largest performance gaps and then looking into areas where improvements can be made such as the syllabus and the instructor’s implicit biases that may impede a student’s learning process. Given the positive results that CCA has had with the Equity in Instruction Leadership Academy and the common exclusion of adjunct faculty in professional development programs, it is encouraging to hear that CCA will be extending participation to adjunct instructors next semester.

University of South Florida

Paul Dosal, University of South Florida’s (USF) Vice President for Student Affairs and Student Success, shared the story of their student success path that took off in 2009. To address the student success challenges that USF faced, the Student Success Task Force was launched. This task force was intentionally composed of 100 individuals representing all units in order to ensure university-wide contributions to the student success movement. This task force was divided into eight core groups and in 2010 the task force produced a lengthy set of actionable and prioritized set of recommendations that have served as a blueprint for USF’s student success movement. One of the three fundamental reforms proposed was building USF’s research capacity to support student success initiatives. In expanding USF’s research capacity, USF identified a need for a guiding body to use data and determine/implement needed policy and process changes. As a result, USF formed the Persistence Committee to focus attention on the performance of all students. USF uses predictive analytics to identify struggling students and those who could benefit from correctly timed interventions. The committee is comprised of approximately two dozen staff from across the institution that work together to adjust practices in order to eliminate barriers to student success, which are identified through USF’s continuously improved, and truly remarkable, analytics-driven case management system.

Southern Connecticut State University

Southern Connecticut State University’s (SCSU) Associate Vice President for Enrollment Management discussed SCSU’s collection of data through focus groups with students, faculty, and other stakeholder groups as well as through the President’s Commission on Social Justice Survey to broaden their work to include more intentional strategies to improve campus culture and to prepare for the continuing changes in student demographics. Some of the insights from the data gathering process were surprising to some and served as the impetus for developing intentional strategies to address well-established cultural norms that are not reflective of SCSU’s commitment to social justice. Efforts for expanding access and usage of student success data across campus are underway and data are being used to foster strategic partnerships with high school and community college partners. SCSU will be hosting a Regional Summit in the Fall of 2018, where they will bring together high school partners, community college partners, school districts, government officials, business leaders, community leaders, Department of Labor representatives, and experts on racial issues to develop an action strategy plan to support more students of color in the community that aspire to earn a Bachelor’s degree. Foudnation for Student Success logo (a hand outstretched, palm up, holding the letters FSS)Underlying this convening is an understanding that the degree attainment agenda is a joint imperative to ensure the vitality of their citizens and their economy. The use of credible data will certainly help create the impetus needed to develop the action strategy plan.

The Future

NCHEMS staff will continue to facilitate and track interactions among the pods of institutions as they work together to reach their self-defined goals towards reducing equity gaps on their campus. We will continue to share lessons being learned through three more webinars, summaries of interviews with mentee institution leaders, and white papers.

Learn more about FSS and check out our Mentor Case Studies, information about the webinar series, a video recording of the webinar discussed above, a link to register for the May 23, 2018 “Who Owns Student Success on Your Campus?” webinar, and more.

author headshot
Sarah Torres Lugo
Research Assistant,
NCHEMS Staff,
Foundation for Student Success

 

 

References

Bauman, G.L., Tomas Bustillos, L., Bensimon, E.M., Brown, M.C., & Bartee, R.D. (2005).       Achieving Equitable Educational Outcomes with All Students: The Institution’s Roles and Responsibilities. Association of American Colleges and Universities. https://www.aacu.org/sites/default/files/files/mei/bauman_et_al.pdf

Sandoval-Lucero, E., White, T. D., Haynes, D. E., Phillips, Q., Brame, J. D., & Sturtevant James, K. A., (2017). Engaging inclusive excellence: Creating a college with an equity mindset. In L. Leavitt, S. Wisdom, & K. Leavitt (Eds.), Cultural Awareness and Competency Development in Higher Education, (pp. 40-60). Hershey, PA: IGI Global.

 


 

CC Logo

 

Learn about WCET Creative Commons 4.0 License

 

Rio Salado: Innovation Pushes the Boundaries of Tradition

Today here on WCET Frontiers we are happy to welcome Stacey VanderHeiden Güney, the Director of the Digital Learning Solution Network. Stacey is here to discuss a recent study on higher education institutions implementing digital learning and follow-up conversations regarding one of the institutions included in the study. 

Thank you to Stacey for today’s post and a special thanks to the representatives from Rio Salado College, who held a wonderful conversation with us in preparation for this post.

Enjoy the read and enjoy your day,

– Lindsey Downs, WCET


Recently, Rio Salado College was one of six institutions who participated in a Bill and Melinda Gates Foundation funded study entitled “Making Digital Learning Work: Success Strategies from Six Leading Universities and Community Colleges.” The study “identified an initial list of approximately 50 candidate institutions cited as exemplars in the implementation of digital learning” (Please note this is the only place in the report where the word exemplar is used). There were some additional criteria in terms of size, scale, target population, and graduation rates. Rio Salado College (Rio) was chosen because of its unique model of serving its students.

A recent article in the e-Literate blog questions Rio’s track record in terms of aggregate academic student outcomes and how “appropriate it is to include them as an exemplar in such a case study-based report.” It went on to conclude that “at best, this is a school with mixed results that should not simply be labeled a success without caveats or explanations” and “whether it is appropriate to hold up a school with some of the lowest student outcomes measures in the country as an exemplar.”

Innovation is sometimes difficult to measure by traditional standards. In the case of Rio, the Department of Education’s Integrated Postsecondary Education Data System’s (IPEDS) Graduation Rate data represent only 0.4% of the college’s fall semester student population. With that in mind, we wanted to take the opportunity to speak with members of Rio Salado College, so they could provide some context. As Paul Harvey used to say, “the rest of the story” may be useful to the larger populace in understanding the unique and innovative model that Rio has created.

With the exception of this introduction and the Summary section, the following sections are responses to questions in an interview with Rio personnel as well as information that was provided to me via email. Participating in the interview were:

  • Angela Felix, Faculty Chair for Languages, Faculty Senate President.
  • Janelle Elias, Dean of Institutional Effectiveness and Innovation.
  • Zach Lewis, Analyst in Office of Institutional Research.
  • Kate Smith, Vice President of Academic Affairs.

WCET is pleased to give Rio Salado College this forum to explain their model and their outcomes.

What makes Rio unique? (What does Rio do and how do they do it?)

Our mission

Part of the uniqueness stems from the origin of the college. We were created to be the college without walls to bring education to both the underserved and the unserved. The history of Rio is pushing the boundaries of traditional education and traditional metrics. It’s incumbent upon Rio to be ready to answer scrutiny because it’s a different model.

Rio was built to reach populations that not all institutions can reach. We work hard to make education accessible to all students: adults, military, incarcerated, high school, business, international, and more. We’ve grown throughout the years; we started with providing education in storefronts in Arizona to being an early provider of online. We have developed unique systems for how we deliver education to meet student needs.

Our master course model

We use a “one course, many sections” model that allows us to have quality assurance and provides us with data to inform what we do to better educate our students. Content experts work to provide standardized coursework with depth and focus on student learning and engagement as the number one priority. We are able to engage in different ways with students because our models allow for more time to do so.

Our rolling schedule with 48 start dates

A calendar showing various datesA flexible schedule is key to the uniqueness of the school and is a fundamental part of the model. We do not see this type of model in public community college environments. Students enroll in a similar section but can start on over 48 different start dates a year. In a recent focus group (completed by a consultant), students said because of the one-on-one focus from instructors, they feel connected, not isolated, despite the different start dates.

Our proprietary LMS

We also have our own proprietary Learning Management System (LMS). Many faculty, who were previously face-to-face teachers (and skeptical of teaching online), felt more connection to their students in this model (even when compared to their face-to-face experiences). There are personalized options for calendaring and personalized outreach.

Describe the students that Rio serves

Rio tends to serve non-traditional student populations. Students served by the unique role of Rio Salado College include active military, veterans and their families, adult re-entry students, high school students, incarcerated re-entry students, international students, lifelong learners, transfer students, university students needing additional coursework, and business, community and government partners. The median age is 29. Rio is successful at serving a traditionally underserved or UNSERVED population of students.

With that in mind, do you think that traditional metrics (such as retention and graduation rates) are the best measures of success?

Rio values traditional metrics. These hold us accountable and are used across the industry. We use those metrics to reflect on how Rio compares to others in education. The issue is not with the metric themselves, but the populations that are included in the methodologies. Specifically, with IPEDS, the cohort is so restrictive, and our student population is so diverse, that it does not capture a representative sample of students. The students included in the IPEDS metric make up less than 1% of our student population. Highlighting this lack of representation, only 113 students (0.4%) of the fall semester student population are measured by the IPEDS Graduation Rate metric. As such, normal fluctuations of even a few students can have a seemingly large impact on our IPEDS Graduation Rate percentage. Figure 1 presents a breakdown of the student populations served by Rio Salado.

Student population served vs. student population measured by IPEDS. Total unduplicated headcount 52,881, Unduplicated headcount (credit-seeking) 46,497, Part-time 41,339, Program-seeking 11,266, Full-time 5,158, IPEDs cohort measured 113Specifically, IPEDS data are not reflective of Rio Salado College’s flexible schedule. When IPEDS data are captured on the 45th day of the fall semester, many Rio Salado students have not yet enrolled because the institution starts new courses nearly every Monday and fall enrollment continues with start dates through the first week in December. A more accurate count of students actually enrolled for the fall semester would be accomplished by capturing enrollment data from the August through December start dates. IPEDS does not allow for this rolling calendar to be factored into its calculation. This single nuance in the institution’s calendar makes it difficult for Rio Salado College to identify peer institutions for benchmarking purposes.

The broadened IPEDS cohorts (including first-time full-time, first-time part-time, non-first time full-time and non-first-time part-time) are more inclusive of Rio Salado College’s student population. However, because the data collection is based on the fall census date (45th day snapshot), the metric still only represents about 43% of the college’s student population.

Rio is passionate about this known discrepancy between what IPEDS measures, who Rio serves, and how they serve students with a flexible curriculum model and therefore work to measure student intention to determine how well they are meeting their mission.

We are heavily engaged in how to understand our student’s intention very clearly, so we can have targeted support to help them meet their specific needs. Success means something very different to each student, based on their intent. For example, we have many students who are at a local university and register at Rio to only take two courses from us. We need to know what they want to do and then have a metric that allows us to assess if they were successful. Helping these students be successful allows Rio to be a very important support for our community.

In FY 2016-17, 76% of students enrolled at the institution self-reported an intention other than to complete an associate degree or certificate. Furthermore, 23% of non-degree seeking students at the institution reported themselves to be high school students enrolled in a dual enrollment program while 33% (11,628) indicated earning transferable credit as the key reason to enroll at Rio Salado College and 6% of students reported their intention as completing a certificate.

What is Rio’s academic model and how does it differ from other institutions?

Each course is designed by a subject matter expert who meets the rigorous hiring qualifications established by the Higher Learning Commission, Rio Salado College’s accreditation body. Course lesson pages integrate reading assignments, carefully curated multimedia resources, and often interactive assignments. In other words, the lesson pages are designed to be both standardized and yet also interactive, providing a substantive learning experience for the student.

All teaching faculty are hired with the primary purpose of facilitating student learning and student engagement. Because of the “one course, many sections” model, primary content and assessments have already been developed that enable the faculty to focus on supporting the learning needs of each individual student and providing content expertise to expand the learning opportunities and engagement The faculty member assigned to teach a particular section is responsible for creating content for the “From Your Instructor” to further expand upon and deepen the lesson content by either offering helpful tips or insights, asking thought-provoking questions, posting discipline-specific current events, or providing additional resources.

Faculty are to provide timely, substantive feedback on all assignments. Students are required to submit weekly assignments. Faculty feedback must be given within 48 hours of submission and must identify areas of strength and weakness in order to help guide the student to further learning. This feedback is provided in the form of annotated grading or summary feedback on each assignment, and rubric dimensions are to receive specific mention. Our learning management system provides automated alerts, calling the faculty member’s attention to students who are late with assignments or who have been inactive in the course for a prolonged period, so that the faculty can conduct targeted outreach and offer assistance to the student. Departments also require weekly roster management, a procedure by which faculty review each student’s progress and reaches out to students who are struggling.

The implementation of our asynchronous, “one course, many sections” model not only results in faculty supporting each individual student along his or her path, but also allows the institution to assess student learning across all sections of a course and implement instructional interventions at scale. This approach allows us to align content with assessment to ensure students achieve course competency and program goals. At a program/college level, we can tag assessments at interdisciplinary levels to see how students are performing at college wide learning outcomes. That’s very unique. We can see how students are performing on a wide variety of skills across the entire college. We can compare “apples to apples” because the assessment is the same across all sections. We recently received the excellence in assessment award from the National Institute for Learning Outcomes Assessment.

Rio also has the ability to drill down and capture performance metrics across disaggregated student populations in pursuit of closing identified achievement gaps. The online learning space—or indeed, any other learning space—requires continuous review and scrutiny of one’s practices to increase student learning. Rio’s commitment to relentless improvement reflects our awareness of the dynamic nature of education.

The student-to-student interaction varies based on the discipline. In English, students complete peer-to-peer editing. In other courses they have group projects. In some courses, students converse using video conferencing. It may not always be synchronous – it could be a recording that another student responds to. There is not always a discussion board; they use a variety of methods which are discipline-specific to the content area. In a recent focus group, students felt discussion boards were not necessary helpful in courses. Rather, we focus on the tools necessarily to effect learning. Some students may use discussion boards effectively, but other classes may need a different tool (discipline specific, driven by learning outcomes). We try to tailor the tools we use to our students.

What are outcomes for your students?

We are looking at metrics that help us showcase success as it corresponds to our student’s intent. Because IPEDS is so unrepresentative, Rio Salado College favors the Voluntary Framework of Accountability (VFA) published by the American Association of Community Colleges (AACC). It provides the college with the best tool for benchmarking to peer institutions, even if those institutions are not online and do not follow a rolling calendar. The benchmarking cohort for Rio Salado College includes 210 community colleges (noted as “peer institutions” below) that participate in the VFA across the country. Through the VFA metrics (2016 outcomes of 2014 cohort), Rio Salado College has learned:

  • Course CompletionRio Info Graphic
    • 1st semester course success rate is higher than peer institutions: Rio 94%; Peer Institutions 84% (first semester credential seeking population only).
    • After two years, full-time students at Rio complete credits at a higher rate than peer institutions: Rio 79%; Peer Institutions 74%.
  • Degree / Certification Completion
    • The 2-year graduation rate is higher than peer institutions: Rio 13%; Peer Institutions 10%.
    • The credential seeking cohort far outperformed sister institutions in completion: Rio 42%; Peer Institutions 16%.
  • Helping Underserved Populations
    • Rio helps more developmental education students (not college ready) complete college or degrees in two years than its peer institutions: Rio 11%; Peer Institutions 7%.
    • Rio students in the following demographic groups complete certificate or degrees at the two year mark at a higher rate than at peer institutions (all stats are higher and greater variance when looking at the credential seeking cohort only):
      • American Indian/Alaskan students: Rio 15%; Peer Institutions 7%.
      • Black students: Rio 11%; Peer Institutions 6%.
      • Hispanic students: Rio 20%; Peer Institutions 8%.
      • White students: Rio 12%; Peer Institutions 12%.
    • While eliminating equity gaps is of utmost importance, it is important to note that two-year transfer rates for minorities tracks well above peer institutions:
      • American Indian/Alaskan students: Rio 25.4%; Peer Institutions 12.8%.
      • Black students: Rio 28.0%; Peer Institutions 20.3%.
      • Hispanic students: Rio 28.1%; Peer Institutions 13.4%.
      • White students: Rio 32.3%; Peer Institutions 15.3%.

What areas does Rio excel at?

  • Being a certificate granting institution that supports students heading to the workforce (last year awarded 3,620 certificates). Rio helps students achieve a career, while they are working, etc.
  • Rio has a robust model with our business partnerships, supporting employers with education, training, and professional development.
  • Rio has increasing numbers of high school students who are participating in dual enrollment (over 7000 students!). This year, 103 high school students will participate in commencement at Rio, before they have graduated high school!
  • Transfer rates are quite high.
  • Students are doing quite well after their transfer, particularly to Arizona State University.

What areas remain opportunities for growth?

  • Capturing the intention and measuring intention to completion for students. This is already in our strategic planning work.
  • Continue the equity and assessment work.
  • Help community colleges tell their stories. Showcase value of community colleges, especially at the federal level.
  • Guided pathways work – particularly for part-time students. We need to know student’s plan when they first start, so we can tailor their academic and student affairs experiences to ensure we are providing the resources students need throughout their educational journey.
  • Explore emerging technologies. For example, using virtual reality glasses and how that can impact learning.

Summary

In summary, Rio Salado serves a diverse and non-traditional student population, while delivering exceptional educational experiences via distance education, as well as other innovative instructional modalities. Although these factors would traditionally contribute to lower success measures at other institutions, Rio’s outcomes are on par with or better than its peers while delivering instruction at a lower cost per student.

Rio Salado College continues its work to demonstrate a commitment to persistence and completion, and the college welcomes others to reach out and learn more about its mission, its students, its challenges, and its solutions. We at the Digital Learning Solution Network and WCET encourage other institutions to push beyond the constraints of traditional outcomes and explore innovative ways to serve their students.

WCET is pleased to be the Backbone Organization supporting the Digital Learning Solution Network (DLSN). The DLSN was created to design a network-centric approach to strengthen digital learning in postsecondary institutions with a particular focus on improved outcomes for vulnerable populations (it is funded through the Bill and Melinda Gates Foundation). The DLSN seeks to foster discussions around innovation.

stacey-square

 

Stacey VanderHeiden Güney
Director, Digital Learning Solution Network
sguney@wiche.edu

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

Opening a New Path to Success – A Journey with Open Textbooks

Z Degrees (Zero-Textbook Cost Degrees) are what many consider the holy grail of Open Educational Resources (OER) accomplishments. Today’s guest blogger, Tanya Grosz, Ph. D., Dean of Graduate, Online & Adult Learning, led the open initiative at the University of Northwestern St. Paul—the first institution in Minnesota to create a Z Degree. Tanya and I met at an Open Textbook Network meeting in 2014, and we were happy to discover our similarities—English faculty, online learning geeks, interested in student achievement, and we have the same first name! I’m so proud of her leadership and success. Read all about Dr. Grosz’s journey from adopting her first open textbook to achieving a Z Degree through the implementation of more than 50 open textbook adoptions. Thank you Tanya for today’s post!

-Tanya Spilovoy, Director of Open Policy, WCET


It Started with a Question

Question mark drawn on a chalkboardBack in 2011, as a keynote speaker for the MN eLearning Summit, Dr. Cable Green, Director of Open Education at Creative Commons (the legal licensing behind open educational resources), asked a question that stopped me in my tracks: “How are your students supposed to learn with books they can’t afford and are not buying?” As a long-time English teacher who was frequently frustrated about needing to buy yet another updated edition of a pricey literature anthology that had undergone only minor changes, the question resonated deeply with me. Green went on to offer a different way forward: learning resources licensed with a Creative Commons license, which enables them to be reused, revised, remixed, redistributed, and retained (The “Five R’s of Open” according to Lumen Learning’s David Wiley).

Shortly after hearing Green’s compelling question, I was introduced to Dr. David Ernst, the founder of the Open Textbook Network, the Open Textbook Library, and the Chief Information Officer for the College of Education and Human Development at the University of Minnesota, Twin Cities. He said he was on a grant from the Hewlett Foundation to promote the adoption of open textbooks through facilitated faculty workshops. When I asked if he would come to the University of Northwestern – St. Paul, a faith-based liberal arts university not far from where Ernst worked, he agreed. Northwestern soon became one of eight founding members of the Open Textbook Network, an alliance of now over 600 campuses dedicated to promoting access, affordability, and student success through the use of open textbooks.

A Shocking Financial Landscape

The financial landscape for students is fairly shocking: The average student owes approximately $30,000 in student loans. Funding for higher education keeps declining while tuition costs keep rising, and textbook prices have risen four times the rate of inflation. Mark Perry, Finance and Business Economics professor at the University of Michigan-Flint, suggests that college textbook prices have risen over 945% since 1978.

Sara Goldrick-Rab, in her book Paying the Price: College Costs, Financial Aid, and the Betrayal of the American Dream, provides an insightful critique of federal financial aid and the problems inherent with the outdated formula that determines institutional allocations. She demonstrates that the federal needs analysis which governs financial aid eligibility is hopelessly outdated because the financial situations of low-income students and their families are not accurately represented. These students often provide support to their families instead of the other way around. Furthermore, while tuition and fees are labeled “direct educational expenses,” everything else, such as food, rent, gas money, and textbooks, are labeled “indirect” and “noneducational.” Goldrick-Rab suggests that, “Paying tuition allows students to go to class, but they will fail if they have no books, no pencils, no gas money to get to school, and no food in their stomachs.”

The Exciting OER Solution

These and other financial realities were presented to University of Northwestern – St. Paul administration and faculty with one exciting solution: Open Educational Resources (OER). OER include any type of educational materials that are in the public domain or introduced with an open license. The nature of these open materials means that anyone can legally and freely copy, use, adapt and re-share them. Open textbooks are a specific type of OER—they are textbooks licensed under an open copyright license and made available online to be freely used by students, teachers and members of the public. Many open textbooks are distributed in either print, e-book, or audio formats that may be downloaded or purchased at little or no cost.

The Open Textbook Network (OTN) came to our campus twice and facilitated faculty workshops that provided an overview of the obstacles that high textbook prices present for students and how open textbooks offer a desirable alternative. They offered a $200 stipend for qualified faculty to review an open textbook in the Open Textbook Library. As the dean over online learning and our instructional design team, I was able to allocate instructional design support for faculty interesting in adopting or adapting an open textbook or curating open educational resources. If faculty would adopt OER, our instructional designers would support them from a technological standpoint, freeing the faculty member to be the Subject Matter Expert (SME) and not have to worry about the technology behind adopting or adapting. When a fellow dean and Chemistry professor announced our first open textbook adoption in fall 2015, the class spontaneously applauded after he told them there would be zero textbook costs for their Chemistry book that semester.

Northwestern’s Three-Pronged Approach

Northwestern’s open initiative wasn’t centered around open textbooks only. We developed a three-pronged strategy; first, we had to raise awareness about student debt and the current financial realities. Second, we wanted to partner well with our rock-star librarians to ensure that we were fully utilizing already-purchased library materials when designing and revising courses. Finally, we wanted faculty to be adopting (and adapting) open textbooks. The three-pronged approach made our open initiative more inclusive, and as a result, it gained traction fairly quickly. The support of our instructional design team, I later found out, is fairly unique. I believe it’s one of the reasons that our open initiative has been so successful; faculty members are more likely to adopt and adapt open textbooks when they have the technology support to do so. Also, as the instructional design team grew in their knowledge of OER and where to find them and I garnered valuable resources from the Open Textbook Network, we began reaching out more strategically and proactively to professors about available open textbooks when a course was due to be revised, new courses were being written, or when particularly interesting open textbooks became available.

In direct response to our growing open textbook initiative, our online course launch meetings evolved. When facilitating new course development, we now have the SME (professor), an instructional designer, and a librarian present to ensure that we are choosing the best possible resources for the course. “Our faculty are finding that open textbooks truly are removing barriers to learning…Our faculty are finding that open textbooks truly are removing barriers to learning, and they love the fact that the open textbook is available to students immediately students don’t have to wait for their paycheck or for financial aid to hit their accounts to access their texts. Our Online Learning Office (which houses our awesome instructional design team) has become the “home” of our open initiative, and they are building a web presence to support open adoptions.

Adding the Student Voice

We knew that we were lacking student voices within our open initiative. Taking a cue from the Open Textbook Network, we created a video of our students talking about how high textbook prices had impacted them, and then panned to the Chemistry students talking about what they thought of their open Chemistry book. This spring, we added a video of faculty who have adopted open textbooks speaking about their perceptions of quality and student impact. Having students and faculty speak about open textbooks themselves is powerful, and it’s helpful to keep reminding our community about our open initiative. About the same time, the Student PIRGS came out with a compelling report about rising textbook costs, how those costs are negatively impacting students, and how open textbooks provide a solution in Covering the Cost: Why We Can No Longer Afford to Ignore High Textbook Prices, that I shared widely across our campus.

Minnesota’s Z Degree

Our Zero-cost Textbook Degree or Z degree, the first Z degree in Minnesota, arose quite naturally out of our growing open initiative; we learned about Tidewater Community College’s Z degree and asked, “Why couldn’t we create a Z degree?” We chose our adult undergraduate Business Management degree, had a supportive program manager collaborate with our SMEs and instructional designers, and then worked one by one through the core courses in the program to ensure there were high-quality OER available that met the objectives of each course, and when there weren’t, we looked to our library for resource support. Our open initiative has made the course design process more iterative, and I love the fact that our designers can help our professors “chunk out” resources throughout the course, placing digital chapters where they will be read within the course. The idea of openness has permeated our thinking about new projects; sharing has become our new normal. President of University of Northwestern – St. Paul Dr. Alan Cureton says, “Through the open textbook initiative, Dr. Grosz and her team have introduced an exciting innovation that benefits our students by making education more accessible and affordable. I’m proud of our Z degree, our many open textbook adoptions, and our faculty for embracing this significant educational movement toward openness.”

Concerns Surrounding Open Initiatives

Everything wasn’t always smooth sailing; certainly, there were skeptics. The two biggest concerns voiced by faculty were 1) the quality of open textbooks and 2) the curtailing of academic freedom.

The Open Textbook Network is clear in their coaching that it’s not my job to speak to the quality of open textbooks. Instead, I leave that to SMEs who review the open textbooks. I pointed skeptical faculty to the Open Textbook Library to review quality for themselves and shared a compelling study that demonstrates students in more than half of the courses using open textbooks did better according to at least one academic measure, and students in 93% of these courses did at least as well by all of the measures. Studies measuring the academic impact of open resources are proliferating, and they all point to the fact that students do the same or better as their peers using traditional textbooks.Photo of group at open textbook adoption celebration

Regarding academic freedom, I was able to speak as a faculty member and long-time teacher myself: I found it incredibly empowering that open textbooks allow me to actually augment and adapt content, thereby introducing a continuous improvement loop that certainly does not exist with traditional textbooks. In addition, it is ultimately the faculty member and academic department’s decision as to what course materials are chosen. Open textbooks provide faculty another choice, so as opposed to limiting academic freedom, I argue that they expand and augment it. However, just like any other disruptive innovation, I have welcomed the faculty who are eager to embrace open textbooks and then encouraged them to become champions of open, celebrating victories such as our 50th open textbook adoption during Open Education Week.

In addition to ensuring that I listened well to faculty concerns, I faced a campus bookstore that was understandably skeptical of the open textbook initiative, concerned about the potential loss of revenue. In retrospect, I wish that I had partnered with the bookstore prior to making any faculty presentations to help ensure understanding and support early on. David Wiley writes that campus bookstores actually don’t make very much on textbook sales, suggesting that they consider adding print-on-demand centers. At one OTN Summit, someone suggested open proponents invite their campus bookstore to the Textbook Affordability Conference. On some campuses, bookstores have partnered effectively with open proponents, and that remains my hope as well. Regular communication and sharing of positive student impact should help us make progress towards achieving that end. Regarding impact, since 2015, our open initiative has saved students $275,000, and if we maintain our current course of average adoptions per semester, we will have saved students approximately $742,000 through 2023. Total student savings in graph form. Savings ranged from 20,000 in Fall 2015 to 275K by Spring 2018.

Sharing Success

The impact has been far more than just financial, however; I found that I had opportunities to talk about open textbooks and our success openly, such as on a local television morning show, at various presentations to local university libraries, through presentations with Dave Ernst at EDUCAUSE Learning Initiative’s Annual Meeting and the Higher Learning Commission Annual Conference, among others. I embraced and continue to say yes to these opportunities, not because they are in my job description or because I have time on my hands, but because I believe that open is a social justice issue. According to Jhangiani and DeRosa, “When faculty use OER, we aren’t just saving a student money on textbooks: we are directly impacting that student’s ability to enroll in, persist through, and successfully complete a course.” And my interest in open has blossomed into an interest in the myriad other obstacles facing college students, from childcare costs to food insecurity. As educators, we want to make a difference, and OER have provided me an opportunity to do just that—to tell other professors, librarians, and administrators that open increases access to learning. According to Dave Ernst, creator of the Open Textbook Network, “The University of Northwestern St. Paul’s open textbook program is a great example of the impact an institution can make when they are persistent, strategic, and supportive of their faculty.”

Here’s to The Future

Where we go from here is an interesting question. Certainly, we want more Z degrees. And I’m planning to co-fund the writing and publication of our first open textbook with our library during the 2018-2019 academic year. But even more compelling, I believe, is the idea of moving toward open pedagogy so that our students can be empowered as participants in the construction of knowledge along with their professors. Becoming open content producers and curators themselves will engage our students even more fully in an active learning environment re-envisioned to be innately more impactful. DeRosa and Robison suggest that, “When we think about OER as something we do rather than something we find/adopt/acquire, we begin to tap their full potential for learning.” Slowly but surely, we are embracing an educational future that reduces barriers and increases access to learners; with OER, we are opening up new pathways to success.

author headshot tanya g
Tanya Grosz
Dean, College of Graduate, Online & Adult Learning
Assistant Professor of English
University of Northwestern St. Paul

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

WCET Summit: Ensuring Ethical and Equitable Access in Digital Learning

In June 2017, I had the outstanding opportunity to attend my first WCET Leadership Summit. Last year’s event focused on the essential institutional capacities needed to encourage and lead innovation. I specifically remember feeling so invigorated by every session I attended and I truly enjoyed the hallway conversations and the incredibly active social media backchannel. I can’t believe it’s been almost a year!

If you’re experiencing a huge case of FOMO (or are just being super nostalgic like me), I’ve got great news for you! #WCETSummit is almost here again! This time we’re meeting in a NEW location (Newport Beach, CA!) and we’re chatting about several new topics. Luckily, Megan Raymond, WCET’s Assistant Director of Programs and Sponsorship, is here to tell us all about this year’s Leadership Summit. Thanks for the great post today, Megan!

I do hope you’ll join us in these important conversations! There’s still time (until May 4) to register at the early bird rate! See you in sunny California in a few months!

~Lindsey Downs, WCET


WCET’s Summits are a unique blend of expert panels and interactive group discussions around key topics identified by WCET’s leadership committees to be timely and important to higher education institutions using technology to enhance teaching and learning. Previous Summit topics include, 21st century credentials, adaptive learning, data analytics, and developing institutional capacity to lead innovation.

ad for wcet summit : reads WCET Leadership Summit Ensuring Ethical and Equitable Access in Digital Learning, June 5-6, 2018 Newport Beach, CA

The 2018 Summit in Newport Beach, CA on June 5-6 will weave some of these threads as well as new ones into a facilitated discourse surrounding these three key topics:

  1. Equity as a demonstrated priority for the institutions’ students, faculty, and staff.
  2. Accessibility as the lens through which the institution examines its resources, policies, services, and infrastructure.
  3. Data and evidence-based decision making for student success and ethical questions underlying analytics engines and EdTech products.

These topics, and the discussions that frame them, will help institutions consider their approaches and emerging strategies to ensure ethical and equitable access in digital learning. The primary goals of the Summit are to create a venue for honest and open conversation, learn about exemplar successes and disappointing failures, and prepare institutions to serve all students effectively, ethically, and equitably.Primary Summit Goals: Create a venue for honest and open conversation, learn about exemplar successes and disappointing failures, and prepare institutions to serve all students effectively, ethically, and equitablyThe list of speakers participating in the Summit showcase a range of institutions and experiences relative to equity, accessibility, and ethical use of data analytics. Their expertise and passion for student success will challenge and inspire attendees to bring promising practices back to their institutions.

View the preliminary program for Summit speakers and topics, including:

  • Inclusion in Higher Education: Beyond a Promise to Action.
  • Ethical and Effective Uses of Student Data (by the Institution, the Faculty, and the Students).
  • Moving Towards a Campus Climate of Universal Access for All.
  • Building a Campus Culture for Inclusion.
  • Freshman Class of 2022. What Will Your Institution Do to Best Serve These Students? How Do Your Faculty Utilize All of the Digital Learning Resources that May Improve Student Engagement and Learning?

palm trees framed in front of a blue sky with light fluffy clouds

The Summit is open to all, the diversity of institutions, job titles, and experience enhance the discussions and information exchange. Space is limited to 150 participants to foster communication and collaboration. Be sure to register before the early-bird deadline of May 4th. Full details are available on the website or contact Megan Raymond mraymond@wiche.edu for more information.

Help us jump-start the conversation!

What are you doing to drive goals of equity, accessibility, and data and evidence-based decision making at your institution? Let us know in the comments below or tweet your thoughts @wcet_info with #WCETSummit!

-Megan

Megan Raymond headshot

 

Megan Raymond
Assistant Director, Programs and Sponsorship
WCET – WICHE Cooperative for Educational Technologies
mraymond@wiche.edu | @meraymond

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

State Authorization for Distance Ed Federal Regulation to be Implemented 07/01/2018

Frequently Asked Questions: Overview and Direction of the Regulations

The U.S. Department of Education is scheduled to implement state authorization for distance education regulations on July 1, 2018. 2018 calendar showing July 2018 with the 1st circledThere is still some uncertainty about whether the Department will implement the regulation and, if they do, what institutions need to do to comply. WCET and its State Authorization Network (SAN), the National Commission for State Authorization Reciprocity Agreements (NC-SARA), and the Distance Education Accrediting Commission (DEAC) have asked for clarification from the Department more than once and through different channels.

This Frequently Asked Questions Document provides an overview of the status and a few recommendations on how to proceed on the following topics:

    1. The Status of the Regulations – As Best We Know It.
    2. The Federal State Authorization Regulation – What are the Big Outstanding Questions? And What Should We Do?
      • Reciprocity.
      • Complaints.
      • Residence vs. Location.
      • Notifications.

More detailed information will be provided to State Authorization Network members.

1. Topic Area: The Status of the Regulation – As Best We Know It

Q: Will this regulation go into effect?

  • The Department and Congress have had numerous opportunities to kill this regulation. We thought they would have done so by now, if that was their intent. We have received mixed signals about the future of the regulation in recent weeks. On March 30, Politico had a convincing report that “state authorization” was one of a handful of regulations that could be “under review” soon. That would likely delay implementation and result in a rulemaking panel to rewrite the regulation. We think this is a likely outcome, but we also predicted they would have acted on it – one way or the other – by now.
  • To clarify, neither WCET nor WCET/SAN are recommending a review. We have long believed that complying with state laws and regulations is reasonable.
  • As an institution, it is important to remember our mantra for the last year, that: “A regulation is a regulation until it is not a regulation.” Until we are told otherwise, institutions should expect to comply.

Q: What about the bill in the House of Representatives, the PROSPER Act? Won’t that bill do away with this requirement?

  • The PROSPER Act will not pass, as is.
  • For a bill to go into effect, it must pass the House, the Senate, and be signed by the President. The House Bill is the product of one party. The PROSPER Act is a first attempt at reauthorizing the Higher Education Act, which covers the basic federal regulations for the higher education / federal government relationship. An update (or “reauthorization”) is long overdue. The Senate is working on a bipartisan version of the reauthorization of the Higher Education Act that will likely differ greatly from the PROSPER’s provisions.
  • Since it is an election year and summer Congressional breaks are coming, many pundits believe that if reauthorization does not happen by June or, maybe, July, that it is probably dead for this year. Such action is appearing to be unlikely, but this Congress has had a tendency to suddenly produce bills and hurry them through.
  • Our current thinking is that reauthorization probably will not happen this year, therefore Congress will probably not change the state authorization regulation.

Q: If Congress does cease requiring federal state authorization, can I stop worrying about state authorization and drop my SARA and WCET/SAN memberships?

  • No.
  • Regardless of what the federal government does, the states will still have their regulations and will expect you to comply.

2. Topic Area: The Federal State Authorization Regulation – What are the Big Outstanding Questions? And What Should We Do?

WCET, NC-SARA, and DEAC have submitted to the Department of Education a list of questions that would need to be answered and recommendations on changes that should be made. For some of these items, we have heard trusted experts provide opposing opinions on what colleges are supposed to do to comply with the regulatory language. IF the regulation goes into effect on July 1, below are some of the top issues that should have been addressed long before now…because compliance takes time:

2.a. Reciprocity – Will reciprocity be recognized by the Department of Education as a path to demonstrate compliance in a state?

  • When the regulation was originally released, it seemed to recognize reciprocal agreements that did prohibit states from enforcing their own laws. Essentially, that would have negated reciprocity since the agreement would lack any enforcement capacity. Shortly after we released our analysis of the new regulation, Department of Education personnel told WCET and NC-SARA staff that we had misinterpreted the meaning of the regulatory language. Instead, they said that their intention was that a reciprocal agreement would not be recognized if there was a “conflict” between state law and reciprocity agreement requirements. Since states joining SARA agree to its provisions, “conflict” is removed before it can join. A letter from the Undersecretary of Postsecondary Education describing this clarification of the Department’s intent was sent to WCET and NC-SARA. While this letter does not hold the force of rule, it is our understanding that Department personnel fully support NC-SARA and, if the regulation goes into effect, that this clarification will be codified. It is highly unlikely that the Department would undermine the will of 48 (soon 49) states, the U.S. Virgin Islands, and the District of Columbia.Panic Button Image

What Should You Do? Don’t’ Panic. The Department is very supportive of reciprocity. We expect clarification to be positive for SARA.

2.b. Complaints – Institutions are required to “document” the complaint processes for students in each state, what do we do about states (like California) that do not have a compliant process? Should we stop enrolling students from California?

  • If a state has joined SARA, then SARA has processes to handle complaints. California will not join SARA this year and might not next year. California has complaint and oversight processes for out-of-state for-profit institutions, but not out-of-state publics or non-profit institutions. They are aware of the issue, but it will take legislative action to fix it. Such action does not appear to be imminent. This could be a problem if you enroll students in California at a distance.
  • If you are at an institution that is not a SARA member, there are other states that do not have a complaint process for you. That could be a problem for enrolling students in those states, as well.

What Should You Do?

  • Some have recommended citing the California Attorney General’s office as the handler of complaints. Since that office has openly declined this responsibility, the action might not withstand a financial aid review by the Department of Education.
  • Both the Department and California officials are aware of the issue. It is our hope and belief that (if the regulation goes into effect) that they will create some type of accommodation so as not to harm students. If they don’t we should all scream loudly…very loudly.
  • We can’t give you an absolute answer on this one as too many possible scenarios. We suggest that you:
    • Count how many distance students that you have or will have in California.
    • Communicate with your institution’s legal counsel.
    • Determine the level of risk that your institution is willing to assume.
    • Act accordingly.
    • Be prepared to act.

2.c. Residence vs. Location – Since state authorization is based upon the location of the student, why does the regulation use “reside” or “state of residence” so often in the regulation?

  • This is another one on which state authorization experts have differed in their advice. Some have suggested that institutions need to collect the student’s official state of residence in addition to location. That’s lots of extra work.
  • During the 2014 Negotiated Rulemaking Marshall Hill (NC-SARA), Leah Matthews (DEAC), and Russ Poulin (WCET) all served as negotiators or alternates. We made sure that all references in the regulations being developed at that time referenced location and not residence. It is too bad that our work was forgotten. We see this as an error in the current regulation and are urging them to correct it.

What Should You Do? 

  • Again, we cannot give you a definite answer, but we should have more clarity soon…we hope.
  • Assess the amount of work this would take.
  • Determine the level of risk that your institution is willing to assume.
  • Act accordingly.

2.d. Notifications – How do we implement the new required notifications?

  • Institutions are expected to make several public (via websites) or individualized (direct) communication with the student notifications. There are several details that need to be clarified, such as “what exactly is an adverse action,” “do SARA institutions report state refund policies,” and several questions about notifications for programs that lead to professional licensure or certification.

photo of a person looking confused and shrugging

What Should You Do?

  • For programs that lead to professional licensure or notification, the requirements are close to those already required by SARA. Also, the lack of proper notifications on these programs has been the leading cause of student lawsuits against institutions, protect yourself.
    • For SARA institutions, you should review SARA requirements, proceed in determining your program’s applicability in any state you will enroll students at distance for that program. Comply with state regulations. Notify students of your status for that program in his/her state.
    • For institutions that are not a member of SARA, we still think that the threat of lawsuits makes this work worthwhile.
  • For the other notifications, if you have not implemented them yet, you may wish to wait for clarification. Again, determine the level of risk that your institution is willing to assume.

Finally…

To be clear, institutions should be preparing for the July 1, 2018 regulations, to the best of their ability, until we hear further information. To that end we have attempted to summarize the aspects of the regulation into somewhat of a check list of requirements for you to download. WCET and WCET|SAN are committed to providing information as soon as it becomes available and guiding institutions through whatever we learn in the upcoming months.

Cheryl Dowd
Cheryl Dowd
Director,State Authorization Network
WCET
cdowd@wiche.edu

 

 

Photo of Russ Poulin
Russell Poulin
Director, Policy & Analysis
WCET – The WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu | @russpoulin

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

Count All Students! Outcome Measures for Community Colleges

q1Should we count all students when analyzing higher education, or only some of them? We think all students should be included….and community colleges are often misrepresented by not doing so. This third post in a series of posts on the IPEDS Outcome Measures data (released by the U.S. Department of Education late in 2017), we turn our attention to community colleges.

The Outcome Measures is a new, more comprehensive view of what happened to students who attend an institution. This new Measure improves on the traditional Graduation Rate, which only included first-time, full-time students and left out new part-time or transfer-in students. There are many of those students at two-year institutions.

The first post in our series introduced the issue generally and the second examined the Outcome Measures results for institutions that serve a large number of students who take all of their courses at a distance. This post samples the results for the community colleges in one state.

Some overall observations:

  • Our friends from the WICHE Policy and Research unit found that 60% of all higher education students and 71% of community college students were not included in the Department’s Graduation Rate statistic that has been used for many years.
  • The Graduation Rate likely did not count 68.8% of community college students in Colorado. That’s almost 15,000 students. They are included in the new Outcome Measures.
  • The Graduation Rate likely counted only 26.8% of community college students in Colorado, as it focused only on those who received a degree or certificate. Adding the Outcome Measures “transfer-out” and “still-enrolled” categories accounts for another 35.6% (7,215) students who entered in 2008.

This is the first year for the data to be released. As with any new statistic, some institutions were more successful than others in gathering and reporting the data. This should improve in the coming years.

We will also provide you with our spreadsheet of data so that you can analyze the data and perform similar analyses for your institution.

The bottom line: Institutions should insist on broader use of the Outcome Measures data when reporting to legislators, state system offices, the press, and those pesky ratings services.q2

What Are the Outcome Measures?

For the old Graduation Rate measure, institutions were asked to identify a “cohort” of students who entered as first-time (they’ve not attended college before) and full-time (they are taking a full load of courses) in the fall of a given academic year. Institutions track and report what happened to those students after a set number of years.

The new Outcome Measures asks institutions to expand this tracking by adding three additional cohort categories of students:

  • First-time, full-time (similar to the Graduation Rate).
  • First-time, part-time.
  • Non-first-time (transfer-in), full-time.
  • Non-first-time (transfer-in), part-time.

graph 1Source: https://nces.ed.gov/ipeds/use-the-data/survey-components/outcome-measures

The first cohorts were compiled for the Fall 2008 academic term and the first dataset containing results was released last October. It is important to note that these data are for all students in a given cohort, not just distance education students.

Our Sample of Public Institutions: The Community Colleges of Colorado

WICHE’s Policy and Research Brief provided statewide and regional results for each of the western states. We wanted to see the impact on an institutional basis and gauge some variations among institutions in a similar setting.

Given that community college leaders long complained about Department of Education’s Graduation Rate not providing an accurate profile of their students, we thought that we should look at that sector. We chose community college in Colorado for several reasons:

  • We both live in Colorado and are more familiar with the geography, if not the intricacies of their operations, for institutions in our home state.
  • Colorado includes a mix of geographic settings: urban, suburban, rural, and remote.
  • With fourteen community colleges, the sample was not so large as to be unwieldy.

Most Community College Students Are Not Included in the Graduation Rate

First, let’s look at the input into the Outcome Measures statistic. Which students are included?

Community colleges may be the group best served by the expanding view of the Outcome Measures statistic. The Graduation Rate (previously the only measure of what happened to those admitted to an institution) includes only students who were first-time (they are new to higher education) and full-time (they enrolled in a full load of courses). Especially for urban institutions with a focus on working adults and those returning to college, this leaves out the bulk of the students they serve.

This fact is starkly highlighted in the WICHE brief on the western states in which they found that only 29% of all community college students in the western states were first-time, full-time students. That means that 71% of community college students in the west were likely not included in the Graduation Rate counts. In our opinions, that is a serious undercount.

For Colorado’s community colleges, the percentage of students who were first-time, full-time students  is similar to the other western states at 31.2% of enrollments. Therefore, 68.8% of community colleges students in Colorado (almost 15,000) were likely not included in the Graduation Rate counts.

Looking at the numbers more closely, in Colorado there is a huge rural/urban divide in serving first-time, full-time students:

  • Rural – 57.7% were first-time, full-time.
  • Urban – 27.8% were first-time, full-time.

This is not surprising, as rural community colleges may tend to serve students in their immediate area. Their enrollments may reflect the desire of traditional age students to begin their college experience locally. It also could be a tribute to those colleges offering programs that are designed to lead to employment locally. On the other hand, urban institutions, due to the make-up of their local population, may recruit and serve more adult and returning students.

Why does this matter? If only the old Graduation Rate were used, the comparisons were not equal. The Graduation Rate includes a much smaller percentage of students for urban institutions and is a poor representation of the institution’s overall population and activities.

Community Colleges Benefit from Including More Outcomes

In addition to granting degrees and certificates, the mission of many community colleges includes preparing students to transfer to other institutions. The Outcome Measure now accounts for one of the basic goals of these institutions. The low completion rates of community colleges have been cited as evidence of their ineffectiveness. Admittedly, some colleges deserve the criticism. By including counts for the students who transferred and those still enrolled, a more complete picture of what happened to the students who entered is obtained.

Below is a table of the results of the Outcome Measures data for students who entered each institution in 2008. The columns included are:

  • Urban or Rural – This is our own (Colorado-based and probably unscientific) view of the communities in which the institutions reside. Those labeled “rural” are in smaller towns in more remote settings. Admittedly, to someone from the east or west coast, Pueblo or Greeley may seem “rural,” but they are larger towns that also include a university.
  • FTFT Over 50% – A “yes” appears if “first-time, full-time” students represent more than 50% of the institution’s incoming freshman class cohort for the Outcome Measure. We found it interesting to note that four of the six rural community colleges served more traditional students. Additionally, a fifth institution was barely below the 50% mark. Urban institutions tend to have a much smaller percentage of “first-time, full-time” incoming students.
  • Completers – These students received a degree or certificate within eight years of entering the institution. This serves as a proxy (though might not be an exact match) for the students included in the Graduation Rate measure.
  • Completers + Transfer-Out + Still-Enrolled (CTS) – The percentages of those who transferred to another institution and those who are still enrolled at the institution after eight years are added to the “completers” percentage. As you may have guessed, there are precious few “still enrolled” students after eight years at community colleges. Some community college professionals have called similar statistics the “success” rate, but we avoided that value-laden term.
  • Difference – Subtract the “completers” column from the next column to obtain the Difference. The reason for showing the difference is to highlight the percentage of students who would likely not have been counted in the Graduation Rate.

chart

Our main observations from this table are:

  • The “completers” designation (which mimics the Graduation Rate) accounted for the ultimate fate for only 26.8% of Colorado’s community college enrollments. Adding the “transfer-out” and “still-enrolled” students account for another 35.6% (7,215) of the students who entered in 2008. Community colleges should get recognition for these students.
  • There are differences in the end results between rural and urban institutions. Rural institutions tend to have higher percentages of “completers.” That result may be partially due to the composition of their incoming student bodies having more “first-time, full-time” students.
  • There are still a large number of students in the “unknown” category. Some of this may be due to inadequate record keeping because this is a new statistic. Some of it may be due to the college’s inability to track some transfers. It is likely that many of the students dropped out. If (in a worse case scenario) nearly 40% of students who entered community college in Colorado in 2008 have left higher education, that is a number that may give policymakers pause.
  • The “difference” column is more than 20% for every college except Pueblo Community College, which leads us to wonder if they had difficulty with identifying transfers or experienced other problems with implementing the reporting requirements in the first year of Outcome Measures data collection.

The Incoming Students: Composition of Fall 2008 Cohort

One of the benefits of Outcome Measures (OM) is to better understand the composition of the student cohorts at our institutions. While the IPEDS Graduation Rate includes only “first-time, full-time” students, Outcome Measures includes the four cohorts listed above. Let’s look at results for those institutions for each of the four cohorts…

First-time, Full-time Cohort

Northeastern Junior College has the largest “first-time, full-time” entering cohort of students. The College is located in Sterling, CO, which has historically been a farming community. Located along the Interstate about 130 miles from Denver, it is closer to Wyoming and Nebraska than the Capitol city.

First time full time 69.4, first time part time 18.1, non first time, full time 8.7, non first time, part time 3.9

First-time, Part-time Cohort

Morgan Community College has the largest “first-time, full-time” entering cohort of students. Located in Fort Morgan, Colorado, it is about 40 miles closer to Denver on the same interstate as Northeastern Junior College. It is interesting to note the differences in the entering student composition of these two rural institutions that are less than an hour from each other.

First time full time 29.8, first time part time 56.1, non first time, full time 5.7, non first time, part time 8.3

Non-First Time, Full-time Cohort

Arapahoe Community College has the largest “non-first-time, full-time” cohort of entering students. It is located in the southern suburbs of Denver. At 17.5%, the College has the highest percentage in this category, but the others are not far behind with most colleges enrolling between 8-16% o their students in this cohort.

First time full time 21.6, first time part time 34.2, non first time, full time 17.5, non first time, part time 26.8

Non-First Time, Part-time Cohort

It is not surprising that the Community College of Denver, in the state’s largest city, leads with the most returning students who are studying only part-time.

First time full time 21.7, first time part time 36.5, non first time, full time 13.5, non first time, part time 28.3

Student Outcomes: What Happened to the Students?

In addition to providing observations about the 2008 student cohort make up, the IPEDS Outcome Measures data also provides insights regarding the disposition of the students over time. Institutions report student outcomes in one of four categories:

  • Completion (Received an Award).
  • Transfer Out.
  • Still Enrolled.
  • Unknown.

The data reported here are at eight years since matriculation (as of Fall, 2016). In addition to eight-year data, IPEDS also collects and reports outcomes at six years. We compared the six and eight-year data and found little difference. For simplicity and consistency, we decided to use only the eight-year outcomes in this analysis. It is also noteworthy that 8 years beyond the matriculation date is four times the typical completion for an Associate degree.

The Outcome Measures data for Colorado’s community colleges are reported below:

Colorado Two-Year Public Institutions

Completion 26.8, Transfer out 31.6, Still enrolled 2.8, Unknwon 38.9

 Completion

Continuing the tour of the state to the rural southeast, Otero Junior College is located in La Junta and has the highest completion rate. Three other institutions broke the 40% completions mark: Morgan Community College (43.4%), Trinidad State Junior College (40.8%), and Lamar Community College (40.3%). All of them are located in rural settings.

Completion 47, Transfer out 24.1, Still enrolled .2, Unknwon 28.7

Completers rates vary greatly and often need local context. For example, Red Rocks Community College (in a middle-class to upscale suburb west of Denver) has the lowest “completers” rate at 16.6%. Its “transfer-out” rate is 39.0%, reflected a possible suburban trend to start locally and transfer.

 Transfer Out

Colorado Northwestern Community College tops the list for “transfer out” colleges. It is located in remote Rangely, CO near the border with Wyoming. It is about 200 miles northwest of Denver on the other side of the mountains. On a good day, it is a four-hour drive. The lure to start locally and transfer may be based on geography. Other two-year institutions with high transfer rates include Community College of Denver 45.9% (an urban college) and Lamar Community College 43.1% (a rural college).

Completion 22.2, Transfer out 48.6, Still enrolled .4, Unknwon 28.8

Still Enrolled

Although the percentages are small across the board, Front Range Community College has the highest rate of students still enrolled at the institution without having completed a degree or certificate after eight years. Front Range serves Boulder, Fort Collins, and the northern suburbs of Denver. Other community colleges with significant transfer rates reported include other suburban institutions: The Community College of Aurora (2.3%), and Red Rocks Community College (2.2%).

Completion 32.4, Transfer out 30.1, Still enrolled 2.8, Unknown 34.6

Unknown

Pueblo Community College has the highest rate of “unknown” responses, which we suspect may be due to problems with gathering data for this statistic in its first year of existence. We suspect that because of the suspiciously low “transfer out” rate that they report, especially since Colorado State University-Pueblo nearby is only six miles away. Northeastern Junior College (16%) and Lamar Community College (15.9%) reported the lowest “unknown” rates.

Completion 30, Transfer out 3.7, Still enrolled 1.2, Unknown 65

Next Steps

We are still absorbing what this all means.

We definitely urge you to examine your institution’s Outcome Measures rates. Use our spreadsheet as a starting place. We should all demand that colleges, policymakers, the press, and online listing/ranking services use these data to better represent what might happen to a student who enrolls at your institution.

If we write more or have calls for action, we will let you know.

Meanwhile, what do you think? Let us know.

Photo of Russ Poulin

 

Russell Poulin
Director, Policy & Analysis
WCET – The WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu | @russpoulin

 

 

Terri Straut

 

Terri Taylor-Straut
Director of Actualization
ActionLeadershipGroup.com

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

Count All Students! Outcome Measures for Non-Traditional Institutions

Should we count all students when analyzing higher education, or only some of them? It’s not surprising that when you include all students, you get different results from that analysis than when you don’t. We think all students should be included.quote 1

As reported in our last blog post (“Count All Students! New Outcome Measures Now Include Non-Traditional Students”), the U.S. Department of Education released data for a new Outcome Measures statistic last year. We thought that the Outcome Measures would provide more comprehensive data regarding student outcomes than the traditional Graduation Rate measure, which includes only first-time, full-time students. We were not disappointed.

Some overall observations:

  • Our friends from the WICHE Policy and Research unit found that 60% of all higher education students and 71% of community college students were not included in the Department’s Graduation Rate statistic that has been used for many years.
  • This is the first year for the data to be released. As with any new statistic, some institutions were more successful than others in gathering and reporting the data. This should improve in the coming years.
  • Institutions with large non-traditional student enrollments (e.g.: community colleges, online colleges, inner city universities, military-serving institutions) have long clamored for a statistic that better represented the populations they serve. The addition of “transferred out” and “still enrolled” are important new options to show the student’s progress at the institution. For the institutions analyzed:
    • For some, their Outcome Measures results are much better than what might be reported in their Graduation Rate.
    • For others…not so much. It will be interesting to see if better data collection improves their results in the coming years.
  • Based upon feedback from institutions to the Department, there are already some changes to data collections for future years.

In this post and the next, we will examine the Outcome Measures results for different sample sets of institutions. We will also provide you with our spreadsheet so that you can analyze the data and perform similar analyses for your institution.

What Are the Outcome Measures?

For the old Graduation Rate measure, institutions were asked to identify a “cohort” of students who entered as first-time (they’ve not attended college before) and full-time (they are taking a full load of courses) in the fall of a given academic year. Institutions track and report what happened to those students after a set number of years.

The new Outcome Measures asks institutions to expand this tracking by adding three additional cohort categories of students:

  • First-time, full-time (similar to the Graduation Rate).
  • First-time, part-time.
  • Non-first-time (transfer-in), full-time.
  • Non-first-time (transfer-in), part-time.

graph shows break down of

Source: https://nces.ed.gov/ipeds/use-the-data/survey-components/outcome-measures

The first cohorts were compiled for the Fall 2008 academic term and the first dataset containing results was released last October.

Top 15 Institutions Offering “Exclusively Distance Education Courses”

In examining the results more closely, the data related to student matriculation and outcomes vary dramatically, based on the mission of the institution. This is especially clear when we look at institutions that are known to serve large populations of students taking “exclusively distance education courses.”

With U.S. Department of Education’s Integrated Postsecondary Education Data System (IPEDS) data, it is possible to identify the institutions with the largest distance education enrollments. Enrollment data is collected separately from Outcome Measures, and we have reported on distance education enrollments several times over the past few years.

Using the 2016 IPEDS distance education enrollment data (the most recent available), we determined the top 15 institutions, based on reported number of enrollments taking exclusively distance education courses. These institutions, in order of exclusive distance education enrollment (largest to smallest), include:

  1. University of Phoenix-Arizona.
  2. Western Governors University.
  3. Southern New Hampshire University.
  4. Liberty University.
  5. Grand Canyon University.
  6. Walden University.
  7. American Public University System.
  8. University of Maryland-University College.
  9. Excelsior College.
  10. Ashford University.
  11. Capella University.
  12. Kaplan University-Davenport Campus.
  13. Brigham Young University-Idaho.
  14. Arizona State University-Skysong. (Note: Arizona State University has recently begun reporting some of its enrollments under this name even though it is not a separately accredited institution. Since the Outcomes Measures data that we analyzed reports students first enrolled in Fall 2008, there is no data to report for Skysong students.)
  15. Colorado Technical University-Colorado Springs.

That list includes eight for-profit, five non-profit, and two public institutions. Why this group? We were searching for a group of non-traditional institutions and this seemed better than any bias resulting from picking a set of institutions on our own. In our next blog post, we will examine public community colleges and universities in one state.

We compiled a complete Excel file of the 15 institutions and their reported data for each cohort. In the analysis below we provide graphic examples of the highs and/or lows for the category in question.

The Incoming Students: Composition of Fall 2008 Cohort

One of the benefits of Outcome Measures (OM) is to better understand the composition of the student cohorts at our institutions. While the IPEDS Graduation Rate includes only first-time, full-time students, Outcome Measures includes the four cohorts listed above. Let’s look at results for those institutions for each of the four cohorts…

First-time, full-time cohort

Predictably, the reported number of students in each of these four categories is aligned with the mission of the organizations. For example, the largest number of first-time, full-time students among the 15 organizations are BYU-Idaho with 76.7% and Southern New Hampshire University with 67.3%. These institutions both have large on-campus students populations in addition to their online presence. The Outcome Measures represent overall institutional enrollments; therefore, these data combine on-campus and online enrollment.

Graph showing BYU Idaho and Southern New Hampshire University

Institutions long-focused on serving students at a distance reported few, if any, first-time, full-time students. In fact, of the 14 institutions that reported OM data, half (50.0%) reported under 5% of the 2008 cohort were first-time, full time students. For those institutions, their Graduation Rate is calculated on less than 5% of their overall student population, which is why the Outcome Measures will be a better barometer of student progress for them.

First-time, part-time cohort

Institutions that are known to serve part-time students reported the largest numbers of students in the first-time, part-time cohort. Walden University led with 56.6% of the cohort, followed by American Public University System with 38.8% of reported students being first-time, part-time in 2008.

Walden University led with 56.6% of the cohort, followed by American Public University System with 38.8% of reported students being first-time, part-time in 2008.

As with first-time, full-time, several institutions reported few or no first-time, part-time enrollments.

Non-first-time (transfer-in), full-time cohort

Institutions that focus on serving full-time students, with a promise of accelerated degree completion through credit for life experience or competency-based education, led with reported enrollment in the returning, full-time category for the 2008 cohort. Western Governors University (WGU) reported 93% of their students in this category; Ashford University reported 84.8%.

Western Governors University (WGU) reported 93% of their students in this category; Ashford University reported 84.4%.

Non-first-time (transfer-in), part-time cohort

Excelsior College reported all (100%) of their enrollments as returning, part-time, followed by Capella University reporting 80.8% of their 2008 enrollment in that category. Both institutions have a mission focused on serving adult learners, many of whom cannot attend college full-time. As previously noted, institutions that are designed to serve full-time students reported very small enrollments in this category.

graphs5

Student Outcomes: What Happened to the Students?

In addition to providing observations about the 2008 student cohort make up, the Outcome Measures data also provides insights regarding the disposition of the students over time. Institutions report student outcomes in one of four categories:

  • Completion (Received an Award).
  • Transfer Out.
  • Still Enrolled.
  • Unknown.

The data reported here are at eight years since matriculation (as of Fall, 2016). In addition to eight-year data, IPEDS also collects and reports outcomes at six years. We compared the six and eight-year data and found little difference. For simplicity and consistency, we decided to use only the eight-year outcomes in this analysis.

As noted by our WICHE Policy colleagues in their recent OM report that examined outcomes for students in the WICHE states:

“The credential rates from Outcome Measures data and graduation rates from Graduation Rate data are not synonymous and generally should not be interchanged. There are important distinctions as discussed in this brief. Users should clearly distinguish between OM and GR results.”

That report is a worthy read. With this caveat as background, let’s look at the outcomes.

Completion

Completion indicates that the student received a certificate or degree within eight years of entering the institution. Of the top 15 institutions in 2016, based on exclusively distance education enrollment, BYU-Idaho led in completions with 59.4%; Southern New Hampshire University reported 55.7% completion. It is important to remember that the enrollment reported was total enrollment, not just distance education students. On this measure the more traditional institutions did better than their less traditional counterparts.

graphs6

Institutions that reported the lowest completion rates include Kaplan University-Davenport Campus with 20.6% and Capella University with 19.5% completion at eight years. Both institutions also had unusually high percentages of students in the “unknown” category.

Kaplan University-Davenport Campus with 20.6% and Capella University with 19.5% completion at eight years.Transfer Out

We know that a primary mission for many institutions of higher education is the successful transfer of students to other institutions. Of the reporting institutions in this analysis, Capella University led with 42.5% of students reported as transferred out, and BYU-Idaho reported the second highest transfer rate, at 38.9%.

graphs8

Still Enrolled

Eight years beyond the date of matriculation is 200% of the timeframe expected to complete a Bachelor’s degree, therefore, we expected this category of enrollment to be the smallest reported. However, institutions that are focused on serving part-time students would logically have the highest rates of students reported as still enrolled at eight years. These students are working toward their degree, often a few courses at a time. Of the reported institutions, American Public University System (which has a large number of students in the military) led with 19.5%. Excelsior College ranks a distant second with 6.7% of reported enrollment reported as still enrolled.

graphs9

Unknown Outcomes – Those with High Percentages

Student enrollments that did not result in one of the prior dispositions are reported as “unknown”. These are troubling outcomes, as it is the responsibility of institutions of higher education to track student progress.

There are three institutions in this sample with over 70% of their student outcomes reported as unknown: Kaplan University-Davenport (79.2%), University of Maryland-University College (72.2%), and University of Phoenix-Arizona campus (71.0%).

graphs10

It is unclear whether the “unknown” classification is due to the students having dropped out, is due to an inadequate classification of students as this is the first OM data collection, or some other reason. It will be interesting to watch the results in the upcoming years.

Unknown Outcomes – Those with Low Percentages

Among the 14 institutions we examined, those that reported few students with “unknown” outcomes include BYU-Idaho reporting 0% (that is remarkably low) and Southern New Hampshire University reporting 12.6% of students with “unknown” outcomes.

graphs11

Outcome Measures Creates a New View of Positive Student Outcomes

As we previously stated, the above institutions also

have a sizeable on-campus contingent. What about institutions that are focus completely on non-traditional students? For technical reasons the exact percentage for the Graduation Rate might be slightly different than the completion percentage for the Outcome Measures statistic. For purposes of these analyses, we will imagine that they are close to each other.

When we had only the Graduation Rate, the implication was that those who did not graduate were drop-outs or that the institution how somehow failed to student. By adding the additional outcomes categories, we can see that institutions might not graduate a student, but that the student is still well served. The two institutions listed below show a large percentage of students who transferred out or are still enrolled. While eight years is a long time to be still enrolled, for adults with work and families, they may be able to take only a few classes at a time.

For Excelsior College, examining completions alone would show success for fewer than half of their enrollees. With the OM, fewer than one-in-five are in the “unknown” category.

For Excelsior College, examining completions alone would show success for fewer than half of their enrollees. With the OM, fewer than one-in-five are in the “unknown” category.

OM Measures legend

A Caveat about First-Year Data Collections

We know from our early work with IPEDS distance education data that the first year of reporting may yield problems in reporting resulting in high numbers of enrollment categorized as “unknown.” We have also noted that the proportion of students reported as “unknown” tends to decrease as institutions become familiar with the new reporting and are able to adjust their systems to better capture the data required.

However, having such large proportions of students reported with an “unknown” disposition is never-the-less concerning. The cost of higher education is high, both to students and to our society. When institutions accept students and their tuition and fees, they should be able to report what happened to the student.

The Next Post – Community Colleges and Colleges/Universities in Colorado

When institutions accept students and their tuition and fees, they should be able to report what happened to the student.

In our next blog post, we will look more closely at public institutions in Colorado, as a sampling of Outcome Measures in a single state. We will also urge you to conduct your own analyses.

These data are a gift…and the result of hard work by all the reporting institutions. Let’s use them.

~Russ and Terri

Photo of Russ Poulin

 

Russell Poulin
Director, Policy & Analysis
WCET – The WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu | @russpoulin

 

 

Terri Straut

 

Terri Taylor-Straut
Director of Actualization
ActionLeadershipGroup.com

 

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

Count All Students! New Outcome Measures Now Include Non-Traditional Students

There is new improvement to the U.S. Department of Education’s Graduation Rate statistic. And we should all be using it.

Institutions with large non-traditional student enrollments (e.g.: community colleges, online colleges, inner city universities, military-serving institutions) have not been well-represented by the Department’s Graduation Rate statistic. Few of their students are included in the results because only first-time, full-time enrollees are included in the process used to calculate the Graduation Rate.

quote box reads: The new Department of Education's Outcome Measures includes about 60% more students in the WICHE region than does its traditional Graduation Rate.Since 2008 the Integrated Postsecondary Education Data System (IPEDS) has collected data on a new metric to solve that problem – the Outcome Measures. Last year they released the first results, but there has been less news about it than we hoped.

Institutions that serve a large number of non-traditional students need to be using Outcome Measures. Today the Western Interstate Commission for Higher Education (WICHE) Policy and Research unit released a great analysis of this new statistic and they calculated the impact across the western states today. This blog post is the first in a series where we provide observations and suggestions on how you can use Outcome Measures to better understand the outcomes of students at your own institution.

What Are the Outcome Measures?

For the Graduation Rate measure, which until recently had been the only available measure, institutions are asked to identify a “cohort” of students who enter as first-time (they’ve not attended college before) and full-time (they are taking a full load of courses) in the fall of a given academic year. Institutions track and report what happened to those students after a set number of years depending on the type of degree they are pursuing.

The new Outcome Measures asks institutions to expand this tracking by adding three additional cohort categories of students:

  • First-time, full-time (similar to the Graduation Rate)
  • First-time, part-time
  • Non-first-time (transfer-in), full-time
  • Non-first-time (transfer-in), part-time

graph shows break down of

Source: https://nces.ed.gov/ipeds/use-the-data/survey-components/outcome-measures

The first cohorts were compiled for the Fall 2008 academic term and the first dataset containing results was released last October. Inside Higher Ed reported on the release and Robert Kelchen (higher education policy and data expert) provided some initial analysis.

 Important Findings from the WICHE Policy Work

We started playing with these analyses a few months ago and learned that our friends down the hall in WICHE Policy and Research were doing the same thing. We were able to share what we learned and discovered that we were doing complementary analyses. Peace Bransberger and Colleen Falkenstern of WICHE Policy created a wonderful WICHE Insights brief, focused on state and regional analyses (only in the Western states) of the new data. Meanwhile, we at WCET, also a unit of WICHE, are focusing on the impact at the institutional level and how institutional personnel might use and display the data.

Many More Students are Counted

The WICHE Policy brief (released today) produced great findings for the WICHE region:

  • Outcome Measures data provide information for about 60% more 2008-09 undergraduates in the WICHE region than covered by the IPEDS Graduation Rate data. This breaks down as:
  • 39% more students at four-year institutions
  • 71% more students at two-year institutions

Wow! Sixty percent of all western students were not included in the Graduation Rate. That strikes us as a large percentage. Institutions were not recognized for their results…good or bad.

Surprising Results on Completion Rates for the Newly Counted Students

In looking at the graduation results for “non-first-time” (transfer-in) students, there were some surprising results in calculations across the WICHE states:

“And overall, non-first-time undergraduates completed a credential within six years at higher rates than first-time students. The rate (72 percent) at which non-first-time, full-time students completed a credential within six years exceeded that rate (59 percent) for their first-time peers by 13 percentage points. And the rate for non-first-time, part-time students (52 percent) was 28 percentage points higher than for their first-time peers (24 percent).”

Students in the “non-first-time” (or transfer-in) group completed a credential within six years at a higher rate than the more traditional first-time, full-time students. But, before now they were not even counted.

Peace and Colleen also provide a great caveat for us when talking about this data:

“There are important distinctions between the credential rate from Outcome Measures data and the Graduation Rate data…Users should clearly distinguish between these data as they are not synonymous.”

Great work by Peace and Colleen. We recommend you read their brief, even if you are not from the West.

Why Should I Care?

This is a great advance for institutions that serve non-traditional students, as many of their students may be enrolled for only a few courses and/or may have attended another college or university. Many students simply were not included in the Graduation Rate.

To show the impact, let’s look at some concrete examples of institutions with a focus on non-traditional students. From the Outcome Measures that were released last Fall, here are percentage of students who were listed as first-time, full-time from their Fall 2008 cohort:

  • A few Colorado universities:
    • Metropolitan State University of Denver – 41%
    • University of Colorado, Colorado Springs – 56%
    • University of Colorado, Denver/Anschutz Medical Campus – 42%
  • A few Colorado community colleges:
    • Community College of Denver – 22%
    • Front Range Community College – 27%
    • Pueblo Community College – 27%
  • A few institutions with large online enrollments:
    • Capella University – 0.9%
    • University of Maryland-University College – 2.5%
    • Western Governors University – 7.0%

Remember that these were the only students counted in the previous IPEDS Graduation Rate calculations. Therefore, their graduation results were based on a fraction of the students attending the institution. As an analogy…if we declared the Super Bowl winner before halftime, the Atlanta Falcons would own the 2017 trophy instead of the New England Patriots.

A Real Example of the Problem

The press, policymakers, and college ranking services like to use the IPEDS Graduation Rate as a simple, independent indicator of the results of students attending college. For example, here is an alarming warning about University of Maryland-University College on an Education Trust website:

screen shot of

If you were a potential student or a parent of a student who is shopping for a college, what would you think?

The site provides no context that this is warning is based on only 2.5% of the students in the University’s freshman class. If you go to the graduation rates tab, you find a “NOTE:” in which there is an attempt to give caution for low numbers of “full-time freshmen students,” but it would be a very exceptional student or parent to be able to make any sense of that caution.

2nd screen shot of the university of maryland information page

What is a student to think? Or a lawmaker? Or anyone else?

To make matters worse, the instructions are incomplete. It took us awhile to find the “Retention and Progression Rates” tab mentioned in the “Note.” We discovered (through trial and error) that you need to first select the “Similar Colleges” tab. You can then find the “Retention and Progression and Rates” tab. There you find there were 107 full-time students in the 2008 cohort. So, their warning is based upon 107 students out of 35,154 total enrollment that they report on the initial page for the institution. The “30-student caution” that they give is simply too low for larger institutions.

About Education Trust…they do good work and have the correct goals in mind as they proclaim themselves as: “Fierce advocates for the high academic achievement of all students – particularly those of color or living in poverty.” Kudos. Keep it up. This is just one place in which you can better serve those students. And it is not your fault that better data were not available previously.

And What About the Student Achievement Measure?

There is a similar, independent effort to attack the problem with Graduation Rate undercounts. The Student Achievement Measure (SAM): “provides a comprehensive picture of student progress on their path to earning a college degree or certificate. As compared to the limitations of typical graduation rate measures, SAM reports more outcomes for more students”

SAM does wonderful work and we support what they do. According to their website (as of this posting) they have 632 participating institutions. While there may be some limitations or growing pains with the IPEDS Outcome Measures, the great advantage is that all institutions offering federal financial aid are required to participate in the IPEDS surveys.

The Bottom Line…and Upcoming Posts

The IPEDS Graduation Rate simply does not represent the reality of what happens at institutions serving a significant number of non-traditional institutions. IPEDS has given us a good new statistic, but we’ve noticed little uptake in using it.

We need to change this. We need to get involved!

In future blog posts, we will look at some outcomes for a few community colleges, universities, and non-profits. Some institutions show better results with the Outcome Measures, others do not. We’ll also suggest a start for you to calculate and display the data for your institution. We thank Peace and Colleen from WICHE Policy for their great help with our thinking on data displays.

Meanwhile, some questions for you…

  • Is your institution well-represented by the Graduation Rates and/or Outcome Measures data?
  • How can we more people across your institution to know about and use the Outcome Measures?
  • What else should we ask about the Outcome Measures?

Let’s promote better data.

 

Photo of Russ Poulin
Russell Poulin
Director, Policy & Analysis
WCET – The WICHE Cooperative for Educational Technologies
rpoulin@wiche.edu | @russpoulin

 

 

Terri Straut
Terri Taylor-Straut
Director of Actualization
ActionLeadershipGroup.com

 

 

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

Rigor, Meet Reality

How do you define academic rigor? I know when I was completing my undergraduate and graduate coursework, I could tell the difference between a rigorous course and one that would be a little less time consuming. I also understood, especially in graduate school, that the more rigorous a course was, the more I “got out of it.” Is there a way to capture the difference so instructors can ensure the best educational experiences for their students?

To help us do just that, today we’re excited to welcome Andria Schwegler from Texas A&M University. Andria is here to discuss her search for the definition of academic rigor, gathered through discussions with her colleagues and presentations at Quality Matters conferences.

Enjoy the read and enjoy your day,

Lindsey Downs, WCET


Rigor is touted by many educators as a desirable quality in learning experiences, and many institutional mission statements promise rigorous academic experiences for students. Unfortunately, a definition of rigor has been elusive. How do institutions leverage evidence to support their claim of rigorous experiences?

The Task to Operationally Define Academic Rigor

In search of a definition of academic rigor, I sought out the types of evidence that faculty members at my institution use to document rigor in their academic programs and courses. The search was prompted by an invitation to serve as a panel discussant representing a faculty member’s perspective in one of two special sessions “Quality Online Education: What’s Rigor Got to Do with It?” at the Quality Matters Connect Conference in 2017. The purpose of the sessions was to open dialogue across stakeholders in higher education regarding the role of rigor in traditional and alternative learning experiences. As a faculty member, I could articulate rigor in my own courses and program, but because defining rigor had not been a topic of discussion across departments, I sought to learn what my colleagues considered evidence of academic rigor.

Group of instructors

Photo from #WOCinTech Chat

Operational Definitions of Rigor Offered by Faculty

Several of my colleagues accepted my invitation to discuss the issue, and they provided a variety of measurable examples to demonstrate rigor in their courses and programs. Rigorous learning experiences they described included:

  • audio and video recorded counseling sessions with clients that were subsequently critiqued in class,
  • problems identified by students in current or former employment contexts that were brought to class and addressed by applying course content to the cases, and
  • quantitative research projects requiring application of completed coursework to collecting and interpreting original datasets.

Distilling a broader summary from course-specific assignments and program-specific assessments revealed that the most frequently cited evidence to support claims of rigor were student-created artifacts. These artifacts resulted from articulated program- and course-learning outcomes that specified higher-level cognitive processing. Learning outcomes as static statements were not considered evidence of rigor in themselves; they were prerequisites for learning experiences that could be considered rigorous (or not) depending on how they were implemented.

Implementing these activities included a high degree of faculty support and interaction as students created artifacts that integrated program- or course-specific content across time to demonstrate learning. The activities in which students engaged were leveled to align with the program or course and were consistent with those they would perform in their future careers (i.e., authentic assessment; see Mueller, 2016). Existing definitions of rigor include students’ perceptions of challenge (“Rigor,” 2014), and the evidence of rigor faculty members provided highlighted the intersection of students’ active engagement with curriculum relevant to their goals and interaction with the instructor. These conditions are consistent with flow, which is characterized by concentration, interest, and enjoyment that facilitate peak performance when engaged in challenging activities (Shernoff, Csikszentmihalyi, Schneider, & Shernoff, 2003).

Translating Rigor into Student Assessments

Creating meaningful assessments of learning outcomes that integrate academic content across time highlights the importance of planning learning experiences, not only in a course but also in a program. Faculty colleagues explicitly warned against considering evidence of rigor in a course outside of the context of the program the course supports. Single courses do not prepare students for professional careers; programs do. It was argued that faculty members must plan collaboratively beyond the courses they teach to design program-level strategies to demonstrate rigor.

text box which reads: …Faculty members must plan collaboratively beyond the courses they teach to design program-level strategies to demonstrate rigor.

Planning at the program level allows faculty members to make decisions regarding articulating curriculum in courses, sequencing coursework, transferring coursework, creating program goals, and assessing and implementing program revisions. Given these responsibilities, instead of being viewed as an infringement on their academic freedom (see Cain, 2014), faculty members indicated that collaborative planning and curriculum design were essential in setting the conditions for creating assessments demonstrating rigor.

Though specific operational definitions of rigor provided by colleagues were as diverse as the content they taught, the underlying elements of their examples were similar and evoked notions of active student engagement in meaningful tasks consistent with a “vigorous educative curriculum” (Wraga, 2010, p. 6).

Student Activities During Lecture

Stepping back from the examples of student work that faculty members offered as evidence of rigor, I reflected on student activities that were missing from our conversations. None of my colleagues indicated that attending class, listening to lecture, and taking notes were evidence of rigor. Though lecture is “the method most widely used in universities throughout the world” (Svinicki & McKeachie, 2011, p. 55), student activities associated with it never entered our conversations.

In fact, one colleague’s example of rigorous classroom discussion directly contradicted the approach. She explained that during discussions, she tells her students not to believe a word she says, though she was quick to add that she does not mislead students. Her approach puts the burden to obtain support for discussions on students, who cannot passively rely on the teacher as an authority. Instead, students are held accountable for substantiating claims provided. This technique offers more evidence of rigor than simply receiving the content via lecture.

text box reads: None of my colleagues suggested that students’ grades were evidence of rigor.

Student Grades

None of my colleagues suggested that students’ grades were evidence of rigor.

One noted that some students may not meet high standards, leading them to fail a course or program. But, these failures in demonstrating performance were viewed as unfortunate consequences of rigor, not evidence to document its existence. This sentiment was complimented by another colleague’s comment that providing support (e.g., remedial instruction, additional resources) to students was not a threat to the rigor of a course or program. Helping students meet high standards and improve performance was evidence of rigor, whereas failing grades because students found the content difficult were not.

Teaching Evaluations and Mode of Delivery

None of my colleagues suggested that students’ evaluations of a course or an assignment were evidence of rigor. When documenting rigor, faculty members offered students’ performance on critical, discipline-specific tasks, not their opinions of the activities. Supporting this observation, Duncan, Range, and Hvidston (2013) found no correlation between students’ perceptions of rigor and self-rated learning in online courses. Further, definitions of rigor provided by graduate students in their study were strikingly similar to the definitions provided by my colleagues (e.g., “challenge and build upon existing knowledge…practical application and the interaction of theory, concept, and practice…must be ‘value-added’” p. 22). Finally, none of my colleagues indicated that mode of delivery (e.g., face-to-face, blended, online) was related to rigor, an observation also supported by Duncan et al. (2013).

Defining Academic Rigor: A Research-Based Perspective

textbox which reads: None of my colleagues indicated that mode of delivery (e.g., face-to-face, blended, online) was related to rigor.

Thanks to my colleagues, I arrived at the Quality Matters Connect conference with 17 single-spaced pages of notes documenting an understanding of rigor. Though the presentation barely scratched the surface of the content, I was optimistic that we were assembling information to facilitate multiple operational definitions of rigor that could be used flexibly to meet assessment needs. This optimism contributed to my surprise when, during small group discussion among session attendees, the claim was made that academic rigor has too many interpretations and cannot be defined.

I cannot support this claim because most variables addressing human behavior have multiple ways they can be operationally defined, and converging evidence across diverse assessments increases our understanding of a given variable. From this perspective, a single, narrow definition of rigor is neither required nor desirable. A research-based perspective allows for multiple operational definitions and makes salient the value of assessment data that may be underutilized when it informs only a single program’s continuous improvement plans. As Hutchings, Huber, and Ciccone (2011) argue, faculty members engage in valuable work when they apply research methods to examine student learning and share results with others. When assessment and improvement plans are elevated to the level of research, the information can be shared to inform others’ plans and peer reviewed to further improve and expand the process.

Articulating and sharing ways to observe and measure rigor can provide educators and administrators a selection of techniques that can be shaped to meet their needs. Engaging in an explicit examination of this issue across institutions, colleges, programs, and courses facilitates the identification of effective techniques to provide evidence of rigor to support the promises made to our stakeholders.

author headshot Andria Schwegler

 

Andria Schwegler
Associate Professor, Counseling and Psychology
Texas A&M University – Central Texas

 

 

 


References

Cain, T. R. (2014, November). Assessment and academic freedom: In concert, not conflict. (Occasional Paper No. 22). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment. Retrieved from http://www.learningoutcomesassessment.org/documents/OP2211-17-14.pdf

Duncan, H. E., Range, B., Hvidston, D. (2013). Exploring student perceptions of rigor online: Toward a definition of rigorous learning. Journal on Excellence in College Teaching, 24(4), 5-28.

Hutchings, P., Huber, M. T., & Ciccone, A. (2011). The scholarship of teaching and learning reconsidered: Institutional integration and impact. San Francisco, CA: Jossey-Bass.

Mueller, J. (2016). Authentic assessment toolbox. Retrieved from http://jfmueller.faculty.noctrl.edu/toolbox/whatisit.htm

Rigor. (2014, December 29). In The Glossary of Education Reform. Retrieved from https://www.edglossary.org/rigor/

Shernoff, D. J., Csikszentmihalyi, M., Schneider, B., & Shernoff, E. S. (2003). Student engagement in high school classrooms from the perspective of flow theory. School Psychology Quarterly, 18(2), 158-176.

Svinicki, M., & McKeachie, W. J. (2011). How to make lectures more effective. McKeachie’s teaching tips: Strategies, research, and theory for college and university teachers (13th ed., pp. 55-71). Belmont, CA: Wadsworth.

Wraga, W. G. (2010, May). What’s the problem with a “rigorous academic curriculum”? Paper presented at the meeting of the Society of Professors of Education/American Educational Research Association, Denver, Colorado. Retrieved from https://eric.ed.gov/?id=ED509394

 


CC Logo

Learn about WCET Creative Commons 4.0 License

 

%d bloggers like this: