Who Should Take College-Level Courses?

Impact Findings From an Evaluation of a Multiple Measures Assessment Strategy


By Elisabeth A. Barnett, Elizabeth Kopko, Dan Cullinan, Clive Belfield

While many incoming community college students and broad-access four-year college students are referred to remedial programs in math or English based solely on scores they earn on standardized placement tests, large numbers of colleges have begun to use additional measures to assess the academic preparedness of entering students. Concomitant with major reform efforts in the structure of remedial (or developmental) education coursework, this trend toward the use of multiple measures assessment is informed by two strands of research: one suggests that many students traditionally assigned to prerequisite remediation would fare better by enrolling directly in college-level courses, and the other suggests that different measures of student skills and performance, and in particular the high school grade point average (GPA), may be useful in assessing college readiness.

The Center for the Analysis of Postsecondary Readiness (CAPR), a partnership of the Community College Research Center and MDRC, recently completed a random assignment study of a multiple measures placement system that uses data analytics. The aim was to learn whether this alternative system yields placement determinations that lead to better student outcomes than a system based on test scores alone. Seven community colleges in the State University of New York (SUNY) system participated in the study. The alternative placement system we evaluated uses data on prior students to weight multiple measures — including placement test scores, high school GPAs, and other measures — in predictive algorithms developed at each college that are then used to place incoming students into remedial or college-level courses. Nearly 13,000 incoming students who arrived at these colleges in the fall 2016, spring 2017, and fall 2017 terms were randomly assigned to be placed using either the status quo placement system (the business-as-usual group) or the alternative placement system (the program group). The three cohorts of students were tracked through the fall 2018 term, resulting in the collection of three to five semesters of outcomes data, depending on the cohort. We also conducted research on the implementation of the alternative placement system at each college as well as a cost and cost-effectiveness analysis.

Findings from the implementation and cost components of the study show that:

  • Implementation of the multiple measures, data analytics placement system was complex but successfully achieved by all the participating colleges.
  • Because alternative placement resulted in many fewer enrollments in remedial courses, the total cost of using the multiple measures system was $280 less per student than using the business-as-usual system.
  • Students enrolled in 0.798 fewer credits within three terms under the alternative system, saving each student, on average, $160 in tuition/fees.

Impact findings from the evaluation of student outcomes show that:

  • Many program group students were placed differently than they would have been under the status quo system. In math, 16 percent of program group students were “bumped up” to a college-level course; 10 percent were “bumped down” to a remedial course. In English, 44 percent were bumped up and 7 percent were bumped down.
  • In math, in comparison to business-as-usual group students, program group students had modestly higher rates of placement into, enrollment in, and completion (with grade C or higher) of a college-level math course in the first term, but the higher enrollment and completion rates faded and then disappeared in the second and third terms.
  • In English, program group students had higher rates of placement into, enrollment in, and completion of a college-level English course across all semesters studied. While gains declined over time, through the third term, program groups students were still 5.3 percentage points more likely to enroll in and 2.9 percentage points more likely to complete a college-level English course (with grade C or higher).
  • Program group students earned slightly more credits than business-as-usual group students in the first and second terms, but the gain became insignificant in the third term. No impacts were found on student persistence or associate degree attainment.
  • All gender, Pell recipient status, and race/ethnicity subpopulations considered (with the exception of men in math) had higher rates of placement into college- level courses using the alternative system. In English, these led to program group course completion rates that, compared to their same subgroup peers, were 4.6, 4.5, 3.0, and 7.1 percentage points higher for women, Pell recipients, non-Pell recipients, and Black students over three terms.
  • Program group students who were bumped up into college-level courses from what their business-as-usual placements would have been were 8–10 percentage points more likely to complete a college-level math or English course within three terms. Program group students who were bumped down into developmental courses were 8–10 percentage points less likely to complete a college-level math or English course within three terms.

This study provides evidence that the use of a multiple measures, data analytics placement system contributes to better outcomes for students, including those from all the demographic groups analyzed. Yet, the (relatively few) students who were bumped down into developmental courses through the alterative system fared worse, on average, than they would have under business-as-usual placement. This suggests that colleges should consider establishing placement procedures that allow more incoming students to enroll in college-level courses.

Document Details

Publication Type
Report
Locations
Date
October 2020
Barnett, Elisabeth, Elizabeth Kopko, Dan Cullinan, and Clive Belfield. 2020. Who Should Take College-Level Courses? Impact Findings From an Evaluation of a Multiple Measures Assessment Strategy. New York: MDRC.