Measure for Measure
How Innovations in Course Placement are Helping Community College Students Succeed
In recognition of our 50th anniversary, we are highlighting examples of where the work of MDRC and its partners is making a difference.
Stephen Burke meets with dozens of students every semester, but he still vividly recalls a conversation he had with one student a few years ago.
Burke, a professor of English at Rockland Community College in Suffern, New York, had noticed that “Troy” struggled to stay awake in class. Beyond just nodding off, he would often slump forward in his seat, falling fast asleep for entire lectures. “What’s going on?” Burke finally asked Troy one day after class. “I’m feeling bad that I keep putting you to sleep. Is there something I can do to be more engaging?”
Troy apologized profusely. He and his girlfriend were expecting their first child, he explained, and he was working round-the-clock to support his growing family—a full-time job and a nighttime gig driving airport shuttles. Much as he enjoyed Burke’s class, he was exhausted.
Burke understood. He recognized that Troy—like many Rockland students—was barely scraping by, hoping a community college degree would open doors to a better life. “Keep coming to class, even if you fall asleep,” Burke told him. “We'll talk before or after class [to make sure you’re keeping up with the material].” With Burke’s help, Troy passed the class, bringing him one step closer to earning his degree.
The success of MMA is a quintessential case of turning evidence into impact.
For Burke, Troy’s story underscored a larger point: the importance of letting Rockland students take college-level courses, as opposed to remedial courses, whenever possible. Students like Troy had limited bandwidth, not to mention limited funds to pay for college. If Burke’s class had not brought Troy closer to completing his degree, perhaps he would have given up on the class, or given up on college altogether—something that happened all too often to community college students across the U.S., especially those stuck in remedial courses.
This insight led Burke and his colleagues at Rockland to embrace a practice called multiple measures assessments (MMA). MMA empowers community colleges to use evidence to place students into college-level, credit-bearing courses, recognizing that placement tests alone are imperfect predictors of readiness. As its name suggests, MMA uses multiple measures—such as high school grade point average (GPA), time since graduating high school, and courses completed during high school—to identify students who are likely to pass a college-level course despite scoring poorly on a placement test.
The success of MMA is a quintessential case of turning evidence into impact. By first building evidence that MMA works, then empowering more community college systems in more states to adopt the practice, MDRC and its partners have come together to help community colleges better serve students from disadvantaged backgrounds. In this sense, MMA builds on a long history of identifying promising levers for change—and then putting evidence into practice to pull those levers more effectively.
An Underappreciated Path Out of Poverty
Community colleges have long offered a path out of poverty for first-generation students and students from low-income backgrounds, though this was not always reflected in the attention they drew from researchers and policymaker. The 1970s witnessed a handful of scattered, foundation-funded initiatives to strengthen the antipoverty role of these institutions, such as helping community college students transition to four-year colleges. But it was not until the early 2000s that social policy researchers began rigorous testing of focused interventions designed to help decrease attrition rates among community college students and improve their prospects for long-term success. MDRC led this charge with its Opening Doors demonstration project, which tested innovations in curriculum and instruction, financial aid, and student services to see if they improved the odds of student success.
“[W]e developed a broader postsecondary education agenda, of which community colleges were the primary focus.”
Tom Brock, Community College Research Center
According to Tom Brock, Director of the Community College Research Center at Teachers College, Columbia University (and an MDRC alumnus), Opening Doors grew out of MDRC’s earlier work on welfare programs. Many welfare recipients, MDRC staff noticed, were trying to improve their lives through community college. “We quickly realized that the challenges that current or former welfare recipients had…completing these programs at community colleges were really the same kinds of challenges [facing many other] students at community colleges,” Brock said. “It was through that growing awareness, and through a lot of conversations at MDRC and with external experts…that we developed a broader postsecondary education agenda, of which community colleges were the primary focus.”
The success of Opening Doors laid the groundwork for new efforts to help disadvantaged community college students succeed. First there was Achieving the Dream: a wide-ranging collaboration of colleges, funders, sector-wide associations, and research organizations that focused on evidence-building, improvements in practice at participating colleges, public policy advocacy, and external outreach. (Achieving the Dream later spun off as an independent nonprofit.) Next came the renowned ASAP program, designed and implemented by the City University of New York. Launched in 2007, ASAP supports students with wraparound support services ranging from assistance in purchasing textbooks and MetroCards to personalized tutoring and career advice. MDRC was a leading player in these developments, both as a founding member of Achieving the Dream and as a longtime partner with CUNY to evaluate and help replicate its ASAP model. And as MDRC’s research validated these and other interventions, they gained traction with funders and colleges.
Progress, in short, was underway.
Lurking in the background, though, was the problem of stubbornly high attrition rates among community college students. This could not be explained away by students transferring to other institutions, as longitudinal research by the U.S. Department of Education had already shown. One of the leading culprits causing attrition, it became increasingly clear, was remedial courses, which sapped the energies of hard-working students and made them more likely to drop out before earning the degree they longed to obtain. But the prospect of addressing this problem through changes in course placement, as opposed to supports to help students currently enrolled in remedial courses, had not yet become a focus for researchers.
That began to change in 2014, when Judith Scott-Clayton, a researcher at Columbia University’s Teachers College (and MDRC alumna), published a seminal paper on the limitations of placement tests at community colleges. By reviewing the high school transcripts of students whose placement scores had landed them in remedial courses, she was able to show that many such students would likely have succeeded in college-level courses. Her paper suggested that the use of alternative placement measures—such as high school GPAs—could address this flaw, ensuring that fewer students were held back from graduating by being unnecessarily stuck in remedial courses. MMA was born.
MDRC, then, did not invent MMA. It did, however, recognize the promise of a good idea early on, resolving to evaluate it so that—if it did work—it could be improved and scaled to help more people. So MDRC joined forces with the Community College Research Center (CCRC), with funding from the federal Institute for Education Sciences and Ascendium Education Group, to pilot MMA at community colleges in New York, Wisconsin, and Minnesota.
Testing, Testing
The first MMA pilot, launched in 2016, saw MDRC and CCRC partner with seven colleges in the State University of New York (SUNY) system, including Rockland Community College, where Stephen Burke teaches. Another overlapping MMA pilot took place at 10 colleges in Minnesota and Wisconsin. Both pilots were administered through the Center for the Analysis of Postsecondary Readiness, a partnership of CCRC and MDRC.
The pilots tested MMA in different ways. At SUNY, administrators used an algorithm created by CCRC and operationalized by MDRC staff to determine when students should be placed in college-level courses despite low placement test scores. In Minnesota and Wisconsin, schools used simpler decision rules whereby students with test scores and GPAs in certain ranges were automatically “bumped up” to college-level courses. While both pilots focused on high school GPA, they also examined other alternative measures, such as time since high school graduation and a “noncognitive assessment” that measured traits like motivation and study skills. Students were randomly placed into courses the traditional way or via MMA so that researchers could gauge the practice’s results.
Outcomes from both pilots were encouraging, if occasionally surprising. Students “bumped up” in English and math were more likely to complete college-level courses than students in the control group. In particular, they were likelier to complete “gatekeeper” college-level courses—such as introductory 101 courses required for their majors—than students placed in courses the traditional way. On the other hand, students “bumped down” via MMA—for instance, because they did well on the placement test but had low high school GPAs—fared worse in their enrollment and completion outcomes than the control group, suggesting that under-placement was a bigger problem than over-placement. MMA, in other words, should focus on students who could succeed in more challenging courses, rather than students who were placed into courses that might be too challenging for them.
Interestingly, the SUNY colleges that used the algorithm did not experience better results than the Wisconsin and Minnesota schools that used the simpler decision rules. And while high school GPA proved highly effective as an alternative measure, the noncognitive assessment was less good at predicting success. (Korinne Cikanek, a psychology professor and administrator who led the pilot’s implementation at Normandale Community College in Minnesota, suggested that this type of assessment—which is more complicated and involves more variables—could still prove effective if tried again in other contexts.)
Perhaps most counterintuitive was the use of time since high school graduation as a placement measure: students who graduated high school longer ago fared better when bumped into college-level courses than students who had graduated more recently. This suggests that the advantages of greater maturity can outweigh any deficits in knowledge or skills from being out of school longer.
“It is better to let a student struggle a little and succeed, or be more likely to succeed, than to put them in a [remedial course].”
Dan Cullinan, MDRC
Why does MMA work?
MDRC staff believe it is not just a matter of flaws in standardized testing or the fact that some students perform badly on such tests despite doing well in school. It also has to do with students rising to the challenge when they are placed in college-level courses—perhaps because the chance to earn credits towards graduation incentivizes success—whereas remedial courses risk sapping hope and promoting burnout. “It is better to let a student struggle a little and succeed, or be more likely to succeed, than to put them in a [remedial course],” argues MDRC’s Dan Cullinan. Taking a remedial course means students “have to burn a whole semester. They have to spend their money and time, and then they have to come back, persist, enroll in the right class, and then finally get to do what they want. Some of them don’t get through those barriers.”
A Wider Lens
As MDRC has learned from decades of experience, it is not enough to tout promising results from a pilot. Impact means helping partners and policymakers—in this case, individual colleges and state-level systems—understand why they should adopt the practice, and then supporting them through implementation.
It helps that MMA holds intrinsic appeal for administrators. Not only does it increase the odds of student success; it is also an inexpensive, “very light-touch intervention,” as Cullinan put it, which makes it well-suited to colleges without the staff or financial resources for something more comprehensive. Indeed, MMA makes it optional to administer and track placement tests—which proved a useful feature during the COVID-19 pandemic, when in-person testing became all but impossible.
As part of its efforts to scale MMA in the last few years, MDRC has empowered advocates in academia and government not just with data, but also with the step-by-step planning they need to make the switch. For instance, MDRC worked with its partners from the MMA pilots to create a toolkit for faculty and administrators to understand the evidence behind MMA, make the case for its use at their college, and design their own MMA system.
Impact means helping partners and policymakers—in this case, individual colleges and state-level systems—understand why they should adopt the practice, and then supporting them through implementation.
MDRC is also working with partners, especially at the state level, to overcome local hurdles. For instance, Mason Campbell and Tracy Harrell of the Arkansas Division of Higher Education were excited to expand the use of MMA at Arkansas’s community colleges, especially after seeing the results from New York, Wisconsin, and Minnesota. Yet they met with a common objection from colleagues throughout the state: Arkansas is different. How can we be sure that the data from these other states mean MMA will work here? So MDRC helped them gather, analyze, and present data from Arkansas showing that MMA did, in fact, work there, too.
MDRC, Campbell explained, “was able to [help us] say, ‘It does not matter what school district this student came from. Their GPA is still a better indicator of success than their ACT score,’” which community colleges in Arkansas had been using as a placement test. “It’s been huge for us.”
To be sure, challenges remain, some of which suggest where efforts to refine the use of MMA should focus next. There is a widespread belief that COVID-era high school GPAs are less reliable, prompting many colleges to speed up their shift to a corequisite model—a trend that was already underway pre-pandemic—in which students are allowed to take college-level courses if simultaneously enrolled in a corresponding remedial course. Evidence suggests this approach can be effective, with students more likely to persevere in remedial courses when they are able to take college-level courses at the same time. But colleges need more research and support to optimize the corequisite model and calibrate their use of MMA accordingly.
“We need a wider lens for thinking about students and how we can help them.”
Stephen Burke, Rockland Community College
Even so, MMA’s impact so far gives supporters much to celebrate. For Stephen Burke, it is a matter of equity—giving a fair shot to people like Troy, the student who had trouble staying awake in his class. “The old placement models using standardized testing…really support people who come from the right zip codes and go to the right schools,” he said. “They’re not any smarter than students [from other backgrounds] who are capable of success, but because of who their family is, where they live, and what schools they went to, they’re going to have a better chance of success [using] these kinds of traditional models. We need a wider lens for thinking about students and how we can help them.”