Can The Demise of the COMPASS Placement Exam Lead to Improved Student Success at California Community Colleges? A Look at Some Relevant Research and Recent Developments

February
2016
William Silver, Professor of English, Evergreen Valley College—San Jose, California

(Note: The following article is not an official statement of the Academic Senate for California Community Colleges. The article is intended to engender discussion and consideration by local colleges, and each college is encouraged to conduct its own research regarding issues such as student placement and curriculum redesign.)

ACT will be taking the COMPASS test off the market in the fall of 2016. As a replacement, California’s Common Assessment Initiative (CAI) will complete development of its statewide online placement exams (CCCAssess) so they are available for use just as the COMPASS is discontinued—in time for spring 2017 registration. The CAI team is also working on a companion project to develop multiple measures using the Cal-PASS Plus data system of high school transcripts and performance. To better understand how changes in student placement could affect student success, one should be familiar with some relevant research.

The relevant research began with Clifford Adelman’s seminal research at the U.S. Department of Education in 2005 and Thomas Bailey, Dong Wook Jeong, and Sung-Woo Cho’s 2008 study of low student completion rates in developmental education courses, published by the Community College Research Center at Columbia University. Writing about a national sample of Achieving the Dream community college students, Bailey noted that, “between 33 and 46 percent of students, depending on the subject area, referred to developmental education actually complete their entire developmental sequence.” He also pointed out that most of the students exited their required developmental course sequence because they did not enroll in an initial or subsequent course, not because they failed or withdrew from a course they were enrolled in.

“between 33 and 46 percent of students … actually complete their entire developmental sequence.”

Building on this work, research by Moore and Shulock at CSU Sacramento, published in 2010, looked at enrollment patterns of California community college students. Their research found that students who reach certain progress milestones have much better success rates in completing certificates and degrees or transferring. For example, if students pass college English within their first two years of study, their success rate rises from 20% to 50%. If they accumulate at least 20 units of credit in their first year of study, the student success rate rises from 21% to 59%. Students gain “momentum” by reaching these progress milestones, the authors pointed out, leading to the term Momentum Points used in the CCC Student Success Scorecard.  These research studies have helped inform a growing movement in California and across the country aimed at improving student success. Efforts at colleges and universities to accelerate students’ progress through their developmental education courses, using “stretch” or other compressed course sequences, are illustrations of this growing movement.

 “if they pass college English within their first two years of study, their success rate rises from 20% to 50%.”

Many community colleges offer separate, parallel developmental reading and writing courses, some beginning three levels below college English, or six levels below for ESL students. Developmental math courses typically begin three levels below college math. Depending on their placement test score, students may be required to complete several lengthy developmental course sequences before they can take transfer-level gatekeeper courses or take courses that fulfill their educational and career goals. To help accelerate students’ progress, some colleges have folded reading instruction into writing or content courses, thereby reducing the number of required courses.

The more time students take to complete their preparatory coursework, the greater the chances that family and financial obligations will interfere. A U.S. Department of Education study from 2007-2008 noted that six in ten community college students work more than 20 hours per week, and over 25% of the students worked 35 hours per week or more. When combined with family and financial obligations, developmental skills course requirements can cause students difficulty in reaching the higher success milestones that Moore and Shulock described.

The starting point and duration of a student’s developmental program are decided in large part by a placement test—most often either the COMPASS or ACCUPLACER tests. Relying heavily on a single measure to determine placement, even a test instrument that has been used for many years and is approved by the state, is a doubtful practice, especially when placement is so important to a student’s potential for success and completion.  In fact, recent studies have found serious problems with the accuracy of both tests. Belfield and Crosta’s 2012 study confirmed the results of prior research, noting that, “the tests do not have much explanatory power across a range of measures of performance including college GPA, credit accumulation, and success in gatekeeper English and math classes.”

The COMPASS and ACCUPLACER “tests do not have much explanatory power across a range of measures of performance including college GPA, credit accumulation, and success in gatekeeper English and math classes.”

A closer look at course level cut-off scores serves to highlight the problem.   Belfield and Crosta found “severe” error rates using cut-off scores in a large statewide community college system: Approximately 30% of students in English were “misaligned,” with somewhat lower misalignment in math. Judith Scott-Clayton found similar error rates in her 2012 study of placement exams. She found that in math exams 75% of the errors placed students in a course that was too low, while in English 85% of the misplaced students were placed too low.

In another example, at one typical community college in California, 20% of all the COMPASS scores reported by the college for 2011 were just one to three points below a cut-off score. Academic counselors sometimes over-ride the test score, placing the individual student higher if there is other relevant information about the student to make the determination, although this practice is sometimes cited as a challenge to discipline faculty expertise and judgment. However, more students in this group of 20% of incoming test-takers—or the 30% of misaligned test scores—might reach progress milestones and achieve success if some of them were to have started their English or math course sequence one level higher than their placement score prescribed.

20% of all the COMPASS scores reported by one typical college were just one to three points below a cut-off score.

Title 5 of the California Code of Regulations requires that “When using an English, mathematics, or ESL assessment for placement, it must be used with one or more other measures to comprise multiple measures.” The need for multiple measures makes good educational sense. Students can be out of practice when they take the placement test, harried by registration obligations, or unaware of the importance of the test. For these reasons, colleges are increasingly helping students to prepare for the placement test or letting them re-take it if needed.

In addition, some colleges are increasingly experimenting with use of a student’s high school GPA in predicting future academic performance. Belfield and Crosta showed that, “Alone, HS GPA was a better predictor of college performance than all other measures put together.” The authors also pointed out that the number of foundation math and English courses taken in high school correlated with college performance. The value of high school GPA can also be seen in the number of colleges and universities making SAT scores optional and using high school GPA instead for admission purposes. The National Association for College Admissions Counseling reported negligible differences in reliability between admissions based on high school GPA and SAT scores.

“Alone, HS GPA was a better predictor of college performance than all other measures put together.”

The Research & Planning Group for California Community Colleges, working in partnership with the CAI team, has proposed using GPA data in conjunction with the forthcoming CCCAssess placement exams, encouraging colleges to place students according to whichever indicator is higher: high school GPA or CAI exam. Support for this approach can be found in research done by Scott-Clayton in 2012, also at Columbia University’s research center. Working with a sample of 42,000 students from an urban community college system, she found that, “allowing students to test out of remediation based on the best of either their placement scores or high school achievement could substantially lower remediation rates (by 8 percentage points in math and 12 percentage points in English) without compromising success rates in college-level coursework.”

The availability of high school GPA data relies on widespread participation in Cal-PASS Plus.  Even with such participation, high school data is not available for some students who enter community colleges. The absence of high school grade records is a practical problem that all community colleges would have to face.

Taken as a whole, the research suggests that significant numbers of students are not correctly placed when entering community colleges, and that some students could begin their developmental course sequence at a higher level.

Taken as a whole, the research suggests that significant numbers of students are not correctly placed when entering community colleges because of placement tests and that some students could begin their developmental course sequence at a higher level. However, the prevailing paradigm over the last few decades has been to increase the number of developmental courses offered, with the intention of giving the fullest possible preparation to underprepared students. Is it possible that the proverbial pendulum has swung too far in that direction? Should the pendulum reverse direction? Such a reversal could be the result of impending changes in placement in California community colleges. Any community college that follows the CAI recommendation, finding the relevant research compelling, is likely to see some of its students placed higher and move more quickly through required developmental courses, thus reaching progress milestones sooner. This could have a salutary impact on student success, especially if appropriate academic support services were available to students.

What should California community colleges do to anticipate changes in placement testing? And how might the colleges develop a more robust use of multiple measures to place students with accuracy, possibly including high school GPA? We would be well-advised to answer those questions some time during the 2015-2016 academic year.

The articles published in the Rostrum do not necessarily represent the adopted positions of the academic senate. For adopted positions and recommendations, please browse this website.