Another Way to Look at Learning Outcomes
At our Fall 2002 Plenary Session the Academic Senate once again expressed through its resolutions strong objection to the new standards adopted by the Accrediting Commission for Community and Junior Colleges. Resolution 2.01 F02, asserting that the Commission has cited no evidence demonstrating that current measures of student learning are inadequate, urges community college faculty to refrain from developing outcome measures simply to satisfy the Commission's dictates. Resolutions 2.03 F02 asks faculty to document the costs of gathering measurable student learning outcomes (MSLOs) that satisfy the Commission's new standards, which are expected to be considerably draining on all of our colleges at this time of fiscal downturn. Resolution 2.06 F02 urges local senates to recommend that "scarce college resources" be used for professional development instead of for setting up means to satisfy the blanket use of MSLOs required by the new standards. And Resolution 2.10 F 02 calls for faculty resistance to the imposition of MSLOs on faculty and in particular to faculty evaluation.
Yet it is important that we recognize that student learning outcomes are not in themselves the target of Senate opposition, that faculty highly value the appropriate use of data tracking the success of their students and use that data to evaluate their efforts and improve programs. (In fact, had the new standards included the phrase "where faculty has determined it to be appropriate" in most of the places where it calls for the use of MSLOs, it would probably have received far less contumely.) All responsible teachers use measures that reflect student learning. The question is not whether we value and agree to use MSLOs, but rather who determines the MSLOs and who decides how and when we use them.
The new standards suggest that MSLOs be used to evaluate every major activity, a stance that goes far beyond what the advocates of MSLOs would agree are reasonable. The standards suggest that everything important that happens at a college can be reflected in objective measures. The new standards leave no room for the transmitting of the values that we find most important to a liberal education, such as the development of curiosity, respect for other cultures, independent thinking, and a value for scholarship (see the AAUP paper "Mandated Outcome Measures"). Instead this new approach asks only for the evidence that our students have mastered skills, like solving an equation for two unknowns or identifying a gerund, which they will soon forget when these skills are not used. So attending to MSLOs to the exclusion of all else forces us to ignore the un-measurable qualities of the educational process.
On the other hand, we certainly do need to know the level of skills mastery our students have attained before we encourage a basic skills writing student to attempt freshman composition. We have to have measures in place that will provide our students with some degree of assurance that when they enroll in a college algebra class, they have a reasonable expectation of succeeding. It is at the basic skills level that our need for valid and reliable student learning measures is critical to the success of our students because these courses are primarily courses designed to develop basic literacy and numeracy skills necessary for students to succeed in college-level course work.
Happily we can report that the development and use of data to increase student success has been growing. We do have evidence that faculty driven assessment has been used to improve basic skills programs at a number of California community colleges. Chaffey College's Basic Skills Transformation Project stands as a model of how well designed assessment can improve an instructional program. This project was designed to increase the rates of success of the seventy percent of Chaffey's freshman who have been assessed as under-prepared for college work (according to the Academic Senate's Basic Skills Survey completed in 2002, the majority of entering freshmen at most California community colleges require some basic skills). The project is a college-wide effort that includes reorganizing programs and services, restructuring curricula, reforming student assessment and placement, expanding academic support services by creating three College Success Centers and four additional multidisciplinary centers, and innovative teaching practices in classes. From the outset research has been an integral part of this project. Faculty and administration wanted to know how many students were affected; to what degree the project improved rates of retention, course success, and persistence; how different demographic groups responded to the project, including historically under-represented students, students with limited English proficiency, and students of all ages; and whether the project helped students who were not under-prepared.
The Chaffey Project is in its third year now (it was planned to be fully implemented in five) and data has been extremely useful in indicating successes. Of course, data cannot show causal relationships with unquestionable reliability. Human behavior is far too complex to be measured by any tests let alone those that we can afford. But certainly when faculty use student learning outcome data carefully, they can provide an important indication of the health of instructional and student service programs.
The Chaffey College Basic Skills Transformation Project will be presented at the 2003 Spring Plenary Session on May 1, 2003. Please visit our program online for exact time and location.
The articles published in the Rostrum do not necessarily represent the adopted positions of the academic senate. For adopted positions and recommendations, please browse this website.