What the Heck is Basic Skills Coding About, Anyway? Or Recoding Basic Skills Courses to Track and Improve Student Success
The Basic Skills Initiative (BSI) has awakened an understanding and motivation to examine data in order to determine the effectiveness of our basic skills efforts and to guide us in improving student success based upon evidence. The Legislature requires annual publication of this data on student success and progression in the Accountability Report for Community Colleges (ARCC), so we must do it. However, beyond simple accountability, we have found the information valuable locally. Recently, along with granting funding for the new Basic Skills Initiative, the Legislature required an additional supplemental report concerning specific basic skills metrics.
Tracking student success and progress is based upon codes assigned to courses in order to determine the level below transfer; this coding is called CB21.
However, upon reviewing the coding related to course levels below transfer, faculty and researchers discovered that courses were frequently coded incorrectly, providing erroneous information about student progression. Some institutions with average basic skills success rates had abysmal student progress to the next course. This was particularly noticed in ESL course progressions. In fact, the Chancellor's Office has known about this problematic coding for the last decade. The core problem was a disconnect between the curriculum being taught and the people coding these classes in your MIS (Management Information Systems). And now the Academic Senate and the Basic Skills Initiative have come to the rescue!
We embarked on a project to help provide information about the curriculum content in each level of basic skills courses in order to help colleges code their courses more accurately, thereby providing more valid data. How did we do this? The Senate gathered faculty in the disciplines of English, reading, mathematics and ESL to talk about the credit courses below transfer. This has resulted in the creation of rubrics to generally define the skills in each of these course levels across the state. In addition, guidelines were developed to explain how these rubrics were designed to be used and how they are NOT designed to be used. The guidelines are included on the next page.
On October 16 and 17, 2008, a group of 140 intrepid faculty from 56 California Community Colleges gathered together to learn about the collection of basic skills data and the MIS coding. Patrick Perry, the Vice Chancellor of TRIS (Technology, Research and Information Systems), and Carole Bogue-Feinour, the Vice Chancellor of Academic Affairs, explained the difficulties with these codes and the impact on the colleges as a result of the inaccurate data.
The faculty were also provided background information collected through research by discipline experts about discipline specific content. Those discipline experts reviewed the ICAS (Intersegmental Committee of Academic Senates) competencies and the IMPAC (Intersegmental Major Preparation Articulation Curriculum) documents in order to determine the entry and college level skills already defined and agreed upon in California across the public colleges. In addition, existing standards for California were reviewed, such as CATESOL's (California Teachers of English to Speakers of Other Languages) California Pathways document, California Department of Education standards, CMC3 (California Mathematics Council, Community Colleges) and AMATYC (American Mathematical Association of Two-Year Colleges) mathematics standards, and others. Finally, a nationwide scan was conducted to look for course descriptors, exit competencies, or standards. Professional organizations were queried for help, particularly where no existing standards or descriptions were available. A recent Academic Senate survey was used in order to determine what the most common number of course levels below transfer were in each discipline statewide.
Armed with all this important information, those attending were then divided into groups based upon their teaching expertise and experience by discipline in either English, mathematics, reading or ESL. For many attending, it was their first opportunity to talk with discipline faculty from across the entire state! Each group first determined the common number of course levels within their discipline below transfer. Currently, existing CB 21 coding only allows for three levels below transfer with the fourth level being nondescript as something lower or transfer. Each of the disciplines independently determined whether this number was appropriate and if not, what needed to be changed. Then, with great care and deliberation, faculty discussed the skills in each level. Here is what happened:
- English described three levels below English 1A, or Freshman Composition, and worked diligently to describe a fourth level but were unsure of its usefulness and content. The English faculty created a rubric based upon the major skills or exit competencies common to these levels of courses. They decided to write the rubric contents in outcomes language to indicate that these skills are what a student can do at the end of each level.
- Reading described four levels below transfer level as well, with distinct skills and philosophies built into each level of their rubric. Because most of the research about reading nationwide is described by grade levels, reading faculty initially created descriptions with grade equivalencies, but were not committed to leaving these in the final rubric.
- Mathematics faculty described a four-level rubric beginning with basic mathematics and going up to Intermediate Algebra. Although these courses were previously fairly well-defined in CB21, faculty found the discussion about the skills and how they related to each course very helpful. The mathematics rubric still needs input as to the location of non-algebra courses such as geometry.
- ESL faculty decided to use English 1A or Transferable ESL courses as the description for the transferable level. However, because ESL skills are so defined and multiple in nature, they developed three rubrics in line with the CATESOL methodology: writing, reading and speaking and listening. The ESL writing rubric is in a draft form but is ready for comment from other faculty, whereas the ESL reading and ESL speaking and listening rubrics are still being finalized.
The ESL faculty felt that they needed to include six levels to accommodate the progress of students in California credit ESL courses. The average number of levels below transfer in the Senate survey did reveal much greater variety than the other disciplines. Some schools had as few as two or three levels while others had as many as nine levels. However, six levels seemed to be the most common and most easily defined. This will require some major changes in the coding metrics because it goes outside of the present design which allows for only four levels. However, the ESL data is some of the most inconsistent, and faculty made strong arguments about the need for these levels based upon our population of students if we want to accurately measure progress.
At the end of the meeting, many of the faculty reviewed the MIS data coding for their own college's basic skills courses; the majority reported that the coding was incorrect for their institution. In conjunction with the rubrics, faculty knowledgeable about curricular levels will be trained as local resources to guide discussion and facilitate recoding based on the curriculum.
The rubrics created over those two days in October were developed as DRAFTS and are meant to be discussed throughout the state over the next six months. In addition to getting responses from discipline faculty, the Senate will also be asking for direct input from professional organizations in each of the disciplines. We will also get feedback on the guidelines to explain how to use and NOT use these. The background information, DRAFT rubrics, guidelines and current CB21 coding for colleges can be found at http://www.cccbsi.org/bsi-rubric-information
So what is next? As stated earlier, the Senate will ask discipline experts to review the rubrics and submit any comments. We will also submit the rubrics to the professional groups CATESOL, ECCTYC (English Council of California Two-Year Colleges), CRLA (College Reading and Learning Association), and CMC3 (California Mathematics Council Community Colleges) for comment. The Senate voted to support this process at its Fall Plenary Session with resolution 9.02 F08. Further discussions will occur throughout the winter and spring. When the Academic Senate meets again at the 2009 Spring Plenary, we will seek to adopt the final rubrics. If the delegates choose to adopt them, then the Senate will teach faculty how to use the rubrics to advise recoding of their basic skills courses using this faculty-designed protocol. This recoding will involve using the rubrics as a guide, but will allow local colleges to code the courses as they feel is best for their institution. In addition, the CB21 coding for levels will not influence whether the course is basic skills or not, degree applicable or not, transferable for elective or not, because it is a separate code. We hope to finalize this training in how to use the rubrics at the Curriculum Institute in July 2009.
What will happen then? We will actually get data on how our students are progressing, where we may need to help them, and we will all understand basic skills progression better ourselves. The Legislature and our institutions will have accurate data and all of us can work together to better to help our students succeed!
Guidelines or Philosophy for the Use of the CB21 Rubrics
These DRAFT rubrics were the result of collegial input from 140 faculty in Math, English, ESL and Reading from across the state. The rubrics were created with the understanding that they would be vetted throughout the disciplines and discussed with the professional organizations associated with each discipline through April 2009. After fully vetting the rubrics, they will be considered for adoption at the ASCCC Spring Plenary Session.
The rubrics describe coding for basic skills levels. They DO NOT prescribe or standardize curriculum. They are not a comprehensive description of curricular activity in those courses, but rather describe a universal core of skills and abilities that the faculty could agree should be present at the end of each of those levels.
The level descriptions ARE NOT comprehensive. There are many other outcomes or skills developed in the courses at individual college locations, but which are not necessarily represented statewide and therefore not included as a part of the rubric.
The rubrics DO NOT dictate anything regarding the classification of the course as to transferability, degree applicability or even coding as a basic skills course or not.
The rubrics ARE NOT the final authority. They are a referential guide representing what we have determined is common practice statewide; they do NOT dictate any course's assignment to any particular level. Coding of the course levels IS a local decision.
There is no obligation to use the CB 21 coding as indicated in the rubric; it is merely a guide or reference indicating agreement among colleges in the state regarding a core commonality. Each local college may code the basic skills courses at their college appropriately to fit their student population, curriculum and program descriptions. If their basic skills course looks like a level 2 on the rubric, but the college decides to code the course at level 1 or level 3 or any other level, it may do so. This is a local decision.
Faculty will continue to develop and determine what they teach as discipline experts about their student audiences, retaining curricular and program primacy.
This process is not designed as an obstacle to curriculum, curricular or programmatic development. It WAS developed as a data coding activity to improve the data reported to the legislature concerning student success and improvement in basic skills.
When the process is completed a protocol will be developed for recoding the basic skills levels. This process will include local discipline faculty working collaboratively with the person coding MIS curriculum elements at their college.
The articles published in the Rostrum do not necessarily represent the adopted positions of the academic senate. For adopted positions and recommendations, please browse this website.