What Does Proficiency Look Like on the ACCJC Rubric?
Since 2007, the Accrediting Commission for Community and Junior Colleges (ACCJC) has clearly stated its expectation that colleges currently be at the "Continuous Sustainable Quality Improvement" level for Program Review and Planning on rubrics that ACCJC has provided. (See http://www.sdmesa.edu/instruction/accreditation/pdf/2-ACCJC-Memo-Barbara... for the rubrics.) Moreover, ACCJC has told colleges that they must be at the "Proficiency" level for student learning outcomes (SLOs) by 2012. Some colleges that have recently experienced either site visits or done their midterm reports have been asked to document how they will reach proficiency in their SLOs by 2012.What does "Proficiency" look like on the ACCJC Rubric? There are eight characteristics (the following are quoted directly from the rubric):
- Student learning outcomes and authentic assessment are in place for courses, programs and degrees.
- Results of assessment are being used for improvement and further alignment of institution-wide practices.
- There is widespread institutional dialogue about the results.
- Decision-making includes dialogue on the results of assessment and is purposefully directed toward improving student learning.
- Appropriate resources continue to be allocated and fine-tuned.
- Comprehensive assessment reports exist and are completed on a regular basis.
- Course student learning outcomes are aligned with degree student learning outcomes.
- Students demonstrate awareness of goals and purposes of courses and programs in which they are enrolled.
In examining each of these characteristics, where would you place your college? What would your college need to do in order to achieve each of them? How do these characteristics interact with the rubrics for program review and planning?
At the SLO Institute held in July 2009, one general session attempted to address these questions. The purpose of the session was to examine what being at the "Proficiency" level of the SLO rubric would look like. Earlier that day, the overwhelming majority of attendees agreed that their colleges had reached the "Development" level on the SLO rubric.
As attendees sat at tables, each table was given a single element from the "Proficiency" level rubric.
Then the attendees were asked to see where they thought their colleges were in meeting that particular element. What they discovered has significant implications for all colleges as they move forward in SLOs and assessment. Several participants discovered that they could be at the "Development" level for one element of the rubric, yet be at the "Continuous Sustainable Quality Improvement" level for another element of the rubric. For example, several colleges specifically link SLOs to their program reviews, which is at the "Sustainable Continuous Quality Improvement" level. Yet many of these same colleges have faculty and staff fully engaged in SLO development, which is at the "Development" level.
The other significant discovery at the SLO Institute provides some understanding as to why program review and planning processes rank so high as the reasons for colleges being on sanction. Most colleges currently facing sanctions have been cited for not being at the "Sustainable Continuous Quality Improvement" level of the "Rubric for Evaluating Institutional Effectiveness-Part I: Program Review" or "Rubric for Evaluating Institutional Effectiveness-Part II: Planning." A third issue, governance, is also a factor.
When examining the rubrics for those processes at the "Continuous Sustainable Quality Improvement" level which colleges are currently expected to be regarding program review and planning, it becomes clear that unless colleges have moved toward the "Proficiency" level with their SLOs and assessment processes, the three rubrics have difficulty working in an integrative fashion. Because so many colleges are at the "Development" stage on the SLO rubric, their ability to be at the "Sustainable Continuous Quality Improvement" level in program review and its link to planning could be hampered. For example, at the "Proficiency" level for SLOs, two key elements of the rubric are "Student learning outcomes and authentic assessment are in place for courses, programs and degrees" and "Results of assessment are being used for improvement and further alignment of institution-wide practices." Moreover, at the "Sustainable Continuous Quality Improvement" level for SLOs, one key element is that "Learning outcomes are specifically linked to program reviews." Only if a college has achieved these three elements, is it likely to have a "consistent and continuous commitment to improving student learning; and educational effectiveness is a demonstrable priority in all planning structures and processes" (taken from the Sustainable Continuous Quality Improvement rubric on Planning) or "program review processes [that] are ongoing, systematic and used to assess and improve student learning and achievement" (taken from the Sustainable Continuous Quality Improvement rubric on Program Review).
As faculty at local colleges tackle the issue of becoming proficient in their SLOs and assessment, applying the rubrics to their own processes will highlight those areas in which more work is needed and those areas in which they do well. By breaking down the rubrics for all three areas, faculty and colleges will have a clearer idea of what they need to do in order to meet the proficiency requirement by 2012. And perhaps along the way, the number of colleges receiving sanctions for program review and planning might begin to decline.
The articles published in the Rostrum do not necessarily represent the adopted positions of the academic senate. For adopted positions and recommendations, please browse this website.