Metric Fatigue: Reforming Metrics to Facilitate Meaningful Institutional Dialog

April
2018
John Stanskas, ASCCC Vice President

In the ever-expanding desire for data-driven discussion and accountability, every new initiative tied to funding has produced another set of metrics to measure our colleges’ effectiveness.  The Institutional Effectiveness Partnership Initiative (IEPI) work group on indicators counted 86 distinct metrics used throughout our system as required by the Strong Workforce Program, Student Equity, Student Success and Support Program (SSSP), Basic Skills, Chancellor’s Office accountability measures and system goals, and IEPI indicators, each that can be disaggregated by equity measures.  While it is useful to have a variety of data cataloged and accessible to inform college discussions, it is unreasonable to expect 86 different measurements to effectively drive meaningful institutional dialog in strategic planning and improvement.

In December, the IEPI Indicators workgroup recommended to the Chancellor’s Office that simplifying the metrics provided to colleges is an important reform if the desire is to use metrics to facilitate local goal-setting and improvement.  The recommendation is not to delete data, but organize data and identify just a handful of meaningful areas to require evaluation by the colleges.  A small number of metrics can drive significant institutional dialog and planning.

The Chancellor’s Office responded to IEPI’s request by forming the Metrics Simplification Workgroup headed by Vice Chancellor Omid Pourzanjani with representation from consultative bodies including the Academic Senate.  The Workgroup agreed to the following set of values:

  • Metrics should shift the emphasis from recording activities to highlighting student journeys, from recruitment to completion.
  • Metrics should incentivize behavior that leads to desired student outcomes, with the goal of identifying the highest-leverage data points that will foster student progress.
  • Metrics should be chosen based on system goals, including the Vision for Success, equity, and Guided Pathways, and not on what has been tracked historically, such as academic divisions or funding sources.
  • There should be a limited number of metrics to promote clarity of focus, to replace existing dashboards and the Student Success Scorecard.
  • Metrics should be based on data points that come from statewide data systems, such as the Management Information Systems (MIS), rather than being reported by colleges using supplemental systems.

For example, one data element might be a measure of student engagement defined as the proportion of students who participated in one or more comprehensive support services (Extended Opportunity Programs and Services, Umoja, Disabled Student Programs and Services, Mesa, etc.) offered by the college in a given year.  When the college evaluates such a measure during institutional planning, it may choose to set a goal for the maintenance or improvement of the proportion of students participating in these programs.  To inform such a decision, the underlying data per support program would be available to guide institutional dialog about where such an improvement may occur in a multidimensional data tool.

In addition, equity and institutional conversations around equity need to be infused throughout any college planning dialog and not in just one committee or in one report.  It is challenging to make progress on addressing equity gaps without ensuring every institutional planning conversation driven by a metric evaluation can also be displayed by various student populations.  For example, when evaluating our measure of student engagement in the context of institutional planning and goal-setting, the measured percentage should be able to pivot into an array of data sorted by student population.  This may better guide dialog about who is underserved by existing structures of the college and where resources and innovation may need to be directed to improve.

The Metrics Simplification Workgroup is expected to complete its recommendations by May 2018.  There are interactive webinars and conversations scheduled for various constituent groups throughout this term, with the final webinar scheduled for April 30. 

The articles published in the Rostrum do not necessarily represent the adopted positions of the academic senate. For adopted positions and recommendations, please browse this website.