Academic prioritization at CU Â鶹¹ÙÍø involves two steps: (1) a categorization of each academic unit based on quantitative data and metrics; and (2) analysis of that categorization by the faculty committee tasked with program review.

Step 1: Quantitative Categorization

In 2013-2014, the Academic Affairs Budget Advisory Committee (AABAC) received a charge to develop a simple and understandable academic prioritization construct that:

  1. Uses quantitative data and metrics to shed light on programmatic efficiencies, space use, academic reputation, and teaching effectiveness on a unit-by-unit basis;
  2. Provides insights into how each academic unit contributes to the strategic goals of the campus;
  3. May be refreshed on an annual basis;
  4. Augments rather than replaces existing periodic Academic Review and Planning Advisory Committee (ARPAC) review of academic programs.

AABAC developed a prioritization tool that examines ten different program characteristics in its prioritization of degree-granting academic units. In addition, each of these ten characteristics is itself a combination of various measures: for instance, expenditures relative to student credit hours taught, or research funding relative to space allocation. A separate scale is applied to each of the ten program characteristics, which are then grouped into four clusters: resource efficiency, degree production, scholarly accomplishments, and undergraduate teaching effectiveness. Scores in each cluster are averaged in order to reach a final prioritization score. In addition, the uniqueness of a program’s degree offerings is examined, although this factor does not fit directly into the final academic prioritization score.

Between the last academic prioritization exercise (2014) and the current one (2018), the academic prioritization tool was refined to better reflect the standing of CU Â鶹¹ÙÍø units in relation to cognate units at peer institutions. In particular, data used for scholarly accomplishment were narrowed so that CU Â鶹¹ÙÍø faculty were compared only to peer faculty at AAU peer institutions. This change makes the data more reflective of CU Â鶹¹ÙÍø’s expectations that its faculty will achieve prominence in scholarship and creative work equal to that of faculty at the top institutions in the United States. Specifically, (1) faculty in PhD-granting units were compared to those in cognate PhD-granting units only at AAU institutions, not to those at all PhD-granting institutions; (2) faculty in CU Â鶹¹ÙÍø units that do not grant the PhD were compared to faculty in PhD-granting units at peer institutions (since we expect our faculty to be competitive with AAU peer faculty in scholarly accomplishment even if their unit does not offer the PhD); and (3) faculty in departments that cover more than one field (for example, Mathematics and Applied Mathematics) were disaggregated so that faculty in each field were compared to faculty in the same field at peer institutions.

Further details of the components of the academic prioritization tool may be found in the Components of the Academic Prioritization Tool.

Step 2: Analysis

Insofar as the complexity and character of academic programs on the Â鶹¹ÙÍø campus are highly variable, the prioritization of programs based on the application of a homogeneous set of metrics to a very heterogeneous set of programs should not be viewed on a superficial level (e.g., reduced to a single score or metric). Rather, the value of the prioritization process is in the in-depth analysis of categorical metrics that can be used to better understand how a specific academic unit contributes to the mission of the CU Â鶹¹ÙÍø campus. CU Â鶹¹ÙÍø has made this analysis part of the periodic review of academic programs. Thus, the next step in studying academic prioritization is a qualitative review by the Academic Review and Planning Advisory Committee (ARPAC). In analyzing the results obtained by the academic prioritization tool, ARPAC considers recent program reviews of each unit, changes each unit has made in response to ARPAC’s recommendations, and how each unit’s performance has changed over time.

The information in this siteÌýreflects both the results obtained via the academic prioritization tool and ARPAC’s analysis of the prioritization results.