Tuesday, January 21, 2020

Back to the future: Alberta introduces performance-based funding


Earlier this week, Alberta announced it would be implementing a new funding model. This model links up to 40% of institutional funding to performance on various targets. The Minister claims “[t]his is a new and completely transformative funding model.” This claim both is and isn’t true.

It isn’t true as Alberta (and other jurisdictions) experimented with performance-based funding in the late 1990s and Ontario is back at this. Alberta eventually abandoned performance-based funding, at least in part, because it was administratively burdensome and institutions could do little to improve their performance where they did not meet the targets.

The Minister’s statement is true in that the size and punitive approach of the new model has not been seen before in Alberta. The 1990s model provided a small amount of additional funding if performance target were met. The current model places significant amount of current funding (up to 40% by 2023) in jeopardy if targets are not met.

The exact institutional performance metrics have not been decided (or so we’re told). But a list of possible measures was included:
  • graduate employment rate
  • median graduate income
  • graduate skills and competencies
  • work-integrated learning opportunities
  • administrative expense ratio
  • sponsored research revenue
  • enrolment (including potential targets for domestic students, international students and under-represented learners)
These metrics will, according to the Minister, “help ensure students are set up for success by encouraging institutions to produce job-ready graduates.”

It is hard to assess an incomplete proposal, but here are a few thoughts:

Performance measures are conceptual technologies, in that they shape what issues we think about and how we think about them. For example, assessing the employment rate and income of graduates emphasizes the labour-market training role performed by post-secondary institutions. Not assessing other roles (e.g., developing an informed and critical populace) obscures these other roles.

Continuing with this example, measuring graduates’ employment rates and incomes also suggests (however implicitly) that institutions have some control over these outcomes. Certainly institutions can (to some minor degree) influence these outcomes based upon which programs are offered and the career services institutions provide. But graduates’ behaviour and general economic conditions are likely to be far more important determinants of these outcomes.

These more important contributors to labour-market outcomes are entirely outside of institutional control. Yet, bizarrely, the government is suggesting making institutions accountable for outcomes they have little control over.

And, institutions that fail to meet targets will be financially penalized (i.e., have less money to improve performance). Absent radical internal re-allocations (which will be difficult since grants will also shrink over time), these institutions be unable to take meaningful steps to improve performance which, it must be remembered, is largely driven by factors outside of institutions’ control anyhow.

So where does such a system take us? Well, that depends on the details (that we don’t have).

One possibility is that performance targets are set such that institutions are already meeting them (or can easily met them). That doesn’t seem in keeping with the UCP’s narrative that the public sector is inefficient. Further, it would mean the new performance metric system is just a bunch of additional government red tape that serves no real purpose (except maybe to make institutions more easily controlled by government).

A second possibility is that institutions re-allocate funding internally to ensure they meet challenging targets. Since funding is tight, this might include dumping programs that drag down institutional averages on performance metrics and expanding programs that push up averages. The exact outcomes are hard to predict and may be unexpected. For example, during a downturn in oil and construction, the graduate employment measure may pressure institutions to curtail trades programs. I mention that example only because the government is constantly messaging the importance of getting more youth involved in the trades.

Another possibility is institutional gaming (which is often a response to performance measures). Institutions will, to the degree possible, prioritize measures that they do well on while de-emphasizing measures that they do poorly on. Institutions may also play some accounting games. For example, the ratio of administrative to academic expenses can be changed by recoding certain salaries and expenses as academic. This gaming and the broader enterprise of collecting and analyzing this data will, ironically, consume resources that could be better spent on educating students.

A fourth possibility is that performance-based funding is designed to cause financially precarious institutions to fail. In the short term, institutions seeing operating grants decline would likely seek wage-rollbacks and layoffs. This government-induced financial crisis would reduce public-sector expenditures (seemingly a key government priority). In the longer-term, an inability to balance budgets would provide a pretext for closing or amalgamating institutions (I’m thinking particularly of rural colleges, which are electorally popular in Tory ridings but of unclear financial viability).

It will be interesting to see which performance measures Alberta and individual institution come up with and how institution’s behaviour changes ins response to them.

-- Bob Barnetson
-->

No comments:

Post a Comment