Don’t Dismiss Performance Based Funding Research

Performance-based funding (PBF), in which at least a small portion of state higher education appropriations are tied to outcomes, is a hot political topic in many states. According to the National Conference of State Legislatures and work by Janice Friedel and others, 22 states have PBF in place, seven more are transitioning to PBF, and ten more have discussed a switch.

The theory of PBF is simple: if colleges are incentivized to focus on improving student retention and graduation rates, they will redirect effort and funds from other areas to do so. PBF should work if two conditions hold:

(1) Colleges must currently be using their resources in ways that do not strongly correlate with student success, a point of contention with many institutions. If colleges are already operating in a way that maximizes student success, then PBF will not have an impact. PBF could also have negative effects if colleges end up using resources less effectively than they currently are.

(2) The expected funding tied to performance must be larger than the expected cost of changing institutional practices. Most state PBF systems currently tie small amounts of state appropriations to outcomes, which could result in the cost of making changes smaller than the benefits. Colleges also need to be convinced that PBF systems will be around for the long run instead of until the next governor ends the plan or state budget crises cut any funds for PBF. Otherwise, they may choose to wait out the current PBF system and not make any changes. Research by Kevin Dougherty and colleagues through the Community College Research Center highlights the unstable nature of many PBF systems.

For these reasons, the expected impacts of state PBF plans on student outcomes may not be positive. A recent WISCAPE policy brief by David Tandberg, an assistant professor at Florida State University, and Nicholas Hillman, an assistant professor at the University of Wisconsin-Madison, examines whether PBF plans appear to affect the number of associate’s and bachelor’s degrees awarded by institutions in affected states. Their primary findings are that although some states had significantly significant gains in degrees awarded (four at the four-year level and four at the two-year level), other states had significant declines (four at the four-year level and five at the two-year level). Moreover, PBF was most effective in inducing additional degree completions in states with long-running programs.

The general consensus in the research community is that more work needs to be done to understand the effects of state performance-based funding policies on student outcomes. PBF policies differ considerably by state, and it is too early to evaluate the impact of policies on states that have recently adopted the systems.

For these reasons, I was particularly excited to read the Inside Higher Ed piece by Nancy Shulock and Martha Snyder entitled, “Don’t Dismiss Performance Funding,” in response to Tandberg and Hillman’s policy brief. Shulock and Snyder are well-known individuals in the policy community and work for groups with significant PBF experience. However, their one-sided look at the research and cavalier assumptions about the authors’ motives upset me to the point that writing this response became necessary.

First of all, ad hominem attacks about the motives of well-respected researchers should never be a part of a published piece, regardless of the audience. Shulock and Snyder’s reference to the authors’ “surprising lack of curiosity about their own findings” is both an unfair personal attack and untrue. Tandberg and Hillman not only talk about the eight states with some positive impacts, they also discuss the nine states with negative impacts and a larger number of states with no statistically significant effects. Yet Shulock and Snyder do not bother mentioning the states with negative effects in their piece.

Shulock and Snyder are quite willing to attack Tandberg and Hillman for a perceived lack of complexity in their statistical model, particularly regarding their lack of controls for “realism and complexities.” In the academic community, criticisms like this are usually followed up with suggestions on how to improve the model given available data. Yet they fail to do so.

It is also unusual to see a short policy brief like this receive such a great degree of criticism, particularly when the findings are null, the methodology is not under serious question, and the authors are assistant professors. As a new assistant professor myself, I hope that this sort of criticism does not deter untenured faculty and graduate students from pursuing research in policy-relevant fields.

I teach higher education finance to graduate students, and one of the topics this semester was performance-based funding and accountability policy. If Shulock and Snyder submitted their essay for my class, I would ask for a series of revisions before the end of the semester. They need to provide empirical evidence in support of their position and to accurately describe the work done by Tandberg and Hillman. They deserve to have their research fairly characterized in the public sphere.

Author: Robert

I am an a professor at the University of Tennessee, Knoxville who studies higher education finance, accountability policies and practices, and student financial aid. All opinions expressed here are my own.

2 thoughts on “Don’t Dismiss Performance Based Funding Research”

Comments are closed.

%d bloggers like this: