Some Updates on the State Performance Funding Data Project

Last December, I publicly announced a new project with Justin Ortagus of the University of Florida and Kelly Rosinger of Pennsylvania State University that would collect data on the details of states’ performance-based funding (PBF) systems. We have spent the last nine months diving even deeper into policy documents and obscure corners of the Internet as well as talking with state higher education officials to build our dataset. Now is a good chance to come up for air for a few minutes and provide an update on our project and our status going forward.

First, I’m happy to share that data collection is moving along pretty well. We gave a presentation at the State Higher Education Executives Officers Association’s annual policy conference in Boston in early August and were also able to make some great connections with people from more states at the conference. We are getting close to having a solid first draft of a 20-plus year dataset on state-level policies, and are working hard to build institution-level datasets for each state. As we discuss in the slide deck, our painstaking data collection process is leading us to question some of the prior typologies of performance funding systems. We will have more to share on that in the coming months, but going back to get data on early PBF systems is quite illuminating.

Second, our initial announcement about the project included a one-year, $204,528 grant from the William T. Grant Foundation to fund our data collection efforts. We recently received $373,590 in funding from Arnold Ventures and the Joyce Foundation to extend the project through mid-2021. This will allow us to build a project website, analyze the data, and disseminate results to policymakers and the public.

Finally, we have learned an incredible amount about data collection over the last couple of years working together as a team. (And I couldn’t ask for better colleagues!) One thing that we learned is that there is little guidance to researchers on how to collect the types of detailed data needed to provide useful information to the field. We decided to write up a how-to guide on data collection and analyses, and I’m pleased to share our new article on the topic in AERA Open. In this article (which is fully open access), we share some tips and tricks for collecting data (the Wayback Machine might as well be a member of our research team at this point), as well as how to do difference-in-differences analyses with continuous treatment variables. Hopefully, this article will encourage other researchers to launch similar data collection efforts while helping them avoid some of the missteps that we made early in our project.

Stay tuned for future updates on our project, as we will have some exciting new research to share throughout the next few years!

Three New Articles on Performance-Based Funding Policies

As an academic, few things make me happier than reading cutting-edge research conducted by talented scholars. So I was thrilled to see three new articles on a topic near and dear to my heart—performance-based funding (PBF) in higher education—come out in top-tier journals. In this post, I briefly summarize the three articles and look at where the body of research is heading.

Nathan Favero (American University) and Amanda Rutherford (Indiana University). “Will the Tide Lift all Boats? Examining the Equity Effects of Performance Funding Policies in U.S. Higher Education.” Research in Higher Education.

In this article, the authors look at state PBF policies (divided into earlier 1.0 policies and later 2.0 policies) to examine whether PBF affects four-year colleges within a state differently. They found evidence that the wave of 2.0 policies may negatively affect less-selective and less-resourced public universities, while 1.0 policies affected colleges in relatively similar ways. In a useful Twitter thread (another reason why all policy-relevant researchers should be on Twitter!), Nathan discusses the implications on equity.

Lori Prince Hagood (University System of Georgia). “The Financial Benefits and Burdens of Performance Funding in Higher Education.” Educational Evaluation and Policy Analysis.

Lori’s article digs into the extent that PBF policies affect per-student state appropriations at four-year colleges, defining PBF as whether a state had any policy funded in a given year. The first item worth noting from the paper is that per-student funding in PBF states has traditionally been lower than in non-PBF states. This may change going forward as states with more generous funding (such as California) are now adopting PBF policies. Lori’s main finding is that selective and research universities tend to see increased state funding following the implementation of PBF, while less-selective institutions see decreased funding, raising concerns about equity.

As an aside, I had the pleasure of discussing an earlier version of this paper at the 2017 Association for the Study of Higher Education conference (although I had forgotten about that until Lori sent me a nice note when the article came out). I wrote in my comments at that time: “I think it has potential to go to a good journal with a modest amount of additional work.” I’m not often right, but I’m glad I was in this case!

Denisa Gándara (Southern Methodist University). “Does Evidence Matter? An Analysis of Evidence Use in Performance-Funding Policy Design.” The Review of Higher Education.

Denisa’s article is a wonderful read alongside the other two because it does not use difference-in-differences techniques to look at quantitative effects of PBF. Instead, she digs into how the legislative sausage of a PBF policy is actually made by studying the policy processes in Colorado (which adopted PBF across two-year and four-year colleges) and Texas (which never adopted PBF in the four-year sector). Her interviews reveal that PBF models in other states and national advocacy groups such as Complete College America and HCM Strategists were far more influential than lowly academic researchers.

In a Twitter thread about her new article, Denisa highlighted the following statement:

As a fellow researcher who also talks with policymakers on a regular basis, I have quite a few thoughts on this statement. Policymakers (including in blue states) are increasingly hesitant to give colleges more money without tying a portion of those funds to student outcomes, and other ways of funding colleges also raise equity concerns. So expect PBF to expand in the next several years.

Does this mean that academic research on PBF is irrelevant? I don’t think so. Advocacy organizations are at least partially influenced by academic research; for example, see how the research on equity metrics in PBF policies has shaped their work. It is the job of researchers to keep raising critical questions about the design of PBF policies, and it is also our job to conduct more nuanced analyses that dive into the details of how policies are constructed. That is why my new project with Kelly Rosinger of Penn State and Justin Ortagus of the University of Florida to collect these details over time excites me so much—it is what the field needs to keep building upon great studies such as the ones highlighted here.

Announcing a New Data Collection Project on State Performance-Based Funding Policies

Performance-based funding (PBF) policies in higher education, in which states fund colleges in part based on student outcomes instead of enrollment measures or historical tradition, have spread rapidly across states in recent years. This push for greater accountability has resulted in more than half of all states currently using PBF to fund at least some colleges, with deep-blue California joining a diverse group of states by developing a PBF policy for its community colleges.

Academic researchers have flocked to the topic of PBF over the last decade and have produced dozens of studies looking at the effects of PBF both on a national level and for individual states. In general, this research has found modest effects of PBF, with some differences across states, sectors, and how long the policies have been in place. There have also been concerns about the potential unintended consequences of PBF on access for low-income and minority students, although new policies that provide bonuses to colleges that graduate historically underrepresented students seem to be promising in mitigating these issues.

In spite of the intense research and policy interest in PBF, relatively little is known about what is actually in these policies. States vary considerably in how much money is tied to student outcomes, which outcomes (such as retention and degree completion) are incentivized, and whether there are bonuses for serving low-income, minority, first-generation, rural, adult, or veteran students. Some states also give bonuses for STEM graduates, which is even more important to understand given this week’s landmark paper by Kevin Stange and colleagues documenting differences in the cost of providing an education across disciplines.

Most research has relied on binary indicators of whether a state has a PBF policy or an incentive to encourage equity, with some studies trying to get at the importance of the strength of PBF policies by looking at individual states. But researchers and advocacy organizations cannot even agree on whether certain states had PBF policies in certain years, and no research has tried to fully catalog the different strengths of policies (“dosage”) across states over time.

Because collecting high-quality data on the nuances of PBF policies is a time-consuming endeavor, I was just about ready to walk away from studying PBF given my available resources. But last fall at the Association for the Study of Higher Education conference, two wonderful colleagues approached me with an idea to go out and collect the data. After a year of working with Justin Ortagus of the University of Florida and Kelly Rosinger of Pennsylvania State University—two tremendous assistant professors of higher education—we are pleased to announce that we have received a $204,528 grant from the William T. Grant Foundation to build a 20-year dataset containing detailed information about the characteristics of PBF policies and how much money is at stake.

Our dataset, which will eventually be made available to the public, will help us answer a range of policy-relevant questions about PBF. Some particularly important questions are whether dosage matters regarding student outcomes, whether different types of equity provisions are effective in reducing educational inequality, and whether colleges respond to PBF policies differently based on what share of their funding comes from the state. We are still seeking funding to do these analyses over the next several years, so we would love to talk with interested foundations about the next phases of our work.

To close, one thing that I tell often-skeptical audiences of institutional leaders and fellow faculty members is that PBF policies are not going away anytime soon and that many state policymakers will not give additional funding to higher education without at least a portion being directly tied to student outcomes. These policies are also rapidly changing, in part driven by some of the research over the last decade that was not as positive toward many early PBF systems. This dataset will allow us to examine which types of PBF systems can improve outcomes across all students, thus helping states improve their current PBF systems.

New Research on Equity Provisions in State Performance Funding Policies

Previous versions of state performance-based funding (PBF) policies were frequently criticized for encouraging colleges to simply become more selective in order to get more state funding (see a good summary of the research here). This has potential concerns for equity, as lower-income, first-generation, adult, and racial/ethnic minority students often need additional supports to succeed in college compared to their more advantaged peers.

With the support of foundations and advocacy organizations, the most recent wave of state PBF policies has often included provisions that encourage colleges to enroll traditionally underrepresented students. For example, Indiana now gives $6,000 to a college if a low-income student completes a bachelor’s degree; while this is far less than the $23,000 that the college gets if a student completes their degree in four years, it still provides an incentive for colleges to change their recruitment and admissions practices. Today, at least sixteen states provide incentives for colleges to serve underrepresented students.

Given the growth of these equity provisions, it is not surprising that researchers are now turning their attention to these policies. Denisa Gandara of SMU and Amanda Rutherford of Indiana University published a great article in Research in Higher Education last fall looking at the effects of these provisions among four-year colleges. They found that the policies were at least somewhat effective in encouraging colleges to enroll more racial/ethnic minority and lower-income students.

As occasionally happens in the research world, multiple research teams were studying the same topic at the same time. I was also studying the same topic, and my article was accepted in The Journal of Higher Education a few days before their article was released. My article is now available online (the pre-publication version is here), and my findings are generally similar—PBF policies with equity provisions can at the very least help reduce incentives for colleges to enroll fewer at-risk students.

The biggest contribution of my work is how I define the comparison group in my analyses. The treatment group is easy to define (colleges that are subject to a PBF policy with equity provisions), but comparison groups often combine colleges that face PBF without equity provisions with colleges that are not subject to PBF. By dividing those two types of colleges into separate comparison groups, I can dig deeper into how the provisions of performance funding policies affect colleges. And I did find some differences in the results across the two comparison groups, highlighting the importance of more nuanced comparison groups.

Much more work still needs to be done to understand the implications of these new equity provisions. In particular, more details are needed about which components are in a state’s PBF system, and qualitative work is sorely needed to help researchers and policymakers understand how colleges respond to the nuances of different states’ policies. Given the growing group of scholars doing research in this area, I am confident that the state of PBF research will continue to improve over the next few years.

(Still) Don’t Dismiss Performance Funding Research

I like the idea of funding public colleges and universities based in part on their former students’ outcomes—and I’m far from the only one. Something in the ballpark of three dozen states have adopted some sort of a performance-based funding (PBF) system, with more states currently discussing the program. Given that many states currently fund colleges based on a combination of enrollment levels and historical allocations that can be woefully out-of-date, tying some funding to outcomes has an intuitive appeal.

However, as a researcher of accountability policies in higher education, I am concerned that some colleges may be responding to PBF in unintended ways. At this point, as I briefly summarized in a recent piece at The Conversation, there is evidence that PBF may adversely affect access to college for moderately prepared students as well as the types of postsecondary credentials awarded. My newest contribution was a recently published article in the Journal of Education Finance that found both two-year and four-year colleges subject to PBF saw less Pell revenue than other colleges not subject to PBF.

Since that article finished the peer review and copy editing processes and was posted online two weeks ago, I’ve been expecting a response from one of the largest organizations advocating for PBF. HCM Strategists, a DC-based advocacy group that is quite effective in lobbying and policy development, has traditionally been a strong supporter of PBF. (Disclaimer: I’ve gotten funding from them for a project on a different topic in the past.) In 2013, an HCM director responded to a high-quality paper by David Tandberg and Nick Hillman (that was later published in JEF) by writing an Inside Higher Ed piece called “Don’t Dismiss Performance Funding.” In this piece, they call the research “flawed” and “simplistic,” neither of which are particularly true. I wrote a blog post called “Don’t Dismiss Performance Funding Research,” in which I wasn’t too pleased with their response.

Today, HCM director Martha Snyder has a much more nuanced IHE essay on my and Luke’s work entitled “Jumping to Conclusions,” saying that our work should not be used “to draw any meaningful conclusions” on PBF. Snyder discusses what she perceives as some of the limitations of our work. The most notable one is that multiple types of PBF policies are lumped together in the analyses. That is necessary due to data limitations—there is no comprehensive archive of the nuances of PBF plans prior to the early 2010s. However, general trends in PBF policies across states are partially captured by the year fixed effects in the regression (standard practice in panel analyses), which also help to account for these factors.

Snyder also suggests that some states have been encouraging students to enroll in community colleges, which is definitely the case (although somewhat less so prior to 2012-13, the last year of our analysis due to the pace at which new data become available). If this were true, it would explain decreases in per-FTE Pell revenue at four-year colleges, but also increase Pell revenues at two-year colleges. Instead, we saw nearly identical negative point estimates, which raise further cause for concern. (Could this affect for-profit enrollment? I can’t really tell with federal data, but a state-level analysis here would be great.)

I appreciate HCM’s work in helping states implement more modern funding programs, but it is imperative that influential policy organizations work with the research community before drawing any meaningful conclusions about the potential unintended consequences of PBF—especially as the stakes become higher for students and colleges alike. The small, but growing, body of literature on colleges’ responses to PBF suggests that collaboration among interested parties would be far more productive than attempting to dismiss findings from peer-reviewed research that suggest caution may be in order. I’m happy to do what I can to summarize the literature on unintended consequences while working to move forward policy discussions on future versions of PBF.

Fewer Poor Students Are Being Enrolled in State Universities–Here’s Why

This post initially appeared at The Conversation, and is co-authored with my Seton Hall colleague Luke Stedrak.

States have traditionally provided funding for public colleges and universities based on a combination of the number of students enrolled and how much money they were allocated previously.

But, in the face of increasingly tight budgets and pressures to demonstrate their effectiveness to legislators, more and more states are tying at least some higher education funding to student outcomes.

As of 2015, 32 states have implemented a funding system that is based in part on students’ performance in at least some of their colleges. In such states, a portion of state funding is based on metrics such as the number of completed courses or the number of graduates.

Research shows that performance-based funding (PBF) has not moved the needle on degree completions in any substantial way. Our research focuses on the unintended consequences of such funding policies – whether colleges have responded to funding incentives in ways that could hurt disadvantaged students.

We find evidence that these systems may be reducing access for low-income students at public colleges.

Just a popular political strategy?

What is performance-based funding (PBF)? And does it improve college completion rates?

Performance funding, the idea of tying funding to outcomes instead of enrollment, was first adopted in Tennessee in 1979. It spread across the country in waves in the 1990s and 2000s, with some states dropping and adding programs as state budget conditions and political winds changed. In this decade, several states have implemented systems tying most or all of state funding to outcomes.

By basing funding on outcomes such as course completions and the number of degrees awarded, PBF has become a politically popular strategy to improve student outcomes. It has received strong support from the Bill and Melinda Gates and Lumina Foundations – two big players in the higher education landscape.

However, the best available evidence suggests that PBF systems generally do not move the needle on degree completions in any substantial way.

For example, a study of Washington state’s PBF program by Nick Hillman of Wisconsin, David Tandberg of Florida State and Alisa Hicklin Fryar of University of Oklahoma showed no effects on associate degree completion at two-year colleges. The study found positive effects on certificates in technical fields that took less time to complete, but those were the ones that were not as valuable in the labor market.

When Tandberg and Hillman conducted a nationwide study, they found no effect overall of PBF programs on degree completions at two-year and four-year colleges.

However, the small number of PBF programs that had been in effect for at least seven years (giving colleges plenty of time to change their practices in response) did appear to increase the number of bachelor’s degrees awarded by a few percentage points.

More selective and lower standards

While there is no significant evidence of impact, there have been many unintended consequences of this policy.

There is a growing body of evidence, for example, that shows that colleges may be trying to change both their student body and their academic standards in order to meet the state’s performance goals as well as their own priorities.

A research team at Teachers College who interviewed administrators in three states with “high-stakes” PBF systems (Indiana, Ohio and Tennessee) found that colleges facing PBF were both becoming more selective in accepting students and lowering academic standards among current students in an effort to have more students graduate.

A new study by Mark Umbricht and Frank Fernandez at Penn State and Justin Ortagus at University of Florida used data on incoming students to show that Indiana colleges increased selectivity in response to PBF.

They estimated that Indiana colleges lowered admissions rates by nearly 10 percent and increased ACT scores by nearly a full point compared to similar colleges in other states.

In our research, published recently in the Journal of Education Finance, we examined whether public two-year and four-year colleges nationwide changed how they either received or spent money in response to performance funding systems.

We found that colleges generally did not change spending on instruction or research, but they did see significantly less revenue from federal Pell Grants that are primarily given to students with family incomes below US$60,000 per year, suggesting fewer low-income students enrolled. We estimated a statistically significant decline in Pell revenue of about 2 percent at both two-year and four-year colleges.

We also found that four-year colleges offered more institutional grant aid, potentially in the form of merit-based scholarships to attract higher-income students with a greater likelihood of success.

Implications for policy

Although research suggests that performance funding systems have not been particularly effective in increasing the number of degrees that public colleges grant, the fact is that PBF is being adopted in more states. For example, five more states have adopted PBF since 2014, with additional states debating whether to adopt plans of their own.

We believe, this is unlikely to go away anytime soon.

And many states’ existing funding systems are highly inequitable. They favor research universities over less-selective colleges, even though less-selective colleges enroll the lion’s share of low-income students.

States should consider placing provisions in both their enrollment-based and performance-based funding systems to encourage colleges to continuing to enroll an economically diverse student body.

Several states, such as Arkansas, Ohio and Florida, provide additional incentives for graduating Pell Grant recipients. But states need to ensure that these additional funds are sufficient to encourage colleges to enroll academically qualified students from low-income families as well.

To do this, states would need to take three concrete steps. First, states should provide incentives for colleges to not raise admissions standards beyond what is needed to succeed in coursework. Second, they could also provide additional funds for graduating students who require a modest amount of remedial coursework (courses to build skills of less-prepared students), before taking college-level classes.

And finally, it is important that state policymakers and college leaders have honest conversations about the goals of PBF systems and what colleges need to improve their performance. This could help reduce the unintended outcomes.

Don’t Dismiss Performance Based Funding Research

Performance-based funding (PBF), in which at least a small portion of state higher education appropriations are tied to outcomes, is a hot political topic in many states. According to the National Conference of State Legislatures and work by Janice Friedel and others, 22 states have PBF in place, seven more are transitioning to PBF, and ten more have discussed a switch.

The theory of PBF is simple: if colleges are incentivized to focus on improving student retention and graduation rates, they will redirect effort and funds from other areas to do so. PBF should work if two conditions hold:

(1) Colleges must currently be using their resources in ways that do not strongly correlate with student success, a point of contention with many institutions. If colleges are already operating in a way that maximizes student success, then PBF will not have an impact. PBF could also have negative effects if colleges end up using resources less effectively than they currently are.

(2) The expected funding tied to performance must be larger than the expected cost of changing institutional practices. Most state PBF systems currently tie small amounts of state appropriations to outcomes, which could result in the cost of making changes smaller than the benefits. Colleges also need to be convinced that PBF systems will be around for the long run instead of until the next governor ends the plan or state budget crises cut any funds for PBF. Otherwise, they may choose to wait out the current PBF system and not make any changes. Research by Kevin Dougherty and colleagues through the Community College Research Center highlights the unstable nature of many PBF systems.

For these reasons, the expected impacts of state PBF plans on student outcomes may not be positive. A recent WISCAPE policy brief by David Tandberg, an assistant professor at Florida State University, and Nicholas Hillman, an assistant professor at the University of Wisconsin-Madison, examines whether PBF plans appear to affect the number of associate’s and bachelor’s degrees awarded by institutions in affected states. Their primary findings are that although some states had significantly significant gains in degrees awarded (four at the four-year level and four at the two-year level), other states had significant declines (four at the four-year level and five at the two-year level). Moreover, PBF was most effective in inducing additional degree completions in states with long-running programs.

The general consensus in the research community is that more work needs to be done to understand the effects of state performance-based funding policies on student outcomes. PBF policies differ considerably by state, and it is too early to evaluate the impact of policies on states that have recently adopted the systems.

For these reasons, I was particularly excited to read the Inside Higher Ed piece by Nancy Shulock and Martha Snyder entitled, “Don’t Dismiss Performance Funding,” in response to Tandberg and Hillman’s policy brief. Shulock and Snyder are well-known individuals in the policy community and work for groups with significant PBF experience. However, their one-sided look at the research and cavalier assumptions about the authors’ motives upset me to the point that writing this response became necessary.

First of all, ad hominem attacks about the motives of well-respected researchers should never be a part of a published piece, regardless of the audience. Shulock and Snyder’s reference to the authors’ “surprising lack of curiosity about their own findings” is both an unfair personal attack and untrue. Tandberg and Hillman not only talk about the eight states with some positive impacts, they also discuss the nine states with negative impacts and a larger number of states with no statistically significant effects. Yet Shulock and Snyder do not bother mentioning the states with negative effects in their piece.

Shulock and Snyder are quite willing to attack Tandberg and Hillman for a perceived lack of complexity in their statistical model, particularly regarding their lack of controls for “realism and complexities.” In the academic community, criticisms like this are usually followed up with suggestions on how to improve the model given available data. Yet they fail to do so.

It is also unusual to see a short policy brief like this receive such a great degree of criticism, particularly when the findings are null, the methodology is not under serious question, and the authors are assistant professors. As a new assistant professor myself, I hope that this sort of criticism does not deter untenured faculty and graduate students from pursuing research in policy-relevant fields.

I teach higher education finance to graduate students, and one of the topics this semester was performance-based funding and accountability policy. If Shulock and Snyder submitted their essay for my class, I would ask for a series of revisions before the end of the semester. They need to provide empirical evidence in support of their position and to accurately describe the work done by Tandberg and Hillman. They deserve to have their research fairly characterized in the public sphere.

New Recommendations for Performance-Based Funding in Wisconsin

Performance-based funding for Wisconsin’s technical colleges is at the forefront of Governor Walker’s higher education budget for the next biennium. In previous blog posts (here, here, and here), I have briefly discussed some of the pros and cons of moving to a performance-based funding model for a diverse group of postsecondary institutions.

This week, Nick Hillman, Sara Goldrick-Rab, and I released a policy brief with recommendations for performance-based funding in Wisconsin through WISCAPE. In the brief, we discuss how performance-based funding has operated in other states, as well as recommendations for how to operate PBF in Wisconsin. Our key points are the following:

(1) Performance-based funding seeks to switch the focus from enrollment to completion.

(2) Successful performance-based funding starts small and is developed via collaboration.

(3) Colleges with different missions should have different performance metrics.

(4) Multiple measures of success are necessary to reduce the possibility of perverse incentives.

Wisconsin’s proposal appears to meet some of these key points, but some concerns do remain. My primary concern is the speed with which funding will shift to performance—from 10% in 2014-15 to 100% by 2019-20. This may not be enough time for colleges to adjust their actions, so this timeline should be adjusted as needed.

Technical Colleges Debate Tying Funding to Job Placement

In advance of Wisconsin Governor Scott Walker’s budget address tomorrow evening, last week’s release of plans to tie state funding for technical colleges to performance measures has generated a great deal of discussion. One of the most discussed portions of his plan (press release here) is his proposal to tie funding to job placement rates, particularly in high-demand fields. Most colleges seem to support the idea of getting better data on job placement rates, but using that measure in an accountability system has sparked controversy.

Madison Area Technical College came out last week in opposition to the Governor’s proposal, as covered by a recent article in the Capital Times. The article mentions comments by provost Terry Webb that job placement rates are partially influenced by factors outside the college’s control, such as job availability, location, and individual preferences. These concerns are certainly real, especially given the difficulty of tracking students who may leave the state in search of a great job opportunity.

However, Gateway Technical College came out in support of funding based on job placement rates, according to an article in the Racine Journal Times (hat tip to Noel Radomski for the link). Gateway president Bryan Albrecht supports the plan on account of the college’s high job placement rates among graduates (85%, among those who responded to a job placement survey with a 78% response rate, although only 55% were employed in their field of study). The college seems confident in its ability to change programs as needed in order to keep up with labor market demands, even in the face of a difficult economy in southeast Wisconsin.

The differing reactions of these two technical colleges show the difficulty of developing a performance-based funding system which works for all stakeholders. Madison College, along with three other technical colleges in the state, has liberal arts transfer programs with University of Wisconsin System institutions. These students may graduate with an associate’s degree and not immediately enter the labor market, or even successfully transfer before getting the degree. The funding system, which will be jointly developed by the Wisconsin Technical College System and the state’s powerful Department of Administration, should keep those programs in mind so to not unfairly penalize students with dual vocational/transfer missions.

More on Wisconsin’s Workforce Development Proposal

Today, Wisconsin Governor Scott Walker released more information about his proposal to improve the state’s workforce development system through an additional $100 million in state appropriations. These proposals have the potential to affect the priorities of Wisconsin institutions of higher education, particularly the Wisconsin Technical College System. While most of the key points of the proposal are directly from his special workforce development commission’s report last August (see my analyses here and here), the additional details provided in this press release provide more concrete information about the Governor’s soon-to-be-released budget proposal.

Three items in Gov. Walker’s proposal are in legislation separate from the state budget: workforce training grants to a mix of colleges, businesses and economic development organizations, a new Office of Skills Development to administer the grants, and a labor market information system designed to help link students and workers to available jobs and track labor market trends. The labor market information system has the potential to provide high school and college students with information that can help them decide their course of study, but getting the information to students in a timely manner may be difficult. It can be a useful tool for high school juniors who want to figure out a possible career, but it may be four or five years before the student is ready to go into the workforce. A lot can happen in that period of time. In any case, these records should be linked to K-12 and higher education datasets so the effectiveness of the new system can be evaluated.

The big change in higher education policy comes from the proposed shift to performance-based funding (PBF) in the Wisconsin Technical College System. Under PBF, colleges are funded based on outcomes (such as graduation and job placement rates) instead of based on enrollment or other historical factors. This plan starts with 10% of base funding being used for PBF in 2014-15, rising to 100% by 2020. Although other states have similar plans to completely shift to PBF, I am skeptical that a majority of funding will ever be tied to performance for political reasons. (Note that if Gov. Walker serves a second term and declines to run for a third, he would leave office in January of 2019—before this takes effect.)

Few details are currently available about the proposed funding formula for WTCS, as it will be developed by WTCS and the state Department of Administration. But the press release does note that the formula will prioritize job placement and enrollment in high-demand programs, something which is likely to be opposed by WTCS campuses with strong university transfer programs (such as Madison Area Technical College). These concerns will likely be kept in mind as a PBF system is developed.

Finally, the press release calls for the development of a common core of 30 credits (approximately ten courses) that will be fully transferrable across the UW System, WTCS, and participating Wisconsin private colleges. This will likely be opposed by a number of UW System universities as a loss of autonomy and a perceived lowering of academic standards. I would expect the common core to be mandated, but some colleges will attempt to deny full transferability of certain courses; for example, a college algebra class at a technical college might be classified as an elective math credit at a UW System university instead of as a college algebra class.

Governor Walker’s budget address will take place on February 20, and I will have a complete analysis of his higher education programs later this week. More details may be released before that time, such as in this unusual Sunday press release.