Blog (Kelchen on Education)

Another Acceptance Angst Article

Having spent three years in a college admissions office, I know this is the time of year in which some students find out whether they were accepted to the college(s) of their dreams. I am particularly annoyed by the New York Times’s “The Choice” blog, which is clearly aimed toward students with academic credentials suitable for Ivy League institutions. My annoyance rises because this blog focuses its attention on such a small number of institutions which are academically out of reach of nearly all students and perceived to be financially out of reach of almost everyone (although this is not the case).

The Choice annually follows a small group of students who apply to many of these highly selective institutions, and are shocked when they receive a rejection letter. While I am glad that the blog now includes more students from geographically and economically varied backgrounds, most of the bloggers’ stories are still sufficient to cause angst to many well-prepared students. Take for example Leobardo Espinoza, Jr., from Topeka, Kansas. His most recent post was full of angst about getting rejected by Washington University in St. Louis, one of the most selective colleges in the Midwest. Thankfully, he eventually realized that he was already accepted by American, Amherst, and Bowdoin, as well as Kansas and Wichita State. But I am concerned that many readers will get the wrong impression about his post.

I am glad that the blog is finally featuring students who apply to at least a few local options. If students have a choice, I strongly recommend avoiding as much debt as possible along the road to a bachelor’s degree by staying in-state or attending private colleges with generous financial aid options. Yes, getting rejected by one prestigious college stings and it makes for great reading among the NYT’s elite readership. But it’s not the end of the world, and I think that Mr. Espinoza has realized that in spite of the title of the article.

Improving Net Price Data Reporting

As the sticker price of attending colleges and universities has steadily increased over the past decade, researchers and policymakers have begun to focus on the actual price that students and their families face. The federal government collects a measure of the net price of attendance in its IPEDS database, which is calculated as the total cost of attendance (tuition, fees, room and board, and other expenses) less any grant aid received. (More information can be found on the IPEDS website.) I have used the net price measure in my prior work, including the Washington Monthly rankings and my previous post on the Net Price Madness tournament. However, the data do have substantial limitations—some of which could be easily addressed in the data collection process.

There are two different net price measures currently available in the IPEDS dataset—one for all students receiving grant aid (federal, state, and/or institutional) and one for students receiving any federal financial aid (grants, loans, or work-study).  The average net price is available for the first measure, while the second measure breaks down the net price by family income (but does not report an average net price.) For public institutions, both of these measures only include first-time, full-time, degree-seeking students paying in-state tuition, which can substantially limit the generalizability of the results.

Here, I use my current institution (the University of Wisconsin-Madison) as an example. The starting sample for IPEDS is the 3,487 first-time, full-time, degree-seeking freshmen who are in-state students. Of those students, net price by family income is calculated for the 1,983 students receiving Title IV aid. (This suggests that just over half of in-state Madison freshmen file the FAFSA.) Here are the net price and number of students by income group:

0-30k: $6,363 (n=212)
30-48k: $10,098 (n=232)
48-75k: $15,286 (n=406)
75-110k: $19,482 (n=542)
110+k: $20,442 (n=591)

The average net price is calculated for a slightly different group of students—those who received grant aid for any source (n=1,858). The average net price is $14,940, which is lower than the average net price faced by students who file the FAFSA ($16,409) as some students who do not receive institutional grants are included in the latter measure. However, the latter number is not reported in the main IPEDS dataset and can only be calculated by digging into the institutional reports.

I would encourage IPEDS to add the average net price for all FAFSA filers into the dataset, as that better reflects what students from financially modest backgrounds will pay. Additionally, to counter the relatively small number of students who may have a family income of less than $30,000 and to tie into policy discussions, I would like to see the average net price for all Pell Grant recipients. These changes can easily be made given current data collection procedures and would provide more useful data to stakeholders.

The 2013 Net Price Madness Tournament

Millions and millions of Americans will be sitting on the couch over the next several weeks watching the NCAA college basketball tournaments—and I’ll be keeping an eye on my Wisconsin Badgers as the men’s team makes its way through the tournament. Those of us in the higher education community have made a variety of brackets highlighting different aspects of the participating institutions (see Inside Higher Ed’s looks at the men’s and women’s tournaments, using the academic performance rate for student-athletes, and one from The Awl based on tuition, with higher tuition resulting in advancement).

I take a different look at advancing colleges through the tournament—based on having the lowest net price of attendance. Net price is calculated as the total cost of attendance (tuition and fees, room and board, books, and a living allowance) less any grant aid received—among students receiving any grant aid. I use IPEDS data from 2010-11 for this analysis, and also show results if the analysis is limited to students with family income below $30,000 per year (most of whom will have an expected family contribution of zero). Data for the 2013 Net Price Madness Tournament is below:

midwest_2013

west_2013

south_2013east_2013

SOURCE: IPEDS.

Overall Net Price

Round of 16

Midwest: North Carolina A&T ($6,147) vs. New Mexico State ($8,492), Middle Tennessee State ($9,148) vs. Albany ($12,697)

West: Wichita State ($8,079) vs. Ole Miss ($12,516), New Mexico ($10,272) vs. Iowa State ($13,554)

South: North Carolina ($11,028) vs. South Dakota State ($12,815), Northwestern State ($7,939) vs. San Diego State ($8,527)

East: North Carolina State ($9,847) vs. UNLV ($9,943), Davidson ($23,623) vs. Illinois ($15,610)

Final Four

North Carolina A&T ($6,147) vs. Wichita State ($8,079)

Northwestern State ($7,939) vs. North Carolina State ($9,847)

WINNER: North Carolina A&T (59% Pell, 41% grad rate)

Net Price (household income below $30k)

Round of 16

Midwest: North Carolina A&T ($4,774) vs. New Mexico State ($5,966), Michigan State ($5,569) vs. Duke ($8,049)

West: Southern University ($8,752) vs. Wisconsin ($6,363), Harvard ($1,297) vs. Iowa State ($8,636)

South: North Carolina ($4,101) vs. Michigan ($4,778), Florida ($3,778) vs. San Diego State ($3,454)

East: Indiana ($3,919) vs. UNLV ($6,412), Davidson ($7,165) vs. Illinois ($7,432)

Final Four

North Carolina A&T ($4,774) vs. Harvard ($1,297)

San Diego State ($3,454) vs. Indiana ($3,919)

WINNER: Harvard (11% Pell, 97% graduation rate)

Depending on which version of net price is used, the results do change substantially. Some colleges dramatically lower their net price of attendance for the neediest students, while others keep theirs more constant in spite of Pell Grant funds being available. Harvard’s victory on the lowest-income measure does ring somewhat hollow, as its percentage of students receiving Pell Grants (11%) tied with Villanova for the lowest in the tournament.

Thanks for reading this post, and feel free to use these picks if you choose to fill out a bracket for the real tournament. Do keep in mind that low net prices and basketball prowess may not exactly be correlated!

The Benefits of Biennial Budgets

The federal government had a substantial problem with its budgeting process over the past several years, with funding being provided by a series of continuing resolutions outside the annual process for more than three years. With bipartisan frustration over this process growing, a group of centrist Senators, led by Jeanne Shaheen (D-NH) and Johnny Isakson (R-GA), have proposed a switch from annual to biennial budgets. This proposal was introduced in the past Congress and was not seriously discussed, but is likely to be considered this time around with the interest of Senate Majority Leader Harry Reid (D-NV).

Biennial budgets are not uncommon at the state level. A 2011 report from the National Conference of State Legislatures shows that 19 states have biennial budgets, including Ohio, Texas, and Wisconsin. Only four of these states have legislatures that only meet every two years, meaning that 15 states have actively chosen the biennial path.

Biennial budgeting allows for more time for debate and discussion of tricky matters, but the budgets often have to be adjusted because of the balanced budget requirements. (Budget repair bills are well-known here in Wisconsin.) The lack of such a requirement at the federal level makes biennial budgeting even more feasible. While I am a staunch supporter of a balanced budget, I recognize that a small error in economic growth or demographic assumptions can result in a slightly unbalanced budget over a two-year period. As long as the assumptions are reasonable, I’m fine with a small error which can be addressed in the future.

Requiring a budget every two years instead of one can help provide more stability to federal education funding, particularly regarding policies and levels of student financial aid and education research. This stability has the potential to have positive impacts which are independent of the actual funding levels. For example, if the exact dollar amount for the maximum Pell Grant is known, a push should be made to communicate that level to students who are likely to qualify upon entering college. Providing earlier information of financial aid could induce the marginal student to enroll in college and perhaps even take an additional high school course which would lower the likelihood of remediation. This push toward earlier notification of financial aid is consistent with other parts of my research agenda, and would have the added benefit (in my view) of allowing Pell Grant funding to be flexible as needed in the future.

A biennial budget process could also have the benefit of making student loan interest rates more predictable. Under current law, undergraduate subsidized Stafford interest rates are currently set to double (from 3.4% to 6.8%) on July 1. (This is a budgetary matter because the interest rate does determine the level of profit or loss for the federal government.) While I am a strong supporter of plans to tie student loan interest rates to market conditionssuch as the rate paid on Treasury bills plus 3%—biennial budgeting would at least allow interest rates to not face a cliff every single year.

Biennial budgeting has the potential to result in more stability in education funding, as well as result in budgets which are well-discussed and passed under regular order. For those reasons, I am supportive of moving from annual to biennial budgets. I would love to hear your thoughts on this proposal in the comments!

The Higher Learning Commission’s Accreditation Gamble

Accrediting bodies play an important role in judging the quality (or at least the competency) of American colleges and universities. There are six accreditors which cover the majority of non-profit, non-religious postsecondary institutions, including the powerful Higher Learning Commission in the Midwest.  The HLC recently informed Apollo Group, the owner of the University of Phoenix, that it may be placed on probation due to concerns about administrative and governance structures.

Part of Phoenix’s accrediting concerns may be due to a philosophical shift at the HLC, emphasizing the public purposes of higher education. As noted in an Inside Higher Ed article on the topic, Sylvia Manning, president of the HLC, stated the priority that education be a public good. The new accrediting criteria include the following statement:

“The institution’s educational responsibilities take primacy over other purposes, such as generating financial returns for investors, contributing to a related or parent organization, or supporting external interests.”

This shift occurs in the midst of questions about the purposes of the current accreditation structure. While colleges must be accredited in order for students to receive federal financial aid dollars, the federal government currently has no direct involvement in the accreditation structure. Accrediting bodies also focus on degree programs instead of individual courses, something which has also been questioned.

Given the current decentralized structure of accreditation, Phoenix could easily move to another of the main regional nonprofit accrediting bodies—or it could go through a body focusing on private colleges and universities. The latter would likely be easier for Phoenix, as it would have to answer to more like-minded critics. While these bodies are viewed as being less prestigious than the HLC, it is an open question whether students care about the accrediting body—as long as they can receive financial aid.

The Higher Learning Commission is taking a gamble with its move toward placing Phoenix on probation, partially due to the new criteria. They need to carefully consider whether it is better to have oversight over one of the nation’s largest and most powerful postsecondary institutions or to steer them toward a more friendly accrediting body. Traditional accrediting bodies should also consider the possibility that the federal government will get into the accreditation business if for-profits leave groups like the HLC. If the HLC chooses to focus on Phoenix’s control instead of its academic competency, a chain reaction could be set off which may end up with them being replaced by federal oversight.

College Reputation Rankings Go Global

College rankings are not a phenomenon which is limited to the United States. Shanghai Jiao Tong University has ranked research universities for the past decade, and the well-known Times Higher Education rankings have been around for several years. While the Shanghai rankings tend to focus on metrics such as citations and research funding, THE has compiled a reputational ranking of universities around the world. Reputational rankings are a concern in U.S.-only rankings, but extending them to a global scale makes little sense to me.

Thomson Reuters (the group behind the THE rankings) makes a great fuss about the sound methodology of the reputational rankings, which they to their credit acknowledge is a subjective measure. They collected 16,639 responses from academics around the world, with some demographic information available here. But they fail to provide any information about the sampling frame, a devastating omission. The researchers behind the rankings do note that the initial sample was constructed to be broadly representative of global academics, but we know nothing about the response rate or whether the final sample was representative. In my mind, that omission disqualifies the rankings from further consideration. But I’ll push on and analyze the content of the reputational rankings.

The reputational rankings are a combination of separate ratings for teaching and research quality. I really don’t have serious concerns about the research component of the ranking, as the survey asks about research quality of given institutions within the academic’s discipline. Researchers who stay on top of their field should be able to reasonably identify universities with top research departments. I have much less confidence in the teaching portion of the rankings, as someone needs to observe classes in a given department to have any idea of teaching effectiveness. Yet I would be surprised if teaching and research evaluations were not strongly correlated.

The University of Wisconsin-Madison ranks 30th on the global reputation scale, which a slightly higher score for research than teaching. (And according to the map, the university has been relocated to the greater Marshfield area.) That has not stopped Kris Olds, a UW-Madison faculty member, from leveling a devastating critique of the idea of global rankings—or the UW-Madison press office from putting out a favorable release on the news.

I have mixed emotions on this particular set of rankings; the research measure is probably capturing research productivity well, but the teaching measure is likely lousy. However, without more information about the response rate to the THE survey, I cannot view these rankings as being valid.

Tying FAFSA Data to IPEDS: The Need for “Medium Data”

It is safe to say that I’m a fan of data in higher education. Students and their families, states, and the federal government spend a massive amount of money on higher education, yet we have relatively little data on outcomes other than graduation rates and student loan default rates for a small subset of students—those who started as first-time, full-time students. The federal government currently operates on what I call a “little data” model, with some rough institutional-level measures available through IPEDS. Some of these measures are also available through a slightly more student-friendly portal in the College Navigator website.

As is often the case, some states are light years ahead of the federal government regarding data collection and availability. Florida, Texas, and Ohio are often recognized as leaders in terms of higher education data availability, both in terms of collecting (deidentified) student-level data and tying together K-12, higher education, and workforce data outcomes. The Spellings Commission in 2006 did call for a student-level dataset at a national level, but Congress explicitly denied the Department of Education this authority in the reauthorization of the Higher Education Act. Although there are sporadic movements toward “big data” at the national level, making this policy shift will require Congressional support and a substantial amount of resources.

Although I am willing to direct resources to a much more robust data system (after all, how can we determine funding priorities if we know so little about student outcomes?), a “medium data” approach could easily be enacted by using data sources already collected by colleges or the federal governemnt. I spent a fair amount of the morning today trying to find a fairly simple piece of data—the percentage of students at given colleges whose parent(s) did not complete college. The topic of first-generation students is important in policy circles, yet we have no systemic data on how large this group of students is at most colleges.

FAFSA data could be used to expand the number of IPEDS measures to include such topics as the following, in addition to first-generation status:

(1)    The percentage of students who file the FAFSA

(2)    Average/median family income

(3)    Percentage of students with zero EFC

(4)    Information on means-tested benefit receipt (such as food stamps or TANF)

(5)    Marital status

Of course, these measures would only include students who file the FAFSA—which would exclude many students who would not qualify for need-based aid, as well as some students who are unable to navigate through the complicated form. But these measures would provide a better idea of institutional diversity beyond racial/ethnic diversity and the percentage of students receiving Pell Grants and could be incorporated into IPEDS at a fairly low cost. Adding these FAFSA measures would help move IPEDS from “little data” to “medium data” and provide more useful measures to higher education stakeholders.

New Recommendations for Performance-Based Funding in Wisconsin

Performance-based funding for Wisconsin’s technical colleges is at the forefront of Governor Walker’s higher education budget for the next biennium. In previous blog posts (here, here, and here), I have briefly discussed some of the pros and cons of moving to a performance-based funding model for a diverse group of postsecondary institutions.

This week, Nick Hillman, Sara Goldrick-Rab, and I released a policy brief with recommendations for performance-based funding in Wisconsin through WISCAPE. In the brief, we discuss how performance-based funding has operated in other states, as well as recommendations for how to operate PBF in Wisconsin. Our key points are the following:

(1) Performance-based funding seeks to switch the focus from enrollment to completion.

(2) Successful performance-based funding starts small and is developed via collaboration.

(3) Colleges with different missions should have different performance metrics.

(4) Multiple measures of success are necessary to reduce the possibility of perverse incentives.

Wisconsin’s proposal appears to meet some of these key points, but some concerns do remain. My primary concern is the speed with which funding will shift to performance—from 10% in 2014-15 to 100% by 2019-20. This may not be enough time for colleges to adjust their actions, so this timeline should be adjusted as needed.

On MOOCs and Money

Massively open online courses (MOOCs) have become the newest fashionable trend in higher education. These courses, which are open to anyone and can cover anything from songwriting to combinatorial game theory (which sounds both fun and exceedingly challenging), have begun to gain recognition among policymakers and the public alike. The American Council on Education announced that five MOOCs from Coursera would be recommended for college credit, hastening the move into the technology.

The University of Wisconsin-Madison announced last night that its faculty would offer four MOOCs in conjunction with Coursera as a part of the university’s Educational Innovations program. While the MOOCs (in video games and learning, higher education globalization, evolution, and economics/finance) would not currently be offered for credit, the potential certainly exists for credit in the future. These courses could be a part of the Flexible Option program through the University of Wisconsin System, which has gained additional support in Governor Walker’s new budget.

With all of the potential promise of MOOCS to help students get access to higher education, there are still many concerns to be addressed. Coursera recently had to call off a MOOC on the fundamentals of online education, as the technology wasn’t ready for 40,000 students. Issues of access are also important with MOOCs, as they appeal to students who prefer more independent learning and are ready to handle that sort of delivery concern. We also know little about whether MOOCs are effective in promoting student learning, both compared to an in-person class or even to nothing at all.

From a university’s perspective, MOOCs present both promises and pitfalls. If a university can develop a successful MOOC (in the sense of gaining public support), it is a potential way to increase funding through either state appropriations (as could be the case in Wisconsin) or donations from satisfied students. If a few entry-level courses (such as UC-Irvine’s offerings of algebra and pre-calculus) were to be offered for credit, it could serve as a potential way for students to select colleges or familiarize themselves with higher-level coursework.

Current university students (and faculty) should be concerned about where the faculty time for developing these MOOCs comes from. Many faculty at research universities (the ones who are likely to develop MOOCs) are teaching one or two courses per semester in an environment where teaching is valued less than research. These courses would have to be developed on a professor’s own time or count as part of the service component—it is essential that teaching loads not be reduced for developing MOOCs unless the university is somehow compensated. An option for compensation is to have foundations help fund initial course development in the form of faculty buyouts.

I am glad that the University of Wisconsin-Madison is starting small with MOOCs, as these courses have potential to help improve student learning on the margin at this point in time. If a college can develop a MOOC for an entry-level class that can be cost-effective, I’ll be a lot more optimistic. But for right now, it’s a neat way to see new research in specialized fields and could certainly be a way for advanced undergraduate students to take an “elective” course in their field of study. I am waiting for more research before fully jumping on the MOOC bandwagon.

Technical Colleges Debate Tying Funding to Job Placement

In advance of Wisconsin Governor Scott Walker’s budget address tomorrow evening, last week’s release of plans to tie state funding for technical colleges to performance measures has generated a great deal of discussion. One of the most discussed portions of his plan (press release here) is his proposal to tie funding to job placement rates, particularly in high-demand fields. Most colleges seem to support the idea of getting better data on job placement rates, but using that measure in an accountability system has sparked controversy.

Madison Area Technical College came out last week in opposition to the Governor’s proposal, as covered by a recent article in the Capital Times. The article mentions comments by provost Terry Webb that job placement rates are partially influenced by factors outside the college’s control, such as job availability, location, and individual preferences. These concerns are certainly real, especially given the difficulty of tracking students who may leave the state in search of a great job opportunity.

However, Gateway Technical College came out in support of funding based on job placement rates, according to an article in the Racine Journal Times (hat tip to Noel Radomski for the link). Gateway president Bryan Albrecht supports the plan on account of the college’s high job placement rates among graduates (85%, among those who responded to a job placement survey with a 78% response rate, although only 55% were employed in their field of study). The college seems confident in its ability to change programs as needed in order to keep up with labor market demands, even in the face of a difficult economy in southeast Wisconsin.

The differing reactions of these two technical colleges show the difficulty of developing a performance-based funding system which works for all stakeholders. Madison College, along with three other technical colleges in the state, has liberal arts transfer programs with University of Wisconsin System institutions. These students may graduate with an associate’s degree and not immediately enter the labor market, or even successfully transfer before getting the degree. The funding system, which will be jointly developed by the Wisconsin Technical College System and the state’s powerful Department of Administration, should keep those programs in mind so to not unfairly penalize students with dual vocational/transfer missions.