A Look at Admissions Rates by Gender

The newest round of data on American colleges and universities from the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) was released last week. While the data—on admissions, financial aid, student charges, and graduation rates—were only a few days later than in past years in spite of the longest government shutdown on record this fall and greatly diminished capacity at ED, the two data releases in recent months have had some uncharacteristic issues. The release earlier this fall was marred with coding issues, and the most recent release had data files not initially posted and issues that make the new cost of attendance survey unusable in my view for vocational institutions.

The Trump administration is keenly interested in the admissions survey, and has proposed a massive expansion that would retroactively collect large amounts of data by race/gender and test scores going back to 2019-20—and that the data collection would happen in the spring of 2026. (James Murphy has covered the government regulation burden angle incredibly well. Check out his work.) This goes well beyond a previously scheduled set of changes for fall 2025 data collection that will collect admissions data by race and gender, which would not be done retroactively.

I took a quick look at fall 2024 admissions data to get a sense of a key policy debate—admissions rates by gender—and to show some of the concerns with drawing policy conclusions from institution-level IPEDS data. A full spreadsheet can be downloaded here.

In aggregate, there is little evidence that men or women are admitted to selective colleges at different rates. For the 298 institutions with acceptance rates below 50%, women were admitted to the median institution at a rate 1.3 percentage points higher than men. This fell to 0.08 percentage points for the 100 institutions accepting fewer than 25% of students and men were favored by 0.04 percentage points for the 31 institutions accepting fewer than 10%.

But there are some interesting items at the institution level. Circle in the Square Theatre School (which I have never heard of, but it seems like a fascinating piece of geometry) admitted 5.01% of women and 1.26% of men. Caltech, MIT, and UCLA admitted higher shares of women, while Chicago, Brown, and Swarthmore admitted higher shares of men. Meanwhile, my university admitted men and women at nearly identical 46% rates…but women were 60% of applicants. Take a spin through to see what you think.

Just cutting the data by gender brings down sample sizes quite a bit, so race/gender admissions rates are going to be noisy at many institutions. For example, the Julliard School accepted men at a higher rate than women (10.6% compared to 7.9%), but only received a total of 2,020 applications. If a racial group represents only a small percentage of applicants (White students are the largest share of current students at just 29%), then a few applications could move percentages quite a bit. Adding test scores or high school GPA to the mix (like the Trump administration proposes) will make data far too volatile for high-stakes accountability, but that appears to be the future as federal investigations are likely to be linked to changes in a small number of student applications or admissions.

On a final note, I wish you a wonderful end to 2025. Check out my interview with the fabulous Alex Usher of Higher Education Strategy Associates for my top ten events in American higher education this year. And for those working on college campuses, take the time to learn how both faculty and staff schedules work once classes end. Many faculty are off contract, while staff may be trying to take time off before vacation days expire.

See you in 2026!

College Prices Have Not Risen Dramatically in the Last Decade—But Will That Change?

Higher education is facing a crisis of confidence among the general public, and much of that is driven by concerns regarding affordability. For example, about 80 percent of Democrats and Republicans alike think that colleges do not sufficiently prioritize affordability.

But while college is still expensive, the narrative that colleges and universities are increasing their tuition willy-nilly has not been true for a decade. See this chart from the College Board’s helpful Trends in College Pricing report, which has data through the 2025-26 academic year—two years ahead of U.S. Department of Education data. Tuition increases have been at or below inflation for the last decade, breaking a decadeslong trend of increases well above the rate of inflation.

This is why I was disappointed to see a recent NPR piece that focused on how college costs (which should be prices—grrrr) have doubled in inflation-adjusted dollars in the last three decades. (The headline of doubling over 20 years is inaccurate, but it is right in the main part of the piece at 30 years.) Yes, listed tuition and fees doubled between 1995 and 2015, but they have not budged in the last decade. And if grant aid is taken into account, much of higher education is close to pre-Great Recession affordability. That is in spite of operating costs (as proxied by the Higher Education Price Index) rising faster than inflation over much of the period, driven by benefits and maintenance costs.

This more nuanced narrative (affordability is still a concern, but the situation has actually improved in public higher education) is important to communicate to the general public. The NPR piece was spot-on in 2015, but less so in 2025.

With all that being said, colleges are under more pressure to generate revenue than at any point in recent years. State funding has been an unsung hero for the last decade, and that is likely to take a hit as budgets get tight. With all of the pressures coming out of Washington, institutions are likely to turn to larger tuition increases if at all possible. Public institutions are frequently constrained by state-level tuition controls, which are present in about 30 states. But other public and private institutions may try to get more revenue out of tuition, breaking a promising trend that few people outside of higher education even knew was in progress.

Examining the Debt and Earnings of “Professional” Programs

Negotiated rulemaking, in which the federal government convenes representatives of affected parties before implementing major policy changes, is one of the wonkier topics in higher education. (I cannot recommend enough Rebecca Natow’s book on the topic.) Negotiated rulemaking has been in the news quite a bit lately as the Department of Education works to implement changes to federal student loan borrowing limits passed in this summer’s budget reconciliation law.

Since 2006, students attending graduate and professional programs have been able to borrow up to the cost of attendance. But the reconciliation law limited graduate programs to $100,000 and professional programs to $200,000, setting off negotiations on which programs counted as “professional” (and thus received higher loan limits). The Department of Education started with ten programs and the list eventually went to eleven with the addition of clinical psychology.

In this short post, I take a look at the debt and earnings of these programs that meet ED’s definition of “professional,” along with a few other programs that could be considered professional but were not.

Data and Methods

I used program-level College Scorecard data, focusing on debt data from 2019 and five-year earnings data from 2020. (These are the most recent data points available, as the Scorecard has not been meaningfully updated during the second Trump administration. Five-year earnings get students in health fields beyond medical residencies. I pulled all doctoral/first professional fields from the data by four-digit Classification of Instructional Programs codes, as well as master’s degrees in theology to meet the listed criteria.

Nine of the eleven programs had enough graduates with debt and earnings to report data; osteopathic medicine and podiatry did not. There were five other fields of study with at least 14 programs reporting data: education, educational administration, rehabilitation, nursing, and business administration. All of these clearly prepare people for employment in a profession, but are not currently recognized as “professional.”

Key takeaways

Below is a summary table of debt and earnings for professional programs, including the number of programs above the $100,000 (graduate) and $200,000 (professional) thresholds. Dentistry, pharmacy, and medicine have a sizable share of programs above the $100,000 threshold, while law (the largest field) has only four of 195 programs over $200,000. Theology is the only one of the nine “professional” programs with sufficient data that has higher five-year earnings than debt, suggesting that students in other programs may have a hard time accessing the private market to fill the gap between $200,000 and the full cost of attendance.

On the other hand, four of the five programs not included as “professional” have higher earnings than debt, with nursing and educational administration being the only programs with sufficient data that had debt levels below 60% of earnings. More than one-third of rehabilitation programs had debt over the new $100,000 cap, while few programs in other fields had that high of a debt level. (Education looks pretty good now, doesn’t it?)

I expect the debate over what counts as “professional” to end up in courts and to possibly make its way into a future budget reconciliation bill (about the only way Congress passes legislation at this point). Until then, I will be hoping for newer and more granular data about affected programs.

New Research on the Prevalence and Effects of Differential Tuition Policies

I am thrilled to share a new open-access article in AERA Open that I wrote on the topic of differential tuition policies at public universities. Differential tuition, in which students pay higher charges for fields of study that are more expensive to operate and/or are in high demand among students, have anecdotally become more popular in recent years. Yet the only published research on the effects of differential tuition (a great study that motivated my work) focused on public research universities that adopted differential tuition by the 2007-08 academic year.

I decided to slowly chip away at collecting data on the presence of differential tuition in business, engineering, and nursing programs between the 2003-04 and 2022-23 academic years. It took me more than three months to compile a dataset that you can download here, and then several additional months to do data checks and write the paper (with the help of a new research assistant who debuted during the project and alternated between sleeping and data entry).

Notably, nearly half of all public universities—and just over half of all research universities—adopted differential tuition by the 2022-23 academic year. While I did not have the resources to collect data on the amount of the differential (funders, reach out if you’re interested in supporting an extension of this work!), differentials ranged from a few dollars per credit hour to several thousand dollars per year.

I then examined whether the adoption of differential tuition increased the number of bachelor’s degrees awarded in business, engineering, or nursing. In general, there were no effects on business or nursing and some modest increases in the number of engineering degrees. However, any benefits of expanded access largely accrued to White students.

Check out the full article and let me know what you think. I am certainly open to extending this work, so any suggestions would be greatly appreciated.

Post-Pandemic Trends in Student Fees at Public Universities

Amid everything going on in the world of higher education right now, it is easy to forget that this is the time of year that students and families are trying to figure out whether they can afford to attend college. This is when I typically get a bunch of questions from journalists across the country about the extent to which college is affordable, and I do my best to provide helpful information.

I have written about student fees at public universities in the past, and the topic is the source of several questions in the last few weeks. In response to that, I updated my work to examine the most up-to-date data available on trends in student fees at public universities that include a look at what has happened since the pandemic. I did this using data on 499 public universities from the 2011-12 through the 2023-24 academic years, excluding about two dozen institutions (including all of Massachusetts) that “reset” fees by shifting most of their fees into tuition at some point during the period. The data for this analysis can be downloaded here.

Overall, tuition went up by 37% between fall 2011 and fall 2023, while fees went up by 40%. This is slightly higher than the 35% overall rate of inflation (as measured by the Consumer Price Index) over this same period. It also marks a change from the 2000s, in which fees went up substantially more than tuition. The growth in tuition and fees slowed considerably around the pandemic, which reflects an increase in the number of tuition freezes during the period and difficulties increasing fees while many students were studying remotely.

But in fall 2023, fees increased more quickly than tuition. The question is whether that becomes a trend—and whether that will be able to be studied using federal data. It would be possible to continue this analysis by collecting data from institutional websites or fact books, but it would also require a number of assumptions about how to handle the complicated structures of differential tuition by field of study and number of credits taken.

Which Colleges Always Lose Money?

It is safe to say that there is a lot of concern right now about the financial viability of higher education. And while I think fewer colleges are going to close than pundits predict (and check out my recent NBER working paper on factors associated with college closures), it is still going to be a bumpy ride as colleges try to cut costs after efforts to increase revenue are unsuccessful.

By far, the most popular piece on my blog in 2024 (representing nearly one-fourth of all traffic to my website) was a fairly quick look at which private colleges consistently lost money over the last decade. Now that a new year of data on institutional finances (through Fiscal Year 2023) came out through the Integrated Postsecondary Education Data System, I am revisiting this and also including public universities.

I looked at the operating margins (revenues minus expenses) private nonprofit colleges and public universities for the past ten years (Fiscal Years 2014 through 2023). This analysis included 938 private and 525 public institutions in the 50 states and Washington, DC and excluded colleges with any missing data, two-year institutions, or special-focus institutions based on the most recent Carnegie classifications.

You can download the dataset here, with highlighted private colleges having closed since IPEDS data were collected.

The first takeaway is that the share of private colleges with losses varies much more than the share of public institutions, and this is driven by a combination of investment returns at private institutions and the backstop that state funding provides for public universities. More than four in ten private colleges posted a loss in Fiscal Year 2023—twice the rate of public universities. Since the beginning of the pandemic, the percentage of public universities with revenues failing to match expenditures has been cut in half. Federal covid relief funds are now gone, however, and state budgets look wobbly.

The two figures below show the number of years in the last decade that both private and public universities posted losses. Most private colleges saw surpluses more often than deficits, with only 14 percent of institutions losing money in more than five years. Seventy-one private colleges never posted a loss during this period, and they are generally less-selective institutions such as Miles College in Alabama, Dordt University in Iowa, and the University of Northwestern in Minnesota (the better-known Northwestern in Illinois posted losses in three years when the stock market went down). A few better-known private universities that managed to stay in the black every year included Southern Methodist University, Liberty University, Southern New Hampshire University, and the University of Pennsylvania.

On the other hand, 27 colleges posted losses in eight or more years. Notably, five of these colleges (bolded) have closed or announced closures in the last year or so, and another one (Bacone College in Oklahoma) is not currently offering classes. While some institutions can withstand consistent losses through one-time donations or activities that are not well captured on balance sheets, it is difficult for most colleges. Take for example Judson Universitty in Illinois, which has lost money in eight of the last ten years. Their IRS Form 990 filings show that net assets have declined from more than $44 million in the early 2010s to just under $27 million today—not a good trend.

NameStateLosses
Polytechnic University of Puerto Rico-OrlandoFL10
Roberts Wesleyan UniversityNY10
Trinity International University-FloridaFL9
Cambridge CollegeMA9
Fontbonne UniversityMO9
Bethany CollegeWV9
Golden Gate UniversityCA8
Pacific Union CollegeCA8
Polytechnic University of Puerto Rico-MiamiFL8
Hawaii Pacific UniversityHI8
Judson UniversityIL8
Southwestern CollegeKS8
Webster UniversityMO8
University of ProvidenceMT8
Drew UniversityNJ8
Elmira CollegeNY8
Hilbert CollegeNY8
St. Francis CollegeNY8
The College of Saint RoseNY8
Yeshiva UniversityNY8
Antioch CollegeOH8
Lourdes UniversityOH8
Bacone College (on hiatus)OK8
Warner Pacific UniversityOR8
Cabrini UniversityPA8
University of Valley ForgePA8
Waynesburg UniversityPA8

While a larger share of public universities than private colleges never posted a loss, more public universities (16 percent) lost money in at least five of the last ten years. In general, most flagship public universities did exceedingly well and many never lost money. But 22 institutions lost money in eight out of ten years, with 15 of them being located in New York. It is indeed a tough time for many regional public universities, even though they are at very low risk of closure.

NameStateLosses
University of New Hampshire at ManchesterNH10
SUNY College of Environmental Science and ForestryNY10
SUNY College of Technology at DelhiNY10
SUNY at FredoniaNY10
SUNY at Purchase CollegeNY10
Rutgers University-CamdenNJ9
SUNY Buffalo State UniversityNY9
SUNY College at GeneseoNY9
SUNY College at PotsdamNY9
SUNY College of Agriculture and Technology at CobleskillNY9
SUNY Maritime CollegeNY9
SUNY Old WestburyNY9
University of Hawaii-West OahuHI8
Northern Illinois UniversityIL8
University of Illinois SpringfieldIL8
Northern Kentucky UniversityKY8
CUNY Graduate School and University CenterNY8
College of Staten Island CUNYNY8
SUNY BrockportNY8
SUNY College of Technology at CantonNY8
State University of New York at OswegoNY8
Shippensburg University of PennsylvaniaPA8

In addition to new finance data, there are also new data on fall enrollments and staffing levels. I encourage researchers, policymakers, and practitioners to take a look through the data to learn more about the current (well, as current as possible given data lags) state of higher education.

Documenting the Growth of Responsibility Center Management Budget Models in Public Higher Education

As most of higher education is concerned about their financial position, a growing number of colleges are trying to encourage academic units to generate additional revenues and cut back on expenses. One popular way of doing this is through responsibility center management (RCM) budget models, which base a portion of a unit’s budget on their ability to effectively generate and use resources.[1]

Both universities that I have worked at (Seton Hall and Tennessee) have adopted variations of RCM budget models, and there is a lot of interest—primarily at research universities—in pursuing RCM. Having been through RCM, I am quite interested in the downstream implications of RCM on how leaders of institutions and units behave. There are a couple of good scholarly articles about the effects of RCM that I use when I teach higher education finance, but they are based on a small number of fairly early adopters and the findings are mixed.

One of my current research projects is examining the growth of master’s degree programs (see our recent policy brief), and I have a strong suspicion that institutions adopting RCM budget models are more likely to launch new programs as units try to gain additional revenue. My sense is that there have been a lot of recent adopters, but the best information out there about who has adopted RCM comes from slides or information provided by consulting firms (which often are not under contract by the time the model is supposed to be fully implemented). This led me to spot check a few institutions commonly listed on charts, and some of them appear to have either never gotten past the planning stage or quietly moved to another budget model.

My outstanding research assistant Faith Barrett and I went through documents from 535 public universities (documents from private colleges are rarely available) to collect information on whether they had announced a move to RCM, actually implemented it, and/or abandoned RCM to return to a centralized budget model.[2] The below figure summarizes the number of public universities that had active, implemented RCM budget models for each year between 1988 and 2023.[3]

There has been a clear and steady uptick in the number of public universities with active RCM models, reaching 68 by 2023. Most of this increase has happened since 2013, when just 25 universities used RCM. Only seven universities that fully implemented RCM fully abandoned the model based on publicly available documents (Central Michigan, Ohio, Texas Tech, Illinois-Chicago, Oregon, and South Dakota), although quite a few colleges have backed off how much money flows through RCM.

Additionally, a number of universities publicly announced plans to move to RCM before apparently abandoning them before implementation. Some examples include Missouri, Nebraska, and Wayne State. This is notable because these are often included on consultants’ slide decks as successful moves to RCM.

Here is the list of universities that had fully implemented RCM by fall 2023. If you see any omissions or errors, please let me know!

NameState
Auburn UniversityAL
University of Alabama at BirminghamAL
University of ArizonaAZ
University of California-DavisCA
University of California-Los AngelesCA
University of California-RiversideCA
University of Colorado BoulderCO
University of Colorado Denver/Anschutz Medical CampusCO
University of DelawareDE
University of Central FloridaFL
University of FloridaFL
Georgia Institute of Technology-Main CampusGA
Iowa State UniversityIA
University of IowaIA
Boise State UniversityID
Idaho State UniversityID
University of IdahoID
University of Illinois ChicagoIL
University of Illinois Urbana-ChampaignIL
Ball State UniversityIN
Indiana University-BloomingtonIN
Indiana University-Purdue University-IndianapolisIN
Kansas State UniversityKS
University of KansasKS
Northern Kentucky UniversityKY
Western Kentucky UniversityKY
University of BaltimoreMD
University of Michigan-Ann ArborMI
University of Michigan-DearbornMI
Western Michigan UniversityMI
University of Minnesota-Twin CitiesMN
University of Missouri-Kansas CityMO
The University of MontanaMT
North Dakota State University-Main CampusND
University of North DakotaND
University of New Hampshire-Main CampusNH
Rutgers University-CamdenNJ
Rutgers University-New BrunswickNJ
Rutgers University-NewarkNJ
University of New Mexico-Main CampusNM
Kent State University at KentOH
Miami University-HamiltonOH
Miami University-MiddletownOH
Miami University-OxfordOH
Ohio State University-Main CampusOH
University of Cincinnati-Main CampusOH
Oregon State UniversityOR
Southern Oregon UniversityOR
Pennsylvania State University-Main CampusPA
Temple UniversityPA
University of Pittsburgh-Pittsburgh CampusPA
College of CharlestonSC
University of South Carolina-ColumbiaSC
East Tennessee State UniversityTN
Tennessee Technological UniversityTN
The University of Tennessee-KnoxvilleTN
University of MemphisTN
The University of Texas at ArlingtonTX
The University of Texas at San AntonioTX
University of UtahUT
George Mason UniversityVA
University of Virginia-Main CampusVA
Virginia Commonwealth UniversityVA
University of VermontVT
Central Washington UniversityWA
University of Washington-Bothell CampusWA
University of Washington-Seattle CampusWA
University of Wisconsin-MadisonWI

[1] This is also called responsibility centered management, and I cannot for the life of me figure out which one is preferred. To-may-to, to-mah-to…

[2] RCM can be designed with various levels of centralization. Pay attention to the effective tax rates that units pay to central administration—they say a lot about the incentives given to units.

[3] This excludes so-called “shadow years” in which the model was used for planning purposes but the existing budget model was used to allocate resources.

How Many Colleges Really Close Each Year?

College closures are getting quite a bit of attention right now—and for good reason. When a college closes suddenly, students are much less likely to complete their studies and employees have a difficult time finding comparable jobs. And the uptick in the number of college closures in the last year or two has been obvious to nearly everyone in higher education.

But how many colleges really close each year? The Wall Street Journal recently led off a story with a statement that more than 500 four-year private nonprofit colleges closed in the last decade, and Inside Higher Ed covered a National Center for Education Statistics report that highlighted that nearly 100 colleges closed in the 2023-24 academic year.

While I greatly appreciate Higher Ed Dive’s running list of college closures, the two more definitive sources for college closures come from the U.S. Department of Education’s Integrated Postsecondary Education Data System (IPEDS) and Postsecondary Education Participants System (PEPS). However, both of these sources need to be interpreted with caution due to how data are reported. I discuss my recommendation for using IPEDS and PEPS below, and see this spreadsheet for the data that I refer to throughout the piece.

IPEDS

IPEDS has a rough measure of college closures in its Directory Information data collection, which is updated annually. The relevant variable is whether an institution is active in the current year, with the possible answers of “yes” or “no: closed, combined, or out-of-scope.” The challenge here is that a substantial number of colleges included under “no” fall under the combined and out-of-scope categories without any disruption to students. Combined institutions can be the result of one college acquiring another, but it can also be an administrative consolidation for reporting purposes that does not change anything for students. An out-of-scope college may choose to opt out of federal Title IV financial aid programs while remaining open; this is primarily the case among for-profit colleges.

In the 2023-24 academic year, I count 70 primarily postsecondary institutions (IPEDS also includes some vocational training programs that are mainly secondary schools, but I exclude them here) that left the IPEDS universe. A complete list of these apparent closures can be found in the first tab of this spreadsheet.

Some of the institutions are clearly closures (such as Finlandia University in Michigan and Medaille University in New York), but others are administrative consolidations. For example, 11 of the 70 “closures” are community colleges in Connecticut, which recently moved to a single-institution model with 12 branch campuses (no campuses were closed).

Similar examples of administrative consolidations are visible at public colleges in Tennessee and Vermont. The private nonprofit closures are primarily freestanding institutions, while quite a few for-profit college closures include multiple branches closing at the same time. Overall, it appears that roughly 40 unique institutions (excluding branches) actually closed in 2023-24, and they are divided between for-profit and nonprofit colleges.

On the other hand, 56 colleges joined the IPEDS universe in the 2023-24 academic year (the second tab of the spreadsheet). They are primarily small, vocationally focused for-profit colleges that frequently cycle in and out of operation. But I did notice Northeastern University Oakland on the list of newly opened colleges using the same IPEDS UnitID as Mills College. It is relatively uncommon for private nonprofit colleges to open and receive federal financial aid, but it occasionally happens in special-focus fields such as health sciences and technical education.

PEPS

If you have read to this point in a pretty technical blog post, you have some pep in your step. Federal Student Aid updates the PEPS data page weekly with a list of every college that closes. That sounds great—until you go and download the spreadsheet. This behemoth (accessible by downloading the closed school search file) includes more than 20,000 college closures since the mid-1980s.

That number of closures seems a bit high since there are only about 6,000 colleges receiving federal financial aid in the United States at this point. The reason is that PEPS tracks the closure of every single physical campus location in the United States, as well as foreign locations of American-based or Title IV-eligible institutions. Let’s look at 2024 PEPS data, which I included in the third tab of the spreadsheet.  

PEPS lists 81 closures in 2024, with the most recent closure being on August 9. But as the below picture shows, many of these closures are of small branch campuses. For example, Saint Louis University closed small sites in Jefferson City, Dallas, and Houston. Johns Hopkins University closed seven sites in Texas, for crying out loud! Other closures are real and meaningful, such as Goddard College and UW-Oshkosh’s Fond du Lac campus (which was a freestanding institution until several years ago).

The way to identify a main campus closure is through examining the Office of Postsecondary Education ID (OPEID) number. If the number starts with a zero and ends in 00, that is a main campus. Any OPEID that ends in something other than 00 (or starts with a number other than zero) is a branch campus. Many of these branches often enroll a small number of students and come and go regularly, although a few are more notable. Thirty-two of the 81 closures to this point in 2024 have been main campuses, while the rest were branch campuses.

So how many colleges really close each year? It depends a lot on what you consider to be a college. But I would contend that quite a few of the colleges listed in federal data as closing either did not actually close or served a minuscule number of students at offsite locations. This does not make the closure of main campuses less concerning, but it’s important to have a clear sense of the numbers before engaging in policy discussions.

More Research on Heightened Cash Monitoring

As the academic summer quickly wraps up (nine-month faculty contracts at Tennessee begin on August 1), I am working on wrapping up some research projects while also simultaneously preparing for new ones. One of the projects that is near completion (thanks to Arnold Ventures for their support of this work) is examining the prevalence and implications of the federal government’s heightened cash monitoring (HCM) policy in higher education.

In the spring, I shared my first paper on the topic, which examined whether HCM placement was associated with changes to institutional financial patterns or student outcomes. We found generally null findings, which matches much of a broader literature on the effects of accountability in higher education that are not directly tied to the loss of federal financial aid. In this post, I am sharing two more new papers.

The first paper descriptively examines trends in HCM status over time, the interaction with other federal accountability policies, and whether colleges placed on HCM tend to close. There are two levels of HCM: HCM1 requires additional oversight, while the more severe HCM2 requires colleges to pay out money to students before being reimbursed by Federal Student Aid. As shown below, there was a spike in usage of HCM2 status around 2015, which was also the first year that HCM1 data were made publicly available by the Department of Education.

Colleges end up on HCM1 and HCM2 for much different reasons. The less severe HCM1 is dominated by colleges with low financial responsibility scores, while more serious accreditation and administrative capacity concerns are key reasons for HCM2 placement. Additionally, colleges on HCM2 tend to close at higher rates than colleges on HCM1.

The second paper builds on the other research from this project to examine whether student enrollment patterns are affected by signals of institutional distress. The motivation for this work is that in an era of heightened concerns about the stability of colleges, students may seek to enroll elsewhere if a college they are attending (or considering attending) displays warning signs. On the other hand, colleges may redouble their recruitment efforts to try to dig themselves out of the financial hole.

We examined these questions using two different accountability thresholds. The first was to compare colleges on HCM2 to colleges with a failing financial responsibility score, as HCM2 is a much smaller list of colleges and comes with restrictions on institutional operations. The second was to compare colleges that just failed the financial responsibility metric to colleges that were in an oversight zone that allowed them to avoid being placed on HCM1 if they posted a letter of credit with the Department of Education. As the below figure shows, there is not a huge jump in the number of colleges that barely avoided failing (the left line)—and that allows for the use of a regression discontinuity design.

After several different analyses, the key takeaway is that students did not respond to bad news about their college’s situation by changing their enrollment patterns. If anything, enrollment may have increased slightly in some cases following placement on HCM2 or receiving a failing financial responsibility score (such as in the below figure). This finding would be consistent with students never hearing about this news or simply not having other feasible options of where to attend. I really wonder if this changes in the future as more attention is being paid to the struggles of small private colleges in particular.

I would love your feedback on these papers, as well as potential journals to explore. Thanks for reading!

New Research on Heightened Cash Monitoring

I have spent most of the last year digging into the topic of heightened cash monitoring (HCM), perhaps the federal government’s most important tool in its higher education accountability toolbox at this time. HCM places colleges’ federal financial aid disbursements under additional scrutiny in order to protect taxpayer dollars. There are two levels of scrutiny: HCM1 requires additional oversight, while the more severe HCM2 requires colleges to pay out money to students before being reimbursed by Federal Student Aid.

This seems like an obscure topic, but it affects a substantial portion of American higher education. In 2023, 493 colleges were on HCM1 and 78 colleges were on HCM2—together representing about 10% of all colleges receiving federal financial aid. And in the mid-2010s, more than 1,000 colleges were on HCM1 or HCM2 at one time.[1]

Thanks to the generous support of Arnold Ventures, my graduate research assistant Holly Evans and I dove into whether colleges responded to being placed on the more severe HCM2 status by changing their financial priorities, closing, or influencing student debt and graduation outcomes. We compared colleges placed on HCM2 to colleges that were not on HCM2, but had failed the federal financial responsibility metric (and thus also had issues identified by the federal government). Using three analytic approaches, we generally found no relationships between HCM2 status and these outcomes. It was a lot of work for no clear findings, but that is pretty typical when studying institutional responses to government policies.

Here is a copy of our working paper, which I am posting here in the hope of receiving feedback. I am particularly interested in thoughts about the analytic strategy, interpreting results, and potential journals to send this paper to. Stay tuned for more work from this project!


[1] HCM1 data were first made public in 2015 following news coverage from Inside Higher Ed, while retroactive HCM2 data were also released in 2015 with the unveiling of the modern College Scorecard.