How Acela Corridor Educational Norms Look to an Outsider

Education policy discussions in the United States tend to be dominated by people living in the Acela Corridor—the densely-populated, highly-educated, and high-income portion of the United States that is served by Amtrak’s fast train from Boston to Washington, DC. Since moving from the Midwest to New Jersey four years ago to start on the tenure track at Seton Hall, I have probably logged 50 trips to Washington via Amtrak’s slower (and less-expensive) Northeast Regional train. (It sure beats driving, and Amtrak’s quiet car is a delight!)

Many of the suburban communities in northern New Jersey have median household incomes of well over $100,000 per year, which is roughly the top 20% of American families. The top 20% is notable because that is the cutoff that the Brookings Institution’s Richard Reeves uses in his new book, Dream Hoarders, to highlight how upper-income individuals have taken steps to make sure their children have every opportunity possible—typically at the expense of other families. The sheer concentration of high-income families within much of the Acela Corridor has created a powerful set of social norms regarding education that can leave outsiders flabbergasted.

Yet in spite of having two parents with bachelor’s degrees, a PhD in education, and being one half of a two-income professional household, I find myself confused by a number of practices that are at least somewhat common in the Acela Corridor but not in other parts of the country. This was highlighted by a piece in Sunday’s New York Times on affirmative action. The reporter spoke with two students at private boarding schools in New Jersey, of which there are apparently a fair number. My first reaction, as a small-town Midwesterner, was a little different than what many of my peers would think.

Here are some other things that have surprised me in my interactions with higher-income families in the Acela Corridor:

  • K-12 school choice debates. Unlike some people in the education world, I don’t have any general philosophical objections to charter schools. But in order for school choice to work (barring online options), there needs to be a certain population density. This is fine in urban and suburban areas, but not so great in rural areas where one high school may draw from an entire county. A number of Republican senators from rural states have raised concerns about school choice as a solution for this reason.
  • SAT/ACT test preparation. I attended a small-town public high school with about 200 students in my graduating class. The focus there was to get students to take the ACT (the dominant test in America as a whole, with the coasts being the exception), while also encouraging students to take the PLAN and PSAT examinations. But I never saw a sign advertising ACT prep services, nor was I even aware that was I thing people do. (I took the practice ACT that came with the exam the night before the test—that was it.) In the Northeast, there seem to be more signs on the side of the road advertising test prep than any other product or service.
  • The college admissions process. Going to a four-year college is the expectation for higher-income families in the Acela Corridor, and families treat the college choice process as being incredibly important. Using private college counselors to help manage the process, which often includes applying to ten or more colleges, is not uncommon. A high percentage of students also leave the state for college, which is quite expensive. (In New Jersey, about 37% of high school graduates head to other states to attend college.) Meanwhile, in much of the country, the goal is to get students to attend college at all rather than to get students to attend a slightly more prestigious institution. I can think of just one of my high school classmates who went out of state, and a large percentage of the class did not attend college immediately after high school.
  • Private tutoring while in college. I supplemented my income in graduate school by tutoring students in economics, typically charging between $25 and $40 per hour to meet with one or two students to help them prepare for exams. (I paid for an engagement ring using tutoring income!) I was never aware of anyone paying for private tutoring when I was an undergraduate at Truman State University, but this was a common practice at the University of Wisconsin-Madison. Nearly all of these students came from the suburbs of New York City or Washington, DC and were used to receiving private tutoring throughout their education. I got very few tutoring requests from in-state students, but they were typically paying for their own college (and thus got a substantial discount from my normal rates).

I worry about education policy discussions being dominated by the Acela Corridor regulars because their experiences are so different than what how most Americans experience both K-12 and higher education. If education committee staffers, academic researchers, and think tankers all share similar backgrounds, the resulting policy decisions may not reflect the needs of rural and urban lower-income individuals. It is important to seek out people from other walks of life to make sure policies are best for all Americans.

ACT Scores Fell Last Year. Relax!

As a shareholder of the Green Bay Packers, I keep an eye on what Butte Community College’s most famous student-athlete has to say. Packers quarterback Aaron Rodgers famously told fans in “Packer-land” in 2014 to “R-E-L-A-X” after the team got off to an uncharacteristically slow 1-2 start. Fans relaxed after the team went 11-2 the rest of the way in the regular season as Rodgers played like his regular self.

In the education policy niche of the world, few things get people more upset than declining standardized test scores. Last year, I wrote about the fuss about SAT scores declining—and how at least part of that decline is due to more students taking the test instead of the American education system failing young adults. Now it’s ACT’s turn to release their newest scores—and my message again is R-E-L-A-X.

Between 2015 and 2016, average ACT scores declined from 21.0 to 20.8 nationwide, the lowest score in at least five years. But as the now-dominant test in the United States (much to the surprise of many folks who grew up on a coast where the SAT is still common), the percentage of students taking the ACT rose from 52% in 2012 to 59% in 2015 and 64% this year. This sharp increase in ACT takers is in large part due to more states requiring all students to take the ACT as a graduation requirement. In 2016, all graduating high school seniors took the ACT in 18 states, up from 13 states in 2015.

The five states that required all students to take the ACT for the first time in 2016 all saw large decreases in their average scores, as shown below. Wisconsin, Missouri, and Minnesota all had about 75% of their students taking the ACT in 2015 and had drops of about 1.5-1.7 points when all students took the test, with South Carolina having a drop of 1.9 points as the last 38% of students took the test. Nevada had a decline of 3.3 points in 2016, but the percentage of students taking the ACT more than doubled.

State Pct tested (2016) Avg score (2016) Pct tested (2015) Avg score (2015)
Nevada 100 17.7 40 21.0
South Carolina 100 18.5 62 20.4
Wisconsin 100 20.5 73 22.2
Missouri 100 20.2 77 21.7
Minnesota 100 21.1 78 22.7


Among the other 45 states that had very small changes in ACT participation rates, the average change in scores at the state level (not weighted for size) was effectively zero. So R-E-L-A-X about test score declines when they are due to more students taking the test (some of whom won’t be going to college, anyway) instead of collegegoing students suddenly performing worse.

Will Colleges Send Out Financial Aid Packages Earlier Next Year?

I’m looking forward to college students being able to submit the Free Application for Federal Student Aid (FAFSA) three months earlier next year. Instead of being able to submit starting January 1 for the 2017-18 academic year, students will be able to submit beginning October 1—giving students an additional three months to complete the form thanks to using ‘prior prior year’ (PPY) income and asset data. This means that students can get an estimate of their eligibility for federal grants and loans as soon as late fall, which has the potential to help inform the college choice process.

But there is no guarantee that students will get their final financial aid package from the college any earlier as a result of prior prior year. Recognizing this, Undersecretary of Education Ted Mitchell recently sent a letter to college presidents asking colleges to send out their aid packages earlier in order for students to fully benefit from PPY. Will colleges follow suit? I expect that some will, but the colleges with the greatest ability to offer institutional grant aid probably won’t. Below, I explain why.

The types of colleges that can easily respond to PPY by getting aid packages out earlier are those institutions with rolling admissions deadlines. (Essentially, it’s first-come, first-served among students who meet whatever admissions criteria are present—less-selective four-year and virtually all two-year colleges operate in this manner.) Some of these colleges already offer their own grant aid upon admission, but these colleges tend to have less grant aid to offer on account of relatively low sticker prices and fewer institutional resources. Additionally, these colleges often take applications well into the spring and summer—after students can already file the FAFSA under current rules.

It is less likely that the relatively small number of highly-selective colleges that get a disproportionate amount of media coverage will respond to PPY by getting financial aid offers out any earlier. For example, the Ivy League institutions didn’t even release their admissions notifications for students applying through the regular route until the last day of March, which gives students plenty of time to complete the FAFSA under current rules. Moving up the notification date to January is definitely feasible under PPY, but it requires students to apply earlier—and thus take tests like the ACT or SAT earlier. All students are supposed to commit to one college by May 1, giving students one month under current rules to compare aid packages and make a decision. Colleges may oppose extending this decision period as students have more time to compare offers and potentially request more money from colleges.

I suspect the Department of Education sent their letter to colleges in an effort to get the admissions notification dates at selective colleges moved up, but this goes against the incentives in place at some colleges to reduce the comparison shopping period. Prior prior year still allows students to get their federal aid eligibility earlier, which is a good thing. But for quite a few students, they won’t get their complete financial aid package any earlier.

It’s National College Decision Day. So What?

May 1 is known as National College Decision Day, as it is often the deadline for students to make deposits to attend the college of their choice. Both local and national media love to highlight students who attend selective institutions, making it seem like May 1 applies to many students who are holding offers from multiple institutions. It’s also spawned a Twitter hashtag of #DecisionDay, which is worth a look. But in reality, the May 1 deadline doesn’t apply to that many students. Below are some reasons why.

(1) In the community college and less-selective four-year sectors, many students apply for admission well after May 1. For example, the University of Missouri-St. Louis, which admitted about two-thirds of applicants for the fall 2012 semester, does not have a firm cutoff date for admission. For-profit institutions often have rolling admissions, meaning that the May 1 deadline applies to only more selective public and nonprofit colleges.

(2) The decision day only really matters to students who applied and are admitted to multiple colleges. Given that most students stay close to home to attend college (as illustrated in these great charts by data wizard Jon Boeckenstedt) and don’t apply to more than three or four colleges, students may not even wait until the last minute to make their decision. I only applied to two colleges and made my decision in October (thank you, rolling admissions!), so I submitted my deposit well before the May 1 deadline.

(3) Just because a student submits a deposit doesn’t necessarily mean he or she will actually enroll in the fall. Some students submit deposits to multiple institutions, as the cost is often relatively small (as examples, Montclair State requires a $525 deposit and Seton Hall requires $625). Submitting multiple deposits is highly unethical according to admissions professionals, as they want certainty in the sizes of their incoming classes. But anecdotal conversations with enrollment management professionals reveal a rising rate of (suspected) multiple deposits, even though colleges may be able to rescind admissions offers under these circumstances.

At less-selective institutions with May 1 deposit deadlines, “summer melt,” in which students intend to go to college but fail to enroll anywhere in the fall, can be a concern. Researchers have estimated the rate of summer melt at between 10 and 40 percent, although the number is likely on the lower end for the types of colleges with May 1 deposit deadlines. This is a factor that colleges may be able to mitigate with good outreach programs and summer interventions.

So pardon my lack of excitement for National Decision Day, as it doesn’t really affect that many students. If the goal is to encourage students whose success in college is far from guaranteed, let’s focus on getting students to apply after May 1 and then show up in the fall.

Should College Admissions be Randomized?

Sixty-nine percent of students who apply to Stanford University with perfect SAT scores are rejected. Let that sink in for a minute…getting a perfect SAT is far from easy. In 2013, the College Board reported that only 494 students out of over 1.6 million test-takers got a 2400. Stanford enrolled roughly 1700 students in their first-year class in 2012, so not everyone had a perfect SAT score. Indeed, the 25th percentile of SAT scores is 2080, with a 75th percentile of 2350, for the fall 2012 incoming class according to federal IPEDS data. But all of those scores are pretty darned high.

It is abundantly clear that elite institutions like Stanford can pick and choose from students with impeccable academic qualifications. The piece from the Stanford alumni magazine that noted the 69% rejection rate for perfect SAT scorers also noted the difficulty of shaping a freshman class from the embarrassment of riches. All students Stanford considers are likely to graduate from that institution—or any other college.

Given that admissions seem to be somewhat random anyway, some have suggested that elite colleges actually randomize their admissions processes by having students be selected at random conditional on meeting certain criteria. While the current approach provides certain benefits to colleges (most notably allowing colleges to shape certain types of diversity and guaranteeing spots to children of wealthy alumni), randomizing admissions can drastically cut down on the cost of running an admissions office and also reduces the ability of students and their families to complain about the outcome. (“Sorry, folks…you called heads and it came up tails.”)

As a researcher, I would love to see a college commit to randomizing most of all of its admissions process over a period of several years. The outcomes of these randomly accepted students should be compared to both the students who were qualified but randomly rejected and to the outcomes of the previous classes of students. My sense would be that the randomly accepted students would be roughly as successful as those students who were admitted under regular procedures in prior years.

Would any colleges like to volunteer a few incoming classes?

Another Acceptance Angst Article

Having spent three years in a college admissions office, I know this is the time of year in which some students find out whether they were accepted to the college(s) of their dreams. I am particularly annoyed by the New York Times’s “The Choice” blog, which is clearly aimed toward students with academic credentials suitable for Ivy League institutions. My annoyance rises because this blog focuses its attention on such a small number of institutions which are academically out of reach of nearly all students and perceived to be financially out of reach of almost everyone (although this is not the case).

The Choice annually follows a small group of students who apply to many of these highly selective institutions, and are shocked when they receive a rejection letter. While I am glad that the blog now includes more students from geographically and economically varied backgrounds, most of the bloggers’ stories are still sufficient to cause angst to many well-prepared students. Take for example Leobardo Espinoza, Jr., from Topeka, Kansas. His most recent post was full of angst about getting rejected by Washington University in St. Louis, one of the most selective colleges in the Midwest. Thankfully, he eventually realized that he was already accepted by American, Amherst, and Bowdoin, as well as Kansas and Wichita State. But I am concerned that many readers will get the wrong impression about his post.

I am glad that the blog is finally featuring students who apply to at least a few local options. If students have a choice, I strongly recommend avoiding as much debt as possible along the road to a bachelor’s degree by staying in-state or attending private colleges with generous financial aid options. Yes, getting rejected by one prestigious college stings and it makes for great reading among the NYT’s elite readership. But it’s not the end of the world, and I think that Mr. Espinoza has realized that in spite of the title of the article.

Making the College Scorecard More Student Friendly

The Obama Administration and the U.S. Department of Education have spent a great deal of time and effort in developing a simple one-page “college scorecard.” The goal of this scorecard is to provide information about the cost of attending college, average graduation rates, and information regarding student debt. The Department of Education has followed suit with a College Affordability and Transparency Center, which seeks to highlight colleges with unusually high or low costs to students.

Although I have no doubt that the Administration shares my goal of facilitating the availability of useful information to prospective students and their families, I doubt the current measures are having any effect. The college scorecard is difficult to understand, with technical language that is second nature to higher education professionals but is completely foreign to many prospective students. Because of this, I was happy to see a new report from the Center for American Progress, a liberal think tank, suggesting improvements to the measures. (As a side note, liberal and conservative think tanks work together quite a bit on issues of higher education. Transparency and information provision are nearly universal principles, and partisan concerns such as state-level teachers’ unions and charter schools just aren’t as present in higher ed.)

The authors of the report took the federal government’s scorecard and their own version to groups of high school students, where they tested the two versions and suggested improvements. The key points aren’t terribly surprising—focusing on a few important measures with simple language is critical—but it appears that the Department of Education has not yet done adequate testing of their measure. I am also not surprised that students prefer to see four-year graduation rates instead of six-year rates, as everyone thinks they will graduate on time—even though we know that is far from the case.

The changes to the college scorecard are generally a good idea, but I remain concerned about students’ ability to access the information. Even if the scorecard is required to be posted on a college website (like certain outcome measures currently are), it does not mean that it will be easy to access. For example, the graduation rate for first-time, full-time students who received a Pell Grant during their first year of college must be posted on the college’s website, but actually finding this information is difficult. I hope outside groups (such as CAP) will continue to publicize the information, as greater use of the data is the best way to influence colleges’ behavior.

Am I On the Wrong Job Market?

In light of being on the academic job market this year, I was amused to get the following mailing from the local branch of Globe University. (Even though the mailing was addressed to me, it was also addressed to “Or Current Resident.”)

Globe Ad

The message is quite simple: graduates of this university get jobs. The mailing advertises that 100% of graduates with a bachelor’s degree in business administration get employment, although the fine print does mention that “employment is not guaranteed.” However, not much can be said about the graduation rates of students attending any of the Globe campuses, both because very few students attending Globe are first-time, full-time students (which are the only students counted in federal graduation rate calculations) and because many campuses (including Madison) have not been open long enough to have a graduation rate cohort.

To get an idea of graduation rates, I looked at the oldest Globe campus, in Brooklyn Center, MN. The reported graduation rate is 23%, with an overall career placement rate of 72%. Meanwhile, tuition is over $5,000 per quarter before mandatory course fees. I’m not saying that Globe University is a bad bet for students, but some students are likely to benefit more by attending the local technical college.

Moral of the story: Don’t believe colleges which imply that everyone gets a job. This isn’t even true at the most prestigious colleges, let alone for relatively unknown for-profit institutions. (Now, I do hope that my UW-Madison PhD helps me get a great job!)

Not Every College is Elite

Like many happenings in American society, the perception of the college selection process is driven by the most elite people and institutions. There are plenty of stories out there about how students apply to more than ten colleges, yet are lucky to get accepted to only their “safety school.” (For example, look at this blog from the newspaper of America’s elite.) But many prospective students and their families do not realize that the majority of four-year colleges are not highly selective and will admit most high school graduates.

An article in today’s Inside Higher Ed (titled “The (Needless?) Frenzy”) highlights the results of a national survey conducted by the National Association for College Admission Counseling (membership required to see the full report). The results of the survey show that the average four-year institution admits about two-thirds of its applicants, with little difference between public and private colleges. I examined federal IPEDS data for 2010-11 admit rates for the 1569 colleges in this year’s Washington Monthly college rankings and also found that the average college admitted 65% of applicants. The below graph shows the distribution of admission rates:


Not every college is elite enough to be able to reject most of their applicants. Although students tend to apply to colleges which should give them at least a chance of admission, it is worth noting that top-rated colleges in the Washington Monthly rankings admit more students than the average. For example, my alma mater, Truman State University, admits 74% of its applicants while ranking sixth in the master’s university rankings. Truman is certainly selective (with a median ACT of 28), but it is far from elite. Prospective students and their families need to keep in mind that there are very good schools out there which are not absurdly selective, and policymakers should focus their efforts on making success for these institutions more possible.

Sticker Shock in Choosing Colleges: What Can Be Done?

Very few items are priced in the same manner as a college education. While the price of some items, such as cars and houses, can be negotiated downward from a posted (sticker) price, the actual price and the sticker price are usually in the same ballpark. However, the difference between the sticker price and the actual price paid can be enormous in higher education. This has posed a substantial problem to students and their families, especially those with less knowledge of the collegegoing and financial aid processes.

Until recently, students had to apply for financial aid to get an idea of how much college would actually cost them. The latest iteration of the Higher Education Opportunity Act, signed in 2008, required that institutions place a net price calculator on their website by last October. This calculator uses basic financial information such as income, household size, and dependency status to estimate a student’s expected family contribution (EFC), which would then give students an idea of their grant aid.

The need for more transparent information on the actual cost of college is shown by a recently released poll conducted by the College Board and Art & Science Group, LLC. These groups polled a nonrandom sample of SAT test-takers applying to mainly selective four-year colleges and universities in late 2011 and early 2012 and found that nearly 60% of low and middle-income families ruled out colleges solely because of the sticker price. This is in spite of generous need-based financial aid programs at some expensive, well-endowed colleges.

Given that the survey was conducted right as net price calculators became mandatory, it is likely the case that more students are aware of these tools by now. But it is unlikely that net price calculators have been used as much as possible, especially by first-generation students. To make the net price more apparent, the Department of Education has put forth a proposed “Shopping Sheet” that can be easily compared across colleges. This proposal has advocates in Washington, but there are reasonable concerns that a one-size-fits-all model may not benefit all colleges.

As an economist, I hope that better information can help students and their families make good decisions about whether to go to college and where to attend. However, I am also hesitant to believe that requiring uniform information across colleges will result in something useful.