Should Payscale’s Earnings Data Be Trusted?

Despite the large amount of money spent on higher education, prospective students, their families, and the public have historically known very little about the earnings of students who attend college. This has started to change in recent years, as a few states (such as Virginia) began to publish earnings data for their graduates who stayed in state and the federal government publishes earnings data for certain programs through gainful employment rules. But this leaves out many public and private nonprofit institutions, and complete data are not available without a student unit record system.

As is often the case, the private sector steps in to try to fill the gap. Payscale.com has collected self-reported earnings data by college and major among a large number of bachelor’s degree recipients (those with a higher degree are excluded—the full methodology is here). Their 2014 “return on investment” report ranked colleges based on the best and worst dollar returns, with Harvey Mudd College at the top with a $1.1 million return over 20 years and Shaw University at the bottom with a return of negative $121,000.

Payscale data is self-reported earnings among individuals who happened to look at Payscale’s website and were willing to provide estimates of their annual earnings. It’s my strong suspicion that self-reported earnings from these individuals are substantially higher than the average bachelor’s degree recipient, and these are often based on a relatively small number of students. For example, the estimates of my alma mater, Truman State University, are based on 251 graduates for a college that graduates about 1,000 students per year. As many Truman students go on to get advanced degrees, probably about 500 students per year would qualify for the Payscale sample. Yet 102 students provided data within five years of graduation—about four percent of graduates who did not pursue further degrees.

But is it still worth considering? Yes and no. I don’t put a lot of stock in the absolute earnings listed, since they’re likely biased upward and there are relatively few cases. Additionally, there is no adjustment for cost of living—which really helps colleges in expensive urban areas. But the relative positions of institutions with similar focuses in similar parts of the country are probably somewhat close to what complete data would say. If the self-reporting bias is similar, then controlling for cost of living and the composition of graduates could yield useful information.

I hope that Payscale can do a version of their ROI estimates taking cost of living into account, and try to explore whether their data are somewhat representative of a particular college’s bachelor’s degree recipients. Although I commend them for providing a useful service, I still recommend taking the dollar value of ROI estimates with a shaker of salt.

Author: Robert

I am an a professor at the University of Tennessee, Knoxville who studies higher education finance, accountability policies and practices, and student financial aid. All opinions expressed here are my own.

13 thoughts on “Should Payscale’s Earnings Data Be Trusted?”

  1. “those with a higher degree are excluded”

    I have been thinking about this issue for some time now and its applicability to long-term wage outcomes and have decided it may be wrong. It makes sense on the surface, but it ignores the fact that many individuals have to accumulate some kind of ongoing learning to stay up to date, or even licensed. Often this is in the form of some kind of continuing education credit, however it may be degree-eligible credit.Since these activities cannot be excluded, even in fields like engineering and education where we know continuing education is required, I see no reason to exclude higher level degrees simply on the basis we can identify them more readily. It all serves the same purpose.

    1. Agreed. I think that including at least master’s degrees as a separate category would be useful. However, including graduate degrees gets (deeper) into the issue of multiple institutional contributions–something which is probably easier to avoid without a large sample size.

      1. Right. Which is also why we report each degree level separately. And our entering cohort based outcomes limit associate or bachelor degrees, as appropriate.

  2. How do you account for the differences in the earning ability of majors? If college A graduates mostly engineers and college B graduates mostly elementary education majors, how would you correct the conclusion that college A has something to do with the difference?

    1. That’s one of the biggest concerns in using wage data in any accountability system. If I were using wage data in rankings, I would control for the mix of graduates to help compare more similar schools. IPEDS has data on degrees granted by major in a given year. If the desired outcome measure is earnings 5 years after graduation, use the mix of majors 5 years ago to help adjust for differences in missions and degree offerings.

      (I would also use wage data for dropouts and transfers instead of just graduates, but that’s another argument for another day.;)

  3. Thanks for pointing out these rather significant flaws to Payscale’s approach. In addition to the cost of living issue, I think the removal of anyone with a post-graduate degree is a huge flaw in the data. AAC&U’s recent report on liberal arts graduates and their long-term employment results show that about 40% of those with a humanities or social science BA also hold a graduate degree. That’s a huge number of graduates!

    1. I’m an economics and finance double major from an AAC&U institution (Truman State) with a PhD, so I’m also not in their data.

      But, with that being noted, it’s okay if Payscale excludes people with a graduate degree if there is a high correlation between earnings with a bachelor’s degree and earnings with a graduate degree. It’s unclear whether that is the case, though.

      1. Yep, that’s my point. This is why I think that, regardless of how carefully they present the data or the various correlations you describe, they are misleading to your average prospective student in two ways. They may discourage someone from getting a humanities or social science undergraduate degree even though many very successful, highly-paid people (often the leaders in their fields!) begin in exactly that educational space. However, these kinds of rankings miss an important “message” for liberal arts students–grad school matters a lot in terms of long-term earnings. They need to be planning for how they will finance grad or professional school. We, in higher ed, are doing a terrible job of putting these kinds of data in context for students and helping them plan both educationally and financially for their entire educational pathway and the transition from college to career.

Comments are closed.

%d bloggers like this: