Just like the leaves starting to change colors (in spite of the miserable 93-degree heat outside my New Jersey office window) and students returning to school are clear signs of fall, another indicator of the change in seasons is the proliferation of college rankings that get released in late August and early September. The Washington Monthly college rankings that I compile were released the week before Labor Day, and MONEY and The Wall Street Journal have also released their rankings recently. U.S. News & World Report caps off rankings season by unveiling their undergraduate rankings later this month.
People quibble with the methodology of these rankings all the time (I get e-mails by the dozens about the Washington Monthly rankings, and we’re not the 800-pound gorilla of the industry). Yet at least these rankings are all based on data that can be defended to at least some extent and the methodologies are generally transparent. Even rankings of party schools, such as this Princeton Review list, have a methodology section that does not seem patently absurd.
But since America loves college rankings—and colleges love touting rankings they do well in and grumbling about the rest of them—a number of dubious college rankings have developed over the years. I was forwarded a press release about one particular set of rankings that immediately set my BS detectors into overdrive. This press release was about a ranking of the top 20 fastest online doctoral programs, and here is a link to the rankings that will not boost their search engine results.
First, let’s take a walk through the methods section. There are three red flags that immediately stand out:
(1) The writing resembles a “word salad” and clearly was never edited by anyone. Reputable rankings sites use copy editors to help methodologists communicate with the public.
(2) College Navigator is a good data source for undergraduates, but does not contain any information on graduate programs (which they are trying to rank) other than the number of graduates.
(3) Reputable rankings will publish their full methodology, even if certain data elements are proprietary and cannot be shared. And trust me—nobody wants to duplicate this set of rankings!
As an example of what these rankings look like, here is a screenshot of how Seton Hall’s online EdD in higher education is presented. Again, let’s walk through the issues.
(1) There are typos galore in their description of the university. This is not a good sign.
(2) Acceptance/retention rate data are for undergraduate students, not for a doctoral program. The only way they could get these data are by contacting programs, which costs money and runs into logistical problems.
(3) Seton Hall is accredited by Middle States, not the Higher Learning Commission. (Thanks to Sam Michalowski for bringing this to my attention via Twitter.)
(4) In a slightly important point, Seton Hall does not offer an online EdD in higher education. Given that I teach in the higher education graduate programs and am featured on the webpage for the in-person EdD program, I’m pretty confident in this statement.
For any higher education professionals who are reading this post, I have a few recommendations. First, be skeptical of any rankings that come from sources that you are not familiar with—and triple that skepticism for any program-level rankings. (Ranking programs is generally much harder due to a lack of available data.) Second, look through the methodology with the help of institutional research staff members and/or higher education faculty members. Does it pass the smell test? And finally, keep in mind that many rankings websites are only able to be profitable by getting colleges to highlight their rankings, thus driving clicks to these sites. If colleges were more cautious about posting dubious rankings, it would shut down some of these websites while also avoiding embarrassment when someone finds out that a college fell for what is essentially a ruse.
This is great advice but in my experience the people who most frequently cite or refer to these dubious (or outright fraudulent) rankings have little or no interest in whether the rankings have rigor. The college and university public relations employees who cite these rankings, often using the exact same language in the author’s press release promoting the new rankings, seem to be interested in printing any positive information that they can obtain about their client. Outside of colleges and universities, some reporters appear to be eager for any information they can easily cast into a story or blog post so exhibit the same level of credulity as our colleagues in public relations. The only other group of people who seem to promote these rankings are the people who create them and of course they have a vested self interest in promoting their website and publications as cheaply, quickly, and easily as possible. In fact, it has been my observation that many rankings and ranking websites exist entirely to (a) display ads to visitors and (b) sell leads to (less-than-reputable) colleges and universities desperate for students.
(If it hasn’t been written yet, there is an article to be written about the many fly-by-night rankings websites that have sprung up over the past couple of years. They tend to have obvious domain names and use publicly-available data – IPEDS and scorecard data – to slop together tables and lists that give the website the thinnest veneer of credibility so they can sale ads and leads. Many of them seem to be focused on specific disciplines and programs. I also suspect – but haven’t done any work to prove – that many of these websites are owned by the same companies or individuals. It can’t be coincidence that so many of them look the same with the same layout and language!)
I agree completely. There is definitely a market for a news article–or even a scholarly article–on the rise of dubious rankings. At some point, a number of colleges are going to be embarrassed for publicizing rankings that aren’t at all based on their program characteristics after an investigation.