By Rajkamal Rao
|Image Credit: Rao Advisors, from covers of various publications|
Ranking anything in the US is big business. Cars, hospitals, colleges, phone companies, home builders and consumer products are all ranked by various media outlets and customer satisfaction companies.
Consumers crave for rankings because it makes it easy for them to sift out good products and services from bad without having to do any product research themselves. The organizations which are ranked covet these rankings and go to extraordinary lengths to appear high up in those annual lists.
When it comes to college rankings, our position is rather radical. We strongly believe that for most students applying to U.S. college and graduate schools, rankings don't matter much at all. There are far better ways to choose colleges than to use school rankings because there are just too many issues with current commercial rankings.
Problem #1: Does the methodology make sense?There are about a half a dozen outfits which publish college rankings. Every outfit uses its own methodology to arrive at a rank. U.S. News, the largest and most popular college ranking organization, says that it gathers and weights data from each college on some 15 indicators of academic excellence, such as:
Graduation and retention rates (22.5 percent)
Undergraduate academic reputation (22.5 percent)
Faculty resources, such as salary and class size (20 percent):
Student selectivity (12.5 percent):
The weights reflect U.S. News' judgment about how much each measure matters. Just as anyone who has played around with an Excel sheet knows, if you change the weights, the rankings would change too. So, the first step in accepting U.S. News rankings as the Holy Gospel is that you agree that its indicators and the weights are just as meaningful to you as they are to U.S. News.
For most people, this is a problem. Most students go to college or graduate school not only for the experience of learning and exploration, but also, to find better employment after graduation. But the U.S. News ranking methodology does not consider how employers rank the school! Nor does it include graduate placement statistics - details about how many (and the kinds of) jobs students got after graduation. If the most famous ranking outfits does not think graduate outcomes are important, something is wrong!
Problem #2: Ranking is not the same as reputationWe do not need any publication to tell us that elite schools such as the Ivy Leagues or Stanford, MIT, Caltech, Berkeley, Duke, Rice and Carnegie Mellon are outstanding institutions of learning. We already know this fact. This kind of reputation is earned over decades (and in the case of the Ivy Leagues, generations) of hard work and accomplishment. In a sense, reputed schools are famous - for being famous.
Rankings, however, ebb or flow with the tide. It is nearly impossible to take an institution with millions of complex interactions involving its management, faculty, students, parents, employers and trustees for nearly a year — and reduce them all to a number.
There’s also the question of how valid the underlying data - for all these interactions - is. Most ranking outfits rely on the schools to provide them with information — such as admissions figures, financial resources, graduation figures and alumni giving — because a transparent, central hub of data does not exist. This creates an inherent conflict of interest. If a school is truthful in its reporting to an external organization, it could potentially end up being ranked lower. Should the school be truthful or aim to manipulate data a bit so that it can end up being higher? When perfect data is not available, every ranking organization makes assumptions to compute scores. Are these assumptions all valid?
The Obama White House criticized this approach in a September 2015 fact sheet as old and static, not consistent with what families and students need. “The old way of assessing college choices relied on static ratings lists compiled by someone who was deciding what value to place on different factors”. [Emphasis ours].
Problem #3: Commercial school rankings are, well, commercialA key issue about commercial school rankings is exactly that — that is, these rankings are produced by commercial, for-profit companies, which love the status-quo. The ultimate goal of these outfits is to sell their rankings or build a brand around them. In 2007, the US News site was regularly getting about half-a-million hits a month. Within three days of the rankings release, traffic went up to 10 million page views, a twenty-fold increase. In 2010, the company expressed its plans to walk away from magazine publishing altogether focusing instead on its rankings business.
Another problem with rankings is that students are placed in the uncomfortable, counter-intuitive position of choosing a ranking system before choosing a school. Each organization uses its own method to rank, so which ranking system is best for you? If the school you like is ranked high in a few lists but ranked lower in the others, what should you do?
The Obama administration set out to correct these flaws. Rather than rely on surveys and snapshots of data as the ranking outfits do, it proposed to use real data to rate the quality of colleges. Every student who takes a student loan is lodged in the Department of Education database. If a student transfers to another school, this information is also reported to the government. Every student who graduates and begins a career has to file a W-4 withholding form, so the IRS knows where this student went to work and how much she is making. If a student failed to make loan payments over a consistent period, this information is also known to the government because the IRS has the power to divert tax refunds to unpaid loan amounts.
Problem #4: The establishment likes the status-quoWith advances in data sciences and computing power, the government has the ability to come up with a technological solution to tie all of these disparate pieces of real information into a comprehensive ranking system that is based on actual data and not subject to commercial interests. In 2013, President Obama announced that all 7,000 of the nation’s colleges would be ranked by the government. As the New York Times reported, the aim was to “publicly shame low-rated schools that saddle students with high debt and poor earning potential.”
But the plan ran into fierce opposition. “Critics, including many of the presidents at elite private colleges, lobbied furiously against the idea of a government rating system, saying it could force schools to prioritize money making majors like accounting over those like English, history or philosophy.”
This type of thinking is at odds with the outcome based selection approach we have advocated for years. If students are really passionate about subjects like English, history or philosophy, they may still choose careers in those fields but this should not stop them from knowing how much they may earn after graduation, or how the return on their college investment is likely to be poor.
The Obama White House succumbed to this pressure from the entrenched establishment and when the new College Scorecard was released in September 2015, it did not have a ranking system.
Our takeawayStudents are better off to use rankings sparingly and more as a final filter, if at all, rather than as a crucial pivot throughout the process. Outcome based ranking lists, such as those from the College Scorecard (although not ranked) or Payscale.com are far better than commercial ranking lists because they keep your focus on the Return on your College Investment.
A Note About Rao Advisors Premium Services
Our promise is to empower you with high-quality, ethical and free advice via this website. But parents and students often ask us if they can engage with us for individual counseling sessions.
Individual counseling is part of the Premium Offering of Rao Advisors and involves a fee. Please contact us for more information.
Go back to "Rao Advisors - Home".