By Rajkamal Rao
Ranking anything in the US is big business. Cars, hospitals, colleges, phone companies, home builders and consumer products are all ranked by various media outlets and customer satisfaction companies.
Consumers crave for rankings because it makes it easy for them to sift out good products and services from bad without having to do any product research themselves. The organizations which are ranked covet these rankings and go to extraordinary lengths to appear high up in those annual lists.
When it comes to college rankings, our position is rather radical. We strongly believe that for most students applying to U.S. college and graduate schools, rankings don't matter much at all. There are far better ways to choose colleges than to use school rankings because there are just too many issues with current commercial rankings. We list them after the YouTube clip below.
Our approach is to not use rankings at all and look at what real students do when they receive offers of admission. The College Navigator, a world-class site operated by the U.S. Department of Education, maintains selectivity and yield numbers for all colleges in the admissions tab. Selectivity and Yield are critically important metrics in the higher education sector. Colleges with the lowest selectivity and highest yield are obviously the most sought-after institutions, and therefore, are ranked very high. Watch the clip below.
Why we don't use rankings
We promised ranking enthusiasts that we will explain our skepticism, bordering on cynicism, about college rankings - so here we go:Problem #1: Does the methodology make sense?
There are about a half a dozen outfits which publish college rankings. Every outfit uses its own methodology to arrive at a rank. U.S. News, the largest and most popular college ranking organization, says that it gathers and weights data from each college on some 15 indicators of academic excellence, such as:- Graduation and retention rates (22.5 percent)
- Faculty resources, such as salary and class size (20 percent)
The weights reflect U.S. News' judgment about how much each measure matters. Just as anyone who has played around with an Excel sheet knows, if you change the weights, the rankings would change too. So, the first step in accepting U.S. News rankings as the Holy Gospel is that you agree that its indicators and the weights are just as meaningful to you as they are to U.S. News.
For most people, this is a problem. Most students go to college or graduate school not only for the experience of learning and exploration, but also, to find better employment after graduation. But the U.S. News ranking methodology does not consider how employers rank the school! Nor does it include graduate placement statistics - details about how many (and the kinds of) jobs students got after graduation. If the most famous ranking outfit does not think graduate outcomes are important, something is wrong!
Here's an excellent article in the NY Times in which a Columbia professor of math challenges Columbia's ranking. Here's his full 22-page post. And another article in the Wall Street Journal (subscription required) that describes how Columbia and other top universities pushed master’s programs that failed to generate enough income for graduates to keep up with six-figure federal loans. Columbia is one of the highest-ranked schools by U.S. News and such a high ranking probably influenced many students to sign up for programs that clearly did not produce promised outcomes.
Problem #2: Ranking is not the same as reputation
We do not need any publication to tell us that elite schools such as the Ivy Leagues or Stanford, MIT, Caltech, Berkeley, Duke, Rice and Carnegie Mellon are outstanding institutions of learning. We already know this fact. This kind of reputation is earned over decades (and in the case of the Ivy Leagues, centuries) of hard work and accomplishment. In a sense, reputed schools are famous - for being famous.Rankings, however, ebb or flow with the tide. It is nearly impossible to take an institution with millions of complex interactions involving its management, faculty, students, parents, employers and trustees for nearly a year — and reduce them all to a number.
There’s also the question of how valid the underlying data - for all these interactions - is. Most ranking outfits rely on the schools to provide them with information — such as admissions figures, financial resources, graduation figures and alumni giving — because a transparent, central hub of data does not exist. This creates an inherent conflict of interest. If a school is truthful in its reporting to an external organization, it could potentially end up being ranked lower. Should the school be truthful or aim to manipulate data a bit so that it can end up being higher? When perfect data is not available, every ranking organization makes assumptions to compute scores. Are these assumptions all valid?
The Obama White House criticized this approach in a September 2015 fact sheet as old and static, not consistent with what families and students need. “The old way of assessing college choices relied on static ratings lists compiled by someone who was deciding what value to place on different factors”. [Emphasis ours].
Problem #3: Commercial school rankings are, well, commercial
A key issue about commercial school rankings is exactly that — that is, these rankings are produced by commercial, for-profit companies, which love the status-quo. The ultimate goal of these outfits is to sell their rankings or build a brand around them. In 2007, the US News site was regularly getting about half-a-million hits a month. Within three days of the rankings release, traffic went up to 10 million page views, a twenty-fold increase. In 2010, the company walked away from magazine publishing altogether focusing instead on its rankings business.Another problem with rankings is that students are placed in the uncomfortable, counter-intuitive position of choosing a ranking system before choosing a school. Each organization uses its own method to rank, so which ranking system is best for you? If the school you like is ranked high in a few lists but ranked lower in the others, what should you do?
The Obama administration set out to correct these flaws. Rather than rely on surveys and snapshots of data as the ranking outfits do, it proposed to use real data to rate the quality of colleges. Every student who takes a student loan is lodged in the Department of Education database. If a student transfers to another school, this information is also reported to the government. Every student who graduates and begins a career has to file a W-4 withholding form, so the IRS knows where this student went to work and how much she is making. If a student failed to make loan payments over a consistent period, this information is also known to the government because the IRS has the power to divert tax refunds to unpaid loan amounts.
Problem #4: The establishment likes the status-quo
With advances in data sciences and computing power, the government has the ability to come up with a technological solution to tie all of these disparate pieces of real information into a comprehensive ranking system that is based on actual data and not subject to commercial interests. In 2013, President Obama announced that all 7,000 of the nation’s colleges would be ranked by the government. As the New York Times reported, the aim was to “publicly shame low-rated schools that saddle students with high debt and poor earning potential.”
But the plan ran into fierce opposition. “Critics, including many of the presidents at elite private colleges, lobbied furiously against the idea of a government rating system, saying it could force schools to prioritize money making majors like accounting over those like English, history or philosophy.”
This type of thinking is at odds with the outcome based selection approach we have advocated for years. If students are really passionate about subjects like English, history or philosophy, they may still choose careers in those fields but this should not stop them from knowing how much they may earn after graduation, or that the return on their college investment is likely to be poor.
The Obama White House succumbed to this pressure from the entrenched establishment and when the new College Scorecard was released in September 2015, it did not have a ranking system. The Trump administration did nothing to correct this issue. And it is unlikely that the Biden administration will rank colleges using outcome.
Problem #5: The conflict of interest is paramount
Ranking outfits must rely on colleges to provide data and self-declare it as authentic. This presents a conflict of interest because some institutions could submit faulty data in the hopes that ranking outfits do not catch it. The result could be better rankings. According to an indictment of Temple University's Fox School of Business by the U.S. Department of Justice, "relying on the false information it had received from Fox, U.S. News ranked Fox’s OMBA program Number One in the country four years in a row (2015 – 2018). U.S. News also moved Fox’s PMBA program up its rankings from No. 53 in 2014 to No. 20 in 2015, to No. 16 in 2016, and to No. 7 in 2017."
Our takeaway
Students are better off to use rankings sparingly and more as a final filter, if at all, rather than as a crucial pivot throughout the process. Outcome based ranking lists, such as those from the College Scorecard (although not ranked) or Payscale.com are far better than commercial ranking lists because they keep your focus on the Return on your College Investment.The best approach is to use selectivity and yield numbers from the College Navigator and base your decision on what other students do.
A Note About Rao Advisors Premium Services
Our promise is to empower you with high-quality, ethical and free advice via this website. But parents and students often ask us if they can engage with us for individual counseling sessions.
Individual counseling is part of the Premium Offering of Rao Advisors and involves a fee. Please contact us for more information.
Go back to "Rao Advisors - Home".
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.