 All right, time to make my Ivy League University tier list. Should be fun. I wanted to do something like this for a while. So let's see, we got them all down here. We'll start at D tier, that's the way you do it. All right, well Harvard definitely, bad personal experience there, it's a long story. For instance, they got the same colors as like the Cincinnati bangles, which is gross. So yeah, they go in D tier. And Dartmouth starts with a D, so D tier. Yeah, that makes sense. And Cornell starts with a C, so that'll be C tier. I think that was also the dude from the office that I didn't like that much, so yeah. Cornell, Cornell, Cornell, Cornell. Brown, B tier, B brown, that seems about right to me. Also kind of boring on the logo. I mean, I don't know, they could have done more. Yale, that's gotta be A tier, right? I mean, it's got Hebrew in it, what more do you want? That leaves me torn between these two. Which one's gonna end up in S tier? I know a lot of people in Columbia, so I'm gonna put that pretty high. But I think that since I'm from Pennsylvania and I've hung out with some dudes from UPenn before, yeah, that's S tier right there. Definitely an S tier school based on the handful of people that I have met from there that I really liked. So there it is, there's my tier list. What do you think? In all honesty, my tier list is just about as useful as the college rankings that you find online. My name is Meacham and today on the Score Channel, we're gonna be talking about why college rankings are totally broken. Okay, so my search started with just trying to understand a little bit about the different college rankings and which ones were the biggest ones and I came across this article by Business Insider which kind of inspired the rest of this video. So thanks to them for making this. Now fortunately, all of these sites are nice enough to explain their methodology and they have a page although in some cases it's a little bit buried. I'm gonna leave the links in the description to each of these so you can check them out yourselves. Okay, so let's start with US News. Their methodology page is pretty transparent. They give you all the variables they consider and weigh them out. On the surface, everything looks pretty reasonable. But US News is actually the worst ranking on this list and gives us a great opportunity to see what not to do. The rest of the rankings will make much more sense when you see how the flaws in US News appear in them too. So let's go from largest to smallest. We've got the graduation and retention rates combined for 22%. Academic reputation and faculty resources are at 20%. Financial spending per student is 10%. The graduation rate performance is 8%. Student selectivity for 7%. We also have the Pell Grant graduation rates considered social mobility at 5% and graduate indebtedness at another 5%, followed lastly by alumni giving rates 3%. So my pie is actually a little different. I don't know if you noticed but I separated the retention rate from the graduation rate but my graduation rate is still the biggest factor by far. It's actually more than 25% in my pie chart and here's why. The graduation rate performance is essentially getting extra points in the graduation rate pattern. According to their explanation here, graduation rate performance is simply comparing their predictions to the actual results. So why is graduation rate a terrible metric to use for a college ranking? It's the sixth year graduation rate. You're supposed to be entering the class of 2025 if you sign up to fall 2021, not the class of 2027, essentially rewarding universities for doing their job at 50% more poorly than they should be. But that even brings up another point. Is it really the university's job to make you graduate? You're a student, therefore you're responsible for your own actions. You decide whether you're gonna study for your test, whether you're gonna do your homework, whether you're gonna read that book that you were assigned or not. And ultimately that is what is gonna determine if you graduate on time. The university should not get credit for your work. Now I wanna talk about why the undergraduate academic reputation is another terrible way to rank colleges. It is a peer assessment survey, but which people are they asking? They tell you here, top academics, presidents, provosts, and deans of admissions rate the quality of peer institutions with which they are familiar. So let's just think about what this means. And to illustrate why this is messed up, I'm gonna ask you guys to do a little experiment with me. So if you can, crack open a word doc or grab a pencil or pen or whatever. And in 30 seconds, I want you to list as many of the best universities as you can, okay? I'm just gonna do this right here with you so you can see exactly what I'm saying. All right, check it out. Okay, 30 seconds. I got 12 on my list here in 30 seconds. The average person, how many do you think they can rank? And why do I first start with Yale, Harvard, Princeton, like why did I jump on those immediately? It's because they're Ivy League colleges that everybody's really familiar with and that speaks to the problem of these peer assessment surveys. If you're asking people about the universities they think are best, they're probably gonna remember the exact rankings from last year and just repeat those. It creates a self-perpetuating cycle. It's a terrible way to measure universities. Furthermore, it's asking academics about universities, not students and you're a student or you're going to be. That would be more important in my opinion is to get the opinions of actual people who've gone and studied there and had the full experience. Now, one last thing on US News before I move on to the others. They have two very important metrics but they only give them 1% each and I wanna explain why those are important and why it's just terrible that they've given them almost no consideration. As a teacher, I can tell you that student-faculty ratio is probably one of the most important metrics in education. Smaller class sizes are generally just easier to have. The full-time percentage is extremely important because lately a lot of universities are cutting corners by hiring a lot of part-time teachers and by having a lot of teacher assistants doing regular classwork. Full-time staff would say a lot about the stability of the teaching staff there and it would say a lot about how the university takes care of their professors. In my opinion, those two variables should be much, much higher on the list and it's kind of a shame that they're only 1% each. Okay, so now let's go global. The next two that I'm gonna look at here are global university rankings which present their own problems but the world university rankings from times has, in my opinion, it commits two huge flaws one of which I just touched on a little bit with the US News is that there's a lot of survey-based points here. There's 15% reputation survey for teaching and 18% for research. The other problem here is that the world university rankings rely so heavily on research. They are using 30% research volume, income and reputation and another 30% for research influence based on citations. We're looking at 60% of the ranking coming from research alone. If the professors are focusing on their research then they're not focusing on you. Generally speaking, when you look at professors who have a lot of research and a lot of published papers, they have a lot less class time. So yeah, you might be going to a university with a fantastic tenured professor who's been published 150 times but how many hours are you going to spend in front of that person? Okay, so quickly we're gonna look at the qstopuniversities.com world university rankings. Honestly, their methodology is probably the worst of anybody's here. They have 40% going to academic reputation through a survey which we've already talked about as to why that's not really a good way to measure a university's quality. They do say it measures sentiment but you should probably make decisions based on facts and not feelings. The other big problem here is again, 20% is going to citations per faculty and I will give them credit. They put 20% to the faculty student ratio so finally somebody gets it right. But from there it's hard to salvage this one. The 10% for employer reputation is a little bit more admissible since the employer is the guy that's gonna give you a job at the end of the day but overall I would stay away from topuniversities.com. Okay, so now for our last ranking in this video let's move on to niche. I was really excited about niche because they said they're from Pittsburgh and that's my hometown so they must be awesome. The homepage here doesn't really have any information about methodology and I had to look around a little bit. I realized that once you click on colleges then you're in the best colleges and the methodology is quietly hidden behind this little more button that says read more on how this ranking was calculated so let's do that. Now these are the weights of the different indicators in this ranking. We have academics, value, professors, campus, diversity, student life, student surveys, local area grade and safety grade. Seems pretty straightforward so I made this really convenient pie chart for you guys so that you would know exactly what it was. Unfortunately things did not go so smoothly. I started to dig a little deeper and before long I was creating a gigantically massive Excel sheet and doing math that I did not expect to do and it was 2.30 in the morning I barely got any sleep because when I started clicking on these multiple sources buttons what I saw terrified me. I'll give you an example. The academics grade right says it's 40% here of the total so let's go in there and see what that 40% is and then immediately I see the professor's grade and the student surveys and the diversity. All of these things are back here. They're being counted twice. There's a bunch of different indicators that are scattered all over the place to the point where these categories lose meaning. If you look at the student survey it's listed here at 5%. Seems pretty good. But if you add up every other time that the student survey appears in these rankings in different categories the total is actually 15.81%. I ended up with two very different pie charts. So here's what niche's rankings should really look like. The number one category is the student survey which represents 15.81% of the total value followed by student outcomes at 13.75%. This was a combination of all the postgraduate employment and earnings data. Quality of life counts out to 10%. That's a category I've made up which combines the local area grade, the safety grade, housing and food and sports and parties. Diversity combines to 9.31% of the total value much higher than the 5% that they said in the first place. Student level is another category that I've kind of made by combining the SAT score values and the merit scholars that are entering into the universities to get 9%. Admissions is 9% and they factor in the rate of emissions and also the yield, the number of people that actually accept the university after they get admitted. Faculty resources makes up 8.19% only calculating pay and the full-time percentage of professors and the awards that they have. Next we have costs and loans at 6.88% which is the net cost of the schools plus the loan rates and default rates for the students. Then we have the six year graduation rate which is my favorite statistic at 6.13%. Student faculty ratio makes it a career tier at 4.5%. Retention rates are at 3.88% and finally research spending at 3.56%. Okay, niche, why two pies? Why are you mixing your categories so much that it's almost impossible to tell what they mean? I understand that companies need to protect their intellectual property and that niche is a private business and they need to make money but this seems more like a deliberate attempt to obscure what's going on here behind the scenes at niche. Not only was it a little bit tricky to find their methodology but even then the methodology has errors inside of it. Take a look at these screenshots right here to see what I mean. I found another methodology page that talked about their actual math and one of the things I noticed was that they said that the weights could be adjusted depending on how they impact the final results. Folks, that's a fancy term for cherry picking. You run the numbers, you don't like the results, so you change the numbers and run them again. That suggests that maybe niche is up to something they shouldn't be. It's enough to make me wonder if the real reason they move all of these indicators around and ignore the actual boundaries of their categories is just a way to make sure that certain universities keep their top spot and since niche uses a standard distribution which means very few universities at the ends and a lot in the middle, you're going to have a limited number of A plus grades that you can give out. Currently, it's about 90 something schools. I don't wanna go full on accusation against niche because I'll be honest, once I understood their data, I really liked it. I think it's actually the best one of all the ones that we looked at today. It focuses a lot on variables that impact you as a student. It takes student surveys into opinion. It focuses on your outcomes and your finances. It's a good ranking. I just wish it was a little more transparent. I wish they would tell us a little bit more about what's going on behind the scenes and I wish that their information was a little more consistent with what it presents itself as. I don't think it's fair to give 40% academics when in reality, almost half of that grade is made up of other things from other categories. Niche, you got some work to do. So, if the rankings are broken, what's the best way to find a university? Well, you're gonna have to subscribe to find out because in our next video we're gonna be talking about how you can find the best university for you and what metrics you should be looking for instead of just going by these rankings. Make sure you like this video and if it helped you out, please subscribe. You're helping us out too. We wanna keep providing you guys with content that helps you figure out what you wanna do with your studies after high school and ultimately help you make good decisions. I think that there's a lot of misinformation out there and that's what we're trying to clear up. If you have any suggestions or any thoughts about our methodology today, you can drop some comments down below and again, we've left the links to all of the different sites that we covered today so that you can check them out for yourselves. Thanks for watching. I'll see you next time.