The Over-Estimated Impact of Median LSAT Change on USNWR Rankings

It's that time of year again—when schools report their incoming class profiles. While official ABA 509 data won't be available until mid-December, many schools, rightfully proud of the results of hard work and significant investment, are showing off their new classes. In fact, we've been keeping track of incoming class data here.

So far we're seeing modest improvement in schools' medians — though certainly nothing like the 2017-2018 cycle, where median growth was the norm, and a change of +2 wasn't uncommon. Instead, it seems this cycle saw more restrained increases. Indeed, a number of schools with median growth are also cutting class size substantially, likely meaning they wouldn't have been able to achieve their new target if they held the old class numbers.

But improvement always brings the same question: "Will these new medians cause XYZ school to increase in the USNWR rankings?" Or the common reverse: "Will a median drop cause XYZ school to drop in the rankings?" Applicants and schools alike are are often very focused on the U.S. News rankings. We're not here to debate the validity of the rankings (though our Senior Consultant to Law Schools and rankings expert John Stachniewicz did do a fascinating video that discusses that topic—available here) but rather to explain how median changes can (and can't) impact the rankings. We'll be focusing on median LSAT, but the information is broadly applicable to median GPA as well.

How does USNWR Measure Median LSAT?

The first thing to understand is that USNWR does not measure median LSAT on a continuous 1-point scale. Rather, it's done by percentile. If you've taken a disclosed LSAT, you've seen your scores percentage equivalent. Each individual numerical LSAT score—134, 159, 172, etc.—has a corresponding percentile. It is this percentile scale that USNWR uses when calculating how schools will receive their score contribution from the median LSAT category.

USNWR does not reveal which score percentile conversion chart they use for their rankings. As anyone familiar with the LSAT knows, these percentages shift slightly from test to test. Nonetheless, LSAC regularly publishes a score distribution chart. You can see a copy here. As you can see, if you compare the data across different years, the exact percentiles associated with each 120-180 score vary. These variations are very slight, however. Personally, we prefer to use the 2014-2017 aggregate as it compiles multiple years worth of data, something that we feel USNWR also likely does—and remember, we're trying to replicate, as closely as possible, USNWR practices. For convenience, here is a copy of each ranked law school's median LSAT and corresponding percentage.

USNWR then standardizes the data (necessary to account for differing metric values). Standardized data is weighed, etc. etc. The specific math isn't important.

So what does this mean for schools with rising LSAT medians?

First, LSAT median growth comes with diminishing returns. Each point you go up on the 120-180 scale has a lesser and lesser corresponding percentage growth. From 155-156, the growth is 3.6%; from 165-166 it's 1.3%. This causes diminishing returns, especially on the 165+ range of medians. Say you took your school's median LSAT from a 167 to a 168. Well, this cycle the score contribution you'd receive from that change would only be about 4% within the LSAT category—not 4% of your overall score. And that assumes everyone else stands still, which is obviously never going to happen. In fact, schools in this range may often be better off focusing their scholarship money targeting median GPA growth.

LSAT Median Score Contribution Changes Yearly

The second issue is that a school's median LSAT isn't ranked in a vacuum. Each year it's scaled in comparison to all ranked law schools. If the overall average LSAT median goes up, well, your shiny new median won't be worth as much as it would have been the previous cycle. We saw this last cycle, where overall LSAT medians among ranked law schools grew by about 2%. In fact, this often leads to unpleasant conversations at schools whose median LSAT stayed the same, but rank dropped anyways. Think about it. If your LSAT median stood still while others grew, well, even though in absolute terms your school is no different from last year, in relative terms it's worse off. And relative is what matters in the rankings.

Schools need to remember that their peers are also part of the USNWR rat race. In up cycles—as the past two have been—it's incredibly important to keep up, or be left behind. Staying still at a 160 for the 2018 entering class would have seen a school lose about 10% of its overall raw score being contributed by the LSAT. Exact numbers will vary by score, but the decay in same score value holds true any time overall averages improve.

Other Categories Matter

And finally, medians aren't the only component in the rankings. Median LSAT is weighted at 12.5% and median GPA at 10%. Significant, absolutely. But neither one measures up to the Peer Assessment or Lawyer/Judge Assessment scores; median GPA is essentially tied with Expenditures Per Student for Instruction, and other small categories add up quickly. A single factor is unlikely to make or break a school's rank unless it is a true outlier—Yale, for instance, with its Expenditures Per Student for Instruction. We've even seen instances where schools divert attention from one metric—say, shifting expenditures from the Direct category toward financial aid to "buy" higher medians—but have inadvertently hurt themselves by lowering their score contribution in the first category; or admitted so many students to ensure this growth that their average expenditures, student:faculty ratio, and acceptance rates all go down.

Will Median Changes Cause Rankings Changes?

As is so often the case, it depends. Median LSAT increases are much more likely to yield rankings improvement in the mid and lower ranked schools, where the difference in overall score from one rank to another is usually as small as a single point. Those schools also see a larger impact on their score from boosting median LSAT than higher ranked schools do—thanks to the earlier discussed "diminishing returns" of median LSAT growth. But, there are so many other factors at play. A school might not boost its median LSAT, or even GPA... but if they invest enough money in direct expenditures, have excellent career outcomes, or see Peer Assessment scores rise (note that we believe in general that rankings lead Peer Assessment, and not the other way around), it might not matter. There are too many moving parts to speak in absolutes.

Medians understandably get a lot of attention. Among applicants, it's because they're the metric often used to gauge likelihood of admission. On the other side, schools like show how well-credentialed their incoming students are—plus, administrations often feel these are the largest metrics they're able to control. And to some extent, they are correct. But a singular focus on medians—especially LSAT medians—is dangerous and can lead to false expectations of ranking improvement. The USNWR system is more nuanced than that, and competition too fierce. The good news for schools is, rankings improvement is absolutely possible; we've seen it happen across all tiers. The schools that do improve all focus on a broad array of programs to improve diverse metrics over a sustained period. There are no quick, easy fixes, but concerted effort will yield results.