Recently, David Frakt asked the question again about LSAT scores being indicative of bar-passage rates in his post on the Infilaw Schools, and it reminded me of the kerfluffle that has occurred between various scamdeans on the one hand and the National Council of Bar Examiners not so long ago.
To recap: for scamdeans and other interested persons desperate to justify their existence, it's wrong to deny opportunities to law students, especially URMs, just based on an entrance test. For the NCBE, a few intellectually-honest law profs, and even the ABA (if you read their often-not-enforced regulations, at least), it's wrong to allow students to pay $200k over three years to only pull the rug out at the end because they are unable to get their bar license.
At the core of the debate is the question as to whether or not (1) the LSAT actually predicts success on the bar exam, and (2) whether there is enough data to say so one way or the other. While I am no Professor of Statistics by any means, it doesn't take much work, either, to come up with answers of (1) a qualified yes, and (2) a qualified yes. This is based on three years of bar passage data for 200+ law schools that has been complied by Law School Transparency - which translates to approximately 600 data points. The results are below:
Given the correlation of 0.51, there is arguably a trend here. To start off, the question of LSAT predictability is more complicated than a simple function box, and no one has ever seriously claimed otherwise. This, after all, makes practical, experiential sense - the idea that only one variable (median LSAT) is the only factor that influences outcome (first-time bar passage percentage) is an oversimplification. However, to say that there is no correlation whatsoever and the LSAT should be thrown out is also self-serving Cartel-dishonesty, and takes us back to the feuds between the establishment deans and the NCBE over what this data means in the first place. Obviously, several factors would logically and likely apply here to give a better predictor for bar-exam performance, but what those factors are, and how to quantify them, are an ongoing matter of debate.
Second, scatter in the data is not license to throw out the conclusion that LSAT scores are a predictor. Horizontally, for example, you find 90%-plus first-time bar passage rates at a median LSAT of 150 and up. Notable, however, is that the number of sub-170 scores that can claim this are not numerous, and there are no sub-150 data points that can claim this at all. Further, the variance in the 90%-plus passage rates data decreases as you approach 170-175, as can be seen from the triangular grouping of the raw data.* There are no sub-90% data points at 170-175.
Vertically, the spread in passage rates can vary as much as 40-50% at a median LSAT of 150, for example. By the same token, that variance narrows significantly as you approach 160, 170, and 175. The data does "cluster," and that clustering has significance. Again, what does that mean? Zip codes matter? Students matter? Schools matter? As stated above, something is going on here, and it certainly has something to do with that elusive term, "caliber."
So, the scamdean canard that "anyone deserves a chance and can pass the bar exam, regardless of low gate-keeping predictors" may carry some weight, but again, the LSAT is indicative of bar passage rates, full stop. The trendline would suggest that even a school with a median of 120 would have 1-in-3 students that pass the bar, which could be somewhat believable (hey, Cooley passes half their students with a median 146). In light of ABA regulations, the questions remain that "should that student be admitted to law school in the first place," and "what does it mean that someone who scores a 120 could pass the bar exam on occasion," among others. One might even question the value of the bar exam in the first place, or why a costly three-year program of study is required as opposed to apprenticeship, if the goal is to get quality lawyers out there who may do poorly on a standardized test, pre- or post-law school, yet can still kick-ass in the courtroom.
And the GRE substitution for acceptance to law school is just "new clothes" anyway, buying a few years for desperate law schools to get more student loan money until the same trend in LSATs becomes apparent in GRE scores and the same policy argument is made all over again. Schools like Harvard can afford to be magnanimous and "lead" the way on this count, as they get a ton of 170+ LSAT takers and the results are plain to see as regards bar passage. No big risk there. Lower-tier schools, however, not so much.
In conclusion - while there are certainly "diamonds in the rough" out there, (and, personally, I readily apply the "rough" designation to myself despite my so-called "success"), it's just too big a gamble to throw caution and $200k to the wind for trite, hackneyed notions of "defending liberty" and "pursuing justice." There are already too-many lawyers and the legal market is shrinking. Young people should not be used up and discarded in so callous a fashion, law school scam or not - but, somehow, the Cartel reasons otherwise...
*Interestingly, U of Wisconsin and Marquette University both report 100% pass rates for the three years considered. WTF? Harvard and Yale only squeaked by in the mid-90s on average. It's like Wisconsin schools don't understand the question being asked by the ABA...or maybe it's that basking in the glow of the Great Lakes and Canada, nearby, bumps American pass rates off the chart. What then, explains Cooley...?