Thursday, November 20, 2014

LSAT Scores Do Not And Cannot Predict Bar Exam Scores

In a lot of the commentary about declining LSATs and declining bar examination scores this past summer, there is, shall we say, a lack of the sort of rigor that normally attaches itself to peer-reviewed publications.

For recent examples of this, please see the work of Derek Muller (Pepperdine) and Jerry Organ (St. Thomas), both somewhat supporting the "damn you, MBE!" thesis advanced by Brooklyn Dean Nicholas Allard.  (more recent nonsense here).

Simple foundational statistics and healthy skepticism can do much to completely demolish these ideas.  For example, here's Organ:
[A] comparison of the LSAT profile of the Class of 2014 with the LSAT profile of the Class of 2013 would suggest that one could have anticipated a modest drop in the MBE Mean Scaled Score of perhaps .5 to 1.0.  The modest decrease in the LSAT profile of the Class of 2014 when compared with the Class of 2013, by itself, does not explain the historic drop of 2.8 reported in the MBE Mean Scaled Score between July 2013 and July 2014
And here's Muller making a similar claim:
[W]e see a fairly significant correlation between my extremely rough approximation of a projected MBE score based on the LSAT scores of the matriculating classes, and the actual MBE scores, with one exception: this cycle. 
Just one problem with all of this:  LSAT year-over-year comparisons are more or less baseless and have no predictive value by themselves.

About the LSAT

To learn why, we need to try to understand where LSAT scores come from, and understand that they have little connection to objective reality.  An LSAT score is derived from the raw number of questions on the LSAT that one correctly answers.  The administrators of the LSAT then "scale" the scores from 120 - 180 depending on the difficulty of the test, which is pre-determined using a metric that is based on prior recent LSAT administrations (LSAC uses what is called "Item Response Theory" to model the test to individual performance on questions instead of assuming all questions equal as your 7th grade math teacher probably did).  They "normalize" or "equate" the test to to even things out over administrations.  It's not entirely transparent (if it is somewhere in clear fashion, please point it out and I will correct anything erroneous herein), but it's fairly clear that the median over several administrations is around the 150 mark by design.

The idea is that students who take a "hard" LSAT should not be punished relative to students who take an "easier" LSAT, and therefore the former will have a more forgiving curve.  The "curve" is set in advance because the questions are "pretested" by previous examination takers, so LSAC "knows" how hard that particular test is.

It should be obvious that this approach makes what might be a serious mistake, and certainly an assumption that invalidates its extrinsic utility:  it assumes that the students taking each administration of the LSAT are roughly equivalent in aptitude on a year-over-year, aggregate basis.

In statistical terms, they assume any given set of administrations is a fairly representative sample of a fairly constant population of pre-law students.  This would seem to severely undercut any idea that the LSAT has any sort of non-relative value; after all, if item response theory evaluates prior responses to questions, and the prior students were either significantly brighter or significantly dumber than the current group taking the test, how can the test possibly have any year-over-year validity outside of a comparison relative to one's class?

A simple hypothetical:

In year 1, 60,000 separate students take the LSAT.  The economy was especially brutal for straight-out-of-college hires, and so a large cluster of elite students decide to try law school.  The median IQ of the group in 115, and there are a spate of applicants from the Ivy League and comparable schools.  An IQ of 105 would roughly put someone in only the 25th percentile.

In year 2, the economy is moving much better, the elite graduates have found something else to do, and the median IQ of 60,000 LSAT takers is 105.  In this group, a 115 IQ puts one in the 75th percentile.

Going strictly by the three-year percentile charts, in year 1, the student with the 115 IQ is going to score in the low 150s.  In group 2, the 115 IQ is going to be in the high 150s, gaining 6 or 7 points just by being with a dumber group overall.  In group 1, the 105 IQ scores in the mid-140s.  In group 2, that same student shoots up into the low 150s.  In the world of bar predictors, this same low-scorer just greatly increased his bar exam probability by simply sitting down a second time at a later point.

You can claim that the certain tests will be above or below the percentile mean because of the variances in item response theory and "equating" working themselves out over the administrations, but when the overall mean is set at 150 and there's a nice bell curve around it, the conclusion is inescapable that a 150 in 2009 will not necessarily equal a 150 in 2014.  A 150 in 2009 could be a 142 or a 163 in 2014 depending on who else has taken the test recently and who else is in the room.

So:
  • There is no inherent connection between IQ/reasoning ability/brainpower and one's LSAT score.
  • The raw number of LSAT high scorers is dependent almost strictly on the number of people who take the LSAT within a certain timeframe.  A school's declining LSAT numbers is more of an indication that there are fewer fish in the pool overall, and really nothing more.  In an alternative universe, it is possible that a school's LSAT percentiles would drop and their bar passage rates actually increased.  LSAT scores are dropping almost everywhere outside the top rank of schools.  This, by itself, should make us question any extrinsic value that the test may have.
  • No matter how dumb a large cohort is (say, a series of years where going to law school is a ridiculous idea), a certain percentage (2-3%) of students are going to inevitably score over 170, at a minimum because the test will at some point self-correct for the "difficulty" of the test, and LSAC apparently aims for a 150 median over time.  That does not mean they all have equal abilities in terms of navigating law school and the bar examination, or in subsequent law practice.  In a few years, I expect to see "new associates aren't as sharp as they used to be" writings from law partners who have had their heads in the sand.
  • There is no true transparency anywhere in this industry, not even on something as basic as an entrance examination based on what appear to be otherwise-sound mathematical models. 
  • It is entirely possibly to have a four-year period where there is a steady decline in the quality of students taking the examination, but only a slight or modest decline in LSAT scores.  This is basic math.
Historically, there may have been a vague correlation between LSAT scores and bar exam performance because classes were relatively stable in their distributions, and thus LSAT scores more or less mimicked a generalized measure of intelligence for relatively consistent sample populations of potential law students.

After 2008, and in wake of the most recent bar results, those bets have to be called off.  To put it bluntly, it's entirely possible the law schools slowly started enrolling collectively dumber students, and we really have no way of knowing that from LSAT medians.

It is absurd for any serious claim or inquiry about one cohort's abilities to be based on the LSAT where the LSAT has no real connection to actual real-world aptitude beyond providing a relative measurement against one's peers.  It does not - and cannot - answer the question of whether one's peers are abnormally bright, normal, or abnormally dim.  Because of that fatal flaw, LSAT scores have no predictive value whatsoever when it comes to a slightly different population taking an unrelated test that has separate controls for year-over-year validity.

As a concluding point, here's a daily koan for you:  why do Allard and friends not go back in time and ask about how the LSAT got scored? 

Assuming Unknowns and Constants

Another fundamental flaw in the analysis provided by Organ, Muller, and others is even considering the 25th, 50th, or 75th percentile LSAT scores as usable in determining anything about how a portion of that group will do on a subsequent examination where the differences are significant at much smaller levels than 25 percentage points (such as bar exam pass rates).  The numbers between those guide posts can be highly variable, and the statistics manipulators are basically assuming there's some constancy or uniformity in the unknown numbers in drawing their conclusions.

Consider a law school entering class of 10 students.

Class A:  161, 161, 161, 156, 156, 156, 156, 154, 147, 140
Class B:  160, 160, 160, 155, 155, 155, 155, 153, 153, 153

Now, which group has the higher LSAT scores and will likely be ranked higher in the magazines?  Now, which group would you bet on to have the higher bar passage rate, if all ten students take the bar?

Multiply this little exercise by hundreds and you can quickly see where not knowing what's at the tail end of the curve (or the middle portions of the curve) is a huge problem.  We (and that includes Muller, Organ, and their peers) have no way of gauging just how terrible the bottom-tier students are at these institutions, notwithstanding any issues with the LSAT itself.  130?  125?

What about the allowance for people who haven't even taken the LSAT? 

There are, of course, other variables the MBE critics are ignoring.  One is students who drop out or transfer (in or out).  If students with median LSATs drop out because law school is a losing bet, the school's bar pass rate is more likely to drop than not.  Similarly, if a year has abnormally high transfers, either an exodus from lower-ranked schools by high-scoring students or an influx of lower-scoring students at higher-ranked schools (both are possible), it's going to throw off any correlation between the LSAT and bar pass rates.

There's far too many variables that cannot be accounted for by law professor statistics.

Conclusion

The admissions-department heuristic that LSAT scores can predict future bar exam success is misplaced in an act of statistical misunderstanding similar to the classic "correlation, not causation" mistake from Stats 101.  Allard, Muller, Organ, et. al see historical correlation, assume causation, and then cry foul when a "predicted" result doesn't happen (and, in a surely-unrelated aside, it hurts their institutions).

Out of all the measurements we have available, the bar exam is probably the most consistent in terms of measuring raw aptitude on a year-over-year basis given that they're looking for a minimum competence bar and not a "hey, let's get snowflake into law school" motivation.

It's likely not the problem; it's almost certainly the students these law schools are enrolling, and no manipulation of statistics and empty claims of MBE chicanery can alter that.  There may be a problem with the test, but given that other, more simple explanations seem more likely and there is no reliable proof of an error, I don't think it's much of a credible thesis.

Ultimately, this is - yet again - number manipulation by the law schools and their friends, this time to support the idea that their open admissions policies should have as little repercussions as possible (it's basically a salvo in the coming battle over bar passage numbers the lowest-ranked schools may have with the ABA).  As a concerned member of the bar, I oppose their efforts, and I oppose any effort to make the bar exam essentially match percentages with the LSAT, as all that does is ensure that a set percentage of each class WILL pass the bar no matter how dumb the cohort or three-year cohort or whatever.  We can talk about the bar exam's utility elsewhere, but if we're going to have it, it needs to mean something beyond what the law school deans want it to mean.

They have no way of supporting any claim that the class of 2011 was just as bright as previous classes, and most available evidence suggests the opposite (for one, they went to law school in fall 2011), but the schools will be damned before they let what is likely the sorry truth get in the way of blaming someone else for the mess they've ultimately created.

46 comments:

  1. I think we might be over thinking this a bit. All the evidence I've seen points to students with LSATs in the mid 150s or above having very high bar pass rates. It's when scores decline into the low 150s and 140s (students who shouldn't go to law school, attending schools that shouldn't exist) that we see large numbers of failures.As far as bar passage goes, the difference between a 158 and 162 is largely academic.

    Allow me to present some specific instances of July 2014 bar passage schadenfreude:

    45% of Cooley students who took it passed the Michigan bar this July. Dreadful. Is this the first time in history an ABA accredited law school had less than half its bar takers in its home state pass the exam? I wonder if Cooley will demote itself in the Cooley rankings.
    58% of bar takers from Infilaw's Florida tentacle passed the easier (by reputation, anyway) FL bar, the worst in the state.
    56% of Infilaw's Charlotte LS takers passed the NC bar, lowest in the state by far and a whopping 13% behind the next lowest NC school (4T embarrassment Elon).

    I was unable to find numbers by school for Arizona, where Infilaw's third sore festers. California results have not yet been released; no doubt TJSL and Whittier administrators are praying fervently.

    ReplyDelete
    Replies
    1. "It's when scores decline into the low 150s and 140s (students who shouldn't go to law school, attending schools that shouldn't exist) that we see large numbers of failures"

      Correct in hindsight. Going forward, I wouldn't bet on seeing more widespread failures at the mid- and upper-150s.

      Delete
  2. Objection, relevancy. There are no jobs. Class C has 170's across the board, and all passed the bar. Class C now has 10 members who are unemployed and owe $200,000 each. So... move to strike.

    ReplyDelete
  3. The LSAT is a joke exam. I've said this story before on here, but I tend to enjoy repeating it.

    I scored a mere 150 on the LSAT (51%ile), but still managed to get into a school with a medium LSAT score of a 161 (which happened to be my second choice AND a state school with a tuition of only $15K per year). I passed the bar on the first attempt in two states AND subsequently passed the Patent Bar. I even scored sufficiently high on the MBE to waive into DC.

    Of course, I did have some work experience and a masters degree, so the school overlooked my low score. Interestingly, my undergraduate and graduate GPAs weren't particularly high, but as a science major that is to be expected.

    Regardless, my LSAT score was ultimately a non-issue. Nobody ever asked me about my LSAT score, and I often joke about it when it comes up in conversation. Then again, had I scored in the 160s, you have to wonder if I could have gotten into Harvard, but I think that would be pushing it (my undergraduate GPA was still rather low).

    To this day, I laugh at the LSAT as being able to predict anything. It is a pointless little exam, which has no significance whatsoever other than being a barrier to entry (and a rather easy one to jump over at that).

    If you're an idiot, you'll do bad on any exam, whether it be the LSAT or the bar, or a spelling test. THAT is why low LSAT scores are correlated with bar failures--and we all know law schools take plenty of idiots.

    ReplyDelete
    Replies
    1. Amen, 10:01 AM. I scored a 151 about thirteen years ago, taking the test cold, and a 151 was considered "mouth-breather" territory on blog commentary back in 2008-2010, easy. Thank goodness for my work experience and a masters degree as well, or I might not have made it into law school.

      Oops...waitaminute...

      Delete
    2. @10:34 (10:01 here) -- LOL !!

      There's another thing I've often said on here: "I can't possibly be that smart. I did, after all, go to law school."

      Point being, I just think the LSAT is a stupid, stupid exam that fails at practically everything it pretends to test for. Much like law school actually if you think about it .. LOL

      Delete
    3. You are right. There is little difference between the LSAT and many of the hide-the-wiener law school exam hypotheticals.

      Delete
    4. I scored in the 99th percentile with no preparation. I was so innocent back then—and it was only a few years ago—that I didn't even know that there was a whole industry of expensive courses and tutors devoted to the LSAT.

      Having since taught some of those courses, I can tell you that they work. Students who practice diligently can expect to see a substantial improvement in their score. Which means that in one sense the LSAT can be bought—and, by extension, that it favors the wealthy.

      Old Guy

      Delete
  4. I absolutley love how these LawProfs became experts in statistics seemingly overnight, considering their non-mathematical BA undergrads and JD graduate degrees, thereby allowing them to pontificate at length about statistical prediction and tell those stupid math PhDs over at NCBE a thing or two.

    Are there no bounds to their sheer brilliance and academic rigor? Have they produced a theory of quantum gravity yet?

    ReplyDelete
    Replies
    1. Many of them don't know much about law either, but that doesn't stop them.

      Old Guy

      Delete
  5. Looking at the Tennessee Bar results, one can readily see that students from the higher ranked schools performed better than students from the lower and non-ranked schools. Only 4 out of 7 from Charlotte passed while 0 out of 5 from Florida Coastal passed.

    ReplyDelete
  6. "They have no way of supporting any claim that the class of 2011 was just as bright as previous classes, and most available evidence suggests the opposite (for one, they went to law school in fall 2011)"

    I love this quote. Nice article. It's interesting to think how correcting for one form of bias (that the LSAT might be harder or easier from year to year) introduces another bias - that the population of test takers is the same. The LSAT might want to back off worrying about test consistency a bit - it's probably easier to make the test equivalent than to ensure that the next round of lemmings is not dumber than the last (they are).

    ReplyDelete
  7. I'm glad you're discussing this topic, because there's something I've been trying to figure out for a while now.

    Take a look at Figure 2 (entitled "Smoothed-percentage frequency of LSAT scores from 2005-2006 through 2011-2012") on page 8 of the LSAC research report linked below. The figure appears to show no significant change over the six-year period.

    My question is: If the tests are scored on the assumption that each year's body of test takers is roughly the same as that in any other year, then what is Figure 2 supposedly showing? (TIA)

    The report is "LSAT Performance with Regional, Gender, and Racial/Ethnic Breakdowns: 2005-2006 Through 2011-2012 Testing Years (Oct. 2012)" and can be accessed here:

    http://www.lsac.org/docs/default-source/research-(lsac-resources)/tr-12-03.pdf

    ReplyDelete
  8. Speaking cynically, I wonder if some lawyers would prefer an easier bar exam. If lots of incompetent lawyers are out there attempting to "practice," it seems like that could create work for the competent lawyers.

    ReplyDelete
    Replies
    1. No, it doesn't work like that. The incompetent lawyers flood the market, and make it harder and harder to discern who is a good lawyer. Also, the over-abundance of lawyers is viewed as cheep labor, thus rendering them as a type of commodity/revenue-generators for employers. This makes it harder for lawyers to actually develop skills and cultivate their talent, thus making it harder to produce "good" lawyers.

      Personally, I think the emphasis should be on the admissions process for law school, rather than the bar exam. I'd prefer a "medical school" type admissions process that really screens candidates.

      Delete
    2. Agreed on the need for admissions to screen candidates. There is something deeply unethical about a profession that deems you unfit to practice only after you have sunk $200,000 into an accredited professional school courtesy of taxpayer-guaranteed, non-dischargeable loans.

      Delete
    3. Well, 11:29, you seem to be assuming that consumers of legal services can identify competent lawyers. That's a big, and unwarranted, assumption.

      Old Guy

      Delete
    4. Right, it's also ignoring market behavior. Consumers don't exclusively shop on experience and competence; they shop on price. If a newbie lawyer pops up every year, plucks 20-30 paying cases from other counsel before imploding because he's totally incompetent (and law is an area where you can hide it for quite some time), it's bad for everyone.

      Much, much better to control it at the admissions level. Or, we could make law school 2 years with a 2 year residency period in legal aid. But no one seems to be interested seriously in doing that.

      Delete
    5. @ 8:19 -- do you think it's wise to make law school only two years?

      I think half the reason we have such a problem with an over-saturated market is that law school is generally regarded as a quick 3 year course, then you can go out and practice.

      By contrast, medical, dental and vet school require 4 years (plus post-degree training that lasts b/n 3-9 years, depending on the specialty and sub-specialty), and a PHD is 4-7 years (followed by a post-doc that is traditionally 2-3 years, but can last longer). I think the time factor may deter certain people from entering these other programs, since many will have to wait a long time before they are capable of practicing their profession fully.

      So, personally, I'm very leery about any talk of shortening law school to essentially being a Masters degree program. The last thing we need is more and more people thinking they can jump into law school because it's ONLY two years now, and then they can get a "real" job.

      In my opinion, law school should become a 4 year program: 2 years of lectures, 1 year of nuts&bolts training (including writing classes, trial advocacy, transactional practice workshops and appelate advocacy training) and 1 year of supervised clinical work.

      Afterwards, once the attorney passes the bar exam, there should be a mandatory apprenticeship of 1-4 years, depending on the specialty. AND, as with medical school, there should be some sort of matching program for the more selective specialties. Judges & Academics should be required to do some additional schooling.

      Additionally, there can be 1-2 years masters programs for paralegals, document review workers AND members of other professions that want some training in law. Also, should someone not be able to complete law school, they can leave half way through with a masters program so as not to have wasted their time.

      I'm certain I'm in the minority on this one, but I do believe this system will ultimately produce fewer lawyers, better trained lawyers AND deter fart-for-brain lemmings from polluting the legal profession (and screwing it up for the rest of us).

      Delete
    6. I'll say this:

      At my élite law school, I asked two friends how many people from our class they would be willing (not necessarily happy) to accept as counsel to their mothers. Naming names, we identified about 8% of the class.

      And that's at one of the most highly regarded law schools. At a typical toilet (remember that I put 90% of law schools in Tier 4), there might not be anyone whom I'd regard as a decent prospective lawyer.

      Old Guy

      Delete
    7. Excellent observation old guy. In my t2-t3 out of my 80-person section there are like 5 people I would hire as my own attorney. And none of them went solo out of law school. I cant imagine how terrible the future solos of the class of 2016 are going to be.

      Delete
    8. @Old Guy -- very insightful observation, as always.

      Law isn't medicine or engineering. There are no prerequisite classes, or even an interview to screen applicants.

      The ONLY basis by which law schools use to admit people is undergraduate GPA and LSAT scores--we all know they don't actually read or care about the "personal statement" or letters of recommendation.

      So individuals that can perform well on the meaningless LSAT tests are somehow deemed to be brilliant. Therefore, the elite schools are probably filled with history or English majors (hence high grades) who happen to do well on the LSAT games.

      So, at the end of the day, it's extremely difficult to pick-out "good" candidates with the necessary talent to become "good" lawyers. Sadly, the way things are now, there doesn't seem to be any incentive to correct this flaw.

      Delete
  9. A little off topic. FSU Shooter was a law school graduate:The gunman was identified by authorities as Myron May, a 2005 graduate of FSU. He graduated from Texas Tech University's law school in 2009, and practiced law in both Texas and New Mexico, according to Tallahassee Police Chief Michael DeLeo. May moved back into the area a few weeks ago.

    ReplyDelete
    Replies
    1. I would not be shocked if he was about to default on student loan payments and was financially destroyed by law school. Most likely mental health was destroyed.

      Delete
    2. I remember thinking the same as soon as I heard that he was a law school graduate. Would be interesting to know his career and financial situation.

      Delete
  10. There is a double dumbing down going on. An invisible one: the overall population of LSAT takers is getting dumber because the brighter and more informed recent college grads are aware of the catastrophic outcomes for recent grads and steer clear. And a visible one: law schools are enrolling more applicants who scored poorly on the LSAT on a percentile basis. Thus, "Organ. . . found that this year’s J.D.s should have performed slightly worse. Instead, they bombed."

    ReplyDelete
  11. Iowa Bar Passage rate declines... complete with audio clips!

    http://wnax.com/news/180081-fewer-iowa-law-students-passing-bar/

    ReplyDelete
    Replies
    1. Imagining The Open ToadNovember 21, 2014 at 8:20 AM

      Weird, you'd think WNAX would have asked a reporter to get a comment/counterpoint from Nando... ;-)

      "The passage rate for Iowans taking the Bar Exam fell 11%... ...Drake University Law School associate dean, Andrea Charlow ... doesn’t agree that the schools are to blame for a decline in scores. "

      Delete
    2. Indeed, ASSociate dean Andrea Charlow doesn't believe that the law skules are responsible for anything.

      Old Guy

      Delete
  12. OT: CU approves new degree. "CU's law school will offer the master's of studies in law, a one-year program for professionals who need some formal legal education but don't want or need to earn a juris doctorate degree." The degree is aimed at students seeking jobs as "patent agents, compliance officers, human resources professionals and other positions require some legal knowledge."

    http://www.dailycamera.com/News/ci_26973782/CU-regents-approve-new-degrees-in-law-technology

    ReplyDelete
    Replies
    1. Who exactly "need[s] to earn a juris doctorate degree?"

      Soon, all schools will be doing this but I suspect that they will have few takers unless they offer the program at night and/or online. The day when people could take a year off from their career has long passed.

      Delete
    2. "is aimed at students seeking jobs as "patent agents, compliance officers, human resources professionals and other positions require some legal knowledge."

      I can't imagine what they'd have on tap that they think would be useful in the work of a patent agent.

      Delete
    3. Again, this seems like a degree devised by egghead academics who don't quite grasp what these professions actually do.

      My guess is that it will focus entirely on case law, but ignore regulations and agencies (i.e., all the areas associated with administrative law), which is what these professions actually have to deal with.

      Delete
    4. Of course the unintended consequences is that if all HR and compliance officers get this one year degree and they actually are taught something useful, then even fewer attorneys will be needed to support their businesses. Leave it to law schools to find a way to keep their faculty employed by further decimating the profession.

      Delete
    5. As I've said, "juris doctorate" is incorrect: it would mean (in mediaeval Latin) 'O thou who hast been made a teacher of law'. The correct term is "juris doctor".

      People who don't know Latin should kindly refrain from mangling the language.

      Old Guy

      Delete
    6. This new degree only reinforces the belief that JDs from every Cooley are bound for grander positions than those of compliance officer and HR paper-shuffler. In fact, many a JD saddled with a quarter of a million dollars' worth of non-dischargeable debt would kill for a crumby job in a goddamn personnel office.

      Nobody is going to sign up for this bullshit degree, which has "Mickey Mouse" written all over it.

      Old Guy

      Delete
    7. Dozens of schools are adding the "Master of Jurisprudence/legal studies/juridical science" to their BS offerings. If it was a consolation prize for 1Ls who drop out/are kicked out, that would be fine, but I don't know what idiot would willfully sign up for this stupid thing.

      Delete
    8. My prediction is that the JD Disadvantage blog will have to expand to cover No Master of Legal Studies/Jurisprudence/Juridical Science Need Apply jobs as well.

      Delete
    9. I'm waiting for the list of Three Hundred Things that You Can Do with a Master's of Studies in Law.

      Old Guy

      Delete
  13. Your argument that a 150 one year isn't the same as a 150 the next year is wrong. The reason why you have to take "experimental" sections with every test is to norm different tests and test populations on a constant scale.

    If the test was dumbed down to account for the trend of dumber people taking the test, then we wouldn't see the decline in scores we do.

    Over a period of many years this might not apply, because the experimental section results grow stale. But LSAC is well aware that the quality of students changes from test to test, and does go to great efforts to account for this. From one year to the next, 150 is a 150.

    I think the better critique of the scamprof argument is that the supposed deviation in LSAT v. MBE scores isn't statistically significant. The quote says based on LSAT decline, MBE scores should have gone down .5 to 1 points instead of 2.8. The MBE standard deviation is 15.9 points. This discrepancy, less than an 1/8 of a standard deviation, even assuming their math is right (never assume lawprofs can perform even basic stats), is so tiny that it is likely statistical noise.

    There are all sorts of other things going on here as well. For one thing, smart law grads may be not taking the bar exam because with the improving non-legal economy and crap law job market, they have better options. The dumber law grad population, however, is stuck without any non law options so takes the bar. There could also be more lower-scoring repeaters than before. They didn't even try to take this into account, despite the fact that repeat takers do much worse than first time takers.

    They scamprofs also are using hyperbole to refer to this as a "historic" drop. It was unusually large, but looking at two-year changes, it actually is almost exactly the same. From 2008 to 2010 MBE scores dropped two points. From 2012 to 2014, the drop was again two points. So it is perfectly normal, not historic, if you extend the period by even one year.

    Finally, their "incoming" data seems to be based on 25, 50, and 75th percentiles. But we also see several formally semi-selective and very large 4th tier schools like NYLS, Cooley, and TJSL have adopted de facto open admissions. That means while the 25th percentile might have only dropped a bit, the 15th, 5th, etc likely dropped a lot as schools struggled to fill their classes or face default on loans and firing faculty.

    ReplyDelete
    Replies
    1. LSTC did not say that year to year averages changed dramatically. He said that averages over certain periods change depending on the pool of test takers during that period. For example, the 2005-2009 crowd may have been sharper than the 2009-2014 crowd. But the test would not reflect such since each year is curved individually based on prior test takers' performances. If the intelligence of testakers declimes steadily over a decade, the lsat will not reflect this decline.

      Delete
    2. I addressed much of this in the article, and you're mistaken on several points.

      Lower LSATs (meaning LSAT medians and 25/75 numbers) mean nothing except that fewer people are taking the test and law schools are reaching further down the curve each year.

      Also, the "experimental" sections are not used to curve/normalize that administration, but rather to generate data to score those same questions in future administrations. The scoring "curve" for each test is set in advance.

      It isn't LSAC "dumbing down" the test at all. I would agree LSAC uses controls to try and have a consistent test, but their method of doing so is flawed (read about item response theory and think about how it would affect the scoring if cohorts were getting progressively dumber.). That's why we cannot assume a 150 in Y1 will have an equal chance of passing the bar as a 150 in Y5.

      Delete
  14. One thing the LSAT does is more or less predict how well you will do against that same group of peers in your first year of law school. I took it and got a 151, then was pissed when I was in the lower 50% of my first semester grades. Instead of heeding the first warning (the LSAT) and the second (the confirmation, low grades at mid-1L year), I soldiered on, graduated, and now have a worthless JD that hinders my employment prospects.

    The LSAT can be a valuable predictor and should be listened to by anyone considering law school.

    But the bar is nothing more than a restraint on competition. Don't ever let me grade them - I will pass EVERYONE who did the time, paid the money, got the grades and earned the JD. At that point it's too late to pull the rug out from under them and say "haha - sorry, you're not a lawyer!" Bullshit, let 'em in. It's not like they've ever going to get to practice anyway.

    ReplyDelete
    Replies
    1. The only thing the LSAT can predict is whether or not you are prone to "choke" on important exams.

      Since law school is one exam per semester, choking is a very real danger for those of us who choke. That is the only reason we see any correlation between low LSAT and low grades in law school. Otherwise, the LSAT is meaningless and does not measure anything.

      Delete
  15. It's really very interesting that the only sound or at least plausible argument without apparent flaws is the one in the article.

    ReplyDelete
  16. The very definition of an aptitude test precludes prior preparation before administration. If LSAT’s were a true aptitude test capable of measuring anything beyond how well one prepared to take the test, studied, and drilled, then there would be no preparation needed, no prep courses, studying, drilling, or any of that present before students were given the LSAT. Aptitude tests measure innate aptitude. Innate proclivities that make one more or less likely to succeed in law school. To measure such things, someone can not have prepared by learning the methods and practicing the skills to do better on an aptitude test. A proper aptitude test would be administered without prior preparation, to test a student’s innate aptitudes toward certain skills relevant to law. The LSAT, in contrast, simply measures how well one has prepared for the test, how much one has studied, how many LSAC prep test booklets they have bought, how many courses they have paid for, and thus how this preparation has culminated into their performance. It is not indicative of law school success, it is indicative of work ethic and the commitment to studying to achieve high academics, which indirectly ties it to a correlative relationship with law school success, however there is no causal correlation nor relationship whatsoever, except for testing the obvious concept that people who study harder for things on a more regular basis are more likely to succeed in academics at graduate schools. The LSAT is a scam. LSAC is a monopolistic organization that profiteers off fledgling applicants who have no other choice but to seek their inefficient and completely absurdly bullish test in order to obtain admittance into a law school. They are forced into this test by the monopolistic structure of the admissions process, similar to MCATs, College Board, and their respective admissions markets they monopolize (don’t tell me the ACT is even remotely competitive with the SAT, you need to take both if you want your full due in consideration for admission). Therefore, LSAC no longer has to truly defend their bs claims about the predictive validity, and simply cites a correlation to be one that is causal in nature because of some predictive validity of the test. All it can predict is what school you get into, and all it informs one about is how well they prepared to take it. Standardized tests measuring aptitude by definition cannot require prior preparation, for you no longer measure innate capacities when overt measures have been taken to compensate for such.

    ReplyDelete