Applicant Data and Law School Outcomes


This is, I believe, the first multivariate analysis of applicant data with law school outcomes. In other words, things law school admissions committees look at versus how well someone does in law school. I don’t believe the full paper is published yet, and disclaimer they are still running regressions (probably based on editor questions and feedback from whatever journal it will be published in). But we get the abstract with permission and I love this. Kudos to Alexia and Scott, I can only imagine the work that went into finding two colleges that even have this much longitudinal data, getting their permission to have it, redacting identifiable information, coding it from all of applications, etc etc. But they did, and we benefit from it.

What Makes a Law Student Succeed or Fail? A Longitudinal Study Correlating Law Student Applicant Data and Law School Outcomes

Alexia Brunet Marks and Scott A. Moss, University of Colorado Law School

Despite the rise of “big data” empiricism, law admissions remain highly impressionistic, basing on anecdotes (e.g., admitting those like recent star students, or fewer like recent underachievers), idiosyncratic preferences (e.g., certain majors or jobs), or primarily LSAT score. The few studies of law student success typically are univariate, controlling for no other variables in finding predictive value in the LSAT or certain personal qualities. We aim to fill this gap with a two-school, 1400-student, 2005-2011 multivariate longitudinal study. After coding data in paper files, and merging admissions, registrar, and career databases, our multivariate regressions yield predictions of law grades (“LGPA”) from numerous independent variables: LSAT; college grades (“UGPA”), quality, and major; having a rising UGPA; job type and duration; leadership activities; other graduate degrees; disciplinary and criminal records; and variable interactions (e.g., having high LSAT but low UGPA or vice-versa; or testing rising UGPA for only new college graduates).

Our findings include the following. First, many variables help predict LGPA: (1) UGPA predicts as powerfully as LSAT, controlling for college quality and major – and among “splitters,” high-UGPA/low-LSAT is equal to the more coveted high-LSAT/low-UGPA; (2) majoring in STEM (science, technology, engineering, math) or EAF (economics, finance, accounting) is akin to three extra LSAT points; (3) work experience, especially in teaching, strongly predicts success; (4) criminal/disciplinary record predicts negatively, akin to five fewer LSAT points. Second, many variables

had nonlinear effects: college quality has decreasing returns but UGPA has increasing returns, and a rising UGPA is an additional positive; and work experience of 4-9 years is the “sweet spot,” with 1-3 mildly positive, and over 10 not significant. Third, some variables predict high LGPA variance, meaning such candidates are a heterogeneous mix of high and low performers requiring close scrutiny – most notably, those with longer work experience and, among “splitters,” those with high-UGPA/low-LSAT rather than the reverse. Fourth, many variables commonly seen as a

positive had little or no effect, e.g.: public interest work; other graduate degrees; reading-intensive majors (e.g., political science or liberal arts); leadership; and military background.

These findings can help schools discern which numerically similar applicants are better bets to outperform traditional predictors (e.g., LSAT). Key caveats, however, are that some applicants projected to have middling grades retain appealing potential for leadership or lawyering success, and that no projections capture various qualities not reducible to data, so many will over- or under-perform even the best predictions.