Tuesday, December 10, 2019
A young legal scholar elsewhere writes:
I'm my faculty's most recently tenured member, so I got a US News peer assessment survey. Or, I should say, peer "assessment," since it doesn't actually ask for any assessment of anything. I knew that the methodology was shoddy for these things, but I'm still kind of shocked at what this is: just a list of all the law schools and a request to rate them on a 5 point scale. No faculty or publications or any information about them. It's just a test of what schools I happen to have heard good things about lately.
So, given that this survey cannot produce any credible measure of quality or anything else (except of who I happen to have heard good things about lately), what should I do? Should I simply ignore this nonsense? Or is there some penalty (to me? to others?) if people who recognize this as nonsense refuse to participate? Should I rank everyone outstanding? Everyone except the top twenty schools?
A few observations and suggestions:
(1) any recently tenured faculty member (and that certainly goes for this young scholar) will, in fact, know a fair bit about the quality of scholarship (at least in his or her fields, and often cognate fields) at anywhere from a dozen to several dozen law schools. Evaluate those schools, being either generous or stingy with the scores as you see fit: e.g., give just five or six schools a "5," or give two dozen schools a "5." In general, I think evaluators should be generous, especially since higher scores will have more influence on the overall results: avoid 1s and 2s (unless you really are confident in the weakness of a particular school), and there's no harm in giving lots of 4s and 3s. (In the past, USNEWS.COM used to drop a percentage of the highest and lowest scores as a check on strategic voting, I'm not sure if they still do that.) Most importantly, when you "don't know" much about a school, choose "don't know." "Don't know" does not count against (or for) a school.
(2) The academic reputation survey is, in fact, one of the few "reality checks" in the whole USNEWS.com charade: without it, the rankings would be based on nothing more than wealth and the extent to which schools "massage" the self-reported data like employment statistics and expenditures. Unfortunately, the academic reputation surveys increasingly track the prior years' overall rank in USNEWS.com, which impedes its utility as a reality check. (This is one reason why adding citation data would, if done rightly, be salutary.) But evaluators can counteract that by actually thinking about (1) the quality of scholarship produced by a school's faculty (not the school's name!), and (2) looking at other data as a check on their impressions.
Here's a suggestion: everyone should give the University of San Diego at least a "4" this year in the peer assessment survey, since its overall USNEWS.com rank is preposterously low relative to the strength of the faculty (which is made up of folks who have had tenured positions or offers at lots of excellent schools, including Berkeley, Northwestern, Cornell, Minnesota, George Washington, Boston University, and elsewhere). If this works, I'll nominate more schools in future years who deserve a boost for their faculty excellence, even as they are punished by USNEWS.com on other metrics.