Does US News Degrade Law Schools?

Engines of Anxiety: Academic Rankings, Reputation and Accountability by Wendy Nelson Espeland and Michael Sauder, Russell Sage Foundation, New York, 2016

The three weekly news magazines, Time, Newsweek and US News and World Report, at one time printed and sold around ten million copies combined each week.  All have been greatly reduced in scope and circulation, unable to compete as weekly new journals with rapid-fire instant news feeds available from online sources or 24-hour cable news channels.  One of the three, US News, under the direction of its owner Mortimer Zuckerman, chose a different course a quarter century back:  becoming the bible of annual higher education rankings.  This venture has proven to be a great success, whatever the profitability of the enterprise for the company.  Today, US News rankings of undergraduate universities and colleges, top high schools, and graduate and professional programs, have become a primary source of information for applicants, faculty, school staffs, employers, and alumni, and an important  measure of status and achievement for the various schools and programs. 

The ratings have always been controversial. How do you assess the quality of a program? Are the annual surveys measuring the right things? Are the weightings of various factors which produce a rating score and a ranking reflective of what should matter most in evaluating schools or programs? Do the ratings capture the student experience and the value of a college or graduate program? Are the distinctions among schools real, or just a factor of a need to rank order?  Can the ratings be gamed? Do the ratings themselves influence some of the scores that are measured in the next ratings cycle, rather than just neutrally present a status report on a school or program? 

Wendy Nelson Espeland and Michael Sauder’s new book, Engines of Anxiety, focuses on the US News rankings for one particular professional program, law schools.  The authors argue that whereas there are alternative rankings for other professional schools or graduate programs, such as business schools, which provide alternatives to the US News survey, and there are many books which try to evaluate and score undergraduate programs, there is no real alternative to the US News rankings of law schools. US News divides undergraduate colleges and universities into national universities, and national small colleges, as well as regional universities and colleges. A business school can pick and choose one of the surveys which ranks its program, or components of its program highly, and sell that to prospective students.  But US News has no real competition for its law school rankings (latest rankings here), for which one ranking system is applied to all law schools, virtually all of whom comply with the “system” and submit their data to the magazine each year.

The authors argue that the high compliance rate relates to the fact that US News will estimate a law school’s data for each factor measured when it does not submit its own. This would include data on job placement, admissions rates, LSAT scores, and GPAs for entering students, and the US News “estimates” are designed to be conservative -- lower rather than higher than what might be the real experience. Why not get ranked 89th with your own data, than 116th when US News fills in the blanks?

The rankings as administered today, divide the near 200 law schools into 4 tiers, with each of the first 3 tiers, containing 50 schools, ranked 1-50, 51-100, and 101-150. Schools in the 4th tier, are listed in alphabetical order, thereby preventing one school each year from being proclaimed the worst law school in America. The military academies celebrate each year the graduate with the lowest GPA.  This is not something that would be viewed as amusing by the lowest ranked law schools. The school that comes out on top is assigned a score of 100. All other schools receive a score that is some percentage of 100, reflecting their total raw score from all the factors as compared to that of the top school. The authors argue that many of the distinctions among the schools on the total factor score are very small, but enough to place one school a good bit higher in the rankings than others not far behind as far as their scores.  The rankings create separation that may not be real.

The rankings over time demonstrate a good deal of movement within tiers, and between tiers.  Harvard and Yale are never going to fall out of the top five, but other schools can move 10- 20 spots in a year, some up and some down with predictable consequences.  The rankings after all are a zero sum game. If one school moves up, one or more will drop, depending on the extent of the move. The same holds true for a drop in a school’s ranking.  The authors suggest, and they have lots of statistical evidence plus many interviews to back them up, that a shift  in a school’s ranking, up or down, will impact the number of applications received in a subsequent cycle, a school’s yield (the percentage of those who are accepted, who choose to go to a school), the quality of future applicant pools (especially as measured by LSAT median scores), and perception of a school by recruiters, alumni and current students, faculty and staff.  The rankings confer status. More or less of it is clearly meaningful.

The authors are critical of the reliance on certain measures to build the rankings. A school’s reputation accounts for 40% of the overall score. Deans and other faculty are asked to rate all the other law schools, accounting for 25% of the 40% reputation score, and practitioners responses  take care of the other 15%. However, the response rate for the surveys is poor, especially among practitioners, and many of the law school people who complete the surveys only rate a few schools, those they think they know.  Some deans admit to marking down the schools they view as competitors, or those ranked just above or below the dean’s school in the last survey.  It is almost certain that at least on reputation scores, the prior year’s overall ranking by US News is a good guide to how the schools will be ranked on this factor in the current survey.

Since it is hard to move the needle on reputation scores, law schools have chosen other tactics to influence measures over which they have more control.  Non-need based scholarships can be offered to high scoring applicants on the LSAT  (responsible for 12.5% of the overall ranking score).   Students have been admitted to part time program or night programs, if their scores were lower, so they would not be included in the calculation of the median score (the average of the 25th and 75th percentile of all LSAT scores), and so the school’s selectivity score was higher.  Some schools hired unemployed graduates for “research jobs” so as to have a higher percentage employed 9 months out.  The number employed often include non- legal jobs for which passing the bar exam was not required.

The attempts at gaming have been exacerbated by a recent weak period for law schools, with sharply declining numbers of overall applicants (close to 50% decline from peak to tough), and a poor job market for graduates. LSAT scores for applicants and those accepted have been declining almost everywhere in recent years by a modest amount. 

The authors provide stories of deans and admission directors and placement director who have lost their jobs after very small ranking declines, as small as 2 spots.  The US News ranking has become an all-consuming number for most everyone involved with a law school, regardless of lip service to the contrary.

There are a few defenders of the rankings who make their case in the book for the greater transparency the ratings provide, and who think it important that the various law school constituencies have measurement tools for their programs.  The authors side more with the critics.

I once considered attending law school, and was accepted at some top tier programs, but the Viet Nam war draft ended that possibility.  I later attended business school, and have worked with lawyers for most of my career as a health care consultant, serving as an expert witness.  My own experience has been that in the field of health care law, many of the best lawyers I worked with attended the law school in the state in which they practiced.  If you grow up in South Carolina, or Kentucky, or Florida or some of the Midwestern states where I have worked, lawyers who grew up in that state wanted to go to law school in their state, since it was both less expensive (the law schools were almost always state schools), and the degree from these instate schools carried more cachet (and business connections) in that state than obtaining one from one of the elite schools out of the region. Trying to flaunt a top ten law school degree in some places is viewed a lot differently than is the case in New York, California , Washington DC or  to a somewhat lesser extent, Chicago. 

The law school community, which by and large hate the US News ratings, may be exaggerating a bit the negative impacts of slipping from year to year in the rankings.  Most schools serve particular identifiable markets and have some advantages in competing in that market.  Slipping a bit looks bad, and may lead to some of the results described above, which might cause a further slippage in the next year.  But the authors provide no evidence of schools that are on the verge of closing or have closed due to rankings issues, or even the recent applicant decline and reduced class sizes at some schools.  Every year, near a quarter of law schools know they are in tier 4.  Another 50 are in tier 3.  They keep on going.  Slipping from 127 to 131 probably does not mean so much. If you slip from 181 to 186, you likely do not even know it, since your alphabetical listing will be the same in tier 4.  Dropping a tier might have a bigger impact.

The authors make a good case that there are some unique issues with the rankings of law schools, as opposed to rankings of other types of schools.  Pressure to move up, and not down is likely higher than for other rankings. Law school is expensive, and many of those who graduate and get jobs are not walking down easy street financially, what with big loans to repay and modest compensation. So rankings do deliver a message both to the prospective applicant and the recruiter, and to those who seek employment at a law school, about whether the cost benefit relationship is a good one. And one thing I might add, given my experiences with lawyers over the years, is the work/life balance required to succeed for many in the field is often unhealthy.  The country has over a million lawyers and another 40,000 or so are turned out every year by the law schools. More undergraduates, with good reason, seem to be questioning the wisdom of choosing this profession.

Engines of Anxiety: Academic Rankings, Reputation and Accountability by Wendy Nelson Espeland and Michael Sauder, Russell Sage Foundation, New York, 2016

The three weekly news magazines, Time, Newsweek and US News and World Report, at one time printed and sold around ten million copies combined each week.  All have been greatly reduced in scope and circulation, unable to compete as weekly new journals with rapid-fire instant news feeds available from online sources or 24-hour cable news channels.  One of the three, US News, under the direction of its owner Mortimer Zuckerman, chose a different course a quarter century back:  becoming the bible of annual higher education rankings.  This venture has proven to be a great success, whatever the profitability of the enterprise for the company.  Today, US News rankings of undergraduate universities and colleges, top high schools, and graduate and professional programs, have become a primary source of information for applicants, faculty, school staffs, employers, and alumni, and an important  measure of status and achievement for the various schools and programs. 

The ratings have always been controversial. How do you assess the quality of a program? Are the annual surveys measuring the right things? Are the weightings of various factors which produce a rating score and a ranking reflective of what should matter most in evaluating schools or programs? Do the ratings capture the student experience and the value of a college or graduate program? Are the distinctions among schools real, or just a factor of a need to rank order?  Can the ratings be gamed? Do the ratings themselves influence some of the scores that are measured in the next ratings cycle, rather than just neutrally present a status report on a school or program? 

Wendy Nelson Espeland and Michael Sauder’s new book, Engines of Anxiety, focuses on the US News rankings for one particular professional program, law schools.  The authors argue that whereas there are alternative rankings for other professional schools or graduate programs, such as business schools, which provide alternatives to the US News survey, and there are many books which try to evaluate and score undergraduate programs, there is no real alternative to the US News rankings of law schools. US News divides undergraduate colleges and universities into national universities, and national small colleges, as well as regional universities and colleges. A business school can pick and choose one of the surveys which ranks its program, or components of its program highly, and sell that to prospective students.  But US News has no real competition for its law school rankings (latest rankings here), for which one ranking system is applied to all law schools, virtually all of whom comply with the “system” and submit their data to the magazine each year.

The authors argue that the high compliance rate relates to the fact that US News will estimate a law school’s data for each factor measured when it does not submit its own. This would include data on job placement, admissions rates, LSAT scores, and GPAs for entering students, and the US News “estimates” are designed to be conservative -- lower rather than higher than what might be the real experience. Why not get ranked 89th with your own data, than 116th when US News fills in the blanks?

The rankings as administered today, divide the near 200 law schools into 4 tiers, with each of the first 3 tiers, containing 50 schools, ranked 1-50, 51-100, and 101-150. Schools in the 4th tier, are listed in alphabetical order, thereby preventing one school each year from being proclaimed the worst law school in America. The military academies celebrate each year the graduate with the lowest GPA.  This is not something that would be viewed as amusing by the lowest ranked law schools. The school that comes out on top is assigned a score of 100. All other schools receive a score that is some percentage of 100, reflecting their total raw score from all the factors as compared to that of the top school. The authors argue that many of the distinctions among the schools on the total factor score are very small, but enough to place one school a good bit higher in the rankings than others not far behind as far as their scores.  The rankings create separation that may not be real.

The rankings over time demonstrate a good deal of movement within tiers, and between tiers.  Harvard and Yale are never going to fall out of the top five, but other schools can move 10- 20 spots in a year, some up and some down with predictable consequences.  The rankings after all are a zero sum game. If one school moves up, one or more will drop, depending on the extent of the move. The same holds true for a drop in a school’s ranking.  The authors suggest, and they have lots of statistical evidence plus many interviews to back them up, that a shift  in a school’s ranking, up or down, will impact the number of applications received in a subsequent cycle, a school’s yield (the percentage of those who are accepted, who choose to go to a school), the quality of future applicant pools (especially as measured by LSAT median scores), and perception of a school by recruiters, alumni and current students, faculty and staff.  The rankings confer status. More or less of it is clearly meaningful.

The authors are critical of the reliance on certain measures to build the rankings. A school’s reputation accounts for 40% of the overall score. Deans and other faculty are asked to rate all the other law schools, accounting for 25% of the 40% reputation score, and practitioners responses  take care of the other 15%. However, the response rate for the surveys is poor, especially among practitioners, and many of the law school people who complete the surveys only rate a few schools, those they think they know.  Some deans admit to marking down the schools they view as competitors, or those ranked just above or below the dean’s school in the last survey.  It is almost certain that at least on reputation scores, the prior year’s overall ranking by US News is a good guide to how the schools will be ranked on this factor in the current survey.

Since it is hard to move the needle on reputation scores, law schools have chosen other tactics to influence measures over which they have more control.  Non-need based scholarships can be offered to high scoring applicants on the LSAT  (responsible for 12.5% of the overall ranking score).   Students have been admitted to part time program or night programs, if their scores were lower, so they would not be included in the calculation of the median score (the average of the 25th and 75th percentile of all LSAT scores), and so the school’s selectivity score was higher.  Some schools hired unemployed graduates for “research jobs” so as to have a higher percentage employed 9 months out.  The number employed often include non- legal jobs for which passing the bar exam was not required.

The attempts at gaming have been exacerbated by a recent weak period for law schools, with sharply declining numbers of overall applicants (close to 50% decline from peak to tough), and a poor job market for graduates. LSAT scores for applicants and those accepted have been declining almost everywhere in recent years by a modest amount. 

The authors provide stories of deans and admission directors and placement director who have lost their jobs after very small ranking declines, as small as 2 spots.  The US News ranking has become an all-consuming number for most everyone involved with a law school, regardless of lip service to the contrary.

There are a few defenders of the rankings who make their case in the book for the greater transparency the ratings provide, and who think it important that the various law school constituencies have measurement tools for their programs.  The authors side more with the critics.

I once considered attending law school, and was accepted at some top tier programs, but the Viet Nam war draft ended that possibility.  I later attended business school, and have worked with lawyers for most of my career as a health care consultant, serving as an expert witness.  My own experience has been that in the field of health care law, many of the best lawyers I worked with attended the law school in the state in which they practiced.  If you grow up in South Carolina, or Kentucky, or Florida or some of the Midwestern states where I have worked, lawyers who grew up in that state wanted to go to law school in their state, since it was both less expensive (the law schools were almost always state schools), and the degree from these instate schools carried more cachet (and business connections) in that state than obtaining one from one of the elite schools out of the region. Trying to flaunt a top ten law school degree in some places is viewed a lot differently than is the case in New York, California , Washington DC or  to a somewhat lesser extent, Chicago. 

The law school community, which by and large hate the US News ratings, may be exaggerating a bit the negative impacts of slipping from year to year in the rankings.  Most schools serve particular identifiable markets and have some advantages in competing in that market.  Slipping a bit looks bad, and may lead to some of the results described above, which might cause a further slippage in the next year.  But the authors provide no evidence of schools that are on the verge of closing or have closed due to rankings issues, or even the recent applicant decline and reduced class sizes at some schools.  Every year, near a quarter of law schools know they are in tier 4.  Another 50 are in tier 3.  They keep on going.  Slipping from 127 to 131 probably does not mean so much. If you slip from 181 to 186, you likely do not even know it, since your alphabetical listing will be the same in tier 4.  Dropping a tier might have a bigger impact.

The authors make a good case that there are some unique issues with the rankings of law schools, as opposed to rankings of other types of schools.  Pressure to move up, and not down is likely higher than for other rankings. Law school is expensive, and many of those who graduate and get jobs are not walking down easy street financially, what with big loans to repay and modest compensation. So rankings do deliver a message both to the prospective applicant and the recruiter, and to those who seek employment at a law school, about whether the cost benefit relationship is a good one. And one thing I might add, given my experiences with lawyers over the years, is the work/life balance required to succeed for many in the field is often unhealthy.  The country has over a million lawyers and another 40,000 or so are turned out every year by the law schools. More undergraduates, with good reason, seem to be questioning the wisdom of choosing this profession.