Don't Look, Don't Touch, Don't Eat
The Science Behind Revulsion
The year got off to a dismal start for law schools when, in early January 2011, the New York Times published a sensational exposé, “Is Law School a Losing Game?” In a multipage report, David Segal splayed open for public consumption the rampant practice of law schools reporting misleading employment data. The legal market was in the midst of a severe recession. Yet miraculously, ninety-seven of the top one hundred law schools, as well as a majority of the bottom hundred, claimed that more than 90 percent of their graduates were employed within nine months of graduation. Dozens of law schools posted eye-popping employment rates of 98–100 percent; and several dozen law schools posted midrange salary numbers claiming that recent graduates earned up to $160,000 a year. Judging from these numbers, going to law school was a smart move that paid off handsomely, especially at a time when so many recent college graduates were unemployed.
Segal disclosed, however, that law schools have been doctoring their employment figures for years, using a variety of fudges to jimmy them up. Several strategies did the trick. When obtaining employment information, law schools asked their graduates whether they have jobs of any kind—not just lawyer jobs. Law schools graduates who were employed in a position outside of the legal field, like a grocery clerk, would be identified as “employed” in “business and industry.” This provides a nice lift to the employment numbers because most graduates must have a job of some kind to pay their bills. In another combination of moves, schools left out any graduates who were “not seeking employment” or were pursuing further education (like enrolling in LLM programs or completing a joint degree); and because US News automatically treated 25 percent of graduates whose status was unknown as “employed,” law schools made less of an effort to get answers from graduates they suspected were unemployed (successful graduates are pleased to report jobs). Finally, law schools offered unemployed graduates temporary jobs—as research assistants or interns at ten dollars an hour—which expired after the period covered by the survey, thus counting them as “employed” when it mattered. As for salary numbers, law schools artfully crafted categories (“private full-time legal employment”) and used selective reporting to elevate amounts, prominently displaying high-income figures that reflected only a small percentage of the class.
These techniques spread through law schools over time. In the 1997 US News ranking, almost all of the top twenty-five schools had placement rates in the ninetieth percentile range (the highest was 97.1 percent); the majority of the schools ranked twenty-six to fifty were in the eightieth percentile range; schools below the top fifty were scattered around the seventies and eighties (flagship state schools that dominated local legal markets were in the nineties), and a dozen schools listed placement rates below 70 percent. That was a plausible distribution. In ensuing years placement rates began to drift up at law schools across the board. By the mid-2000s, nearly every law school in the top hundred advertised employments rates in the 90th percentile range, as did many schools ranked lower. Some of this rise is attributable to a healthy job market for law graduates, but that does not explain such high figures across the board. Goosing the numbers evidently had become pervasive.
It was widely known, at least among law school administrators and professors who were paying attention, that advertised employment numbers were inflated. They rationalized that since most law schools were doing it, it wasn’t wrong, and any school that did not boost numbers would suffer next to competitor schools that engaged in the practice. Few people inside or outside of law schools complained about or criticized the artificially high reported employment rates. The extent of the inflation was not readily apparent.
That changed when waves of lawyer layoffs plowed through the bar in 2008 and 2009, leaving no doubt that the job market was terrible.
The percentage of graduates who obtained jobs as lawyers (among those whose jobs status was known) declined every year—from a high of 76.9 percent in 2007, to 74.7 percent in 2008, 70.6 percent in 2009, and 68.4 percent in 2010. The percentage of part-time jobs rose significantly, from a norm of 5 percent or less, to 11 percent in 2009. One in five jobs obtained by the class of 2010 were temporary, double the number in 2007. The only employment category to show gains was “academic” jobs for law graduates, which reached an all time high—the result of law schools putting more unemployed graduates on their payroll with part-time jobs (to keep up faltering employment numbers).
An association that collects employment information on recent law graduates, National Association for Law Placement, concluded that only 64 percent of 2010 graduates (whose job status was known) had found full-time lawyer jobs, and the “aggregate starting private practice salaries fell an astonishing 20% for this class.” An analysis factoring in the thousands of lawyers laid off estimated that only 19,397 lawyer jobs were available annually from 2008 to 2010—law schools produce more than two times that number of graduates each year. There was less than one job opening for every two new lawyers. By all indications, this was the worst job market in decades. Defying this reality, many law schools continued to report employments rates for graduates above the ninetieth percentile. The disparity between this cheery picture and the ongoing carnage in the legal job market was too great to go unnoticed.
Not all law schools used every technique for manipulating the employment figures and some massaged the numbers more ruthlessly than others, but most schools did some of it, including elite law schools. (Northwestern University Law School pioneered the “hire your unemployed” technique.) Through these devices, the bulk of law schools across the country were able to certify that nine out of ten of their graduates landed employment, with many scoring high salaries.
Legal educators are unapologetic about their use of these expediencies, having a ready justification for each. Legal academics insist, for instance, that lots of graduates don’t practice law yet still use the degree to advance their careers in “business.” To fixate only on lawyer jobs would understate successful outcomes. And putting unemployed grads temporarily on the law school payroll extends them a helping hand in difficult times. The reported employment numbers, they say, are truthful—not fabricated— and comply with reporting guidelines acceptable to the ABA.
This is “truthiness” in the technical sense that lawyers are infamous for, but it wasn’t honest. As legal educators know, prospective students who saw claims of “98 percent employed 9 months after graduation” would naturally assume that meant lawyer jobs at decent pay. That is the primary career law schools are selling.
Skeptical prospective students who conducted a diligent investigation into the employment numbers would have realized that something didn’t add up. Many schools advertised employment rates that exceeded their bar pass rates, which implies that not all the jobs were lawyer jobs, although a person reading the information would have to draw the connection on his or her own. Unwary students—and why should they think that law schools were substantially distorting the employment numbers?—would have been fooled. Segal called out law schools for the deception.
Public scandal hit again a month after Segal’s article when the new dean at Villanova University School of Law, John Gotanda, let it be known that the previous administration had submitted falsely inflated LSAT numbers to the ABA and US News for several years, reporting a 162 when it was actually 159. This was not creative accounting or truthiness—but flat-out lying, which was rewarded with a higher rank for the school. After its actual median was counted, Villanova tumbled in rank from sixty-seven to eighty-four.
Embarrassing blows to law schools kept coming month after month. In March, US Senator Barbara Boxer sent a letter to ABA president Stephen Zack demanding that the ABA implement reforms to halt the deceptive reporting practices of law schools. “Most students reasonably expect to obtain post-graduation employment,” Boxer sternly wrote, “that will allow them to pay off their student loan debts, and rely on this information [provided by law schools]—which may be false at worst and misleading at best—to inform their decision.”
In April, Segal published a follow-up piece in the New York Times likening the scholarship policies of many law schools to “bait and switch” schemes. To attract students, law schools offered sizable scholarships for three years, contingent after the first year on maintaining a minimum qualifying grade point average (GPA), a B average, for instance. This would not seem like much to be concerned about for prospective students, most of whom got high grades in college. What students were not told clearly enough is that many of their classmates, at some schools more than half the class, were offered similar deals, and first-year grading is done on a curve that strictly limits the number of students who receive Bs and above.
To see how this snare works, say that 50 percent of the class comes in with a scholarship, but at the end of the first year only 30 percent of the class achieves a B average. In an arrangement like this, four out of ten scholarship students would lose their scholarships for the second and third year, ending up paying tens of thousands of dollars more for law school than they had planned. Had students realized the magnitude of the risk, they might have decided instead to attend a higher-ranked school, paying full price all the way but obtaining better job opportunities on graduation. Law school officials defended the arrangement as an appropriate allocation of scholarships and insisted that students knew the conditions. Students who lost out were devastated and felt deceived because they were not specifically told that a significant number of students yearly forfeited scholarships. (Eighty-five percent of law schools outside the top fifty, and about half of the top fifty, attach contingencies of this sort to their scholarship offers.)
The assault on law schools escalated in May, when a group of graduates filed a class action lawsuit charging Thomas Jefferson School of Law with fraud and deceptive business practices, misinforming prospective students about job placement rates. Similar lawsuits were soon filed against Thomas Cooley Law School and New York Law School, and a dozen additional law schools were sued several months later, with more suits reportedly to follow.
In July, a second US Senator, Charles Grassley, ranking member on the Judiciary Committee, sent a letter to the president of the ABA raising concerns about law school scholarship practices, the overproducing of law graduates during a bleak job market, and the risk that growing numbers of students might default on federally backed student loans, costing taxpayers a great deal of money. The prospect of closer scrutiny by the Senate of the law school situation was implicit in the list of thirty-one questions set forth in the letter, with a demand for a prompt response.
In September a second law school was exposed for reporting false numbers. Illinois law school advertised an LSAT median of 168, when in fact it was 163. After further investigation, it was revealed that Illinois had reported false LSAT and/or GPA medians to the ABA six times in the preceding ten years, as well as false acceptance numbers (substantially boosting their selectivity rate). This was not the first time Illinois had been caught for questionable reporting. In 2005 Illinois inflated the amount it spent on students (a factor in the ranking) by reporting to US News the estimated fair market value of electronic legal research subscriptions to Westlaw and Lexis, claiming to have spent $8.78 million instead of the $100,000 it actually paid for those services.
With the news about Illinois coming on the heels of the disclosure of Villanova’s false LSAT numbers, the obvious question was how many other law schools have been doing the same. “It really makes you wonder,” said Sarah Zearfoss, assistant dean for admissions at University of Michigan Law School. “There have been schools that my colleagues and I thought were cheating, because we knew enough about their applicant pools that their numbers didn’t seem credible. Maybe they really weren’t credible.” Law School Admission Council (LSAC), an organization that processes law school applications and enrollment on behalf of law schools, has accurate LSAT/GPA medians of every accredited law school in its database and could easily monitor the scores submitted by law schools. When asked whether LSAC would do this, President Daniel Bernstine resisted: “That’s just not something we have done historically, and I don’t see why we would. We are not in the reporting business.” What is peculiar about this whole affair is that the ABA and LSAC jointly publish the Official Guide to ABA-Approved Law Schools. The ABA asks law schools to supply their annual LSAT/GPA medians for inclusion in the guide, when LSAC could provide this information directly to the ABA free from error or deception. The ABA has thus created an arrangement that allows law schools to report false scores with impunity.
In October Senate scrutiny shifted from talk to action, when Senator Tom Coburn and Senator Boxer jointly directed the inspector general of the Department of Education to conduct an investigation into law schools. The senators sought this report as a prelude to possible reforms of the Higher Education Act to rectify the problematic situation with law schools.
Finally moved to act by all the negative attention, the ABA Section on Legal Education met the call for greater transparency by approving new rules that would require law schools to provide prospective students with more clear and accurate information about the employment results of recent graduates, although questions still remained. By taking so long to deal with the problem, the ABA’s action had the appearance of being forced, an attempt to head off Senate scrutiny rather than a genuine embracing of reform. ABA officials continued to deny the seriousness and pervasiveness of the problem, insisting that “the number of institutions that fail to report employment data accurately is small.”
Throughout this period, a relentless stream of invective was directed at law schools by a “scamblog” movement, two dozen active blogs by recent law graduates who dedicate themselves to exposing “the law school scam.” They warn readers that law schools lie about employment statistics and that the fate of many graduates is huge debt with no job. Law professors and deans are painted as profiteers who make money by selling a false product. The most popular of these blogs, with over 400,000 visits, is Third Tier Reality, where the author, Nando, weekly posts a detailed profile of a law school. Prominently displayed at the head of each profile is a shot of an excrement-filled toilet—a play on the phrase “third-tier toilet” or “TTT”—followed by information about tuition, expenses, ranking, job prospects, and dean and professor pay. In profanity laced attacks, Nando ridicules the employment numbers posted by each school, exposing the tricks used to pump them up. He concludes each profile with a blunt warning to prospective students to stay way.
An uproar erupted when a law professor joined the scamblog movement with an anonymous blog, Inside the Law School Scam, presenting a series of posts contending that law professors did hardly any preparation for class, knew little about the practice of law, and produced reams of worthless scholarship. He argued that attending law school is a bad idea for most students, costing too much for a dubious economic return. The author, who later outed himself as Colorado law professor Paul Campos, was excoriated by law professors for indulging in sweeping exaggerations.
Critics of law schools praised him for candidly raising issues that the legal academy was doing its best to ignore.
Law schools have always presented themselves as the upstanding conscience of the legal profession—always. How could this deflating series of events happen? Segal explains: “The problem, as many professors have noted, is structural. A school that does not aggressively manage its ranking will founder.” When called to account for their conduct, legal educators point the finger at the US News ranking system. Once a few law schools began to use questionable techniques to squeeze up their score in the factors that went into the ranking, others risked being punished with a lower rank if they did not follow suit.
The rankings have law schools by the throat. No question. From 1990, when US News began to issue a systematic annual ranking, its influence over law schools has grown enormously. Deceptive reporting practices are just a part of its pervasive impact. Multiple deans have resigned after a drop in rank. Schools have altered their admissions formula to maximize their ranking. The internal composition of the student body has changed in multiple ways at law schools as a result of the ranking. Schools have shifted scholarships away from financially needy students owing to the ranking. Tens of thousands of dollars are spent on promotional material by law schools hoping to improve their ranking. Faculties have formed committees and plotted strategies to chart a rise in the rankings. The fact that reputation among academics is the most heavily weighted factor in the ranking—25 percent of the score—turbocharged the market for lateral hires, boosting professor pay at the high end. The Government Accounting Office issued a report to Congress concluding that competition among law schools over the ranking is a major contributor to the increase in tuition.
Each spring, when the new annual ranking is announced, law professors and students across the nation apprehensively await their fate. A few schools are elated at a jump, a few are dejected at an unexpected slide, and everyone else is relieved to have avoided a devastating fall—at least until next time around. Because schools are tightly bunched together in their raw scores, minor fluctuations have outsized consequences. On average, about two-thirds of law schools experience a change in rank from the previous year. This absurdly high rate puts every school on edge. Dropping a tier is especially dreaded, like going off a cliff.
The annual pronouncement of the surviving rump of a defunct magazine thus mercilessly lords over legal academia—an amazing state of affairs when you think about it. Colleges and other professional schools are subject to competing rankings so no single ranking system dominates to the same extent that law schools dance to the tune of US News.
The US News ranking gets its inordinate power because students choosing between law schools attach preeminent weight to the ranking. Students are sensible to consider rank (although its significance diminishes the further one gets from the top), alongside location and scholarship offers, because legal employers view rank as an indicator of student quality. The largest 250 corporate law firms hire heavily from the top schools. Law is an obsessively credential-focused profession. Every justice on the current Supreme Court attended top-five law schools (Harvard, Yale, Columbia), and Harvard and Yale together produce a substantial proportion of the law professors across the country.
A statistical analysis of the influence of the ranking on student’s decisions confirmed what law schools already knew: “Ranks affect how many students apply to a school, how many of those applicants have exceptionally high LSAT scores, the percentage of applicants who are accepted, and the percentage of accepted students who matriculate.” This influence shows up most dramatically when schools experience sharp movements up or down in the rankings or shifts between tiers. After a significant movement, the number and quality of applications will change from the previous year to match the shift in rank.
Legal educators endlessly gripe that the US News ranking is bunk, poking holes in every aspect of its construction and methodology. Here is just one example of an egregious flaw: the reputation rating by practitioners, which carries significant weight in the final score, is based on a survey US News sends to 750 law partners asking them to rate all the law schools across the country (on the questionable assumption that they know about the quality of particular law schools). The response rate typically is low. Consequently, the opinions of two hundred or so lawyers determines 15 percent of the final score each year. (Fewer than 120 responses were received in 2010, not enough to be credible, which forced US News to shift to an average of two years.) Complaints about the flaws in the ranking, however, have no apparent effect.
For an illustration of its impact, take Emory University School of Law, which fell from twenty-two to thirty in the 2012 ranking, a devastating plunge. From a spot close to the coveted top twenty, the school was dumped into the thirties. Whatever statistical input produced the drop must have had no connection with the quality of the school because nothing meaningful had changed in a single year. The dean resigned.
Immediately, Emory’s situation changed because of the fall. Emory ranked thirtieth will attract an aggregate pile of applications with a median LSAT perhaps a point or two lower than Emory did when it was ranked twenty-second. LSAT median is all important and a single point shift matters because schools are separated by fractions in the raw scores that underlie the ranking. Previously, Emory competed on an even basis for students against schools like Boston University—tied with Emory at twenty-second—offering a roughly similar scholarship to entice students. After the drop, however, it must offer higher amounts if it hopes to win students away in a head-to-head competition. Students from the Northeast, a pool Emory draws heavily from, would be reluctant to choose the thirtieth-ranked school over twenty-second in the absence of significant financial inducement. And that might not be enough to appease risk-averse students worried about a further slide by Emory. As a consequence of the drop, Emory faces the prospect of a dual financial hit, increasing its scholarship budget as well as enrolling fewer students to stave off a drop in its LSAT median.
In this manner, the ranking creates its own reality. An initial fall precipitates further downward pressure that is costly and must be reversed immediately before becoming self-perpetuating. Schools ranked fiftieth attract applications from students who fit that LSAT/GPA profile range— likewise at hundredth or at tenth. With new crops of applicants arriving each year, a school’s current rank is what counts, in combination with general reputation and strength in the local legal market. A bit of shuffling between spots occurs, but the top fourteen law schools in 1990 have remained the top fourteen up through the present, hence the phrase “T-14,” although in 2011 Texas joined the club in a tie for fourteenth. Their rank, in a self-reinforcing fashion, secures their rank by drawing the best applications and by enhancing their elite reputation. Schools ranked in the twenties and thirties, particularly ones that draw nationally, constantly jostle with cohorts in a tight competition for students and position. In this race, any school that stumbles is run over. Further down the chain, schools worry about their rank relative to local competitors at their range. Only schools at the lowest level are free to ignore US News, helpless to alter their fate because the ranking has condemned them to the basement.
That is why law schools loath the ranking and many do whatever they can—including cross the ethical line—to maximize their rank. Questionable reporting practices and gaming happened early on and has never let up. In the 1995 ranking issue, US News named twenty-nine schools that had “disturbing discrepancies,” supplying it with higher LSAT scores than they had reported to the ABA. The magazine also noted that, according to a firm that surveys legal salaries, salary figures reported by some schools “seem a bit high.” AALS president Dale Whitman, in his 2002 presidential address, “Do the Right Thing,” implored schools to stop gaming for ranking purposes. His criticism of six strategies schools were using did not halt them and probably his disclosure of these tactics to an attentive audience of legal educators did more to help spread them. An article in the New York Times in 2005 revealed a host of dubious moves law schools were making to manipulate their scores. A month after the 2010 ranking came out, US News discovered that Brooklyn Law School had improperly failed to report the (lower) LSAT median of their part-time students; the administration called it a “mistake.” The multiple years of false reporting by Villanova and Illinois in the mid-2000s were not mistakes.
This will not stop. And its consequences go beyond superficial gaming. Real changes have occurred in law schools as a result, with manifold consequences. Law schools closely monitor each factor counted in the ranking and strive to raise their score by any means available. The most heavily weighted factor is the reputation rating of a school based on the surveys US News sends to academics (25 percent) and practitioners (15 percent). A school cannot directly affect its reputation, but the effort to elevate reputation has fueled the hiring of star laterals and a profusion of promotional material. The second heaviest factor in the ranking, student selectivity (LSAT, GPA, acceptance rate), 25 percent of the overall score, can be shaped by law schools; the effort to boost this score has warped law schools in several ways, which I’ll elaborate in the next chapter. Placement success comes next in weight, at 20 percent, which is behind the dubious reporting practices of law schools mentioned earlier. The final category, totaling 15 percent, covers resources for students: library expenditures, student-faculty ratio, other spending on students, and volumes in the library. To raise scores in this category, law schools spend more money per student (or use accounting tricks to claim higher expenditures)—yet another factor pushing up the spiraling cost of a legal education.
The “Investigative Report: University of Illinois College of Law Class Profile Reporting”—an investigation ordered by the university’s legal counsel and ethics office after its false reporting was exposed—provides a behind-the-scenes look at the extraordinary extent to which the ranking can consume a school. The stated goal of the 2006 five-year strategic plan was to move from its current twenty-fifth ranking to its former perch in the top twenty. Each proposed action in the plan begins with a statement of how much the target item counts in the US News ranking and what can be done to increase the score. The plan noted that the academic reputation rating is the most heavily weighted variable. To improve this score the faculty would expand from thirty-nine to forty-five; to retain faculty, professor pay would have to be increased to match the compensation level of peer law schools. Student credentials also count heavily, the plan noted, so it set 168 LSAT and 3.7 GPA medians as its goal. To accomplish this, the law school would hold down class size, increase scholarships to “buy high-end students,” and actively recruit higher-tuition-paying outof-state students. In addition, the law school “launched an aggressive national Transfer Program that attracts and enrolls transfer students from other institutions in the 2L [second] year and that helps to offset the loss of tuition revenue that is entailed by recruiting a smaller incoming class.” Next, the plan observed that “the other significant remaining obstacle to our ability to climb the rankings hierarchy” is the rate of employment on graduation, which also counts heavily in the ranking. Unfortunately, the released document stops there, withholding the plan’s strategies for increasing the employment rate (the conspicuous cutoff of the released document at this point is suspicious).
Administrators utilized a calculator that a professor had constructed that duplicates the ranking methodology to determine whether “a 165/3.8 LSAT/GPA combination was preferable to a 167/3.6 combination” as the best way to raise their score. The calculator made projections on how many places the school would improve in the rank given different LSAT/ GPA combinations. They collected extensive data on peer schools, estimating the raw scores of competitors on items measured in the ranking and devising tailored strategies to leap over schools in its proximity (Indiana especially). As a part of its initiative, the law school developed a program that granted admission to University of Illinois students with high GPAs without requiring that they take the LSAT exam. The administrator in charge of the program confided that it enabled him to “trap about 20 of the little bastards with high GPA’s that count and no LSAT score to count against my median.” When hearing of the plan, a correspondent admiringly responded, “That is clever. Jack up the GPA without risking the low LSAT. … nice gaming the system.”
This was the setting in which Illinois’s false reporting took place. Each year, the admissions dean falsified the LSAT, GPA, and acceptance rate just enough to meet the targets. The institutional commitment to improve its rank paid off, at least temporarily, with Illinois rising from twenty-fifth to twenty-first during this period, until sliding back to twenty-third. Although the “Investigative Report” placed the entire blame for the false reporting on the admissions dean, it also makes clear that the institutional obsession with achieving ranking benchmarks had warped internal policies.
This is not just about Illinois. Law schools across the country pay very close attention to the ranking and many follow Illinois-like strategies (false reporting aside) to boost their raw scores. These strategies have had sweeping effects on the schools—several of which are taken up in the next chapter. That educational institutions are under the thumb of the ranking to such an extreme degree is stunning.
A common refrain among legal educators is that they cannot be blamed for the unfortunate and unintended consequences of “structural” factors that govern legal academia. What they mean by this is that law schools operate in an environment in which schools compete intensely against one another for students. Since students rely heavily on the US News rankings in their decision, schools are forced to maximize their rank to succeed in the competition. Law schools are helpless to do otherwise as long as these conditions hold. The students want the ranking to be high because it adds value to their credential. The alumni want the ranking to be high for the same reason and as a matter of personal pride. No law school administrator likes posting misleading employment numbers or putting out scholarship offers that trap unwary students, but once a few less scrupulous schools used these techniques to advance their position in the ranking, other schools inevitably followed. That is how it spread. As Indiana law professor Bill Henderson put it in the Times article that initially brought scrutiny to these practices, “Enron-type accounting standards have become the norm [among law schools]. Every time I look at this data, I feel dirty.”
The structural explanation for why honorable law school administrators ended up taking disreputable actions for ranking purposes helps explain the developments of the past two decades. A conscientious dean who refused to engage in questionable number reporting or any of the other dubious practices risked not just her continued tenure as dean but the standing of her institution, which would pay the price for her scruples by looking worse than competitor institutions that were being less forthright. When serving as interim dean in 1998, after I learned (to my astonishment) from a professor at Northwestern that the school was putting its unemployed graduates temporarily on their payroll to boost their employment rate artificially, I immediately did the same—well aware that it was a bogus move.
Recognizing the structural forces that impelled us down this path does not cleanse us of responsibility. It is too convenient to assert that we collectively found ourselves in a bad place owing to structural factors but that no one did anything wrong personally, other than a couple of atypical cheaters who outright lied. Legal educators made choices every step of the way. Neither administrators nor professors stood up to say “Stop. That may be permissible under the rules but it’s not right.”
Copyright notice: Excerpted from pages 70-84 of Failing Law Schools by Brian Z. Tamanaha, published by the University of Chicago Press. ©2012 by The University of Chicago. All rights reserved. This text may be used and shared in accordance with the fair-use provisions of U.S. copyright law, and it may be archived and redistributed in electronic form, provided that this entire notice, including copyright information, is carried and provided that the University of Chicago Press is notified and no fee is charged for access. Archiving, redistribution, or republication of this text on other terms, in any medium, requires the consent of the University of Chicago Press. (Footnotes and other references included in the book may have been removed from this online version of the text.)