Posts Tagged ‘gerrymandering’

The Law is Medieval

October 25, 2017

This post is based on an article by Oliver Roeder on the FiveThirtyEight website on 17 Oct 2017 titled “The Supreme Court is Allergic to Math.”

In 1897, before he took his seat on the Supreme Court, Oliver Wendell Holmes delivered a famous speech at Boston University, advocating for empiricism over traditionalism: “For the rational study of the law…the man of the future is the man of statistics and the master of economics. It is revolting to have no better reason for a rule of law than that so it was laid down in the time of Henry IV.” HM believes that if Oliver Wendel Holmes were alive today, he would also argue for an understanding of psychology and cognitive science. Much has been learned about how and why we humans perceive, think, and act. Unfortunately there is a poor fit between this knowledge and the law because the law is medieval.

The article notes that this problem was on full display this month, when the Supreme Court heard arguments in a case that will determine the future of partisan gerrymandering. The issue here is how to measure a map’s partisan bias and to create a standard for when a gerrymandered map infringes on voters’ rights. A proposed measure is called the efficiency gap. To calculate it, you take the difference between each party’s waster votes for winning candidates beyond what the candidate needed to win—and divide that by the total number of votes case. The aim here is to measure the extent of partisan gerrymandering. Now a threshold needs to be established for deciding when gerrymandering is agreed upon, and that is a reasonable basis for argument. And other metrics can be proposed for measuring gerrymandering. But the only intelligent way of assessing gerrymandering is through a statistic. But apparently, this is too much for some justices mental capacities. HM is asking himself why the term feebleminded was recalled while reading this. This is no esoteric statistical technique. And, indeed, statistical measures provide the only supportable means of addressing this problem. Chief Justice John Roberts dismissed attempts to quantify partisan gerrymandering: “It may be simply my educational background, but I can only describe i as sociological gobbledygook.” To be fair to Chief Justice Roberts, the fault may well lie in the educational system. Previous healthy memory blog posts have argued for teaching some basic statistics before graduating from high school. One cannot be a responsible citizen without some basic understanding of statistics, much less someone deciding questions on the Supreme Court.

Another instance of judicial innumeracy was the Supreme Court’s decision on a Fourth Amendment case about federal searches and seizures. In his opinion Justice Potter Stewart discussed how no data existed showing that people in states that had stricter rules regarding admission of evidence obtained in an unlawful search were less likely to be subjected to these searches. He wrote, “Since as a practical matter, it is never easy to prove a negative, it is hardly likely that conclusive factual data could ever be assembled.

But as the author’s article, Oliver Roeder, wrote “This, however, is silly. It conflates two meanings of the word “negative.” Philosophically, sure, it’s difficult to prove that something does not exist: No matter how prevalent gray elephants are, their number alone can’t prove the nonexistence of polka-dotted elephants. Arithmetically, though, scientists, social and otherwise, demonstrate negatives—as in a decrease, or a difference in rate—all the time. There’s nothing special about these kinds of negatives. Some drug tends to lower blood pressure. The average lottery player will lose money. A certain voting requirement depresses turnout.

Ryan Enos, a political scientist at Harvard, calls this the “negative effect fallacy. This is just one example of an empirical misunderstanding that has proliferated like a tsunami through decades of judges’ thinking, affecting cases concerning “free speech, voting rights, and campaign finance.

Some are suspicious that this allergy to statistical evidence is really a more of a screen—a convenient way to make a decision based on ideology while couching it in terms of practicality. Daniel Hemel, who teaches law at the University of Chicago said: [Roberts] is very smart and so are the judges who would be adjudicating partisan gerrymandering claims—I’m sure he and they could wrap their minds around the math. The ‘gobbledygook’ argument seems to be masking whatever his real objection might be.’

Reluctantly, one comes to the conclusion that there is no objective truth in the law. The corpus of law can be regarded as a gigantic projective test, analogous to the Rorschach Test. Judges can look into the law and see in it what they want to see. Rarely is a decision unanimous. And frequently decisions break down along the strict constructionist philosophy. But the Constitution should be viewed as a changing and growing document as democracy advances. Strict constructionists feel compelled to project themselves back in time and interpret the words literally as written. HM wonders why they would want to go back to a time when slavery existed, women could not vote, and blacks were counted as fraction of a human being. As long as time travel is involved, why not try to think of what they would have been written in light of today’s knowledge. After all, today’s high school science student knows more science than Benjamin Franklin did, who was the most distinguished scientist of his day. And the disciplines of psychology, cognitive science, and inferential statistics did not exist.

© Douglas Griffith and, 2017. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Douglas Griffith and with appropriate and specific direction to the original content.