“Fake” is a popular word these days—I probably don’t need to tell you why. Part of our education, from our parents and teachers and peers, is learning how to avoid being taken by a fake. Spotting fakes is the survival skill that the gatherer uses to miss the poisonous berry, the merchant uses to buy the genuine article, and that the spy uses to flush out a double-agent.
What does any of this have to do with the law? Consider the United States Supreme Court. On the bench sit brilliant, well-educated, experienced legal minds. The scales of justice remind us that balancing things is a judge’s core role. How strong is one argument versus another? Which party met its burden of proof or persuasion by adding enough pebbles to its side of the scale? Citations of fact add weight to an argument; the amount of weight depends on a fact’s certainty plus its persuasiveness. But when a court can’t tell the truth from fiction, and a fake goes onto the scale, the law can tip in irrational, harmful ways.
The independent investigative journalism group ProPublica reports that Supreme Court opinions contain an unsettlingly high number of factual errors. The ProPublica team found many errors that, although embarrassing, are not always consequential—say, undercounting the number of U.S. states that have passed a given law.
Other errors are startling.
In one case, the Court recited a fact that “88 percent of all private companies in the country conduct [background] checks,” to deny a lawsuit by Caltech scientists who worked at the Jet Propulsion Laboratory on a contract basis. The scientists argued that the background checks (featuring intrusive questions about sensitive health matters, and questionnaires sent to scientists’ friends and family) were intrusive enough to infringe on protected rights. But the Court reasoned that a practice followed by 88 percent of private companies must not be very intrusive.
Here’s the problem—nobody knows where that number came from, including the attorneys who filed the amicus brief containing that “fact.”
Some errors are even worse. The Court cited fake data about drug-sniffing dogs to water down constitutional requirements for search and seizure. The Court cited a fake fact – that no-bail imprisonment of immigrants awaiting trial averaged 4 months – to validate the practice. But in truth, the average length of detention was 13 months.
Maybe we shouldn’t be too surprised. After all, the Supreme Court’s brilliant minds have gotten a lot of important things wrong over the years. Spotting what you might call fake logic – fallacies – is hard too for the sharpest people; the ability to do it is impaired by biases we carry through our lives. And yet, observation-based facts should be different and easier to keep straight. That is the promise of the scientific method. In the end, blame probably lies with the most pernicious fallacy of all – confirmation bias. People, judges included, don’t spot a fake where the fake matches their expectations or supports their values. “A man hears what he wants to hear and disregards the rest.”
The ProPublica piece referred to here is available at: https://www.propublica.org/article/supreme-court-errors-are-not-hard-to-find