Trending Topics

How a famous baseball statistician would change the justice system

Bill James proposes a mathematical model for evaluating evidence and calculating whether a burden of proof has been met

For all the science, technology, and engineering behind modern policing, we can’t escape that the ultimate arbiter of our work relies on an immeasurable and imprecise standard we call “reasonable doubt.”

The sum of our investigative endeavors is laid before 12 citizens who must make certain decisions based on uncertain circumstances. The court will guide them on the mechanics of the law. But for the ultimate question of guilt or innocence, they are left with nothing but the good sense they brought with them and the evidence presented to them.

Bill James is a statistician and writer, mostly known for being an obsessive baseball fan who created new methods for measuring and evaluating the game. He also wrote a terrific book called “Popular Crime,” in which he showed another of his obsessions: true crime stories.

A Mathematical Model
James once wrote, “A trial is rather like a basketball game at which no one keeps score, but at the end of the game the audience is asked to vote on which team has played better.”

James is a fascinating writer and thinker. His ideas are often radical, and because of that they are easy to dismiss. But the man does not present any idea flippantly. His research is exhaustive and his analysis always seems grounded in wisdom.

In “Popular Crime,” James proposes a mathematical model for evaluating evidence and calculating whether a burden of proof has been met.

He wrote, “I know, I know, I know; this is totally impossible, it’s ridiculous, it’s absurd, it can’t be done, logic doesn’t work that way; I understand all that, I get it. I’m just saying … what if?”

James’ proposition calls for a standard (he proposes 100 points) which must be met to convict a defendant of a crime. Evidence is then assigned numerical values based on what it shows and how strongly it shows it. He presents this as the framework for assigning that value:

1. State the fact itself in a way that is unambiguously true.
2. State that which tends to be proven by the fact, as if this was known to be true.
3. Put the statement of fact proven by (2) in a “standard evidence” form (a statement of evidence, as opposed to a statement of fact).
4. Establish the value of the statement of evidence (3) with reference to a standard set of values for such evidence.
5. Make an estimate of the extent to which the statement (3) is unproven.
6. Make an estimate of the extent to which the statement (3) is irrelevant.
7. Discount the value (4) by the extent to which the statement is unproven (5) or irrelevant (6).

For instance, suppose a homicide suspect mailed a death threat to the victim before the victim was murdered. The fact can be stated as “the suspect mailed a death threat to the victim.” What tends to be proven by that fact can be stated as “the suspect wanted to kill the victim.” That statement is then assigned a numerical value from a predetermined range for evidence of that type, and it is discounted by the extent to which it is unproven and/or irrelevant.

A More Scientific Method
Admittedly, the above framework is going to leave things open to interpretation. I have no doubt that two people could evaluate the same evidence using that framework and come up with different numbers. But as long as there are safeguards in place to prevent the numbers from varying too drastically, this still seems like a more scientific method than a blindfolded statue with a scale.

One of the tenets of James’ proposal is that no single piece of evidence should be sufficient to meet the burden of proof. But strong evidence should be scored appropriately high. Maybe DNA won’t get you 100 points on its own in a rape case, but it should get you close. Unless, of course, it is a question of consent, in which case the evidence is scored differently.

Another important facet of James’ proposal is that redundant evidence is discounted when it only tends to prove what has already been proven. Say you have a burglary case in which you have a suspect’s DNA, fingerprints, and shoe impressions at the scene of the crime. The DNA might be worth 60 points, the fingerprints another 60 and the shoe impressions 35, but they don’t combine for 155 points and a conviction. They all prove the same thing, and cumulatively they probably aren’t worth much beyond what DNA or fingerprints would have been alone.

James also argues for an overhaul of the types of things that can be presented as evidence. Specifically, James argues that most forms of hearsay should be admissible as evidence and discounted appropriately when there are concerns about their veracity and their relevance.

James is not an attorney or a police officer. But as a spectacularly well-read layperson, it is interesting to hear him question the technicalities that we all take for granted. He has a point. When a jury is vested with so much responsibility, why do we prevent them from knowing things that they would undoubtedly want to know

Barney Doyle has been a police officer for nine years. He spent five years on patrol and the last four-plus years doing white-collar crime investigations. He has a bachelor of science degree in accounting and is a Certified Fraud Examiner. Prior to law enforcement, Barney worked as a newspaper reporter. He still writes in his spare time and runs the site www.workingpolice.com.

Contact Barney Doyle