| No Comments | No TrackBacks

TLS' David Papineau discusses statistician Thomas Bayes and his "Essay towards Solving a Problem in the Doctrine of Chances" as being "a breakthrough in thinking about probability:"

Most people who have heard of Thomas Bayes associate him primarily with "Bayes's theorem". This states that the probability of A given B equals the probability of B given A, times the probability of A, divided by the probability of B. So, in our case, Prob(biased coin/five heads) = Prob(five heads/biased coin) x Prob(biased coin) / Prob(five heads).

As it happens, this "theorem" is a trivial bit of probability arithmetic. (It falls straight out of the definition of Prob(A/B) as Prob(A&B) / P(B).) Because of this, many dismiss Bayes as a minor figure who has done well to have the contemporary revolution in statistical theory named after him. But this does a disservice to Bayes. The focus of his paper is not his theorem, which appears only in passing, but the logic of learning from evidence.

"It was this 'problem of the priors'," Papineau reminds us, "that historically turned orthodox statisticians against Bayes:"

They couldn't stomach the idea that scientific reasoning should hinge on personal hunches. So instead they cooked up the idea of "significance tests". Don't worry about prior probabilities, they said. Just reject your hypothesis if you observe results that would be very unlikely if it were true.

"In truth," writes Papineau, "this is nonsense on stilts."

The advent of modern computers has greatly expanded the application of these techniques. Bayesian calculations can quickly become complicated when a number of factors are involved. But in the 1980s Judea Pearl and other computer scientists developed "Bayesian networks" as a graph-based system for simplifying Bayesian inferences. These networks are now used to streamline reasoning across a wide range of fields in science, medicine, finance and engineering.

"The vindication of Bayesian thinking is not yet complete," he continues:

No sane recipe can ignore prior probabilities when telling you how to respond to evidence. Yes, a theory is disconfirmed if it makes the evidence unlikely and is supported if it doesn't. But where that leaves us must also depend on how probable the theory was to start with. Thomas Bayes was the first to see this and to understand what it means for probability calculations. We should be grateful that the scientific world is finally taking his teaching to heart.

For more, see Sharon McGrayne's book on Bayes, The Theory That Would Not Die.

No TrackBacks

TrackBack URL:

Leave a comment

About this Entry

This page contains a single entry by cognitivedissident published on July 5, 2018 2:14 PM.

King's socialism was the previous entry in this blog.

Wrath Month is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Monthly Archives


  • About
  • Contact
OpenID accepted here Learn more about OpenID
Powered by Movable Type 5.031