Richard Jeffrey

American logician
The basics

Quick Facts

IntroAmerican logician
PlacesUnited States of America
wasLogician Philosopher
Work fieldPhilosophy
Gender
Male
Birth5 August 1926, Boston, USA
Death9 November 2002Princeton, USA (aged 76 years)
Star signLeo
Education
Princeton University
University of Chicago
Awards
John Simon Guggenheim Memorial Foundation Fellowship 
The details

Biography

Richard Carl Jeffrey (August 5, 1926 – November 9, 2002) was an American philosopher, logician, and probability theorist. He is best known for developing and championing the philosophy of radical probabilism and the associated heuristic of probability kinematics, also known as Jeffrey conditioning.

Life and career

Born in Boston, Massachusetts, Jeffrey served in the U.S. Navy during World War II. As a graduate student he studied under Rudolf Carnap and Carl Hempel. He received his M.A. from the University of Chicago in 1952 and his Ph.D. from Princeton in 1957. After holding academic positions at MIT, City College of New York, Stanford University, and the University of Pennsylvania, he joined the faculty of Princeton in 1974 and became a professor emeritus there in 1999. He was also a visiting professor at the University of California, Irvine.

Jeffrey, who died of lung cancer at the age of 76, was known for his sense of humor, which often came through in his breezy writing style. In the preface of his posthumously published Subjective Probability, he refers to himself as "a fond foolish old fart dying of a surfeit of Pall Malls".

Philosophical work

As a philosopher, Jeffrey specialized in epistemology and decision theory. He is perhaps best known for defending and developing the Bayesian approach to probability.

Jeffrey also wrote, or co-wrote, two widely used and influential logic textbooks: Formal Logic: Its Scope and Limits, a basic introduction to logic, and Computability and Logic, a more advanced text dealing with, among other things, the famous negative results of twentieth century logic such as Gödel's incompleteness theorems and Tarski's indefinability theorem.

Radical probabilism

In frequentist statistics, Bayes' theorem provides a useful rule for updating a probability when new frequency data becomes available. In Bayesian statistics, the theorem itself plays a more limited role. Bayes' theorem connects probabilities that are held simultaneously. It does not tell the learner how to update probabilities when new evidence becomes available over time. This subtlety was first pointed out in terms by Ian Hacking in 1967.

However, adapting Bayes' theorem, and adopting it as a rule of updating, is a temptation. Suppose that a learner forms probabilities Pold(A&B)=p and Pold(B)=q. If the learner subsequently learns that B is true, nothing in the axioms of probability or the results derived therefrom tells him how to behave. He might be tempted to adopt Bayes' theorem by analogy and set his Pnew(A) = Pold(A | B) = p/q.

In fact, that step, Bayes' rule of updating, can be justified, as necessary and sufficient, through a dynamic Dutch book argument that is additional to the arguments used to justify the axioms. This argument was first put forward by David Lewis in the 1970s though he never published it.

That works when the new data is certain. C. I. Lewis had argued that "If anything is to be probable then something must be certain". There must, on Lewis' account, be some certain facts on which probabilities were conditioned. However, the principle known as Cromwell's rule declares that nothing, apart from a logical law, can ever be certain, if that. Jeffrey famously rejected Lewis' dictum and quipped, "It's probabilities all the way down." He called this position radical probabilism.

In this case Bayes' rule isn't able to capture a mere subjective change in the probability of some critical fact. The new evidence may not have been anticipated or even be capable of being articulated after the event. It seems reasonable, as a starting position, to adopt the law of total probability and extend it to updating in much the same way as was Bayes' theorem.

Pnew(A) = Pold(A | B)Pnew(B) + Pold(A | not-B)Pnew(not-B)

Adopting such a rule is sufficient to avoid a Dutch book but not necessary. Jeffrey advocated this as a rule of updating under radical probabilism and called it probability kinematics. Others have named it Jeffrey conditioning.

It is not the only sufficient updating rule for radical probabilism. Others have been advocated including E. T. Jaynes' maximum entropy principle and Brian Skyrms' principle of reflection.

Jeffrey conditioning can be generalized from partitions to arbitrary condition events by giving it a frequentist semantics.

Selected bibliography

  • Formal Logic: Its Scope and Limits. 1st ed. McGraw Hill, 1967. ISBN 0-07-032316-X
    • 2nd ed. McGraw Hill, 1981. ISBN 0-07-032321-6
    • 3rd ed. McGraw Hill, 1990. ISBN 0-07-032357-7
    • 4th ed., John P. Burgess (editor), Hackett Publishing, 2006, ISBN 0-87220-813-3
  • The Logic of Decision. 2nd ed. University of Chicago Press, 1990. ISBN 0-226-39582-0
  • Probability and the Art of Judgment. Cambridge University Press, 1992. ISBN 0-521-39770-7
  • Computability and Logic (with George Boolos and John P. Burgess). 4th ed. Cambridge University Press, 2002. ISBN 0-521-00758-5
  • Subjective Probability: The Real Thing. Cambridge University Press, 2004. ISBN 0-521-53668-5
The contents of this page are sourced from Wikipedia article on 07 Jun 2020. The contents are available under the CC BY-SA 4.0 license.