Harold Jeffreys on Probability
D Wrinch and H Jeffreys, On Some Aspects of the Theory of Probability, Philosophical Magazine 38 (1919), 715731. D Wrinch and H Jeffreys, On Certain Fundamental Principles of Scientific Inquiry, Philosophical Magazine 42 (1921), 369390, D Wrinch and H Jeffreys, On Certain Fundamental Principles of Scientific Inquiry, Philosophical Magazine 45 (1923), 368374. In their views of probability they were influenced by William Ernest Johnson and John Maynard Keynes. Dorothy Wrinch had, in fact, attended lectures by Johnson. Jeffreys used the ideas from these papers in his book Scientific Inference published by Cambridge University Press in 1931. We give below an extract from Jeffreys' book on Probability.

Probability
byHarold Jeffreys
1. What is probability?
Suppose that a man wishes to catch a train announced to start at 1.00 p.m. When he is a quarter of a mile from the station he looks back and sees that a church clock some distance away indicates 12.55. Will he catch the train?
From previous experience he knows that a quarter of a mile in five minutes means comfortable walking Without wasting time. The distance, with slight exertion, can be done in four minutes. Hence he may reasonably expect to catch the train, especially if he hurries slightly. But he has to get a ticket before he will be admitted to the platform. If he finds nobody waiting at the booking office this is a matter of ten seconds; but if there is a queue of ten people it will take two minutes, and he has no means of knowing which will occur in this case. Again, though the church clock is usually reliable, it has been known on a few occasions to be as much as three minutes slow. If that is so on this occasion, and the train is punctual, his chance of catching the train disappears. On the other hand, if the train is a few minutes late, as sometimes happens, he will catch it even if there is a queue and the clock is slow. Further, there is always the possibility of something quite unforeseen, such as an accident on the line. In that event the 11.14 train may arrive at 1.30 and his problem will be solved.
Now we notice that in this situation the man has some definite information, which is relevant to the proposition "he will catch the train". But numerous other possibilities, none of which he can foresee, are also intensely relevant. Therefore his available knowledge, though relevant to the proposition at issue, is not such as to make it possible to assert definitely that this proposition is true or false. Further, extra data will have a definite effect on his attitude to the proposition. If he meets an astronomer whose watch has just been compared with a wireless time signal, and who assures him that the church clock is accurate, he feels more confident. On the other hand, if a crowded omnibus passes him he expects his worst fears about the queue to be verified. Thus the attitude to the proposition under discussion does not amount to a definite assertion of its truth or falsehood; it is an impression capable of being modified at any time by the acquisition of new knowledge.
Probability expresses a relation between a proposition and a set of data. When the data imply that the proposition is true, the probability is said to amount to certainty; when they imply that it is false, the probability becomes impossibility. All intermediate degrees of probability can arise.
The relation of the laws of science to the data of observation is one of probability. The more facts are in agreement with the inferences from a law, the higher the probability of the law becomes; but a single fact not in agreement may reduce a law, previously practically certain, to the status of an impossible one. A specimen of a practically certain law is Ohm's law for solid conductors. Newton's inverse square law of gravitation first became probable when it was shown to give the correct ratio of gravity at the earth's surface to the acceleration of the moon in its orbit. Its probability increased as it was shown to fit the motions of the planets, satellites, and comets, and those of double stars, with an astonishing degree of accuracy. Leverrier's discovery of the excess motion of the perihelion of Mercury scarcely changed this situation, for the phenomenon was qualitatively explicable by the attraction of the visible matter within Mercury's orbit. Newton's law was first shown to be wrong, as a universal proposition, when it was found that such matter could not actually be present in sufficient quantity to account for the anomalous motion of Mercury.
The fundamental notion of probability is intelligible a priori to everybody, and is regularly used in everyday life. Whenever a man says "I think so" or "I think not" or "I am nearly sure of that" he is speaking in terms of this concept; but an addition has crept in. If three persons are presented with the same set of facts, one may assert that he is nearly certain of a result, another that he believes it probable, while the third will express no opinion at all. This might suggest that probability is a matter of differences between individuals. But an analogous situation arises with regard to purely logical inference. One person, reading the proof of Euclid's fifth proposition, is completely convinced; another is entirely unable to grasp it; while there, is at any rate one case on record when a student said that the author had rendered the result highly probable. Nobody says on this account that logical demonstration is a matter for personal opinion. We say that the proposition is either proved or not proved, and that such differences of opinion are the result of not understanding the proof, either through inherent incapacity or through not having taken the necessary trouble. The logical demonstration is right or wrong as a matter of the logic itself, and is not a matter for personal judgment. We say the same about probability. On a given set of data p we say that a proposition q has in relation to these data one and only one probability. If any person assigns a different probability, he is simply wrong, and for the same reasons as we assign in the case of logical judgments. Personal differences in assigning probabilities in everyday life are not due to any ambiguity in the notion of probability itself, but to mental differences between individuals, to differences in the data available to them, and to differences in the amount of care taken to evaluate the probability.
2. Principles of probability
The mathematical discussion of probability depends on the principle that probabilities can be expressed by means of numbers. This depends in turn on two deeper postulates:
2. All propositions impossible on the data have the same probability, which is not greater than any other probability; and all propositions certain on the data have the same probability, which is not less than any other probability.
Such an order once established, we can construct a correspondence between probabilities and real numbers, so that to every probability corresponds one and only one number, and so that of every pair of probabilities the less corresponds to the smaller number. When this is done the system of numbers can be used as a scale of reference for probabilities. But the choice is not yet unique. Obviously if x_{1} , x_{2} , ..., x_{n} are a set of positive numbers in increasing order of magnitude, x_{1}^{2}, x_{2}^{2}, ..., x_{n}^{2} are another set, exp(x_{1}), exp(x_{2}), ..., exp(x_{n}) a third,
Again, let us consider any set of m equally probable and mutually contradictory propositions, and call the number attached to any one of them, on the same data, x. If we select any t of them, the number attached to the proposition that one of these t is true is tx, by our rule.
Now take t = m, and suppose that on our data there is just one true proposition among the m, but that we have no means of knowing which it is. The number attached to the proposition that one of the m propositions is true is mx. But on our data this proposition is certain, and therefore mx is the number corresponding to certainty, which is a definite constant by Prop. 2. We therefore choose 1 as the constant to be attached to certainty. This is another convention. Thus mx = 1, and we derive the rule:
It follows from this that any probability can be made to correspond to a real number, rational or irrational. For any given probability P either corresponds to a rational fraction or does not. In the former case the proposition is granted. In the latter case every Rprobability is either greater or less than P. Hence P divides the Rprobabilities into two classes R_{1} and R, such that the probabilities in R_{1} are all less than P and those in R are all greater than P. Also, since the relation "greater than" among probabilities is transitive, every fraction corresponding to an R probability is greater than every fraction corresponding to an R_{1} probability. Hence P determines a cut in the series of rational fractions. But this is precisely the method of defining a real irrational number. when it is specified which rational fractions are on one side of the cut and which on the other side, there is one and only one real number that can occupy the cut. We then associate the probability P with this number. In this way we arrive at the result:
Consistency is therefore proved for Rprobabilities. For others the result is easily generalized. For if two nonrational probabilities are associated with real numbers a and b, of which a is the greater, we can find a rational fraction t/m lying between them. Then the probability associated with a is greater than that associated with t/m, and that associated with t/m is greater than that associated with b. Hence, in virtue of the transitive property of the relation more probable than, the probability associated with a is greater than that associated with b. In other words, the greater number corresponds to the greater probability.
We have seen how definite numbers can be associated with probabilities, so that the higher number always corresponds to the higher probability. In consequence of our fundamental assumption our rules always imply the existence of a definite probabilitynumber. The rules, as we stated before, are conventions and not hypotheses; for if the probabilitynumber assigned by our rules is x, any function of x that always increases with x would satisfy the fundamental assumption. But the choice that we have made seems to be far the most convenient. Henceforth we shall have no need to speak of probabilities apart from their associated numbers, and when we speak of the probability of a proposition on given data we shall mean the number associated with the probability by our rules.
JOC/EFR August 2007
The URL of this page is:
https://wwwhistory.mcs.standrews.ac.uk/Extras/Jeffreys_Probability.html