© 2019

6. Balls in Urns

The Ellsberg Paradox

The question:

Suppose you have an urn that contains ninety balls. Thirty are red, and the other sixty are either black or yellow. You do not know how many are black, how many are yellow; there may be zero black balls and 60 yellow balls, one black and 59 yellows, or the other way around, or anything in between. If you guess correctly which ball you draw, you win a prize.

But before you draw, you must answer a question: would you prefer to bet on a red ball or a black ball?

There’s no correct answer, of course, and some people would say ‘black’ but, in general, a majority say ‘red’.[i]

Before they draw the ball out of the urn, a second question is asked but only of those who answered ‘red’ in the first question: would you prefer to bet that it will be ‘red or yellow’, or that it will be ‘black or yellow’? Again, there is no correct answer. Some people will opt for one answer, others for the other, but in experiments most choose ‘black or yellow’.


The paradox:

The latter group – be it a majority or just many people – presents a paradox. “If you are in [this group], you are now in trouble …” said Daniel Ellsberg, the author of this paradox. By responding ‘red’ to the first question, and ‘black or yellow’ to the second, they contradict themselves.

What’s the trouble? Well, by responding ‘red’ to the first question people indicated their subjective belief that there are fewer than 30 black balls in the urn. (There are between 0 and 60 black balls in the urn. So, if one opts for ‘red’, of which there are exactly 30, one implicitly reveals one’s belief that the amount of black balls in the urn is less than thirty.)

Take note that it is quite irrelevant how many black or yellow balls the urn actually contains. We only talk here about the person’s subjective beliefs.

Now, since they believe that there are less than 30 black balls, these people obviously believe that there are more than 30 yellow balls in the urn. (Remember that the blacks and the yellows add up to 60.) Accordingly, together with the 30 red balls, the ‘red or yellow’ balls together add to more than 60. Therefore, those people who – after responding ‘red’ to the first question – gave ‘black or yellow’ as an answer to the second, acted against their own beliefs! In the second answer they contradicted their own assessment in the first.

We have a paradox!



The name Daniel Ellsberg may sound familiar to some readers, though in a totally different context. Ellsberg is the famous leaker of the notorious Pentagon Papers to the Washington Post in 1931. Born in 1931 in Chicago, Ellsberg showed great talent as a child…as a piano virtuoso. But he was talented not only in music and obtained a Pepsi-Cola scholarship to study economics at Harvard University whence he graduated summa cum laude. In 1954, he joined the Marine Corps and served for three years as a platoon leader and company commander. Back at Harvard in 1962, he earned a Ph.D. in Economics with a thesis on Risk, Ambiguity and Decision.

Ellsberg transferred to the State Department in mid-1965. Later, at the RAND Corporation, he worked on the top-secret McNamara study of ‘U.S. Decision-making in Vietnam, 1945-68’. To his deep shock, he learned about governmental deception and unwise decision-making, cloaked by secrecy, under four presidents. Unable to garner attention within the US Congress, he photocopied the entire 7,000 page study and leaked it in the fall of 1969, first to the New York Times, then to the Washington Post, then to seventeen other newspapers.

Ellsberg faced prosecution on twelve federal felony counts but due to horrendous governmental misconduct, the judge declared a mistrial.

* * *

Several years before he became notorious for the Pentagon Papers, Ellsberg had made a name for himself in a totally different context. While still a Fellow at Harvard, he published a paper which showed that there are problems with the way in which people handle probabilities. ‘Risk, ambiguity and the Savage axioms’, published in The Quarterly Journal of Economics, is considered a landmark in the theory of decision making.

The gist of the paper was a test that Ellsberg designed and tested on his colleagues. It consisted of the two questions asked at the outset of this chapter.



Recall that the first question asked you to chose between ’red’ and ‘black. The second question asked you to chose between ‘red or yellow’ and ‘black or yellow’. A rational person, no matter what her belief about the number of black balls, should compare ‘red or yellow’ to ‘black or yellow’ in the same manner as she compares ‘red’ to ‘black’. Adding ‘yellow’ to the question should not influence her choice at all.

If you read the chapter about the Independence of Irrelevant Alternatives (IIA, see Chapter ///) or about the Allais Paradox (Chapter ///), it’s déja-vu all over again. All people who chose ‘red’ at first but reversed their choice when the option ‘yellow’ was added violated the IIA axiom.


Technical Supplement:

Several mathematicians and statisticians had proposed that probabilities cannot be considered objective and that people behave as if they put their own subjective assessments to uncertain events. With his test, Ellsberg showed that even subjective probabilities do not add up.

Are people who behave in that manner irrational? That depends on the defintion of rational. In any case, their behavior is inconsistent with the axioms of expected utility theory that was developed by John von Neumann and Oskar Morgenstern. (See also the chapters on the St. Petersbug Paradox and on Gambling vs. Insurance.)

Ellsberg took his colleagues to task, many much senior than he, either because of their theoretical work or because they participated in his experiment and chose the conflicting answers. “There are those who do not violate the axioms, or say they won’t, even in these situations…. Some violate the axioms cheerfully, even with gusto… others sadly but persistently, having looked into their hearts, found conflicts with the axioms and decided… to satisfy their preferences and let the axioms satify themselves. Still others … tend, intuitively, to violate the axioms but feel guilty about it and go back into further analysis…. A number of people who are not only sophisticated but reasonable decide that they wish to persist in their choices.” The latter group, Ellsberg notes, includes people who felt committed to the axioms and were surprised, even dismayed, to find that when put to the task, they violated them.


Ellsberg believed that these people’s choices were motivated by ambiguity aversion: people seem to prefer taking on risk in situations where they know the odds rather than when the odds are completely ambiguous. When they choose ‘black or yellow’ they know that there are 60 balls; had they opted for ‘red or yellow’, there could have been anywhere between 30 and 90.



[i] The paradox can also be set up for those who answered black.

Comments, corrections, observations: