I encountered this problem during the Google CodeJam competition. I'm paraphrasing it here briefly but the actual link is here: https://codingcompetitions.withgoogle.com/codejam/round/000000000019ff7e/000... The idea is, you have n ballpoint pens, each with between 0 and n-1 units of ink in them. You have exactly one pen of each value, but they are randomly mixed and the only way you can tell which is which is by actually using them until they run out. You want to select two of the pens such that the chance that you have at least n units of ink between the two pens is as high as possible. You can perform any number of tests on any pens, but your only test is to try a pen. When you try a pen, either it is out of ink, or it writes (has ink) but the test itself decreases the amount of ink in that pen by one unit. For a given n (say, n=15), what's a strategy to maximize the chance that the two pens you choose contain at least 15 units of ink between them? if you just choose two pens at random the chance is about 46.7%. The problem was given as a programming problem (as part of a 2 1/2 hour contest) but I think it has sufficient interesting mathematics to be of general interest here. The problem is attributed by Google to Ian Tullis. -tom -- -- http://cube20.org/ -- http://golly.sf.net/ --