Impossible? by Julian Havil @ 02:15 pm
Impossible? Surprising Solutions to Counterintuitive Conundrums
A truly enjoyable book of recreational math, or whatever it should be called. At times, Havil delves too deeply into proofs and demonstrations for even my taste, but most of it should be accessible to any interested reader. But the true advantage of the book is that, in a field where the same aged chestnuts are printed and reprinted endlessly, this book had a high ratio of novelty.
So you are part of a team of three people in the lair of an eccentric millionaire. He (at random) distributes red or blue hats on all of your team member's heads. You can see your teammates' hats, but not your own. You must guess the color of your own hat  though you are allowed to 'pass'. If, as a team, there is at least one correct guess and no incorrect guesses, the eccentric will give you all a million dollars. What should your team strategy be?
If the three of you each guess red or blue at random, there's a 1 in 8 (12.5%) chance of winning by guessing them correctly, since there are two ways for each hat to be, making 2*2*2 = 8 possibilities in all.
If the three of you guess red/blue/pass at random equally, a longer examination shows that you've improved your odds to about 25.9%. Adding passes into the mix offers fewer possibilities to be wrong, so it helps the odds.
Taking that to the extreme, you can collude to decide that two of you pass and the third picks red or blue at random. Now your odds are 50% of winning the million dollars. But you can do better than that...
If you see your teammates wearing different colored hats, pass.
If you see your teammates wearing the same colored hats, choose the other color.
If all three of you have red hats, you'll see two red hats and guess blue and be wrong. Likewise for all three blue hats. But the other six cases lead to winning. If there's two red and one blue, two people see mismatched ones and pass, while one person sees two reds and correctly chooses blue. This raises the odds of winning to 75%. Not too shabby. And fairly surprising when the 'naive' guessing algorithm gets you only 12.5% odds.
Even more surprising (to me, anyway) is that if you extend this to teams of more than three, your odds actually get better! Havil credits Berlekamp of UC Berkeley with showing that for a team of n players, a strategy can be devised that has odds of winning of n/(n+1). So a team of nine people, with randomly distributed red and blue hats, would have a 90% chance of passing the test.
Unfortunately, the method of choosing is not as simple as the case of three. For a team of seven, you could try to do a similar thing, saying 'If you see 4 red hats, guess blue; otherwise pass'. But that only gets you 55%. The optimal solution involves the use of Hamming Codes, which are used for error correction in data transmission. My eyes started to glaze over during Havil's discussion, but the optimal strategy is "If the six out of seven hats you see are 'almost' a Hamming codeword (taking, say red to be 0 and blue to be 1), choose your hat color so that the hats in the room are not a codeword; otherwise pass". This gets you 87.5%. Bigger teams call for longer codewords and this increases the chances of winning the game.
Speaking of games, Havil computes the odds of getting various hands in poker with the introduction of an extra wild card, like a Joker. In regular poker (without a wild), threeofakind is slightly rarer than two pair, and thus is considered the higher ranked hand. With one wild added, triplets become more common than two pair. Part of the reason why is that the only way to get two pair with a wild card is to have a natural pair and a wild to match with an unmatched card in the hand. But if that's the case, then you should triple up your pair with the wild, rather than making another pair. So adding a wild card does not add any more two pairs to the possible results, but it does add a lot of triplets.
So perhaps to make the game 'fair', two pair should be considered the higher hand when a joker is added? Unfortunately, if you do that, then if someone has a pair and a wild, she'll use it to make two pair instead of a threeofakind. It turns out that there is no way to order the ranking of hands unambiguously in a way that higher hands are less frequent. This is apparently a general character of all wildcard variations (like deuces wild, as opposed to adding a joker).
Finally, just a quote. If you've delved into maths as far as 'precalculus', you've probably run across the harmonic series:
1 + 1/2 + 1/3 + 1/4 + ...
Each term gets smaller and smaller, and closer and closer to zero. It seems friendly enough; it should add up to something pretty quick and after a while all the remaining terms'll be so close to zero that they won't matter any more. Havil quotes Borovik's lecture to his students:
All right, just one more thing, but I'll let wiki explain it, because I would just confuse myself and you trying to explain it: Benford's Law
A truly enjoyable book of recreational math, or whatever it should be called. At times, Havil delves too deeply into proofs and demonstrations for even my taste, but most of it should be accessible to any interested reader. But the true advantage of the book is that, in a field where the same aged chestnuts are printed and reprinted endlessly, this book had a high ratio of novelty.
So you are part of a team of three people in the lair of an eccentric millionaire. He (at random) distributes red or blue hats on all of your team member's heads. You can see your teammates' hats, but not your own. You must guess the color of your own hat  though you are allowed to 'pass'. If, as a team, there is at least one correct guess and no incorrect guesses, the eccentric will give you all a million dollars. What should your team strategy be?
If the three of you each guess red or blue at random, there's a 1 in 8 (12.5%) chance of winning by guessing them correctly, since there are two ways for each hat to be, making 2*2*2 = 8 possibilities in all.
If the three of you guess red/blue/pass at random equally, a longer examination shows that you've improved your odds to about 25.9%. Adding passes into the mix offers fewer possibilities to be wrong, so it helps the odds.
Taking that to the extreme, you can collude to decide that two of you pass and the third picks red or blue at random. Now your odds are 50% of winning the million dollars. But you can do better than that...
If you see your teammates wearing different colored hats, pass.
If you see your teammates wearing the same colored hats, choose the other color.
If all three of you have red hats, you'll see two red hats and guess blue and be wrong. Likewise for all three blue hats. But the other six cases lead to winning. If there's two red and one blue, two people see mismatched ones and pass, while one person sees two reds and correctly chooses blue. This raises the odds of winning to 75%. Not too shabby. And fairly surprising when the 'naive' guessing algorithm gets you only 12.5% odds.
Even more surprising (to me, anyway) is that if you extend this to teams of more than three, your odds actually get better! Havil credits Berlekamp of UC Berkeley with showing that for a team of n players, a strategy can be devised that has odds of winning of n/(n+1). So a team of nine people, with randomly distributed red and blue hats, would have a 90% chance of passing the test.
Unfortunately, the method of choosing is not as simple as the case of three. For a team of seven, you could try to do a similar thing, saying 'If you see 4 red hats, guess blue; otherwise pass'. But that only gets you 55%. The optimal solution involves the use of Hamming Codes, which are used for error correction in data transmission. My eyes started to glaze over during Havil's discussion, but the optimal strategy is "If the six out of seven hats you see are 'almost' a Hamming codeword (taking, say red to be 0 and blue to be 1), choose your hat color so that the hats in the room are not a codeword; otherwise pass". This gets you 87.5%. Bigger teams call for longer codewords and this increases the chances of winning the game.
Speaking of games, Havil computes the odds of getting various hands in poker with the introduction of an extra wild card, like a Joker. In regular poker (without a wild), threeofakind is slightly rarer than two pair, and thus is considered the higher ranked hand. With one wild added, triplets become more common than two pair. Part of the reason why is that the only way to get two pair with a wild card is to have a natural pair and a wild to match with an unmatched card in the hand. But if that's the case, then you should triple up your pair with the wild, rather than making another pair. So adding a wild card does not add any more two pairs to the possible results, but it does add a lot of triplets.
So perhaps to make the game 'fair', two pair should be considered the higher hand when a joker is added? Unfortunately, if you do that, then if someone has a pair and a wild, she'll use it to make two pair instead of a threeofakind. It turns out that there is no way to order the ranking of hands unambiguously in a way that higher hands are less frequent. This is apparently a general character of all wildcard variations (like deuces wild, as opposed to adding a joker).
Finally, just a quote. If you've delved into maths as far as 'precalculus', you've probably run across the harmonic series:
1 + 1/2 + 1/3 + 1/4 + ...
Each term gets smaller and smaller, and closer and closer to zero. It seems friendly enough; it should add up to something pretty quick and after a while all the remaining terms'll be so close to zero that they won't matter any more. Havil quotes Borovik's lecture to his students:
Today I said to the calculus students, "I know, you're looking at this series and you don't see what I'm warning you about. you look at it and you think, 'I trust this series. I would take candy from this series. I would get in a car with this series.' But I'm going to warn you, this series is out to get you. Always remember: The harmonic series diverges. Never forget it.
All right, just one more thing, but I'll let wiki explain it, because I would just confuse myself and you trying to explain it: Benford's Law
Share

Flag
