M
MinnesotaMike
Rising Star
Bronze Level
The "Kelly formula" purports to be the gambler's holy grail. It was devised by two scientists at Bell Labs in the 1950s and has been widely used at blackjack and sports betting. It tells how much to bet on any wager in order to make the most money in the long run. The Kelly formula is the subject of William Poundstone's excellent new book, "Fortune's Formula." My one disappointment is that there is nothing about poker in the book. I am trying to work out the proper Kelly formula bet for poker. I thought I'd share what I've got and see if anyone can improve on it.
Poundstone summarizes Kelly's 1956 paper, "A New Interpretation of Information Rate." In brief, Kelly offers this advice:
1. When you don't have a statistical advantage, don't bet. i.e., no betting on slot machines, lotto tickets.
2. When you have an advantage, bet a percentage of your gambling bankroll. This percentage should equal this simple formula: edge/odds (explanation several paragraphs below). The "gambling bankroll" is whatever money you have allotted to play with and hope to increase as efficiently as possible.
3. By following rules #1 and #2 you are guaranteed to end up richer, as time goes to infinity, than you would by following any other rules.
As to point 1, the message for poker is clear. Your advantage in poker is always contextual. It will depend on your skill and those of the people you play with. Maybe you play with the same five people every Friday night and regularly lose. Then your "advantage" is negative. The Kelly advice is: stop playing with those people. Maybe you do have an advantage with your friends, and wouldn't in the world series of poker. Play only where you have an advantage.
In blackjack and horseracing you are free to choose how much to bet within wide limits. The Kelly formula tells you the optimal bet amount. In poker the amount you bet is determined by a multi-person process of bidding with strategic elements. I might bet $10 and fold, or I might raise several times, investing $80. I could then lose the $80 or win a pot that might be worth several times that.
Running out of money is a common experience in any game of chance, even when you've got a statistical advantage. One of the things the Kelly formula does is to prevent the player with an advantage from running out of money through a streak of bad luck. The player who fails to heed the Kelly advice is "overbetting." He's betting more than is justified by his advantage and the amount of money he has to gamble with. This is a sure way to end up broke.
I believe it would be useful to know how high a wager in poker constitutes overbetting. This would warn you to avoid games where the betting frequently exceeds that, or tell you that you'd better bring more money.
The Kelly formula, edge/odds, works like this. "Edge" is your statistical advantage, as a percent. In poker it's always going to depend on who or what you're playing with. You may find that in playing with a certain pokerbot, over and over, you win an average of $105 for each $100 you bet. Your edge is then 5 percent. In gambling generally, you're lucky to have a few percent edge.
The "odds" comes from horse racing. It measures the profit assuming you win. Horse race odds are something like 10 to 1, which means that a winning wager pays $10 profit for every dollar wagered You also get your $1 back. The Kelly formula uses just the first number (10 here), the "to 1" part being assumed.
An even-money bet has 1 to 1 odds. In that case, edge/odds reduces to just edge. The Kelly formula then says to bet a proportion of your gambling bankroll equal to your edge. This is almost the case in blackjack. A card counter who estimates the deck has a 3 percent advantage should bet about 3 percent of his chips on the next hand.
In reality, Poundstone's book says, the effective odds in blackjack are slightly higher than 1. This is because of doubling down and splitting pairs. In situations where these are called for, the optimal player must add to his original bet. This increases the average odds of the game. This element of having to add to your original bet is like the raising in poker.
The odds in poker would work something like this. In a simple game with N players, where everyone bets the same amount x and stays in, the jackpot is Nx. The winning player's profit is Nx-x. Divide that by how much the winner wagered (x) and the odds are N-1.
Take a closer-to-realistic game where one person bets increment x and folds; another bets 2x and folds, and so on, until the Nth player raises to Nx and wins the pot. The value of the pot must equal (N*(N+1)/2). Subtract the Nx the winner has bet to get his profit, and divide by Nx to get the odds. This simplifies to (N-1)/2, exactly half the odds in the less-plausible case above. I imagine that (N-1)/2 is a decent approximation to the Kelly formula odds in an N-person poker hand. Of course, the odds could be much different depending on the situation. In a no-limit game where two players raise each other repeatedly, long after everyone else has folded, the odds approach 1 (since the two raising players are each putting up nearly half the ultimate pot, and it's "double or nothing").
What does this mean? I usually make sure I have about $200 to gamble with. I play with about five people. Optimistically, let's say I have a 5 percent edge. Then the Kelly bet is 5%/(5-1)/2, or about 2.5 percent of my $200 bankroll. That is a mere $5. When the betting gets above that, and it always does, I'm in the danger zone by the Kelly formula. I run the risk of going broke before getting rich, even with my 5 percent advantage.
The conclusion is that I should either bring more money or play with fewer or dumber people.
I'm curious to know if there are any real statistics relating the size of the ultimate pot to betting limits, the number of players, and type of game. I haven't had much luck finding this on the web.
Kelly formula links:
Kelly Formula, Bell Labs, Data Transmission, and Optimal Bet Sizing
Get Rich: Here's the Math
Kelly's original paper (very technical)
Poundstone summarizes Kelly's 1956 paper, "A New Interpretation of Information Rate." In brief, Kelly offers this advice:
1. When you don't have a statistical advantage, don't bet. i.e., no betting on slot machines, lotto tickets.
2. When you have an advantage, bet a percentage of your gambling bankroll. This percentage should equal this simple formula: edge/odds (explanation several paragraphs below). The "gambling bankroll" is whatever money you have allotted to play with and hope to increase as efficiently as possible.
3. By following rules #1 and #2 you are guaranteed to end up richer, as time goes to infinity, than you would by following any other rules.
As to point 1, the message for poker is clear. Your advantage in poker is always contextual. It will depend on your skill and those of the people you play with. Maybe you play with the same five people every Friday night and regularly lose. Then your "advantage" is negative. The Kelly advice is: stop playing with those people. Maybe you do have an advantage with your friends, and wouldn't in the world series of poker. Play only where you have an advantage.
In blackjack and horseracing you are free to choose how much to bet within wide limits. The Kelly formula tells you the optimal bet amount. In poker the amount you bet is determined by a multi-person process of bidding with strategic elements. I might bet $10 and fold, or I might raise several times, investing $80. I could then lose the $80 or win a pot that might be worth several times that.
Running out of money is a common experience in any game of chance, even when you've got a statistical advantage. One of the things the Kelly formula does is to prevent the player with an advantage from running out of money through a streak of bad luck. The player who fails to heed the Kelly advice is "overbetting." He's betting more than is justified by his advantage and the amount of money he has to gamble with. This is a sure way to end up broke.
I believe it would be useful to know how high a wager in poker constitutes overbetting. This would warn you to avoid games where the betting frequently exceeds that, or tell you that you'd better bring more money.
The Kelly formula, edge/odds, works like this. "Edge" is your statistical advantage, as a percent. In poker it's always going to depend on who or what you're playing with. You may find that in playing with a certain pokerbot, over and over, you win an average of $105 for each $100 you bet. Your edge is then 5 percent. In gambling generally, you're lucky to have a few percent edge.
The "odds" comes from horse racing. It measures the profit assuming you win. Horse race odds are something like 10 to 1, which means that a winning wager pays $10 profit for every dollar wagered You also get your $1 back. The Kelly formula uses just the first number (10 here), the "to 1" part being assumed.
An even-money bet has 1 to 1 odds. In that case, edge/odds reduces to just edge. The Kelly formula then says to bet a proportion of your gambling bankroll equal to your edge. This is almost the case in blackjack. A card counter who estimates the deck has a 3 percent advantage should bet about 3 percent of his chips on the next hand.
In reality, Poundstone's book says, the effective odds in blackjack are slightly higher than 1. This is because of doubling down and splitting pairs. In situations where these are called for, the optimal player must add to his original bet. This increases the average odds of the game. This element of having to add to your original bet is like the raising in poker.
The odds in poker would work something like this. In a simple game with N players, where everyone bets the same amount x and stays in, the jackpot is Nx. The winning player's profit is Nx-x. Divide that by how much the winner wagered (x) and the odds are N-1.
Take a closer-to-realistic game where one person bets increment x and folds; another bets 2x and folds, and so on, until the Nth player raises to Nx and wins the pot. The value of the pot must equal (N*(N+1)/2). Subtract the Nx the winner has bet to get his profit, and divide by Nx to get the odds. This simplifies to (N-1)/2, exactly half the odds in the less-plausible case above. I imagine that (N-1)/2 is a decent approximation to the Kelly formula odds in an N-person poker hand. Of course, the odds could be much different depending on the situation. In a no-limit game where two players raise each other repeatedly, long after everyone else has folded, the odds approach 1 (since the two raising players are each putting up nearly half the ultimate pot, and it's "double or nothing").
What does this mean? I usually make sure I have about $200 to gamble with. I play with about five people. Optimistically, let's say I have a 5 percent edge. Then the Kelly bet is 5%/(5-1)/2, or about 2.5 percent of my $200 bankroll. That is a mere $5. When the betting gets above that, and it always does, I'm in the danger zone by the Kelly formula. I run the risk of going broke before getting rich, even with my 5 percent advantage.
The conclusion is that I should either bring more money or play with fewer or dumber people.
I'm curious to know if there are any real statistics relating the size of the ultimate pot to betting limits, the number of players, and type of game. I haven't had much luck finding this on the web.
Kelly formula links:
Kelly Formula, Bell Labs, Data Transmission, and Optimal Bet Sizing
Get Rich: Here's the Math
Kelly's original paper (very technical)