A
AviCKter
Visionary
Silver Level
In probability theory, the expected value (or expectation) of a random variable, is the long-run average value of repetitions (trails) of the experiment it represents.
The formula for expected value,
When we're tossing a coin, the possible outcomes of the experiments are {Heads, Tails}, each with a probability of {1/2, 1/2}; i.e. the coin has a probability of coming heads 50% (1/2 of the time) and probability of tails 50% (1/2 of the time). Simple. In this case, we cannot calculate the EV directly, because we cannot multiple a outcomes with its probability (i.e. we cannot multiple Heads with 1/2 or Tails with 1/2).
Lets take another example, lets say we're rolling a dice, the possible outcomes of the experiments are {1, 2, 3, 4, 5, 6}, each outcome having a probability 1/6. Assuming the fair dice, the EV = 1*1/6+2*1/6+3*1/6+4*1/6+5*1/6+6*1/6 = 3.5.
Now this doesn't mean that in any given trail, we'll get a value of 3.5, simply not possible because 3.5 is not an outcome of the experiment. It simply means that if we were to roll the dice infinite number of times, the average value of the outcome would be close to 3.5.
Great, but how does it translate to gambling situation?
Lets take a simple example, lets say you and your friend have decided to wager $10 on the outcome of a coin toss, if a head comes your friend pays you $10, if a tail comes you pay your friend $10. As we've already seen, the possible outcome for the experiments are {Heads, Tails}, each with probability of 1/2. Remember we couldn't simply find the expectation of that experiment earlier (we were unable to multiple "Heads" or "Tails" with 1/2, a number), but in this experiment we've replaced the outcome with a monetary figure (number). So in case a head comes, you win $10 and if a tail comes, you lose $10. So the outcome of the experiment from your perspective is {+10, -10), this can be multiplied with the probability.
Assuming a fair coin, EV= +10*1/2+(-10)*1/2 = 0
So the expected value of the game is $0, i.e. in the long run (running the experiment infinite number of times), you expect to break-even (no gain, no loss). This doesn't mean that in a particular trial, you'll have a $0 profit, i.e. in each trail either you'll win $10 or lose $10.
You with me so far?
How do we take this concept into poker?
Again lets start with a simple example, lets say you're playing Heads-Up and each of the player has 10 Big Blinds. The SB posts 0.5bb, and you're sitting in the BB, post 1bb, before any card is dealt. Suppose the SB goes all-in and you look at your cards, and you see. As you may or may not know, the best hand against AA is 65s, which has an equity of 23%. Lets say you know for sure that he has , and have already decide to make the call (at least you should), what is your expected value from the game?
EV = 0.77*11 + 0.23*(-9) = +6.4bb , the 11bb that's actually in the pot and the 9bb that you have to invest to make the call.
So in the long run for this experiment you're expected to win +6.4bb when you make the call.
Even still, I know the concept but why should I make these many calculations?
Because poker is a multiple street game & you have a decision to make, and you cannot make a decision without carefully considering the different options (Call/Fold/Raise) in front of you. Now most beginners try to make the decision based on the number of outs they have, i.e. if I have a flush draw, I have 9 outs, which roughly translates into 36%, by the river. But the problem with this is that you're already assuming the opponent's hand, a particular hand or even a small set of hands, like on a board with , holding a , you're assuming that he has a J or a 10. Which might or might not be correct, but simplifies the problem. The more advanced players try to range the opponent, give the opponent a particular set of hands and calculate their equity against that range. Now, there's no problem with equities, but it doesn't give us a clear answer to whether we should call, fold or raise. That's where EV comes into play, EVs tell us what action to choose from the multiple options in front of us.
I do understand that its tough to do these kind of calculations on the table, you really need a good brain to be able to make these kind of calculation on the go. And as we know, most of us aren't that fortunate, so what we can do instead is make these kinds of calculation off the table and use it while playing.
The whole point of this post is to make you understand that the game has evolved and the players (opponents/playing field) have become much more knowledgeable (tougher). Now to stay in the competition, you need to be able to stand on the same plank as the competition and the only way you'll ever be able to beat them, unless off course, you're just blessed by the Luck-God/Goddess.
Hope it makes sense.
Disclaimer: All the ideas I've presented is not something new, its been discussed over and over again in multiple materials by multiple authors.
The formula for expected value,
, where x1, x2, x3, ...,xk is the possible outcomes of the experiments, each occurring with probability p1, p2, p3, ..., pk. Sounds complicated, right? Let me explain with an example.EV = x1*p1+x2*p2+x3*p3+....+xk*pk
When we're tossing a coin, the possible outcomes of the experiments are {Heads, Tails}, each with a probability of {1/2, 1/2}; i.e. the coin has a probability of coming heads 50% (1/2 of the time) and probability of tails 50% (1/2 of the time). Simple. In this case, we cannot calculate the EV directly, because we cannot multiple a outcomes with its probability (i.e. we cannot multiple Heads with 1/2 or Tails with 1/2).
Lets take another example, lets say we're rolling a dice, the possible outcomes of the experiments are {1, 2, 3, 4, 5, 6}, each outcome having a probability 1/6. Assuming the fair dice, the EV = 1*1/6+2*1/6+3*1/6+4*1/6+5*1/6+6*1/6 = 3.5.
Now this doesn't mean that in any given trail, we'll get a value of 3.5, simply not possible because 3.5 is not an outcome of the experiment. It simply means that if we were to roll the dice infinite number of times, the average value of the outcome would be close to 3.5.
Great, but how does it translate to gambling situation?
Lets take a simple example, lets say you and your friend have decided to wager $10 on the outcome of a coin toss, if a head comes your friend pays you $10, if a tail comes you pay your friend $10. As we've already seen, the possible outcome for the experiments are {Heads, Tails}, each with probability of 1/2. Remember we couldn't simply find the expectation of that experiment earlier (we were unable to multiple "Heads" or "Tails" with 1/2, a number), but in this experiment we've replaced the outcome with a monetary figure (number). So in case a head comes, you win $10 and if a tail comes, you lose $10. So the outcome of the experiment from your perspective is {+10, -10), this can be multiplied with the probability.
Assuming a fair coin, EV= +10*1/2+(-10)*1/2 = 0
So the expected value of the game is $0, i.e. in the long run (running the experiment infinite number of times), you expect to break-even (no gain, no loss). This doesn't mean that in a particular trial, you'll have a $0 profit, i.e. in each trail either you'll win $10 or lose $10.
You with me so far?
How do we take this concept into poker?
Again lets start with a simple example, lets say you're playing Heads-Up and each of the player has 10 Big Blinds. The SB posts 0.5bb, and you're sitting in the BB, post 1bb, before any card is dealt. Suppose the SB goes all-in and you look at your cards, and you see. As you may or may not know, the best hand against AA is 65s, which has an equity of 23%. Lets say you know for sure that he has , and have already decide to make the call (at least you should), what is your expected value from the game?
EV = 0.77*11 + 0.23*(-9) = +6.4bb , the 11bb that's actually in the pot and the 9bb that you have to invest to make the call.
So in the long run for this experiment you're expected to win +6.4bb when you make the call.
Even still, I know the concept but why should I make these many calculations?
Because poker is a multiple street game & you have a decision to make, and you cannot make a decision without carefully considering the different options (Call/Fold/Raise) in front of you. Now most beginners try to make the decision based on the number of outs they have, i.e. if I have a flush draw, I have 9 outs, which roughly translates into 36%, by the river. But the problem with this is that you're already assuming the opponent's hand, a particular hand or even a small set of hands, like on a board with , holding a , you're assuming that he has a J or a 10. Which might or might not be correct, but simplifies the problem. The more advanced players try to range the opponent, give the opponent a particular set of hands and calculate their equity against that range. Now, there's no problem with equities, but it doesn't give us a clear answer to whether we should call, fold or raise. That's where EV comes into play, EVs tell us what action to choose from the multiple options in front of us.
I do understand that its tough to do these kind of calculations on the table, you really need a good brain to be able to make these kind of calculation on the go. And as we know, most of us aren't that fortunate, so what we can do instead is make these kinds of calculation off the table and use it while playing.
The whole point of this post is to make you understand that the game has evolved and the players (opponents/playing field) have become much more knowledgeable (tougher). Now to stay in the competition, you need to be able to stand on the same plank as the competition and the only way you'll ever be able to beat them, unless off course, you're just blessed by the Luck-God/Goddess.
Hope it makes sense.
Disclaimer: All the ideas I've presented is not something new, its been discussed over and over again in multiple materials by multiple authors.