We've talked about using probability to determine what to believe. Such as, whether to believe that a certain form of contraception will work or will probably work. Or whether to believe that you have cancer when a doctor has done a test and you got a positive result which can be pretty scary. But you still have to figure out whether to believe that you really do have cancer. Now, probability can be very useful in the formation of beliefs. But it's also extremely useful in making decisions. So, in this lecture, we want to talk about the use of probability to make decisions or choices. There are three kinds of decisions or choices that we need to discuss. First of all, decisions with certainty. Second, decisions with risks. And third, decisions with ignorance. Or, as those are sometimes called, decisions with uncertainty. And we'll talk about the first two and then I'll just say a few words about the third. First, let's look at decisions with certainty. A decision with certainty is one where you know exactly what's going to happen, you know what the effects are going to be and you know it definitely. Its not that its probably going to happen, you know what's going to happen. You know the good effects and the bad effects. So there's no uncertainty, everything is certain. For example, suppose you go to a restaurant, an Italian restaurant, with a friend of yours. And, you have to decide whether to order pasta or pizza. Some really nice eggplant Parmesan on top of pasta that you like at this restaurant. They also have really good mushroom pizza. So you have to think about that, but you know that if you order the eggplant Parmesan pasta, then they're going to bring it to you. And you know that if you order the mushroom pizza, they'll make one for you, and you've got enough money for both of them so there's no uncertainty. You know what you're going to get. Still might be a difficult choice because, well, you might not be sure whether you feel more like the eggplant Parmesan pasta tonight or the mushroom pizza. And, they're going to be also potentially a conflict of values. Maybe the eggplant Parmesan pasta costs more than the pizza. In which case you have to decide, well, even though I like it more, do I want to spend that money right now? You got a conflict between what you feel like and how much it's going to cost. Or, there might be a conflict with your friend. Your friend wants to buy the mushroom pizza and split it with you because he doesn't want an entire pizza himself. But you really feel like the eggplant Parmesan. So now, you've got a conflict between your loyalty to your friend. You want to support your friend and help your friend get what he wants. And yet, you feel like the eggplant Parmesan yourself. So, these decisions can be very difficult in some cases but there's no uncertainty. You still know that if you order the eggplant Parmesan you're going to get eggplant Parmesan, or if you order the pizza you're going to get pizza. Right? There's not uncertainty at all. No, of course there's uncertainty. You might order the pizza and it comes burnt. Yuck. Oh, I hate burnt pizza. And you might order the eggplant Parmesan and it turns out that they really messed up the sauce, and it's got way too much salt in it tonight. Maybe they have a new chef in the kitchen who's going to mess up one but not the other. It's not really certain that when you order it, you're going to get what you wanted or that you're going to like it. Now, this is just a fact about life. There's nothing really certain in this life. You might think that you know exactly what's going to happen. But there's always at least some chance that it's not going to turn out the way you thought. And so, what do you do about that? Well, some people just stipulate. If I order the pizza I'll get the pizza. And it'll be good pizza. Well, you can of course stipulate that. But then, it just makes the whole situation unrealistic. Cause in the real life, you're never going to know for certain. So another possibility is to say, we're just going to ignore these slight chances that the pizza's going to come back burned. I've been at this restaurant a hundred times, before they've always done it just right. And so, what are the odds they're going to burn it this time? Pretty slim. 99% chance they're going to do it the right way. Well, you might want to say, I'm just going to ignore the 1% chance of getting burnt because it's not going to make any difference. I still want the pizza or I still want the eggplant, whichever. The other thing you might do is say, look, nothing's certain, but the probabilities are balanced. They go on both sides. After all, there's a chance of burning the pizza, there's also a chance of burning the eggplant. So, in some cases it might make perfectly good sense to ignore these uncertainties because they're not going to effect, your decision making process. Still, it's important to realize that in life, throughout life, in all areas of life, there's always going to be some uncertainty. And that means, there's going to be a probability that if you look at it seriously, it's going to play into the decision making process. So, the kinds of cases that we want to focus on are the cases of risk where there is a probability of failure and a probability of success for the different plans that you have to choose among. All realistic examples then are actually decisions under a risk, especially the most important ones. So, what is a decision under risk? It's simply a case where you're not absolutely certain what's going to happen. It's not definitely this effect will occur, that effect will occur. You'll get your pizza, it'll taste good. You'll get your pasta, it'll taste good. Instead, there's a probability of each outcome. There's a probability that the pizza will get to the table. There's a probability that the pizza will taste good. There's a probability that the pizza will never make it to the table but get dropped by the waiter, or that the pizza will taste horrible because it was burnt. Then, you have a decision under risk. And how do we make these decisions under risk? Well, there are a lot of different theories,. But one, pretty common one, is to apply expected value theory. Now, we're going to see that there are lots of different values, and it's hard to see how to handle them all, but let's start with the really simple case. Let's assume for now that all that matters, is money. One common way, perhaps the most common way, of figuring this out is to calculate the expected monetary value or the expected financial value. And that calculation is pretty simple. It's the probability of winning times the net gain if you do win, minus the probability of losing times the net loss if you do lose. And here we're talking about gains and losses in terms of money. But why do we say net loss and net gain? Imagine, you buy a lottery ticket. And you win $20 million and they sent you a check for $20 million Well, how much did you win? Sounds like you won $20 million but not really. You never got that dollar back that you paid for the ticket. So really, you only won $19,999,999. You know, that's pretty trivial if you just won the lottery. No big deal. But now, imagine a different circumstance. When you're playing poker and you bet $10. And the opponent raises it to twenty. And you're bluffing so you want to bet big. So you bet $200. Put that all in the pot, too. And your opponent backs down and you pull in the entire pot. How much has it got in it? $230. Well, did you win $230? No, because you put in 210. You only won twenty. The net winnings are how much you pull in minus how much you put in before you pulled in your winnings. So it's 230 minus the 210 you put in, you only gained twenty. A lot of people forget that in poker and they end up thinking that they win a lot by bluffing when actually they didn't win as much as they thought they did. In any case, we have to look at the net gain and the net loss when applying this formula for expected financial value. Let's do a few examples of calculating expected financial values, and let's do them with a deck of cards because we're familiar with that. There are thirteen different cards and four different suits, spades, hearts, diamonds and clubs. So we have to remember that to calculate the probabilities. Let's imagine that you can make this bet. If you bet a dollar, you'll win $5 net if you pick a spade. Now, notice that it's net. So that means either you put a dollar in, and then you take $6 out, and your net gain is five. Or you say, if it's not a spade I'll give you a dollar, and if it is a spade you give me $5. Either way, if you pick a spade, you're going to be $5 up. What's the expected financial value of this bet? Well, we know that the odds of picking a spade is one in four. That's the odds of winning because if you pick a spade you'll win. And the net gain is $5. So, to calculate the expected financial value, we need to multiply this one-quarter, the odds of picking a spade, times five, the number of dollars that you win, net if you win. And then, we have to subtract the odds of loosing three quarters, because if you pick a heart, a diamond, or a club, then you loose, and what do you loose? You loose $1. So, one quarter times five is five quarters, three quarter times one is three quarters, subtract three quarters from five quarters, you get two quarters and that of course is a half. So the expected financial value of this bet is one-half. Now, compare that bet to another bet. In this bet, if you bet a dollar, you're going to win ten. That sounds pretty good. Instead of winning five, you're going to win ten. But, you only win $10 net if you pick an ace. And there are thirteen cards, so we have to redo the calculations. And the probability of picking an ace is one in thirteen. The amount you win is ten. That's your net winnings, remember. You have to subtract the probability of losing, twelve in thirteen because you're going to lose if it's any card other than an ace. And the amount you'll lose is $1 net. 10 * 13 = 13/10. Subtract 12/13 * 1, which is 12/13, and you get -2/13. Notice that's -2/13. That means the expected financial value of that bet is negative. You ought to expect to lose money over the long run if you play that kind of bet a lot. Third example. This time you can bet a dollar and win $51 net if you pick an ace of spades. But you can't pick any old ace, or any old spades. It's got to be the ace of spades. So now, what's the financial value of that bet? The expected financial value is probability of winning one over 52, times the amount you win net, which is 51, minus the probability of losing 51 over 52, times the amount you lose net, if you lose. 51 * 1/52 = 51/52. 51/52 * 1 is 51/52. Subtract them and you get zero. So in this bet, the expected financial value of the bet is zero. Okay? Now, we can compare these two bets by asking what bet should you make. Well, the first bet was favorable. A bet's favorable if its expected value is greater than zero. And that means that it's in your interest to bet, at least if your only interest is money. A bet's unfavorable if its expected value is less than zero. And a bet is neutral or fair if its expected value is zero. And if you have a bet that's favorable, a bet that's unfavorable, and a bet that's neutral, it shouldn't be surprising that the one you should pick is the one that's favorable where the expected value is greater than zero, if you're going to bet at all. And that's the next question that we have to turn to.