1 00:00:00,001 --> 00:00:10,169 If you take all the results of cheating I described to you up to now, the real 2 00:00:10,169 --> 00:00:16,942 fundamental issue here is conflict of interest. 3 00:00:16,942 --> 00:00:21,370 What happens when we are paid to do things that are slightly different from the 4 00:00:21,370 --> 00:00:24,490 truth? Do we see the world in a different way? 5 00:00:24,490 --> 00:00:29,057 Are we biased and do we act on our bias? And the answer of course is yes. 6 00:00:29,058 --> 00:00:32,170 Now the truth is that we all understand conflicts of interest. 7 00:00:32,170 --> 00:00:36,190 We all understand that if we go to basketball game, and we are big fanatic 8 00:00:36,190 --> 00:00:40,612 fans of team one, and the referee calls a call against our team, we will think the 9 00:00:40,612 --> 00:00:45,949 referee is evil, vicious, blind stupid. And we all understand that in sports, 10 00:00:45,949 --> 00:00:50,641 conflicts of interest are everywhere and you know what, maybe even makes the game 11 00:00:50,641 --> 00:00:53,762 more fun. But ask yourself, would you go in front of 12 00:00:53,762 --> 00:00:58,434 a judge if you knew the judge was getting paid 5% of the verdict that they would 13 00:00:58,434 --> 00:00:59,970 pass? Probably nobody would take it on. 14 00:00:59,970 --> 00:01:03,240 But never the less we have tremendous conflicts of interest. 15 00:01:03,240 --> 00:01:06,960 We have conflicts of interest with our mechanics and our lawyers and our 16 00:01:06,960 --> 00:01:11,324 accountants and of course our governments. And of course in the financial industry, 17 00:01:11,324 --> 00:01:15,380 both financial advisers and bankers, have tremendous conflicts of interest. 18 00:01:15,380 --> 00:01:20,453 And the question is how do we let it be? If we understand that people are blinded 19 00:01:20,453 --> 00:01:25,414 by the conflicts of interest, shouldn't we do more to eliminate some of those? 20 00:01:25,414 --> 00:01:29,820 Well, of course, people do understand that to some degree that conflicts of interest 21 00:01:29,820 --> 00:01:34,126 are important or trying to eliminate it. But most of the efforts toward elimination 22 00:01:34,126 --> 00:01:37,560 has been done on what is called sunshine policies and disclosure. 23 00:01:37,560 --> 00:01:41,212 I idea is that if I'm your banker and I have a conflict of interest I should just 24 00:01:41,212 --> 00:01:43,969 tell you about it, I should just put it on the website. 25 00:01:43,970 --> 00:01:47,862 And the moment you know that I have a conflict of interest, you will know how to 26 00:01:47,862 --> 00:01:51,940 discount my opinion. So that was a very nice experiment on 27 00:01:51,940 --> 00:01:53,340 this. Imagine two people. 28 00:01:53,340 --> 00:01:59,366 One is the adviser and one is the advisee. And the adviser sees a jar with money from 29 00:01:59,366 --> 00:02:05,139 close up, and the advisee sees the same jar but from far away. 30 00:02:05,140 --> 00:02:10,455 So the advisor has better information about what's in the jar, how much money, 31 00:02:10,455 --> 00:02:13,706 than, than the advisee. Now in one case there was no conflict of 32 00:02:13,706 --> 00:02:16,590 interest. The advisor was giving the advisee a piece 33 00:02:16,590 --> 00:02:20,739 of information suggesting what they do. The advisee guessed. 34 00:02:20,739 --> 00:02:24,842 And if the, the closest the advisee was to the real value, they both got paid more 35 00:02:24,842 --> 00:02:28,990 money. Another condition there was a conflict of 36 00:02:28,990 --> 00:02:32,683 interest. All of a sudden the adviser got paid more 37 00:02:32,683 --> 00:02:37,495 as the advisee overestimated the amount. If they underestimated they got nothing, 38 00:02:37,495 --> 00:02:40,237 but the more they over-estimated the amount they got paid. 39 00:02:40,238 --> 00:02:42,883 What happened now? The adviser acted on the conflicts of 40 00:02:42,883 --> 00:02:47,041 interest, they recommended the higher number and indeed the advisor made more 41 00:02:47,041 --> 00:02:51,830 money and the advisee made less money. But here's the interesting thing. 42 00:02:51,830 --> 00:02:54,100 What would happen if we added the sunshine policy? 43 00:02:54,100 --> 00:02:56,389 What would happen if we added the disclosure? 44 00:02:56,389 --> 00:03:01,692 Now the adviser, had to tell them, by the way, just so you know, I get paid more 45 00:03:01,692 --> 00:03:04,970 money if you exaggerate the amount. What would happen? 46 00:03:04,970 --> 00:03:08,935 What we usually think about, is that the moment the adviser will disclose, the 47 00:03:08,935 --> 00:03:12,558 advisee will discount their opinion, maybe discount it by 10% by 15%. 48 00:03:13,760 --> 00:03:18,233 But what could also happen is the advisor, knowing that their advice will be 49 00:03:18,233 --> 00:03:21,480 discounted, will exaggerate their advice even more. 50 00:03:21,480 --> 00:03:24,960 And now the question is, which one of those forces will be higher? 51 00:03:24,960 --> 00:03:29,754 Will the advisor exaggerate more, or will the advisee discount more? 52 00:03:29,754 --> 00:03:35,388 And the results show that the advisor, indeed, exaggerated more and the advisee 53 00:03:35,388 --> 00:03:39,590 discounted not enough. So when the advisor was disclosing, 54 00:03:39,590 --> 00:03:44,151 actually the advisor got more money and the advisee got less money. 55 00:03:44,151 --> 00:03:49,870 So conflicts of interest are everywhere. They're important, they're big, they're 56 00:03:49,870 --> 00:03:55,124 fundamental, they change people's view of the world, in, in important ways and get-, 57 00:03:55,124 --> 00:03:59,006 cause them to give biased advice and behave in dishonest ways. 58 00:03:59,006 --> 00:04:05,387 And sunshine policies and disclosure don't seem to fix the problem. 59 00:04:05,388 --> 00:04:10,979 Now on to a couple of my own personal stories about, eh, conflicts of interest. 60 00:04:12,050 --> 00:04:16,782 So I was burned, many years ago, and I spent a long time in hospital. 61 00:04:16,782 --> 00:04:21,074 And about six years after I left hospital already, maybe not six, maybe five years 62 00:04:21,074 --> 00:04:24,553 after I left hospital. I came back to the burn department and the 63 00:04:24,553 --> 00:04:29,109 head of the department finds me and said, Dan, I have a fantastic new treatment for 64 00:04:29,109 --> 00:04:32,284 you, come with me. It's not something you hear too often. 65 00:04:32,285 --> 00:04:36,834 I will go with him to his office, and he says look, my right side of my face is 66 00:04:36,834 --> 00:04:39,600 burnt. The left side of my face is not, so when I 67 00:04:39,600 --> 00:04:43,931 shave, I have little black dots on the left side of my face, but on the right 68 00:04:43,931 --> 00:04:47,465 side of my face, there's no black dots because I was burned. 69 00:04:47,465 --> 00:04:52,657 What was his solution? He was going to tattoo little black dots 70 00:04:52,657 --> 00:04:57,470 on the right side of my face as well. So he said, why don't you go home, shave 71 00:04:57,470 --> 00:05:01,267 as close as you want to, and come back, and I'll do the tattooing? 72 00:05:01,268 --> 00:05:04,439 I went home, I shaved. I, I shaved kind of carefully, because I 73 00:05:04,439 --> 00:05:07,550 said, whatever I shave today, I have to shave for the rest of my life. 74 00:05:07,550 --> 00:05:11,235 I went back to his office, and I said, you know what, can I see some pictures of this 75 00:05:11,235 --> 00:05:15,217 procedure, you've done it on other people. He said yes, we've done it on two people. 76 00:05:15,218 --> 00:05:18,610 Here are the picture, but I can't show you the whole face, I'll show you just the 77 00:05:18,610 --> 00:05:21,060 cheek. You know, and sure enough these were cheek 78 00:05:21,060 --> 00:05:24,266 with little dots. And then I said, what happens when I grow 79 00:05:24,266 --> 00:05:26,632 older? What happens when I grow older and my hair 80 00:05:26,632 --> 00:05:29,215 becomes white? He said, oh, don't worry about it, we can 81 00:05:29,215 --> 00:05:33,159 laser it out when the time comes. I said you know what, I, I don't feel so 82 00:05:33,159 --> 00:05:35,200 comfortable with it, let, let me not do it for now. 83 00:05:35,200 --> 00:05:40,390 Maybe I'll come back next month or so, but for now, I don't want to do it. 84 00:05:40,390 --> 00:05:41,779 And the next part was something I completely didn't expect. 85 00:05:41,779 --> 00:05:46,611 He said Dan, what's wrong with you? Don't you want to be symmetrical? 86 00:05:46,612 --> 00:05:50,249 Do you get some distorted pleasure from looking different? 87 00:05:50,250 --> 00:05:54,276 He said do women feel sorry for you because you look so odd, and give you 88 00:05:54,276 --> 00:05:56,986 sympathy sex, which, by the way, never happened. 89 00:05:56,987 --> 00:06:02,184 But he gave me this tremendous guilt trip. Now I'm Jewish, I'm used to guilt trip, 90 00:06:02,184 --> 00:06:05,966 but this was really extreme and he was also my physician for three years in 91 00:06:05,966 --> 00:06:08,378 hospital, and he was a fantastic physician. 92 00:06:08,378 --> 00:06:12,474 By the way he did this little eyebrow here, this eyebrow took him nine hours of 93 00:06:12,474 --> 00:06:15,998 operations. He took,, two blood vessels that were 94 00:06:15,998 --> 00:06:21,410 going up my skull and isolated a piece of skin here and took them with their blood 95 00:06:21,410 --> 00:06:25,325 vessels, and redirected them. So because if you don't have the blood 96 00:06:25,325 --> 00:06:28,300 vessel there's a good chance the skin will not function as well. 97 00:06:28,300 --> 00:06:32,460 So this was, you know, the guy basically spent nine hours in the operating room 98 00:06:32,460 --> 00:06:36,640 fixing half of my eyebrow. He clearly cared a lot about aesthetics. 99 00:06:36,640 --> 00:06:40,556 He also cared a lot about me. Anyway but this guilt trip was not 100 00:06:40,556 --> 00:06:44,334 something I was used to. So anyway I left him, quite shocked, then 101 00:06:44,334 --> 00:06:46,731 I went to his deputy and I said what was going on? 102 00:06:46,731 --> 00:06:50,322 Well, he told me, they did it on two patients already and they're looking for 103 00:06:50,322 --> 00:06:53,520 third patient for an academic paper. And I was kind of an ideal patient. 104 00:06:53,520 --> 00:06:57,522 Half the face burned, half not, its kind of symmetrical around here or 105 00:06:57,522 --> 00:07:02,296 non-symmetrical if you want. And this made it made me very, good 106 00:07:02,296 --> 00:07:07,472 candidate for his, for his paper. Now, he was a fantastic dedicated 107 00:07:07,472 --> 00:07:13,104 physician, he spent amazing amount of hours with me, he cared a lot about me, I 108 00:07:13,104 --> 00:07:18,197 knew him very well, but never the less at that moment, he was blinded. 109 00:07:18,198 --> 00:07:22,670 He thought that there was something that was in his best interest was also in mind. 110 00:07:22,670 --> 00:07:25,520 And I think this actually happens to all of us. 111 00:07:25,520 --> 00:07:30,308 When we are so engrossed in what's good for us, then we feel more comfortable 112 00:07:30,308 --> 00:07:35,324 demanding it from other people, not realizing how the conflicts of interest is 113 00:07:35,324 --> 00:07:37,758 working on us. Of course there are also cases when I have 114 00:07:37,758 --> 00:07:42,046 my own conflicts of interest. In one example, I ran an experiment at 115 00:07:42,046 --> 00:07:44,560 Harvard. An the people at Harvard are very nice to 116 00:07:44,560 --> 00:07:48,460 let me use their lab from time to time, and when I use this lab on that particular 117 00:07:48,460 --> 00:07:52,300 occasion, we ran an experiment, and we hoped that one group will have the high 118 00:07:52,300 --> 00:07:56,766 mean, and one group will have a low mean. And we basically got that, aside from the 119 00:07:56,766 --> 00:08:01,182 fact that there was one person in the group I was hoping to be the highest, that 120 00:08:01,182 --> 00:08:03,990 had the lowest possible result you can imagine. 121 00:08:03,990 --> 00:08:08,297 And he pushed the mean and changed the variance, it was awful. 122 00:08:08,298 --> 00:08:11,598 So I looked carefully in this one annoying participant. 123 00:08:11,598 --> 00:08:15,610 And he was almost 25 years older than anybody else in this sample. 124 00:08:15,610 --> 00:08:19,630 And I remember, there was one older gentleman who came to the experiment 125 00:08:19,630 --> 00:08:22,900 drunk. The lab at Harvard is just by the street. 126 00:08:22,900 --> 00:08:26,498 People come from the street. And, this guy, drunk came to the 127 00:08:26,498 --> 00:08:28,060 experiment. Take him out. 128 00:08:28,060 --> 00:08:31,157 Who wants drunk people in the study? Of course, lets kick him out. 129 00:08:31,158 --> 00:08:32,906 What happened? Boom. 130 00:08:32,906 --> 00:08:35,517 We kick him out, the results look beautiful. 131 00:08:35,518 --> 00:08:38,380 But two days later in our lab meeting, we thought about this case. 132 00:08:38,380 --> 00:08:43,945 And we said what would happen if by luck, this one drunk individual would not have 133 00:08:43,945 --> 00:08:47,184 been in that group. He would have been in the other group, the 134 00:08:47,184 --> 00:08:50,616 group we expected to have a low mean. Now, he would push the mean in the 135 00:08:50,616 --> 00:08:52,987 direction we hoped it would go, even lower. 136 00:08:52,988 --> 00:08:56,690 Would we even notice him in that case? And the answer was probably no. 137 00:08:56,690 --> 00:08:58,200 The result would have looked just beautiful. 138 00:08:58,200 --> 00:09:01,050 We wouldn't have taken him out. And even if we notice him, and we notice 139 00:09:01,050 --> 00:09:03,532 he was the lowest, and we notice that he was drunk. 140 00:09:03,532 --> 00:09:07,539 Now he would help our results. Would we take him out? 141 00:09:07,539 --> 00:09:11,416 Probably not. Maybe we'd even imagine, ooh, this is 142 00:09:11,416 --> 00:09:15,137 good, let's start having more drunk people in our sample. 143 00:09:15,138 --> 00:09:19,907 And the issue here, and what I realize, is that I was my worst enemy. 144 00:09:19,908 --> 00:09:24,606 When I was originally taking his data point out, I thought I was helping science 145 00:09:24,606 --> 00:09:27,423 along. I was thinking that this is the real 146 00:09:27,423 --> 00:09:31,242 pattern of the data, and I was eliminating something in order to get the real, eh, 147 00:09:31,242 --> 00:09:34,800 reality to shine through. I was clearing up the path, but the 148 00:09:34,800 --> 00:09:40,621 reality is, that's not what I was doing. I had an idea of what the data was look 149 00:09:40,621 --> 00:09:43,930 like. Should look like and I use my creativity 150 00:09:43,930 --> 00:09:47,920 to basically find an excuse, a rationalization of what the, why this 151 00:09:47,920 --> 00:09:51,780 thing that was bothering me should get, should get out of the way. 152 00:09:51,780 --> 00:09:55,312 I might have been able to find all kinds of other rationalizations. 153 00:09:55,312 --> 00:09:59,163 Maybe if it was a different age, or ethnicity, or something else I could have 154 00:09:59,163 --> 00:10:03,260 found the reason as well. So the point is that when we come to 155 00:10:03,260 --> 00:10:07,600 conflicts of interest, it is basically to protect us against ourselves. 156 00:10:07,600 --> 00:10:12,290 That we ourselves are likely not to see reality correctly, and we need to protect 157 00:10:12,290 --> 00:10:14,580 ourselves. And now I have very strict rules, and I 158 00:10:14,580 --> 00:10:16,611 tell my students that we have very strict rules. 159 00:10:17,710 --> 00:10:21,805 It's okay not to have drunk people in the study, but you can only do it before the 160 00:10:21,805 --> 00:10:26,180 study, you can't announce it afterward, especially not after looking at the data. 161 00:10:26,180 --> 00:10:30,870 It is okay to say we don't want people who don't understand the instructions, but you 162 00:10:30,870 --> 00:10:34,120 can only do it in front, up, up front and not, and not later. 163 00:10:34,120 --> 00:10:39,220 And these rules are good for us to protect against ourselves, and without them, we 164 00:10:39,220 --> 00:10:43,626 would do very badly. So what does it tell us about conflicts of 165 00:10:43,626 --> 00:10:48,400 interest more, more generally? It tells us that conflicts of interest are 166 00:10:48,400 --> 00:10:51,762 basically operating within this fudge factor. 167 00:10:51,762 --> 00:10:55,991 Every time there's a grey zone, of what exactly is okay and not okay, conflicts of 168 00:10:55,991 --> 00:10:59,582 interest are going to push us into the direction of what is good for us 169 00:10:59,582 --> 00:11:03,255 financially. Good for our company, good for our friends 170 00:11:03,255 --> 00:11:08,447 financially, and the worst thing of it is that we don't notice it ourselves. 171 00:11:08,448 --> 00:11:12,835 There's lot of biases in decision making. And one of the interesting in the biases 172 00:11:12,835 --> 00:11:15,400 is the one that we don't notice that exist in ourselves. 173 00:11:15,400 --> 00:11:20,493 And conflicts of interest is one of those. It works on us, it changes our view, it 174 00:11:20,493 --> 00:11:23,810 changes our view fundamentally, but we don't notice. 175 00:11:23,810 --> 00:11:26,877 It's because somebody else is paying us for this. 176 00:11:26,878 --> 00:11:30,367 And I'll just end by telling you one last experiment. 177 00:11:30,368 --> 00:11:34,037 Imagine you come to a lab and we show you some art. 178 00:11:34,038 --> 00:11:38,807 Some comes from gallery, that it is called Lone Wolf, and some come from a gallery 179 00:11:38,807 --> 00:11:42,339 called Blue Moon. But I tell you one of those galleries, 180 00:11:42,339 --> 00:11:47,000 let's say Lone Wolf, is the one that is giving you the $100 to pay for your 181 00:11:47,000 --> 00:11:51,776 participation in this study, right now. And I put you in a FMRI scanner, so your 182 00:11:51,776 --> 00:11:54,637 brain is being scanned, and I show you these images. 183 00:11:54,638 --> 00:11:56,977 And you have to tell me how much you like them. 184 00:11:56,978 --> 00:12:00,876 The first thing that happen, is that when you look at the results people like more 185 00:12:00,876 --> 00:12:03,336 the art that came from their sponsored galleries. 186 00:12:03,336 --> 00:12:05,500 And you can say, well, maybe people are just nice. 187 00:12:05,500 --> 00:12:09,404 They say oh, this is nice gallery, they gave me some money for the experiment, 188 00:12:09,404 --> 00:12:13,037 I'll repay them by saying I like their gallery, their, their art more. 189 00:12:13,038 --> 00:12:17,670 But what was more interesting, is that the reaction of the brain in the pleasure 190 00:12:17,670 --> 00:12:21,520 centers, also reacted more strongly to the supporting gallery. 191 00:12:21,520 --> 00:12:27,148 And that amount of response the supporting gallery grew, as the gallery gave the 192 00:12:27,148 --> 00:12:30,686 individual more money. And I think this is really what's 193 00:12:30,686 --> 00:12:34,936 happening in conflicts of interest. We feel indebted to somebody, to 194 00:12:34,936 --> 00:12:38,760 something, to some cause, to some goal or to our own pocket. 195 00:12:38,760 --> 00:12:43,120 And it changes dramatically how our brains operate, how we view information. 196 00:12:43,120 --> 00:12:46,384 And because of that, we really need to fix it. 197 00:12:46,385 --> 00:12:51,835 If you understand conflicts of interest, you would be very disappointed with what 198 00:12:51,835 --> 00:12:57,416 we've done with the financial crisis. I think the financial crisis is, basically 199 00:12:57,416 --> 00:13:02,800 was a problem of conflicts of interest. Imagine I offered you $5 million a year to 200 00:13:02,800 --> 00:13:06,549 view mortgage backed securities as a better product. 201 00:13:06,550 --> 00:13:09,817 Don't you think you would be able to do it? 202 00:13:09,818 --> 00:13:12,750 And I don't mean that you would lie. That you would say, oh I know these are 203 00:13:12,750 --> 00:13:15,280 shitty products, but I'll tell my client that they are good. 204 00:13:15,280 --> 00:13:19,627 Don't you think that with so much money on the line, your own opinion would be 205 00:13:19,627 --> 00:13:22,112 shaded? And what would happen if they were harder 206 00:13:22,112 --> 00:13:25,743 to evaluate, if they were complex? If you were sitting there with a big excel 207 00:13:25,743 --> 00:13:29,619 spreadsheet and there's all parameters and estimation and at the bottom here you 208 00:13:29,619 --> 00:13:33,609 would see the value of the mortgage backed security, but it will also reflect on your 209 00:13:33,609 --> 00:13:36,006 bonus. Don't you think you would play with the 210 00:13:36,006 --> 00:13:39,366 numbers a little bit better? Multiple steps removed from money to get 211 00:13:39,366 --> 00:13:43,490 that number to be higher. And what if everybody around you believed 212 00:13:43,490 --> 00:13:46,083 the same thing? Everybody believed that mortgage backed 213 00:13:46,083 --> 00:13:49,518 securities are wonderful. And what if you're creative individual? 214 00:13:49,518 --> 00:13:52,660 All of those forces would keep on shading your opinion. 215 00:13:52,660 --> 00:13:56,442 Getting you to believe to a higher and higher degree, that these things are 216 00:13:56,442 --> 00:13:59,896 actually better than they really are. And from that perspective we haven't 217 00:13:59,896 --> 00:14:02,194 really done anything to the financial industry. 218 00:14:02,195 --> 00:14:07,035 If you think that the villain in this whole story are conflicts of interest and 219 00:14:07,035 --> 00:14:10,760 biased payments, then we haven't really done anything useful. 220 00:14:10,760 --> 00:14:12,890 And I think it's the same for the financial industry. 221 00:14:12,890 --> 00:14:16,734 It's the same for the health care industry, and of course the same thing for 222 00:14:16,734 --> 00:14:20,976 Washington, in terms of lobbyists. And until we recognize how big force, 223 00:14:20,976 --> 00:14:26,068 conflicts of interest are, how paying you a hundred dollars from one gallery, or 224 00:14:26,068 --> 00:14:30,932 giving you a favor from somebody else, or paying you a higher bonus under some 225 00:14:30,932 --> 00:14:35,467 conditions: All of those color our own understanding to a deep level. 226 00:14:35,468 --> 00:14:39,628 And until we recognize the size of conflicts of interest, the depth of the 227 00:14:39,628 --> 00:14:44,388 influence, and try to create systems that don't have as much conflicts of interest, 228 00:14:44,388 --> 00:14:46,972 we're not going to get out of those troubles. 229 00:14:46,972 --> 00:14:52,500 So it's not going to be easy, eliminating conflicts of interest is hard, but I think 230 00:14:52,500 --> 00:14:57,380 we must look at ourselves carefully and try to eliminate as many of them as 231 00:14:57,380 --> 00:14:58,158 possible.