Welcome to Remarkable People. We’re on a mission to make you remarkable. Helping me in this episode is Dan Simons.

Dan is a highly acclaimed cognitive psychologist and professor at the University of Illinois at Urbana-Champaign, where he directs the Visual Cognition Laboratory. Before joining the University of Illinois, Dan spent five years on the faculty at Harvard University.

Dan is best known for his famous “Invisible Gorilla” experiment, demonstrating that people can easily miss unexpected events when focused on a specific task. You’ve probably seen the video, so allow me to remind you; it’s where two teams, one in black shirts and one in white shirts, are passing a ball. The participants are told to count how many times the players in white shirts pass the ball.

Midway through the video, a gorilla walks through the game, stands in the middle, pounds his chest, then exits. More than half the time, subjects miss the gorilla entirely. There’s a follow-on to this video, and even if you know something will happen, I bet you miss it.

Dan’s latest book, written with Chris Chabris, is Nobody’s Fool: Why We Get Taken In and What We Can Do About It. This book caused me to have a brief existential crisis. Believe me, as we head into the elections of 2024, knowing how not to get taken in could save democracy.

Please enjoy this remarkable episode with Daniel Simons: Never Get Taken Again!

 

If you enjoyed this episode of the Remarkable People podcast, please leave a rating, write a review, and subscribe. Thank you!

Follow on LinkedIn

Transcript of Guy Kawasaki’s Remarkable People podcast with Daniel Simons: Never Get Taken Again:

Guy Kawasaki: I'm Guy Kawasaki, and this is Remarkable People. We're on a mission to make you remarkable. Helping me in this episode is the remarkable Dan Simons. Dan is a highly acclaimed cognitive psychology professor at the University of Illinois at Urbana-Champaign. He directs the visual cognition laboratory there. Before joining the University of Illinois, Dan spent five years on the faculty at Harvard University. Dan is perhaps best known for his famous Invisible Gorilla experiment. This demonstrated that people can easily miss unexpected events when they are focused on a separate task. If you haven't seen the video, let me give you the gist; two teams of people, one in black shirts and one in white shirts, are passing a ball. The participants in the study are told to count how many times the players in the black shirts pass the ball. Midway through the video, a gorilla walks through the game, stands in the middle, pounds his chest, and then exits. More than half of the time, subjects miss the gorilla entirely. There's a follow-on to this video, and even if you know something is going to happen, I bet you will miss it.
Actually, two things happen. I bet you missed them both. Dan's latest book, written with Dan Chabris, is called Nobody's Fool, Why We Get Taken In and What We Can Do About It. Honestly, this book caused me to have a brief existential crisis when I read it. And believe me, as we head into the elections of 2024, knowing how to not get taken in could save democracy. I'm Guy Kawasaki. This is Remarkable People. And now, here is the remarkable Dan Simons.
I'd like you to explain the Possibility Grid, and then I'm going to explain why your Possibility Grid caused an existential crisis for me.
Dan Simons: The idea of the Possibility Grid is that we tend to focus primarily on the information that we have in front of us and to not pay attention to information that we might be missing. So you can think of any sort of intervention or outcome in terms of a two by two grid. So let's say, for example, that whatever you do, say getting a college education, might be the top row- and not getting a college education, dropping out, might be the bottom row. And then the first column might be you succeed, you become a business success, you become a unicorn and you start a billion-dollar business. And the right column might be you don't, you fail to succeed. And what we tend to think about are those cases of just the successes, that first column, the people who did succeed because we just don't know about as many of the people who didn't become unicorns.
And then we might think about prominent examples. For example, prominent examples of people who dropped out of college and became the founders of major companies, the Steve Jobs sorts of cases where you've got this amazing person who found a company despite not having graduated from college. So that's this idea of this grid. You can look at the cases that have succeeded and failed and the people who tried something or didn't and you can look to see, for example, did people who dropped out of college, are they more likely to become unicorns? So this is a really nice study by Lifted and Watts and a few other people who asked people to choose: who's more likely to become a unicorn founder of a company? You could give them two people, one of them is a college dropout, and one of them is a college graduate, and ask them to choose which one's more likely to become the founder of a unicorn.
And instead of just asking them that, you can start by giving them a list of five very famous people who dropped out of college and became unicorns, or you give them a list of five very famous people who graduated college and became unicorns. If you give them the list of college dropouts, people are much, much more likely to pick the college dropout to become the unicorn founder. So about two-thirds. If you give them the list of college graduates, they know that this is just a list of five people; it doesn't matter. Then about eighty-five percent of them pick the college graduate. So just thinking about the ones that are right in front of you, these cases of famous dropouts, makes you think dropouts are more likely to be unicorns.
But in reality, if you look at who has become a unicorn, and Chris Chabris, my co-author and Jonathan Way looked at a list of all 253 unicorns that were listed in the Wall Street Journal in 2015, and pretty much all of them had college degrees. Dropouts are rare to become founders because sometimes an education's helpful to get in the door.
Guy Kawasaki: When I read that, and then I read your Possibility Grid applied to Malcolm Gladwell and his theory of tipping where, yes, hush puppies tipped, and Malcolm explains that as being caused by using social media influencers. But you point out that many companies didn't tip who used social media influencers and many companies tipped who didn't use, did I say that? Didn't use social media influencers. And so I've never heard anybody call out Malcolm Gladwell, who is like my hero. He writes a book about anything and five million copies sell, and here's my man Dan ripping him up.
And so, my immediate thought after that, and this idea that Zuckerberg, Gates, and Jobs don't have college degrees and they're hugely successful, but that doesn't mean you don't want a college degree. So I put all that in my brain. I said, "Oh Guy, shit, basically your writing and your speaking is bullshit because I'm telling people to become evangelists." I never talk about the people who became evangelists and didn't succeed, nor the people who aren't evangelists and did succeed. And so I think you can apply this to almost every business writer and every business speaker. The people who pivot. You only hear about companies who pivoted and succeeded, not all the losers who pivoted and failed. And I applied this to myself too. Am I just peddling selective bullshit?
Dan Simons: No, I wouldn't say that. But I would say that this is a pervasive problem. And we talk about Gladwell just because he's probably the best-known author in this genre. He's a terrific writer and he brings cases forward that people might not have thought about and gets people to think about them in really interesting, engaging ways. But this is a problem for the entire industry and for any evaluation of an industry where you don't have all the information. So if you think about who we're aware of, it's only the businesses that survived. We occasionally hear about the failures, but those are few and far between and they can fail at any point in the process. We mostly hear about the big success stories. And most business books focus on those big success stories. And they assume that the reason they became successful is whatever it is that that person happened to do, that founder happened to do.
Sometimes that might be right, but we don't know the rates. We don't know how often people with college degrees become unicorns versus people without college degrees. We need to think about the businesses that tried the same things and failed, the businesses that tried something totally different and succeeded, and the companies that tried something totally different and failed in order to know whether what that company did or what that founder did actually was related at all to their success. It could have just been luck. It could have just been something unrelated to what they did. And we don't really have an easy way of knowing in a lot of these cases.
Guy Kawasaki: What if my audience is listening to this and their heads are exploding and they're saying, "I just bought Malcolm Gladwell's book. I just signed up for a Tony Robbins conference, and I've been reading Deepak Chopra and his existential stuff is just random words put together. So now, what do I do? How am I supposed to go forward in life saying, 'Oh, I just got to depend on getting lucky?'
Dan Simons: Getting lucky helps. Being in the right place at the right time does help. But I think the critical thing is don't take somebody's claims about how they succeeded as gospel. They might well be right. It might well be true that the things they did contributed to their success or certainly didn't get in the way of it. But you really need to ask, "What am I missing? What information do I not have that would let me evaluate, 'Hey, is what they're telling me really true?" Is what they're telling me, they're very confident this is what caused their success, but you often hear successful people saying, "I work very hard." I know lots of people who work very hard who are not super rich, often working much harder than the people who are super rich.
But it's an assumption that I succeeded because I worked hard. Yeah, you worked hard, right? Lots of other people worked hard and didn't succeed. Lots of other people didn't work hard and did succeed. So you've got to think, what information do I not have? What's the missing information that would let me evaluate whether the things they're claiming help actually do? You need to know how often those help compared to how often other things would've helped.
Guy Kawasaki: Man.
Dan Simons: It doesn't mean they're wrong. They might be right.
Guy Kawasaki: I'm telling you, heads are exploding all over the universe right now.
Dan Simons: That sounds messy.
Guy Kawasaki: So you mentioned how you guys did further analysis where you went back to the Wall Street Journal unicorns and 253 of those unicorns had college graduate CEOs. But let's suppose we did this slightly differently. So let's suppose we went back into the Crunchbase database and we did an analysis of all the companies that started in a five-year period and noted whether their CEOs had college degrees or not, and then we categorized them as successful or not. And so, instead of coming up with 253 examples of successes, we would want to know in each corner of the two by two matrix what percentage of people became successful as opposed to what absolute number?
And then, I think that would be very interesting, but let's suppose that we find out that there's a one percent chance that if you have a college degree, you're going to be a unicorn, and there's a two percent chance that if you don't have a college degree, you're a unicorn. So that in and itself is interesting. But then, I was thinking about this, so let's say those are the real numbers, which they could be. What do you conclude from that? Yes, you're twice as likely to become a unicorn if you have a college degree or twice as likely if you don't have a college degree, but it's still only one percent versus two percent. So what do you do with something that is so low?
Dan Simons: Yeah, that's exactly right. The first conclusion you draw from that is it's really rare to become a unicorn, and I'm sure the percentages have to be substantially lower than one percent. There aren't that many billion-dollar companies out there. So they're going to be very low, and there are a couple things you can look at. So I think the right approach is to look at those rates. If you could get those data, that would be a fantastic way to evaluate this question of whether college dropout or college graduation is related to success. Be the best way to do it because you'd actually have the rates. Now, the absolute number is important too, that people should go in knowing you're not likely to be a unicorn, but there might be other measures of success and you can determine what constitutes success. The distinction you're drawing there is between what's known as a relative risk, how big of an improvement you'd get, versus the absolute risk.
And this comes up all the time in other disciplines. So in medicine, most diseases are fairly rare. Most treatments might affect a small percentage of people, but if you get twice as good an outcome from a treatment, from a new treatment than an old treatment, that could be a really big thing even if relatively few people have this disease. So relative risks and relative changes can be really important, even if that is very small numbers. You have to be careful. If you're talking about five cases, that's just noise possibly, but if you've got enough companies, then half a percent of all companies could be a sizeable number. But it's the rates, it's those percentage changes, percentage differences that are really the key.
Guy Kawasaki: Can't you just find a couple grad students and point them and say, "Okay, this is what I want you to do. Go to Crunchbase, find all the backgrounds, and then let's just decide what success is." I think unicorn is too tough a test. Let's just say, go back ten years, find all the college degree status, and correlate that against how many companies are still around, survived at all. Wouldn't that be a really interesting study?
Dan Simons: Absolutely. Be a fascinating thing to do. We haven't done it, but there's no reason anybody couldn't. And the way the studies that have been done that we know of are mostly looking at what people think about what's going to be successful. So that study that I mentioned by Lifted and Watts was looking at whether or not people would be biased by what they've seen recently. So if Steve Jobs and Mark Zuckerberg are in the news all the time and everybody talks about the fact that they didn't graduate college, that's really available to us. It's something that is central to our minds. We don't hear about all of the founders who have their graduate degrees. That's not something that gets mentioned all that often. So that's why the studies were looking at that question.
Guy Kawasaki: So you and Malcolm Gladwell should co-author this book and then it'll have the academic rigor of you and the marketing of Malcolm Gladwell. Oh my God.
Dan Simons: Somehow I think that's probably not going to happen.
Guy Kawasaki: I want to be in the acknowledgment. That's all I ask. I'm a humble person.
Dan Simons: Yeah, I'm guessing that's probably not going to happen.
Guy Kawasaki: Not if he reads your current book.
Dan Simons: Yeah, we've had discussions in the past with Malcolm about other things too.
Guy Kawasaki: One last question about this college degree, no college degree and business author. So do you think that if you could peer into the soul of all these influencers, thought leaders, authors, and visionaries, you think we're fundamentally intellectually dishonest, or we're just ignorant that we don't understand good science?
Dan Simons: I don't see this as dishonesty. I see this as a lot like memoirists. If you're writing your memoir, it's going to be from the perspective of what mattered to you and what was true for you in your past. So it's not surprising that people who are writing about a success story are going to focus on what that person's experiences were, what their history was. I don't think that's deliberately disingenuous or attempting to deceive people, but I think it's also tapping into this issue that all of us have, that we focus on the information that's right in front of us, the information we have, and we generally don't think about the information we're missing.
And that's a real challenge, and it's one of the reasons we do get deceived a lot, even if it's not intentional. We get deceived by paying attention to what we're seeing. We hear about Mark Zuckerberg and Steve Jobs not having a college degree. That's right in front of us, that's available to us. We don't think about all the cases we're not hearing about. And because of that, our reasoning is based on those things. So people looking to deceive us could use that. I don't think business writers as a group are deliberately trying to deceive anybody. They're just presenting the case studies that are of interest to them.
Guy Kawasaki: That's like telling the IRS, "I didn't know I was supposed to report that income." But anyway.
Dan Simons: Do I think they would be better books if they actually did evaluate whether the factors that drove that person led to their success? Possibly. They might not be as engaging for us because we find those sorts of stories of success really engaging. But it's a different goal, I think, in a lot of those cases.
Guy Kawasaki: Well, we had another guest on named Derek Sivers, and he wrote a book where he said, listen, I'm going to give you advice and some of this advice is going to directly conflict with other advice. You just need to know that it's not that simple. There's no one path. Go to college, don't go to college.
Dan Simons: That seems like a wise way to approach it. Here's a story of the success of some person. And the things that they did may or may not have led to their success, but it's still interesting to study those cases to see if there's any commonality.
Guy Kawasaki: I'm glad you said that because I'm writing a book called Remarkable Mindset that is full of these things, but okay. The first exposure I had to your work, which just made me fall in love with your work, was the, of course, Invisible Gorilla video. And so the gist of this, for those of you not familiar, is there's people in white shirts and black shirts, and you're told to count the number of times the people in one of those colors tosses the ball. And you're watching this and you're counting, and then in the middle comes a student dressed as a gorilla, beats his chest, and walks off, and apparently, fifty percent of the people don't even notice the gorilla. So now I was thinking about this, we should apply the possibility grid to this, which means that we all talk about the fifty percent of the people who didn't notice, but were counting, okay? But what about the other possibilities, like the people who noticed the gorilla and were counting? So the question there is, how come they're capable of doing two things that fifty percent are not? So what's the explanation for that corner?
Dan Simons: So that question comes down to, is there anything different about people who notice and people who miss? And it's a great question. It's something that we've been studying for more than a decade now, trying to figure out if we can predict who will notice and who will miss in these sorts of tasks. We don't generally use the gorilla video because it became enough known that we'd lose a third of our participants who'd already seen it. But we use simplified tasks, and there actually are a number of studies trying to predict who will notice and who won't. We're just now finishing up a comprehensive review of all of those studies. And the short answer seems to be we can't predict who's going to notice and who's going to miss.
There doesn't seem to be anything that systematically differentiates them. It's really just luck. Did you happen to notice? And if I could show you that video and you watch it and you don't see the gorilla, and if I show it to you again, maybe you'd be more likely to notice it. Or let's say you do notice it and somehow I wipe your memory for the last thirty seconds and then show it to you again. The odds of noticing it the next time is again just a flip of the coin.
Guy Kawasaki: Wow.
Dan Simons: And that's, I think, what's interesting there. It shows that we're all subject to those sorts of limitations. It's not just that some people always notice everything and other people always miss everything. The key thing for me, and thinking about this in terms of the possibility grid, isn't whether or not people notice. It's what they think. So if you ask people, "Hey, if I showed you a video and you were counting passes and a person in a gorilla suit walks into the middle and thumps his chest and walks off, would you notice?" I say, "Yeah, of course. Of course, I would notice that." Right? Now, why do they think that?
What do they know about? They know about the times when they noticed something unexpected. They don't remember all the times that they didn't see something unexpected. So they only were aware of the cases where there was something unexpected and they noticed it. They're not aware of the cases where there was something unexpected that they missed, or where there wasn't something unexpected and they hallucinated, or where there wasn't something unexpected, and then they didn't notice that there was nothing there. The possibility grid gets at that same thing. It's why we have this intuition that of course we'll see things because we're only aware of the times when we did. So it's the same problem.
Guy Kawasaki: I just want to point out that I've hired many people in my career, and I only remember the times they worked out. That's what you're saying?
Dan Simons: Yeah. Although that's the case where if you hired a contractor and they completely bungled the repair job on your house, you'd remember that, right? That would be a significant event. But in general, we tend to remember the cases that call themselves to our attention. So you remember every time you were at the grocery store and the line took forever, and you seem to be in the slowest line. You don't remember the times that you breezed through. Right? It's just not remarkable. Noticing something like a person in a gorilla suit calls attention to itself. Not noticing something like a person in a gorilla suit doesn't. You don't realize that you've missed it.
Guy Kawasaki: I got two more questions about this gorilla. What would happen if you knew more about the people watching, and let's say that you could somehow make it so that you knew people who had ADHD, OCD, or autism, and you studied the differential rates of people with those issues, whether they noticed or didn't notice the gorilla. You got any predictions? Or you think it's still flip of the coin?
Dan Simons: I think it's flip of the coin. And the reason is some of those studies have been done, although not so much with ADHD or autism, but we've looked at personality measures, people who are really conscientious, people who are really neurotic in how they focus on things. We've looked at cognitive abilities, people who are really great at tracking moving things versus people who aren't, so doing the primary task. We've looked at working memory, how good a thinker they are, how good they can remember, how well they can focus attention, how well they can spread their attention. None of these factors seem to reliably predict who will notice and who doesn't.
Guy Kawasaki: Really?
Dan Simons: And I've seen predictions about these sorts of things where people say, "If course, people with autism would always notice because they're attuned to things." And it's like, no. If anything, it might be exactly the opposite. You could argue that people with autism, if they're really interested in that primary counting task, maybe then they probably wouldn't notice if they're really engaged. But I don't think it actually matters a whole lot in either case. As long as people are trying to do the task, it seems to work just as well. And when we did the original study, it was done with Harvard undergraduates, but we've done this in every population you could imagine.
We've done it worldwide with indigenous tribes with relatively little Western contact to people in big cities all over the world, and we generally find the same pattern. So I actually don't think it's going to matter a whole lot, even though everybody thinks it will. I get emails once or twice a month at least still, where somebody says, I watched it and I didn't see it, but then I showed it to my daughter and they did, but my son didn't. Is there a gender difference? I was like, well, it's three people and not that we've seen.
Guy Kawasaki: So I was going to suggest that you take the people who didn't notice the gorilla and you make them into software testers because they can focus on finding bugs and they won't be distracted. But I guess that won't work, huh?
Dan Simons: There are better ways to do that. There are lots of things that you can study. People who are focused well, and one of my colleagues, Steve Mitrov has done some of this work with baggage screeners at the airport, which is a really difficult task. You're standing there for hours trying to scan bags, and there’s relatively rare events that happen, and you can find things that will predict whether or not people will spot the gun in the bag, which is something they are looking for. It's not a gorilla, it's something they are looking for. It's like the basketball passes, but you can predict that with other sorts of cognitive measures. What's odd is that you can't predict whether or not somebody will see something that's totally unexpected because they're not looking for it. So it seems to be a different sort of thing.
Guy Kawasaki: Now, we're going to move on to the next video because in your book you say, "Watch the monkey business illusion before you keep reading." So, of course, because I have great respect for you, I did that. And I have to tell you, in a rare moment of humility, I totally knew or expected a gorilla to walk into the middle of this. So I knew, okay, it's not the gorilla because we already know about the gorilla. So Guy, really carefully watch what else unusual happens. And knock me over with a feather, I totally did not see the change in color of the curtain or the person with the black shirt walking off. So what causes people to not notice something like that?
Dan Simons: That's exactly what we were interested in there was, would having known about the gorilla video, the original video, would that somehow be like a vaccination? It would inoculate you against these sorts of failures of awareness?
Guy Kawasaki: Nope.
Dan Simons: And it's not, right, because we can only take in so much visual information at any time. So in that case, we knew that people who had already seen the video, like you, the original video, would know that anytime somebody asks you to count people passing basketballs, there's probably a gorilla. So you knew to look for the gorilla, and that's what you find. People who knew to look for it find the gorilla. They see it. If anything, they may be slightly less likely to notice the other things because they're now not only counting passes, they're looking for the gorilla. So they're using up more of their attention to find that.
But yeah, the failure to notice that sort of background change and the player leaving is a related phenomenon called change blindness, that we don't notice that something's different from one moment to the next. And really, all of these sorts of phenomena, these failures of awareness, reveal that we aren't taking in and remembering as much as we think we are. And that was one of the themes of our previous book from a dozen years ago now. The Invisible Gorilla was talking about these sorts of intuitions about what we notice, what we don't notice, what we remember.
Guy Kawasaki: If nothing else, I have to say that I have very little confidence in eyewitnesses in criminal cases now. If half the people don't notice the gorilla or half the people don't notice the changing background or someone leaving, how can you be sure it was a white guy, about six-two, red hair, that shot the guy?
Dan Simons: We have to be careful. Eyewitnesses can be wrong. So what you look for is independent converging evidence from multiple people. You look for high confidence at the time they make their first identification. So the first time they look at a lineup, say, "Oh yeah, that's the person, versus, I'm not sure, maybe it's them." You've got to take that uncertainty really seriously. Whereas by the time they get to the courtroom, you don't want to trust their confidence judgments there. You don't want to trust the person on the witness stand saying, "Oh yeah, that's the person," because your confidence can change over time. You want to get that very first assessment. You want converging evidence. I'd be really wary of convictions based on a single person's testimony without any other sort of concrete evidence that 'circumstantial evidence' is often more compelling because you can verify it. Whereas memory is fallible and perception is fallible.
Guy Kawasaki: There probably goes half the convictions in Texas and Florida, but we won't go down that hole.
Dan Simons: That's a lot of what the Innocence Project finds that a lot of the false convictions, the ones that were overturned due to DNA evidence, a lot of them were due to an initially not confident eyewitness testimony. That's the most common sort of problem. So yeah, it's a big issue.
Guy Kawasaki: Dan, by the way, do you have kids?
Dan Simons: I do.
Guy Kawasaki: How old are they?
Dan Simons: Twenty-one and sixteen.
Guy Kawasaki: Okay. So do you think some of this blindness and perception and lack of noticing, do you think you could explain why teenagers have messy rooms? They don't see the toilet paper outside the waste-basket, they don't see the towel on the floor. It's not that you're a bad parent, they're bad kids, it's just a perception problem?
Dan Simons: It could well be. I think the way to think about a lot of this sort of stuff is that we tend to pay attention to the things that matter to us, that are interesting to us. If you're given a task like count basketball passes, we can do that. We can pay attention to that, but we can't pay attention to everything. So if you are completely uninterested in the state of your room or the state of the bathroom or whether a towel is on the floor, you're not going to focus much attention on it and you might just walk right past it. And the more you walk right past it, the more you're not in the notice of it anymore. We all have that box that we never put away when we moved the last time, and after it sat there for three years, you no longer notice it. You're not paying attention to it anymore. It's still there.
Guy Kawasaki: Okay. So let it never be said that you cannot learn about parenting by listening to the Remarkable People podcast. Now, okay, finally, we're going to get to the current issue. So this is a big question, but hey, you're a big guy. So just give us the gist. Why do people get suckered and taken in like Madoff or Crypto or Tulip mania? Like what happens? Why do they get taken?
Dan Simons: I think there are a lot of reasons, but I think the first thing to keep in mind is that the people who get fooled are not just gullible or naive or uneducated or dumb. We all can be fooled because we all use the same sorts of shortcuts to be efficient in the way we get around the world. We all have habits of thought that work for us the vast majority of the time. They help us be effective, efficient. We all have a bias to accept that what we're hearing is true by default and that we only check later if we want to question it. We tend to assume that when you're interacting with somebody, when you're talking with somebody, they're being truthful with you, and that's necessary to have good relationships in the world. So the problem is that those who are actively looking to deceive us hijack those sorts of tendencies.
So we have these habits of thought that we use quite a bit. We tend to focus, we've already talked about that one, we tend to focus on what's right in front of us and not think about what we're missing. But we also tend to think poorly about the way predictions work and expectations work. We tend to not think about whether something was something we predicted. We kind of get it, it's handed to us. We tend to take it as, "Yeah, that's the way it should have been." We tend to become really committed to our ideas without questioning them. So we stopped thinking about what the assumptions were that underlies our strong beliefs, and we tend not to ask enough questions. So we tend to be accepting without questioning. And most of the time, those things all work fine for us. It's just when somebody like a Madoff wants to take advantage of that, they can. They can meet expectations, they can cater to our commitments very effectively.
The other factor is that we have patterns of information that we find really compelling. And the metaphor I like to use for this is a matador bullfighter. When the matador shows this red cape, that's really enticing to that bull. It's going to charge forward. It's not going to think about whether it's hiding a blade. It's going to charge right into it because that's really an appealing thing at that moment. For whatever reason, this is something I've got to charge. But we all have that sort of information that we find really appealing, and we charge forward without questioning it enough. And obviously we don't run right into a blade, but if somebody's trying to deceive us, they take advantage of our tendency to trust some kinds of information more than we probably should.
Guy Kawasaki: So what's the short course and how to avoid being taken?
Dan Simons: The short answer is accept a little bit less and check a little bit more, which it sounds trite, but often it's just a matter of asking one more question or two more questions. If you're seeing a demo for a Theranos Edison machine and they say, "Okay, we're going to have this machine process this, and now you go on your tour and we'll come back and give you the results," asking, did that machine right there actually do that? Direct question, when people were investigating Madoff, they asked a lot of questions. He was investigated repeatedly.
People asked a lot of questions, but they could have checked one more thing. They could have gone and checked whether the bank accounts he mentioned actually had any money in them. They didn't because they took it as truth. They had a truth bias. They accepted what he was saying as true, and if somebody's looking to dupe you, they're just going to leave out information. They're going to give you the information they want you to focus on just like a good magician will, and they're going to hide their method by using some other machines in the background that you don't know about.
Guy Kawasaki: I have to say that one of the most interesting stories in the book is something that very few people will pick up, I think, but is such a great tell about Theranos, which is you said that Bill Draper invested his personal money in Theranos, but not his venture capital fund. Wow. That kind of explains everything, and I guarantee you, when Elizabeth Holmes was out raising money, she was saying Bill Draper has invested with the implication that people would understand that as Bill Draper's fund has invested, which is very different.
Dan Simons: Yeah, this is a common tactic. Actually, it was the same thing that happened with the Knoedler Art Gallery where instead of giving actual provenance for artwork, they instead described what family members of the artists or other experts looked at it and they gave the description like, that's a very nice canvas. Or they didn't say yes, this genuinely was by Rothko. They say it looks like something from that time period, or it's a very nice painting. But attaching the names of those experts to the briefing sheet about the artwork gave the impression that they had been endorsed and verified and vetted, when they probably hadn't.
Guy Kawasaki: And I bet you'll agree with me with this; Theranos is still top of mind, so Sunny just went to jail, Elizabeth is about to go to jail, but next week it's going to happen again. There's going to be the same kind of phenomenon. And people are going to say, "Draper's in, aren't you in? And it's going to happen." We just don't learn. Are we just so stupid?
Dan Simons: So I think that's the really interesting thing. The same scams that have been perpetrated over and over again, they take new costuming, but they're basically the same scams over and over and they have been for centuries. That's what's so interesting. We hear there are lots of wonderful podcasts about Theranos, there's lots of media coverage of these sorts of scams. We hear about them daily, but for whatever reason, they don't sink in. And I think the reason is that we're not thinking about what leads us to be fooled, and that's really the theme that we're trying to focus on in our book is, what allows us to be deceived? If we understand ourselves better, then we might have a better chance of avoiding falling for those sorts of things. Smart people fall for these scams. These are not people who are clueless or idiots or just unnaturally gullible.
Most of our listeners for this podcast are not people who will fall for the Nigerian email scam. You're not going to believe that if you just give somebody a little bit of money from your bank account, they're suddenly to give you treasure from their lost fortune. That's not believable. But we are going to buy, "Hey, this new tech startup has a technology that nobody has gotten to work before. It seems really promising. I've got a really charismatic leader who's great at selling it." You're going to take what they're saying as true. Because more often than not, when people are pitching their products and they say they can do something, they probably can. It's the cases where they really can't that you get into a lot of trouble.
Guy Kawasaki: If the premise is that your mind focuses on the cases that are visible and at hand, e.g., Zuckerberg, Gates, and Jobs don't have college degrees became trillionaires. The case at hand now is Theranos. So wouldn't our minds default to what's most obvious in front of our faces right now and we're all hyper-skeptical? Or do you think people want to believe the positive stuff?
Dan Simons: I think there's a mix. There is a range of how much people are willing to believe and how open they are to new ideas. So people differ in that. I probably am on the high skeptical end of the spectrum at this point, having read a lot of this stuff. A lot of people are much more trusting and believing that this is the next great thing, and people are optimistic. So yeah, we know that Theranos went down in flames, but if another founder comes to you and has a different product in a different area and says, "Hey, I have this new idea, I think this is really going to work."
You're not going to initially assume that they're Theranos. You're going to want to evaluate them. And if Steve Jobs came to you and said, "Hey, I'm going to make this new computer that's going to be easier to use, and everybody's going to love it." If he did that now, would you assume that he's doing what Elizabeth Holmes did? Probably not. There are a lot fewer frauds out there than there are genuine companies. There are a lot of companies that skirt the edges of it, but the genuine frauds are probably still pretty rare, even though they're in the media a lot.
Guy Kawasaki: Okay.
Dan Simons: I hope they are, anyway.
Guy Kawasaki: Are you a Wait, Wait, Don't Tell Me Fan? You're in Chicago, you must be, right?
Dan Simons: I'm in Champaign, but yes, I'm a big fan.
Guy Kawasaki: Okay. Peter Sagal has been on this show and I said, "Peter, do you get up every morning and you thank God for the GOP? Because how hard can it be to get material? Jewish lasers, just everything." So I just want to know, do social scientists like you, do you get up in the morning and say, "Oh God, thank you for making crypto because now we have so much to study, and we can look at all the frauds and all that. I miss Tulip mania, God, but thank you for crypto." So what's your analysis of crypto?
Dan Simons: Unfortunately, there's no shortage of these sorts of overhyped, impossible success stories. The idea that we literally watch the news now and we keep a list, "Okay, here's a new fraud today." It's not a rare thing to hear about these sorts of things. Crypto is an interesting case because it does bring together a lot of the same sorts of tendencies. There's this sort of hope for something that's radically different, that is somehow going to be much new and improved and better. There are testimonials from celebrities which is not necessarily who you'd want to listen to for investing advice. I don't really care what Tom Brady thinks about cryptocurrency. Why should I? So that's a standard hallmark of something that's not necessarily grounded. Good companies use testimonials too, but it's something you should be wary of. So you've got this completely new sort of approach to currencies that is not well regulated, that isn't as stable as what we know, it doesn't have the sort of protections in place that banks do.
Could it be something that's really useful? Yeah, maybe. There's a lot more risk there. So you should be asking, as just a person with not a huge amount of money, you should be asking, "Is this a smart move? Is this something that is a get-rich-quick scheme, or is it something that is a safe, long-term investment? And how would I know?" Asking that next question, "How would I know if this was safe? What would I do?" You can do a pre-mortem. "What would I do if this turned out to be a fraud? Would I be okay still?" And if you treat it like, "Hey, I'm going to go play the roulette wheel or backgammon, and it's like, okay, I know I probably will lose because the casinos generally win." It's a gamble. So you can anticipate that in advance, so you wouldn't want to bet more than you can afford to lose. Think about an investment like crypto, any investment. Would it make sense to make this investment without checking out what would happen if you just lost everything, which could happen?
It's the same thing if you were investing in a money manager. If you're dumping your money into a mutual fund at Vanguard, for example, that's a giant company, it's probably pretty stable. You know how they're running it. You probably don't have to do a ton of vetting to make sure that Vanguard isn't a fraud. There's enough regulation in place that it's probably just fine. If you're investing in a buddy's friend who is a money manager, you probably want to check them out a lot more thoroughly than you would that mutual fund, that index fund, because you don't know.
Guy Kawasaki: But a few years ago, wouldn't you say that Bernie Madoff was just as vetted as Vanguard?
Dan Simons: Yeah. Bernie Madoff was a really interesting case because he was so credentialed. He was a former head of the Nasdaq. This is somebody who was really well-known for a long time, and there were things you could see that you could ask questions about. There were good investors who saw what he was doing and said, "Yeah, I'm not going there." The consistency of his returns from year to year were impossible. You can't get eight to fourteen percent every year for twenty-five years. Nothing does. But people didn't question that because they weren't probably looking at it that way. He also capitalized on familiarity.
One of these sort of information tendencies, we tend to like things that are familiar. He targeted his own community. People who knew him, people who trusted him, which is a common tendency for these sorts of frauds now. He had remarkably consistent results, which is what people wanted. What was interesting about Madoff though, is his was a new kind of Ponzi scheme. He really developed this new, the Madoff scheme. Normally, you think of a Ponzi scheme as, "Hey, I'm going to give you eighty percent returns, guaranteed, no risk of loss." He never did that. In fact, he underperformed the S&P for his fraud, but he didn't have any huge swings. So he was a safe, low-risk investment. People treated it as getting eight percent with almost no risk, and that was really appealing to people.
Guy Kawasaki: I think this is an important message to highlight that what you're saying is counterintuitive in that the consistency of Madoff is a bad sign. Just like if it's in a data set and everything's consistent, something is wrong. There should be inconsistencies.
Dan Simons: Yeah. We think about noise in the sense of being a bad thing, that if it's not cleaned up, you're not getting the information that you want to get because it's noisy and messy. But for most things, we undervalue noise. Noise is what we should expect. So you don't always expect, let's take baseball, I don't know if you're a baseball fan, but if you take baseball, we don't always expect the best team to win in a one-game match. It just doesn't happen. Sometimes the best team loses. Sometimes the best team loses in a seven-game match, and that's not unusual because there's variability there. A batter who is a 300 hitter doesn't get three out of every ten at-bats, doesn't get three hits out of every ten at-bats. A free throw shooter in basketball who's an eighty percent free throw shooter isn't going to make eight out of every ten.
Sometimes they'll make four out of ten, sometimes they'll make ten out of ten, but they're not going to consistently, every single ten throws, make eight baskets. We don't think about how noisy most human performance is, most financial performance, it should be up and down a lot. Even within a day, stock value varies a lot. That's what we expect. If everything is constantly stable and you're always getting the same return, that should be a red flag. In science, it is. If you expect two groups to always be the same and you run twenty studies and every single time they're exactly the same, that's almost certainly fraud. Because just randomly flipping a coin, even if they are the same, you shouldn't expect them to always end up exactly equal. They'll vary. Sometimes one group will be doing better, sometimes another group will do better, and we don't tend to like that sort of thing. So if you see somebody who gives you the same results every time, it's like, wow, that's consistent, that's great. No, that's worrisome.
Guy Kawasaki: But now, what if I'm CMO of Vanguard or CMO of Schwab or whatever, and all my literature says year after year, consistently, we return above the S&P. Are you saying that you're shooting yourself in the foot or most people are dumb, they're going to think that's a good thing?
Dan Simons: Well, that might be fine. So on average, do they do better than the S&P? That's informative, right, over a long haul. But on average is different from every time. So if every single year they're one percent better than the S&P, that's strain, right? Because you wouldn't expect that. Sometimes they'll be five percent better, sometimes they'll be a little worse. But on average, they might do better. The problem is when we confuse that on-average is a better investment with every single year, it's going to be better by the same amount.
Guy Kawasaki: Okay.
Dan Simons: Yeah.
Guy Kawasaki: It seems to me that Donald Trump and the Republican party has blown the Overton Window wide open. It's not a window anymore. It's a full patio deck wide open. We may have to explain what the Overton Window is, but, so do you think when an Overton Window expands like this, does it change people's perception so that they don't notice stuff anymore? So my congressman is a pedophile, no big deal. That's okay these days.
Dan Simons: It's worrisome when sort of the baseline changes like this. It is a moving goalpost thing for what should be the appropriate standard for being critical. And yeah, it's really hard when people lie with impunity, get away with it, and then get more and more extreme. It makes it harder and harder to recognize when something is important or not. And authoritarians throughout history have known this, that you throw enough muck into the system, people don't know what's true and what's not anymore. And the reason for constantly upping the ante and telling lies isn't so much to get people to believe those; it's to get people to distrust in general, and that's a real problem.
There's a lot of research on attempting to counter misinformation, but the volume is pretty high right now, and it's hard to think about how to do that correctly. I think of that sort of different than the sort of scams and deceit that we're talking about in our book, where I think in a lot of these cases, it's knowingly just throwing nonsense out there and seeing what sticks as opposed to trying to persuade one individual that they should go ahead and do something that they don't want to do. It's more just throwing out mass confusion.
Guy Kawasaki: So if the Democratic Party called you up and said, "Dan, help me out here. What do we do?" What would your advice be?
Dan Simons: I have no idea. It's a really difficult problem. And you should mention there are people who scam other people on both sides of the spectrum, but the misinformation these days seems to be coming from one subset of the population more. It's a hard problem. We're seeing that right now with the debt ceiling negotiations. And when one side is willing to shoot the hostage, it makes negotiating really hard.
Guy Kawasaki: So to give you a break, this is the hardest question of all. So after hearing all of this-
Dan Simons: This is the break?
Guy Kawasaki: Yeah, this is the break. I think people listening to this, in a sense, the bottom line question is, how do you balance accepting versus checking? Because you cannot check everything and you cannot accept everything. So how do you balance?
Dan Simons: And that's the real challenge is that you can't go through life distrusting everybody, assuming everybody's lying to you, being a permaskeptic about everything. It just doesn't work. You can't go to the grocery store, you could, but it wouldn't be very productive, to go to the grocery store and crosscheck every price on your receipt with the price that was listed on the sign. You could do it, but it's probably not a good idea. You could then say, "Okay, should I trust that this product that claims that it's organic isn't or is? So do I need to go to the farm and investigate what fertilizers they're using on their crops?" No, of course not. At some point, you could go down every rabbit hole and you'd never do anything. We have to believe that other people are being truthful at least some of the time. So I think the real challenge is setting that balance.
And I think that's why that's the hardest question. How do you set this balance between checking more and asking more questions versus let's let it go? I think everybody's going to set that a little bit differently. So some people are going to be really risk averse. They're going to want to make sure every time. If you're the sort of person who, if you're buying a new computer, if you're the sort of person who spends hours checking out every single feature of all of the different options before just making a decision about what to buy, you're probably going to be the sort of person who checks more. If you're the sort of person who says, "Yeah, they're the same, I'll just buy it," then you're probably going to check less. But the key is thinking about, when are the times when you're at the greatest risk?
When are the times when it would be catastrophic to be deceived? Investments are an obvious case. If you're putting a lot of your money into something and you don't want to lose that money, it makes sense to be much more careful. If you're an art collector, it makes sense to be really careful in checking out the provenance of the art you're trying to buy, if you're buying expensive art. The reason is that if somebody were trying to scam you, if somebody were trying to pass off a forgery, it is worth that scammer's while to spend a lot of time setting up fake information because they sell one piece of art for a million dollars, if they have to spend a year or two setting up fake provenance for that painting, they make money. All they have to do is sell that one. So thinking about it from the scammer's perspective, what would they need to do in order to make this worth it to them?
If it's worth it to them, you need to check more carefully. So if a company is trying to get huge amounts of venture capital, it makes sense for those VCs to investigate very carefully, and most of them do, right? Because you don't want to invest in something that might be a fraud. Means you could lose a lot of money that way. So thinking about it in terms of what would be the risk to me if I were wrong, and then how much would the scammer stand to benefit if they were scamming me? Those two things together end up helping you come to some sort of a balance. I'm not going to worry too much about things that I could afford to lose. I'm going to worry a lot more about things that could affect my future.
Guy Kawasaki: I don't want to burst a bubble of yours regarding venture capitalists, but I think you're giving them way too much credit. The due diligence they do is not that great and it's not that thorough. And I also believe that most venture capitalists, they make a decision about a company in the first minute of the pitch. This is way before some associate is cranking spreadsheets. And my explanation is that the fear of losing somebody else's money is much less than the fear of missing out on the next unicorn.
Dan Simons: Yeah. One of the real challenges is what counts as due diligence? We actually talk about in the book that due diligence is one of those buzzwords that often means nothing at all. So when a company says, "Oh yeah, we're using best practices." What are your best practices? What are they? If you're doing due diligence and somebody says, "Oh yeah, we did our due diligence about this company," you can ask them, "What did you do? What did you actually do?" And if it was, "I listened to a one-minute pitch," that's not very compelling. So this is one of those cases where asking another question really helps.
If you trust that somebody is doing their diligence, they should be able to tell you exactly what they did. It's a classic case of, as a consumer, as somebody who's interacting with them, you want them to show you, not tell you. It's a standard writing principle, show, don't tell. You want them to show you what their diligence was. You want to show that they are using best practices. You want them to define what those best practices are. And it shouldn't be a huge imposition to ask them.
Guy Kawasaki: So you're saying that George Schultz and Tom Brady invested is not enough due diligence probably, huh?
Dan Simons: Probably not. No. We find it compelling, though, because they're familiar people. Yeah. Having generals on a board is not necessarily a great predictor of how good a company it is.
Guy Kawasaki: I hope you enjoyed this episode with Dan Simons. Please tell me that you're always going to be wondering what's missing, and you will create possibility grids to ensure that you think of everything that could be missing. Dan has dedicated his career to uncovering the mysteries of human perception and attention, and we are fortunate that he did. Now, go watch the gorilla videos and see if you notice everything that happens. And order his new book, Nobody's Fool, Why We Get Taken In and What We Can Do About It.
I'm Guy Kawasaki, this is Remarkable People. My thanks to Peg Fitzpatrick, Jeff Sieh, Shannon Hernandez, Alexis Nishimura, Luis "Broken Fin Box" Magana, and the drop-in queen of all of Santa Cruz, Madisun Nuismer. Until next time, paddle out, turn and burn, and catch the wave that makes you remarkable. Mahalo and Aloha.