Poker is a game of decision-making under conditions of uncertainty over time. Unlike chess, poker involves luck and hidden information. Poker is like life and decisions are like bets. Thinking in bets can therefore help you make better decisions in life.
Buy Thinking in Bets at: Amazon | Kobo (affiliate links)
Key Takeaways from Thinking in Bets
- A bet is a decision about an uncertain future. If you think in bets, you will make more objective and accurate decisions. When you have something at stake, you have a greater incentive to be thoughtful about your bet.
- All decisions are bets. Usually we are not betting against another person but against ourselves. We are betting that the decision we take is the best one, bearing in mind opportunity costs.
- Treating a decision as a bet makes explicit the risk that was already inherent in the decision. It makes us take a closer look at things. A bet makes you accountable to accuracy.
- Treating a decision as a bet will make us more objective. Because that’s how you win bets.
- Thinking in bets is hard, at least initially. But it gets easier over time as it becomes a habit.
- The outcome of a decision is the result of both skill and luck.
- Duke uses the term “fielding outcomes” to mean deciding whether an outcome was mainly due to skill or luck. Fielding outcomes is important. If we correctly determine an outcome was mainly due to skill, we can feed that information back to update our beliefs, creating a learning loop.
- Fielding outcomes can be challenging because an outcome may have multiple contributing causes and our biases can get in the way. Most people have a self-serving bias. We tend to view our decisions with good outcomes as being mostly due to skill and decisions with bad outcomes as being mostly due to luck. To get better at fielding outcomes, treat it as a bet.
- We can improve our skill by seeing the world more objectively. This involves being a better belief calibrator and recognising (and developing strategies to bypass) our cognitive biases.
- Luck is always a factor, so we can only improve our odds. We cannot guarantee good outcomes.
Detailed Summary of Thinking in Bets
Biases and other bad thinking patterns
We default to believing things we read or hear
When we hear something we tend to believe it is true without vetting it [unless it is inconsistent with other existing beliefs]. The vetting only comes later if we have the time or inclination.
Daniel Gilbert, a Harvard psychology professor, published a 1991 paper that summarised centuries of study on the subject. He concluded that “People are credulous creatures who find it very easy to believe and very difficult to doubt”.
Two years later, Gilbert and others did experiments showing that our default is to believe what we hear and read. The researchers showed subjects a bunch of statements colour-coded as either true or false. The study showed that, under pressure, subjects made errors recalling whether a statement was true or false. And they were much more likely to mistakenly treat statements they had read as “true” than as “false”.
Apparently the reason is because evolutionarily, we didn’t develop much scepticism. Our beliefs were just about things we directly experienced. Later on, as complex language evolved, we were able to form abstract beliefs (i.e. beliefs about things outside our direct experience). But we had already developed a default of believing what we saw or heard by then.
We’re bad at updating our beliefs
This is because of:
- Motivated reasoning and confirmation bias;
- Self-serving bias; and
- Hindsight bias.
Thinking in bets can help counter these biases. If someone challenges us to a bet, that is a signal that our belief is inaccurate. This should trigger us to vet that belief and look at it more objectively.
Motivated reasoning and confirmation bias
Instead of altering our beliefs to fit new information, we tend to interpret information to fit our beliefs. Information that disagrees with us assaults our self-narrative and we work hard to explain it away. Whereas we’ll happily embrace information that agrees with us. This is called motivated reasoning.
Hastorf and Cantril’s 1954 paper “They Saw a Game” studied how Dartmouth and Princeton students viewed a controversial Dartmouth/Princeton game. It found significant differences in the numbers of penalties each side perceived.
Dan Kahan in 2012 published a paper “They Saw a Protest” where subjects saw a video of a protest. The researchers told one group it was an anti-abortion protest. The other group was told it was a protest against the ban on gays and lesbians in the military. Again, how people saw the protestor’s actions varied depending on whether they agreed with the reasons for the protest.
Smarter and more creative people are better at motivated reasoning. They can come up with more ways to explain information away.
- In a 2012 study West, Meserve and Stanovich looked at the blind-spot bias where people recognise biased reasoning in others more easily than in themselves. They found that the blind-spot bias was stronger in smarter people. Others have replicated this result.
- Dan Kahan also did a study asking subjects to analyse complex data. When the data was associated with something neutral (e.g. a skin treatment) subjects with better math skills did better at interpreting the data. But when the same data was associated with something controversial (e.g. gun violence), the subjects with better math skills interpreted the data to support their existing beliefs on gun control. And they made more mistakes in doing so (compared to less-skilled subjects with the same views).
Duke says that fake news and disinformation works because people who already hold beliefs consistent with the news won’t question it. Fake news isn’t meant to change minds, but amplify and add to existing beliefs.
Self-serving bias
Fritz Heider, an Austrian psychologist, discovered this bias. The bias is that we take credit for good outcomes and blame bad luck for bad outcomes. (Similarly it is natural for us to dismiss others’ success as being due to luck and blame others’ decisions for bad outcomes.)
Duke explains that self-serving bias will make it harder for us to learn from our experiences. It will cause us to miss opportunities to examine bad decisions. If it result in a good outcome we will field it as a good decision so no learning required there. And, if it results in a bad outcome we view it as being outside our control. So again, no learning there.
In a footnote, Duke notes that some people, particularly women, have the opposite of self-serving bias. That is, we blame ourselves for bad outcomes and chalk the goods outcomes up to luck. Duke say this still inhibits our learning because it’s also inaccurate. [In my view, however, this anti-self-serving bias (“self-sabotaging bias”?) isn’t quite as bad for learning because if you incorrectly chalk up a bad outcome to your bad decision, you still might try to learn from it.]
For example, Robert MacCoun found that in in multiple-vehicle accidents, 91% of drivers blamed someone else. Even in single-vehicle accidents, 37% of drivers blamed someone else.
Hindsight bias and “resulting”
Hindsight bias makes us see an outcome as inevitable, even though in reality a number of other outcomes could’ve happened. One something has happened, we no longer think of it as having ever been probabilistic. Duke frequently uses the example of coach Pete Carroll’s decision in the 2015 Super Bowl to call for the quarterback to pass, instead of a handoff to the running back.
Related to hindsight bias, “resulting” is a poker term for when people judge a decision by its outcome. But good decisions can lead to bad outcomes and bad decisions can lead to good outcomes. Resulting comes naturally to us because we evolved to see the world that way. It was natural for us to try and create order and make sense out of chaos, and we are uncomfortable with how big a role luck plays in our lives. Duke argues that resulting is bad because it inhibits our ability to learn from our decisions.
Making better decisions
If we want to make better decisions, we need to:
- stop bad habits like resulting or motivated reasoning;
- embrace not being sure about things and think probabilistically;
- get a truth-seeking social group that holds you accountable; and
- recruit our past and future selves.
But to do all that, you need to work with your brain, not against it.
Duke explains that the prefrontal cortex is a pretty thin layer on top of our big animal brain. It’s already overtaxed so we can’t just make better decisions by trying to do more in the deliberative prefrontal cortex. This is especially the case for poker hands, which are played so quickly. We need to work within the limits of our existing brains. We want to try to get our reflexive minds to execute on what our deliberative minds want. (Reflexive and deliberative minds are labels for what Daniel Kahneman describes as System 1 and System 2 thinking.)
For example, our brain is built to seek positive self-image updates, especially when compared to others. We like to feel good about ourselves. Instead of getting that feel-good feeling from a self-serving bias, we can work to get it elsewhere. We can get it from being a good learner, mistake-admitter, credit-giver, outcome-fielder, and decision-maker. Similarly, instead of feeling bad about admitting a mistake, you can feel bad about missing an opportunity to learn. You can also mitigate the embarrassment of having to admit you were “wrong” by assigning a confidence rating to statements.
Embrace not being sure
Embrace not being sure about things and avoid black-and-white thinking. The world is uncertain and unpredictable (objective ignorance is high). Rather than trying to be sure about things, try to figure out how unsure you are and make your best guess. Often your best bet still won’t have a high chance of success.
When expressing beliefs, express your confidence on a scale from 0 to 10:
- This makes you less likely to engage in motivated reasoning and more likely to update your beliefs objectively as new facts emerge. Psychologically, it is harder to admit you were “wrong” than to revise a confidence measure up or down.
- Duke argues that expressing less than 100% confidence also makes you more credible.
- It invites people to collaborate with you and share information that may update your belief. People can offer up contradictory information without feeling like they’re trying to argue with you or prove you wrong.
When we assign a probability to something occurring, we are much less likely to be “proven” wrong (unless we assigned a probability of 100% or 0%). Even if we say there’s only a 5% chance of something happening, and that thing happens, we could’ve still been right. It could’ve been the 5%.
When Duke consults with enterprises, people often resist having to assign a probability to future events because they feel like they can’t be certain. Duke says that’s the point. Our predictions don’t have to be perfect, but we should acknowledge that we’re making a prediction by making it explicit. [I think people just don’t want to be held accountable, particularly if it’s not a psychologically safe workplace.] See also below under “Mapping the future”.
Get a truth-seeking social group
There are many benefits to enlisting other people to help with “truth-seeking” and accuracy:
- other people can naturally spot your errors better than you can and see your blind spots;
- the group can avoid resulting because you can seek advice on a decision without telling them the outcome;
- others can offer a different perspective (especially if your group is diverse);
- as humans, we crave social approval. By being accountable to a group that rewards objective thinking and shuns things like motivated reasoning, you can leverage that desire for social approval into making better decisions.
Duke thinks that the truth-seeking group’s charter should be communicated unambiguously to its members. A suggested charter includes (I’ve paraphrased rather than used the terms Duke uses):
- Full and frank disclosure. To review a decision properly, the group needs access to all the relevant information. Err on the side of disclosure. If you want to leave out a detail because it makes you uncomfortable or will require more clarification to explain away, that’s a good sign you should disclose it. Top poker players will describe an enormous amount of detail when they workshop a hand with another player.
- Assess the claim or idea impersonally. Don’t give more credence to a claim or idea because it came from someone you think is reputable. And don’t dismiss it simply because it came from someone with a bad track record. No one has only good ideas or bad ideas.
- Reduce conflicts of interest and bias. Don’t tell the group the outcome of a decision until after they’ve analysed it. Try to be matter-of-fact and not reveal your opinions or beliefs when describing the situation.
- Be sceptical. Ask why things might not be true rather than why they are true. Instead of telling someone they’re wrong, ask them how sure they are, or if they’ve considered another viewpoint.
Occasionally you may need to blow off steam or celebrate a win. It is okay to take a limited agreed “break” from the normal rules of the social group for those situations.
Recruit past and future selves
- Recruiting your past self means thinking back to previous similar experiences and learning from them.
- Recruiting your future self means thinking about how your decisions now will affect your “future self”. That is, think ahead to the consequences of your actions. For example, Duke had a loss limit in poker which she had agreed with her “truth-seeking group”. If she played beyond that limit, she’d imagine having to explain her decision to that group which made her “regret” the decision in advance and not play beyond the limit.
Duke says that we engage the same neural network when we think about the future as when we remember the past. We just put memories together in a creative way to imagine how things will turn out. Those brain pathways include the hippocampus (a key structure for memory) and the prefrontal cortex
10-10-10 tool
One approach suggested is Suzy Welch’s 10-10-10 tool. That involves asking: what are the consequences of my decision in 10 minutes, 10 months, and 10 years? Duke also suggests using the 10-10-10 tool backwards, asking: how would I feel if I had made this decision 10 minutes ago, 10 months ago and 10 years ago? How did those decisions generally work out?
An advantage of the 10-10-10 tool is that it forces you to look at the present moment in perspective. It discourages you from blowing the present moment out of proportion and getting tilted. When we do that, our decision-making becomes reactive and we focus on quick fixes like offloading negative emotions and self-serving bias.
Duke uses the rather effective comparison to a stock ticker. When you zoom in on a given day, the movements look large and volatile. Our happiness depends not so much on how we’re doing in absolute terms, but how we’re doing relative to how we were immediately before (e.g. if you win $100 then lose $100 you’re sadder than the other way around). Zooming out is a more objective way to view your happiness.
Pre-commitments and Ulysses contracts
Duke points out that pre-commitments don’t necessarily have to stop you from doing something to be effective. It can be enough if it interrupts us and forces us to slow down and think about the decision more rationally.
Scenario planning
If we’re placing a bet on the future, we should think in detail about what the possible futures might look like before placing the bet. Start with all the possibilities, and add contingent possibilities like in a decision tree.
When we look into the future, it can be hard to see possibilities past the immediate next few steps. Duke recommends working from end by backcasting and doing pre-mortems:
- Backcasting is when you imagine you’ve achieved your goal, and think back about what needed to happen for you to get there. Backcasting helps identify any low-probabilities events that must occur to reach the goal. You might then develop strategies to maximise the chances of those events occurring.
- Pre-mortem is when you imagine you’ve failed to reach your goal, and think back to how it might’ve gone wrong. Gabriele Oettingen, a psychology professor, has conducted studies finding that people who imagine obstacles and failures were more likely to acheive their goals. By making you anticipate potential obstacles, pre-mortems improve chances of success.
Then assign probabilities. The probabilities of all the positive futures (from backcasting) and negative futures (from pre-mortems) have to add up to 100.
Advantages of scenario planning include:
- reminding us that the future is inherently uncertain and makes that uncertainty explicit;
- better preparing us for how to respond to the different possible outcomes;
- protecting us against hindsight bias. We are less likely to feel unproductive regret (or euphoria) about an outcome that could’ve turned out in any number of ways but happened to turn out bad (or good).
Other Interesting Things
- A hand of poker takes about 2 minutes, and involves up to around 20 decisions. Each hand ends with a concrete result. But the result can often be due to luck, so it’s hard to use the immediate feedback from a poker hand for learning.
- If a player takes too long in poker someone can “call the clock” on them which gives them 70 seconds to make up their mind.
- A great poker player with a good-size advantage over other players at the table, making significantly better strategic decisions, will still be losing over 40% of the time. That’s a whole lot of wrong.
- To play poker for a living, you have to put in the hours. And the best games are at night so you’re “working” the graveyard shift.
- In poker, players spend most of their time watching others play. A good player only plays about 20% of the hands they are dealt, forfeiting the other 80%.
- If you walked into a poker room and threw around words like “always” and “never”, lots of people will challenge you to bets – because it’s easy to win a bet against people who take extreme positions.
- She refers to System 1 and System 2 thinking described by Daniel Kahneman in Thinking, Fast and Slow. But she favours the terms “reflexive mind” and “deliberative mind” used by Gary Marcus. Duke says that the differences between the systems are more than just labels. Automatic processing happens in the evolutionarily older parts of the brain (cerebellum, basal ganglia, and amygdala), whereas the deliberative thinking happens in the prefrontal cortex.
- John von Neumann, a poker player, made immense contributions to the science of decision-making, yet it was just a footnote in his life. He also made huge contributions to mathematics, physics, computer scientist and others. He founded the field of game theory, which was modelled on a stripped-down version of poker. Neumann also invented the concept of mutually assured destruction.
- Two biologists compiled a list of all 187 species of mammals that had been declared extinct in the last 500 years. More than a third of those species have been rediscovered. This is according to The Half-Life of Facts by Samuel Arbesman.
- The US State Department has a formal Dissent Channel where employees can have their dissenting views heard without penalty. The Dissent Channel has been credited with a policy change that helped end the genocidal war in Bosnia. In 2017, approximately one thousand employees signed a dissent cable in response to Trump’s executive order to suspend immigration from seven Muslim countries.
- Cass Sunstein conducted a study on ideological diversity in federal judicial panels. The study was based on 3-judge Court of Appeal panels that are randomly formed from a larger pool of judges. Sunstein found that when Democrat judges sat with Republican judges and vice versa, they expressed more moderate views. Whereas an all-Democrat or all-Republican panel was much more likely to vote in a direction consistent with their politics.
- Before 2005, it was an informal badge of honour for judges (particularly conservative ones) to hire clerks with ideological backgrounds that differed from theirs. This practice has mostly stopped at the Supreme Court.
- Sociologists are overwhelmingly left-leaning. Surveys estimate 85-96% of members as left of centre or having voted for Obama. The remaining ones identified as either centrist or moderate rather than conservative.
- One study by Anna Dreber found that a betting market where scientists bet on the likelihood of some results replicating (71% correct) was more accurate than traditional peer review (58% correct). The scientists were the same scientists – the difference was they had money on the line.
My Thoughts
Thinking in Bets was a decent read. I enjoyed the first 10% quite a lot but it got a bit repetitive after that. Overall I feel about 30% of the book could have been cut. The feeling of repetition was perhaps exacerbated as I had read Noise just before, which discussed some of the same or similar ideas about making better decisions.
There were some amusing or interesting anecdotes and examples throughout, usually related to poker, which I often enjoyed. And I did like Duke’s writing style.
Buy Thinking in Bets at: Amazon | Kobo <– These are affiliate links, which means I may earn a small commission if you make a purchase. I’d be grateful if you considered supporting the site in this way! 🙂
If you enjoyed this summary of Thinking in Bets, you may also like: