35 lessons from Kahneman’s Thinking Fast and Slow that every risk manager must know RISK-ACADEMY Blog

0
220

Наши популярные онлайн курсы

sample85
+ Подробнее

Риск-ориентированное управление. Самостоятельно

Курс направлен на развитие навыков риск-ориентированного мышления, которое позволяет выявлять, приоритезировать и моделировать влияние рисков на ключевые цели или решения организации.

25000 руб
sample85
+ Подробнее

Риск-ориентированное управление. С преподавателем.

Крупнейшая в России программа онлайн-подготовки к двум сертификациям: национальной и международной G31000

45000 руб
sample85
+ Подробнее

Количественная оценка рисков

Единственный в России и СНГ онлайн-курс по количественной оценке рисков и принятию решений.

33000 руб


Every year I update my must read list for risk and insurance managers. Kahneman’s Thinking Fast and Slow has been there since the very beginning. A risk manager on my team just finished reading it and he used RAW@AI to summarise they key points for risk professionals. Enjoy!

  1. The two systems – There are two ways of thinking: System 1 and System 2. System 1 is fast, automatic, and works without much effort. It’s the part of our brain that helps us make quick decisions. System 2 is slower, more deliberate, and takes more effort. It helps us when we need to think carefully, especially in difficult situations. These two systems work together, with System 1 handling easy tasks and System 2 stepping in when things get more complicated.
  2. Attention and effort – System 2 needs focus and energy to work. When we concentrate on something, like reading or learning a new skill, System 2 is in charge. However, because it uses a lot of energy, we often rely on System 1 to handle most tasks. System 2 kicks in when something requires careful thought or when System 1 makes a mistake.
  3. The lazy controller – System 2 is often lazy and lets System 1 take the lead, which can lead to errors. For example, when we make quick decisions without much thought, it’s usually System 1 at work. System 2 should check and correct System 1, but because it’s lazy, it doesn’t always do so. This can cause us to rely on shortcuts that might not be accurate.
  4. The associative machine – System 1 is always making connections between ideas, even when they aren’t really related. For instance, seeing the word “banana” might make you think of “yellow.” Here, Kahneman talks about how System 1 quickly forms connections and patterns, even when they don’t really exist. This can affect our thoughts and decisions without us realizing it. System 1 is always trying to make sense of things by linking ideas together, which can sometimes lead to wrong conclusions.
  5. Cognitive ease – When things feel easy and familiar, System 1 works smoothly, giving us confidence in our decisions. But this ease can make us overlook important details.
  6. Norms, surprises, and causes – System 1 expects the world to follow patterns. When something surprising happens, it quickly tries to find a cause. This quick thinking can help us make sense of the world, but it can also lead us to make assumptions that aren’t always correct.
  7. A machine for jumping to conclusions – System 1 often jumps to conclusions with limited information. System 2 is supposed to check these conclusions, but it doesn’t always do so. This can lead to quick decisions that might not be well thought out.
  8. How judgments happen – This chapter explains how System 1 forms judgments quickly based on what’s easy to see or remember. System 2 should review these judgments, but it often doesn’t, especially if it’s tired or distracted. This can result in decisions that aren’t accurate.
  9. Answering an easier question – When faced with a difficult question, System 1 often replaces it with an easier one, leading us to give an answer that feels right but doesn’t actually address the real question. This is why we sometimes answer questions about complex issues with simple, unrelated answers.
  10. The law of small numbers – System 1 tends to make decisions based on small amounts of information, which can lead to errors. For example, if we see a few examples of something, we might think it’s common, even if it’s not. System 2 should help by seeking more information, but it often relies on what System 1 provides.
  11. Anchors – The first piece of information we receive, called an “anchor,” strongly influences our decisions. Even if the anchor is irrelevant, it can affect how we think about a situation. This is why first impressions or initial numbers can have a big impact on our choices.
  12. The science of availability – This chapter explains how System 1 relies on information that is easy to remember. This can make us think that something is more common or important than it actually is. System 2 should help by considering all the information, but it often goes along with what System 1 suggests.
  13. Availability, emotion, and risk – Kahneman talks about how emotions influence System 1’s decisions, especially when it comes to risks. For example, if something bad happened recently, we might overestimate the chances of it happening again. This emotional influence can lead us to misjudge risks.
  14. Tom W’s specialty – Here he tell how System 1 quickly stereotypes or categorizes people and things. This can lead to judgments based on incomplete or misleading information. System 2 should help by thinking more carefully, but it often doesn’t, leading to mistakes.
  15. Linda: less is more – In this chapter, Kahneman discusses how System 1 makes decisions based on stories that seem to make sense, even if they are logically flawed. This can cause us to make errors in reasoning, especially when we rely on intuition rather than careful thinking.
  16. Causes trump statistics – System 1 is drawn to stories about causes and effects, even when they aren’t supported by statistics. For example, we might believe a single dramatic story more than a set of boring statistics. This can lead us to make decisions based on compelling stories rather than actual data.
  17. Regression to the mean – Extreme events are often followed by more typical ones. For example, a sports team that has an exceptionally good season is likely to have a more average season the next year. System 1 struggles with this concept and often expects patterns where there are none.
  18. Taming intuitive predictions – System 1 makes quick predictions that can be inaccurate because they’re based on limited information. System 2 can improve these predictions by taking the time to think carefully and consider all the details, but it requires effort and isn’t always done.
  19. The illusion of understanding – This chapter explains how System 1 often believes it understands things better than it actually does. This can lead to overconfidence and errors in judgment. System 2 should help by questioning these assumptions, but it often doesn’t, leading to mistakes.
  20. The illusion of validity – System 1 sticks with its first judgments, even when evidence suggests they’re wrong. This creates a false sense of certainty, leading us to trust our first impressions more than we should.
  21. Intuitions vs. formulas – Here Kahneman tells that System 2 is better at making decisions using formulas or rules rather than relying on System 1’s intuitions. Even experts can make mistakes if they don’t use systematic approaches. System 2 should help by applying these rules, but it often relies too much on intuition.
  22. Expert intuition: when can we trust it? The author explains that expert intuition can be reliable when it’s based on experience in a stable environment. However, in unpredictable situations, even experts can make mistakes. System 2 should help by questioning intuitive decisions, but it might not always do so.
  23. The outside view – Taking a broader perspective, known as the “outside view,” helps prevent biases and errors that come from focusing only on the immediate situation. By looking at similar situations and their outcomes, we can make better decisions.
  24. The engine of capitalism – Optimism and overconfidence drive entrepreneurial ventures, fueling innovation and growth. However, they also increase the risk of failure. It’s important to balance enthusiasm with a realistic assessment of risks to avoid costly mistakes.
  25. Bernoulli’s errors – Traditional economic theories assume people make rational decisions to maximize happiness, but in reality, decisions are often influenced by irrational factors. For example, people tend to fear losses more than they value gains, leading to decisions that aren’t always logical.
  26. Prospect theory – Kahneman talk about Prospect Theory, which explains how people make decisions based on perceived gains and losses rather than just the final outcome. System 1’s influence can lead to irrational decisions, such as being more afraid of losing something than being excited about gaining something.
  27. The endowment effect – People tend to overvalue what they already own, making them reluctant to trade or sell, even when it would be beneficial. This is known as the endowment effect, where ownership increases the perceived value of an item.
  28. Bad events – People are more sensitive to potential losses than gains. This can lead to risk-averse behavior, where we avoid taking risks even when they might lead to better outcomes. The fear of losing often outweighs the hope of gaining.
  29. The fourfold pattern – Attitudes toward risk change depending on whether people are dealing with gains or losses and whether the probabilities are high or low. This can lead to inconsistent behaviors, as people might be risk-averse in some situations but risk-seeking in others.
  30. Rare events – This chapter discusses how System 1 tends to overestimate the likelihood of rare events, especially when they are dramatic or emotional. This can lead to disproportionate responses to low-probability risks, such as being overly afraid of unlikely dangers.
  31. Risk policies – In this chapter, Kahneman emphasizes the importance of having clear policies to guide decisions about risk. Instead of relying on System 1 quick instincts, which can be influenced by emotions or recent experiences, policies help ensure that decisions are more consistent and rational. System 2 should enforce these policies, even though it might be tempted to go along with System 1 faster, easier choices.
  32. Keeping score – Kahneman explains how people tend to keep mental “scores” of their decisions, focusing on short-term gains and losses. This scorekeeping is influenced by System 1, which makes us more concerned with immediate results than long-term outcomes. This can lead to decisions that feel good in the moment but aren’t the best in the long run. System 2 should help by looking at the bigger picture, but it often gets caught up in System 1’s focus on short-term wins.
  33. Reversals – How a problem is framed can change whether we are risk-averse or risk-seeking. For example, people might choose differently if a situation is presented as a loss rather than a gain. Re-examining the framing helps avoid being misled.
  34. Frames and reality – Kahneman continues to discuss framing, explaining that System 1 is very sensitive to how choices are presented, or “framed.” This can lead to decisions that are based more on how the options are described than on the actual facts. System 2 can help by stepping back and considering the objective reality, but it often lets System 1 take control, which can lead to biased decisions
  35. Two selves – In this chapter, Kahneman introduces the idea of the “experiencing self” and the “remembering self.” The experiencing self is how we feel in the moment, while the remembering self is how we look back on those moments later. System 1 influences both selves, but they often lead to different decisions. For example, we might make choices that don’t make us happy in the moment but that we think will be memorable later.
  36. Life as a story – Kahneman explains how the remembering self often shapes our decisions by focusing on how we expect to remember events, rather than on how we experience them at the time. This can lead to choices that prioritize creating memorable stories over actually enjoying life as it happens. System 1 is driven by these stories, while System 2 should help by considering our overall happiness, but it doesn’t always succeed.
  37. Experienced well-being – There is often a gap between how we feel in the moment and how we remember those moments later. This difference can lead us to make decisions that don’t necessarily improve our overall happiness. Understanding this gap can help us make choices that lead to greater long-term well-being.
  38. Thinking about life – Our satisfaction with life is influenced by both quick judgments and reflective thinking. Recognizing the roles of both System 1 and System 2 can help us make better decisions that lead to lasting happiness.

RISK-ACADEMY offers online courses

sample85

+

Informed Risk Taking

Learn 15 practical steps on integrating risk management into decision making, business processes, organizational culture and other activities!


$149,99$49,99




sample85

+

Advanced Risk Governance

This course gives guidance, motivation, critical information, and practical case studies to move beyond traditional risk governance, helping ensure risk management is not a stand-alone process but a change driver for business.


$795