The Art of Thinking Clearly by Rolf Dobelli


  1. Dobelli lays out some of the most common and disastrous mental biases we are susceptible to. The cognitive biases and errors we make have been made by every generation for hundreds of years. Learning how to spot and eventually mitigate these risks can have great benefits for our lives, relationships and decision making
Key Takeaways
  1. The fallacies laid out here is by no means complete
  2. Recognize that many of these fallacies are interconnected and play off each other. Lessening one also often improves others
  3. Some cognitive errors are necessary for living a happy, normal life so we don’t want to remove every fallacy. Removing most, however helps avoid must large, stupid mistakes – less irrationality
  4. Survivorship bias – so easy to ignore failures and think odds of success much higher than reality. Guard against it by continuously studying ‘graveyards’
  5. Swimmer’s body illusion – Body a result of selection, not a result of swimming. Don’t fail to recognize factor of selection for results
  6. Clustering illusion – Brain seeks patterns and rules and simply invents them if can’t find any. Don’t fall into trap of seeing patterns when there are none
  7. Social proof – Herd instinct, causes us that the more people doing something the better of an idea it is – making it likely we follow suit. The evil behind investment bubbles, cults and more
  8. Sunk cost fallacy – don’t keep on doing something just because you have already sunk a lot of time, money, energy or love into it
  9. Reciprocity – Beware free gifts
  10. Confirmation bias – skewing new information so it fits what we already believe. Must constantly search for disconfirming evidence which is one of the hardest things to do (Darwin the master)
  11. Authority bias – Tend to blindly follow authority figures
  12. Contrast effect – Have difficulty with absolute judgments as we tend to always compare to something else. People awful at noticing small, gradual changes
  13. Availability bias – don’t think that examples that are most likely to come to mind are necessarily correct or most telling. We think dramatically, not quantitatively. People tend to prefer wrong information to no information (map is not the terrain)
  14. It’ll get worse before it gets better fallacy – A form of confirmation bias – upside for consultant either way (right if things stay bad or customer happy if things improve)
  15. Story bias – Stories simplify and distort reality as we build meaning into things only after the fact. Narratives often irrelevant but we find them irresistible. Be very aware of story teller’s intentions and incentives (you are often the story teller)
  16. Hindsight bias – keeping a journal helps keep you honest. All seems clear in retrospect
  17. Overconfidence effect – people are systematically overconfident in forecasts, knowledge, predictions and decisions on a massive scale. Experts suffer even more than laymen
  18. Chauffeur knowledge – do you truly understand something or simply surface? (Planck and chauffeur). True experts delineate their circle of competence and stick in it
  19. Control illusion – thinking we can sway an outcome when we can’t
  20. Incentive super response tendency – beware what you incentivize! Rat breeding example. People respond to incentives themselves and not the grander intentions behind them. Good incentive systems think of both intent and reward
  21. Regression to the mean
  22. Outcome bias – Never judge a decision by its outcome, rather judge the process
  23. Paradox of choice – less is more. Good enough is the new optimal
  24. Liking bias – will help or buy more from people we like, more similar the more we like them
  25. Endowment effect – liking something more merely because we own it
  26. Coincidence – most underestimate the role of chance in our lives
  27. Groupthink – reckless decisions made because social proof gets people to agree when they otherwise would not. In a tight group, speaking your mind even more important
  28. Neglect of probability – react to size or danger of event rather than likelihood of it happening. No intuitive grasp of risk
  29. Scarcity error – people more highly value what is scarce. Focus only on price and benefits
  30. Base rate neglect 
  31. Gamblers fallacy – dice do not have memory, play the probability
  32. Anchoring effect
  33. Induction – drawing universal conclusions from individual observations
  34. Loss aversion 
  35. Social loafing – individual effort and accountability decrease as we become one with the crowd. Smaller teams tend to be more effective
  36. Exponential growth – people cannot gasp the power of exponential growth
  37. Winner’s curse – highest bidders win but typically pay too much so lose. Competition and ambiguity of true value of things cause this
  38. Fundamental attribution error – Overestimate individual’s influence and underestimate the environment’s
  39. False causality – mistaking correlation, effect or coincidence for causality
  40. Halo effect – a single bright characteristic makes everything else seem better
  41. Alternative paths – all the outcomes that could have happened but didn’t. Don’t contemplate invisible or missing outcomes or info as much as we should
  42. Forecast illusion – people horrible at predictions, even experts
  43. Conjunction fallacy – when we think a subset seems larger than the entire set. We all have soft spots for plausible stories
  44. Framing – information is perceived differently depending on how it is presented
  45. Action bias – people want to look active even if it accomplishes nothing, accentuated in new situations or where you’re unsure
  46. Omission bias – inaction seems more admissible than action even if both lead to the same outcome
  47. Self serving bias – attribute success to ourselves and luck to others’
  48. Hedonic treadmill – we always recalibrate happiness and sadness to our situation. Avoid negative things you can’t get accustomed to, expect only short term happiness from material things, get as much free time, autonomy and deep relationships as possible
  49. Self selection bias
  50. Association bias – seeing connections where none exist
  51. Persian messenger syndrome
  52. Beginner’s luck – regression to mean always brings you back down. True skill lies in outperformance over long periods of time
  53. Cognitive dissonance
  54. Hyperbolic discounting – desire for immediate gratification causes us to make bad decisions for our long term interests
  55. Because justification – people accept reasons even if they don’t explain everything
  56. Decision fatigue – decide better when you decide less
  57. Contagion bias – things can get a negative connotation simply through association
  58. Problem with averages – often mask underlying distribution. Don’t cross a river which is on average 4 ft deep. Beware things which follow power laws (when extreme outliers dominate like Bill Gates’ wealth)
  59. Motivation crowding – surprisingly small monetary incentives crowd out other incentives (volunteering feels less good if we are compensated). Bonuses help more in jobs where people don’t get intrinsic fulfillment
  60. Twaddle tendency – excessive words hides lazy thinking or poor understanding. Jabber disguises ignorance
  61. Will Rogers phenomenon – accounting type illusions which make situations seem better but actually add no value
  62. Information bias – the delusion that more information helps us make better decisions
  63. Effort justification – overvalue things you put a lot of effort into (Ikea effect)
  64. Law of small numbers – much larger fluctuations with small numbers
  65. Expectations – raise expectations for self and those you love and lower it for things you can’t control
  66. Simple logic – scrutinize even simple sounding problems more closely
  67. Forror effect – why pseudoscience works so well, very general or flattering statements most people want to associate with
  68. Volunteer’s folly – giving your time is often not the most effective way to volunteer as using your skill to earn money and donate to a cause or to those who can perform the needed skill more aptly is often a better way to give
  69. Affect heuristic – emotional reactions determine risks and benefits, rather than expected value and probabilities. Substituting how we feel rather than what do I think
  70. Introspection illusion – internal reflection is not reliable and become overconfidence in our beliefs, nothing more convincing than own beliefs; become your own toughest critics
  71. Boat burning effect – remove options in order to become all in. Options and more choices have hidden costs and diminishes will power. Invert in order to determine what to avoid
  72. Neomania – new things always seem to shine brighter. Rule of thumb – whatever has survived for X years will survive for another X years
  73. Sleeper effect – forget source of information but remember message and how it made us feel. Don’t accept any unsolicited advice, avoid ads, remember source of all info you get
  74. Alternative blindness – fail to compare your best alternative to next best alternative(s). Consider all alternatives
  75. Social comparison bias – tendency to withhold assistance from people who might outdo you even if you’ll look like a fool in the long run (hire people who are better than you)
  76. Primacy and recency effects 
  77. Not invented here syndrome – tendency to fall in love with our own ideas
  78. Black swan – unthinkable events which affect every aspect of your life; profit from the unthinkable by trying to catch a positive back swan (entrepreneur or inventor or build something which scales) and avoid negative black swans by giving yourself margin in every aspect of your life
  79. Domain dependence – insights do not pass well from one field to another, especially from theoretical to practical
  80. False consensus effect – frequently overestimate the popularity in the general public of things we like
  81. Falsification of history – remove wrong past assumptions so you think you were right all along; adjust past views to present views. Safe to assume half of what you remember is wrong
  82. In group / out group bias – even small similarities can cause in group bias and anyone outside is a potential enemy
  83. Ambiguity aversion – difference between risk and ambiguity is that with ambiguity the probabilities of outcomes are unknown. Can make calculations knowing risk but not with uncertainty
  84. Default tendency – status quo bias, cling to way things are even if not the best option
  85. Fear of regret – those who don’t follow the crowd tend to feel more regret and therefore tend to act more conservatively. Last chances envoke panic
  86. Salience effect – outstanding features get much more attention than they deserve, can lead to prejudice and changes how we interpret the past and how we act, avoid jumping to the easiest conclusions
  87. House money effect – we spend and think about money differently depending on how we got it
  88. Procrastination – Self control drains will power – eliminate distractions, set self imposed deadlines for yourself, refuel your batteries
  89. Envy – most destructive sin as it is no fun in any way, different from jealousy as jealousy requires at least 3 people, tend to envy people similar to us, stop comparing self to others, determine circle of competence and work on mastery, be only envious of the person you want to become
  90. Personification – we empathize with other people but less so if we can’t see them or don’t know them, statistics don’t stir us but people do
  91. Illusion of attention – tend to only see what we focus on and miss everything else, think the unthinkable, try to spot the Black swans, Pay attention to silences as much as noises
  92. Strategic misrepresentation – exaggerate self or promises in order to achieve some goal, look at past performance and do a cost/benefit analysis to protect self from this
  93. Overthinking – paralysis by analysis, use your emotions and intuition strategically with simple matters or areas your are highly skilled in but use your reasoning for more complex matters
  94. Planning fallacy – people take on too much and is even worse in groups, we are not natural planners and underestimate role of outside events, use pre mortems
  95. Man with a hammer syndrome – locate shortcomings and try to add tools (mental models) to aid you in your life, thinking and decision making
  96. Zagarnik effect – seldom forget uncompleted tasks but immediately forget what we’ve finished, outstanding tasks gnaw at us until we have a clear and detailed view of how we will accomplish them, create step by step instructions with detail to complete tasks
  97. Illusion of skill – luck plays a bigger role than skill, in some areas skill plus almost no role
  98. Future positive effect – missing information much harder to appreciate than what is present, have problems perceiving non events and the absence of things
  99. Cherry picking – selecting and showcasing only the best characteristics and hiding or not mentioning the rest. Always ask about the failures and try to notice what is missing or not mentioned
  100. Fallacy of the single cause – no single factor causes any event but people want a single cause to explain an event
  101. Intention to treat error – failed events show up (unlike in survivorship bias) but in the wrong category. Always try to determine if failed events are not included in the study
  102. News – makes people well informed but ignorant, harmful in the long run
  103. Negative knowledge or knowing what to avoid is much more important than positive knowledge or knowing what to do – via negativa. Eliminate the downside and the upside will take care of itself 
  104. Today’s world, unlike our ancestor’s, rewards deep thinking and independent action. Due to our biology, this is very difficult 
What I got out of it
  1. A really complete and informative book which details some of the most common heuristics and mental biases which lead us to poor decisions or faulty thinking