Thinking, Fast and Slow: Daniel Kahneman
£10.99£12.99 (-15%)
One of the most influential books of the 21st century: the ground-breaking psychology classic – over 10 million copies sold – that changed the way we think about thinking
‘There have been many good books on human rationality and irrationality, but only one masterpiece. That masterpiece is Thinking, Fast and Slow’ Financial Times
‘A lifetime’s worth of wisdom’ Steven D. Levitt, co-author of Freakonomics
Why do we make the decisions we do? Nobel Prize winner Daniel Kahneman, ‘the world’s most influential living psychologist’ (Steven Pinker) revolutionised our understanding of human behaviour with Thinking, Fast and Slow. Distilling his life’s work, Kahneman shows how there are two ways we make choices: fast, intuitive thinking, and slow, rational thinking. He reveals how our minds are tripped up by error, bias and prejudice (even when we think we are being logical) and gives practical techniques that enable us all to improve our decision-making. This profound exploration of the marvels and limitations of the human mind has had a lasting impact on how we see ourselves.
‘The godfather of behavioural science … his steely analysis of the human mind and its many flaws remains perhaps the most useful guide to remaining sane and steady’ Sunday Times
Read more
Additional information
Publisher | Penguin, 1st edition (10 May 2012) |
---|---|
Language | English |
Paperback | 512 pages |
ISBN-10 | 0141033576 |
ISBN-13 | 978-0141033570 |
Dimensions | 19.8 x 12.9 x 3.06 cm |
by Joseph Augustine
Stemming from the author’s Nobel prize winning scholarly research on the simplifying short-cuts of intuitive thinking (systematic errors) and then decision making under uncertainty both published in Science Journal, the book is a series of thought experiments that sets out to counter the prevailing rational-agent model of the world (Bernoulli’s utility errors) that humans have consistent preferences and know how to maximise them.
Instead, Prospect Theory shows that important choices are prone to the relativity of shifting reference points (context) and formulations of inconsequential features within a situation such that human preferences struggle to become reality-bound. In particular our decisions are susceptible to heuristic (short-cutting) or cognitive illusory biases – an inconsistency that is built in to the design of our minds, for example, the ‘duration neglect’ of time (less is more effect) in recounting a story by the Remembering Self, as opposed to sequentially by the Experiencing Self. Prospect Theory is based on the well known dominance of threat/escape (negativity) over opportunity/approach (positivity) as a natural tendency or hard wired response towards risk adversity that Kahneman’s grandmother would have acknowledged. Today this bias is explored by behavioural economics (psychophysics) and the science of neuroeconomics – in trying to understand what a person’s brain does while they make safe or risky decisions.
It would appear that there are two species of homo sapiens: those who think like “Econs” – who can compare broad principles and processes ‘across subjects’, like spread betters (broad framing) in trades of exchange; and “Humans” who are swayed optimistically or pessimistically in terms of conviction and fairness by having attachments to material usage (narrow framing) and a whole host of cognitive illusions, e.g. to name but a very few: the endowment effect, sunk cost fallacy and entitlement. Kahnmann argues that these two different ways of relating to the world are heavily predicated by a fundamental split in the brain’s wet-ware architecture delineated by two complementary but opposing perspectives:
System 1 is described as the Inside View: it is “fast” HARE-like intuitive thought processes that jump to best-case scenario and plausible conclusions based on recent events and current context (priming) using automatic perceptual memory reactions or simple heuristic intuitions or substitutions. These are usually affect-based associations or prototypical intensity matches (trying to compare different categories, e.g. apples or stake?). System 1 is susceptible to emotional framing and prefers the sure choice over the gamble (risk adverse) when the outcomes are good but tends to accept the gamble (risk seeking) when all outcomes are negative. System 1 is ‘frame-bound’ to descriptions of reality rather than reality itself and can reverse preferences based on how information is presented, i.e. is open to persuasion. Therefore, instead of truly expert intuitions System 1 thrives on correlations of coherence (elegance), certainty (conviction) and causality (fact) rather than evidential truth. System 1 has a tendency to believe, confirm (well known bias), infer or induce the general from the particular (causal stereotype). System 1 does not compute base rates of probability, the influence of random luck or mere statistics as correlation (decorrelation error) or the regression to the mean (causality error). System 1’s weakness is the brain’s propensity to succumb to over-confidence and hindsight in the resemblance, coherence and plausibility of flimsy evidence of the moment acronymically termed WYSIATI (What You See Is All There Is) at the expense of System 2 probability. To succumb is human as so humbly shown throughout the book which has no bounds to profession, institution, MENSA level or social standing. Maybe Gurdjieff was right when he noticed that the majority of humans are sheep-like.
System 2 on the other hand is the Outside View that attempts to factor in Rumsfeld’s “unknown unknowns” by using realistic baselines of reference classes. It makes choices that are ‘reality-bound’ regardless of presentation of facts or emotional framing and can be regarded as “slow” RAT-like controlled focus and energy sapping intention, the kind used in effort-full integral, statistical and complex reasoning using distributional information based on probability, uncertainty and doubt.
However System 2 is also prone to error especially in the service of System 1 and even though it has the capability with application not to confuse mere correlation with causation and deduce the particular from the general, it can be blocked when otherwise engaged, indolent or full of pride! As Kahneman puts it “…the ease at which we stop thinking is rather troubling” and what may appear to be compelling is not always right especially when the ego – the executive regulator of will power and concentration – is depleted of energy, or conversely when it is in a good mood of cognitive ease (not stress) deriving from situations of ‘mere exposure’ (repetition and familiarity). Experiments have repeatedly shown that cognitive aptitude and self-control are in direct correlation, and biases of intuition are in constant need of regulation which can be hard work such as uncovering one’s outcome bias (part hindsight bias and halo effect) based on the cognitive ease with which one lays claim to causal ‘narrative fallacies’ (Taleb) rather than “adjusting” to statistical random events born out of luck!
So..
Do not expect a fun and “simples” read if you want clarity in to how impulses become voluntary actions and impressions and feelings and inclinations so readily become beliefs, attitudes and intentions (when endorsed by System 2).
The solution..
Kahneman makes the special plea that our higher-minded intuitive statistician of System 2 take over the art of decision-making and wise judgement in “accurate choice diagnosis” to minimise the “errors in the design of the machinery of cognition.” We should learn to recognise situations in which significant mistakes are likely by making the time and putting in the analytical effort to avoid them especially when the stakes are high – usually when a situation is unfamiliar and there is no time to collect more information. ‘Thinking Fast and Slow’ practically equips the reader with sufficient understanding to approach reasoning situations applying a certain amount of logic in order to balance and counter our intuitive illusions. For example recognising the Texas sharp shooter fallacy (decorrelation error) or de-constructing a representative heuristic (stereotype) in one’s day-to-day affairs should be regarded as a reasonable approach to life even by any non-scientific yard stick. In another example, the System 2 objectivity of a risk policy is one remedy against the System 1 biases inherent in the illusion of optimists who think they are prudent, and pessimists who become overly cautious missing out on positive opportunities – however marginal a proposition may appear at first.
One chapter called “Taming Intuitive Predictions” is particularly inspiring when it comes to corrections of faulty thinking. A reasonable procedure for systematic bias in significant decision-making situations where there is only modest validity (validity illusion) especially in-between subjects is explored. For example, when one has to decide between two candidates, be they job interviewees or start up companies as so often happens the evidence is weak but the emotional impression left by System 1 is strong. Kahneman recommends that when WYSIATI to be very wary of System 1’s neglect of base rates and insensitivity to the quality of information. The law of small numbers states that there is more chance of an extreme outcome with a small sample of information in that the candidate that performs well at first with least evidence have a tendency not to be able to keep up this up over the longer term (once employed) due to the vagaries of talent and luck, i.e. there is a regression towards the mean. The candidate with the greater referential proof but less persuasive power on the day is the surer bet in the long term. However, how often can it be said that such a scenario presents itself in life, when the short term effect is chosen over the long term bet? Possibly a cheeky pertinant example here is the choice of Moyes over Mourinho as the recently installed Man Utd manager! A good choice of bad choice?
There are many examples shown in low validity environments of statistical algorithms (Meehl pattern) showing up failed real world assumptions revealing in the process the illusion of skill and hunches to make long-term predictions. Many of these are based on clinical predictions of trained professionals, some that serve important selection criteria of interviewing practices which have great significance. Flawed stories from the past that shape our views of the present world and expectations of the future are very seductive especially when combined with the halo effect and making global evaluations rather than specific ratings.
For example one’s belief in the latest block busting management tool adopted by a new CEO has been statistically shown to be only a 10% improvement at best over random guess work. Another example of a leadership group challenge to select Israeli army leaders from cadets in order to reveal their “true natures” produced forecasts that were inaccurate after observing only one hour of their behaviour in an artificial situation – this was put down to the illusion of validity via the representation heuristic and non regressive weak evidence. Slightly more worryingly, the same can be said for the illusory skills of selling and buying stock persistently over time. It has shown that there is a narrative being played within the minds of the traders: they think they are making sensible educated guesses when the exposed truth is that their success in long term predictability is based on luck – a fact that is deeply ingrained in the culture of the industry with false credit being “taken” in bonuses!! Kahneman pulls no punches about the masters of the universe and I am inclined to believe in the pedigree of his analysis!!
According to Kahneman so-called experts – and he is slightly derisive in his use of the term – in trying to justify their ability to assess masses of complexity as a host of mini-skills can produce unreliable judgements, especially long term forecasts (e.g planning fallacy) due to the inconsistency of extreme context (low or zero-validity environments with non regular practice) – a System 1 type error. Any final decision should be left to an independent person with the assessment of a simple equally weighted formula which is shown to be more accurate than if the interviewer also makes a final decision who is susceptible to personal impression and “taste”..(see wine vintage predictions). The best an expert can do is anticipate the near future using cues of recognition and then know the limits of their validity rather than make random hits based on subjectively compelling intuitions that are false. “Practice makes perfect” is the well known saying though the heuristics of judgement (coherence, cognitive ease and overconfidence) are invoked in low validity environments by those who do not know what they are doing (the illusion of validity).
Looking at other similar books on sale, “You are Not So Smart” for example by David McRaney is a more accessible introduction to the same subject but clearly rests on Kahneman’s giant shoulders who with his erstwhile colleagues would appear to have informed the subject area in every conceivable direction. It is hard not to do justice to such a brilliant book with a rather longish review. This is certainly one of the top ten books I have ever read for the benefits of rational perseverance and real world knowledgeable insights and seems to be part of a trend or rash of human friendly Econ (System 2) research emanating out of the USA at the moment. For example, recently 2013 Nobel winning economics research by R Shiller demonstrates that there are predictable regularities in assets markets over longer time periods, while E Fama makes the observation that there is no predictability in the short run.
In summary, “following our intuitions is more natural, and ‘somehow’ more pleasant than acting against them” and we usually end up with products of our extreme predictions, i.e. overly optimistic or pessimistic, since they are statistically non-regressive by not taking account of a base rate (probability) or regression towards the mean (self correcting fluctuations in scores over time). The slow steady pace of the TORTOISE might be considered the right pace to take our judgements but we are prone not to give the necessary time and perspective in a busy and obtuse world. The division of labour and conflict between the two Systems of the mind can lead to either cognitive illusion (i.e. prejudice/bias) or if we are lucky wise judgement in a synthesis of intuition and cognition (called TORTOISE thinking by Dobransky in his book Quitting the Rat Race).
Close your eyes and imagine the future ONLY after a disciplined collection of objective information unless of course you happen to have expert recognition, which is referred to in Gladwell’s book on subject called Blink, but then your eyes are still open and liable to be deceived. Kahneman’s way seems so much wiser but harder nonetheless. The art and science of decision-making just got so much more interesting in the coming world of artificial intelligence!
by gadgetuser
Good book (big) but VERY SMALL print font size
– the audiable guide is just as good (ignore the refernaces to the .pdf there arent many)
Interesting book.
Bought the audible version to replace, just as good.
by Edward B. Crutchley
This book starts by being intriguing and stimulating, and deserves to be read. The chapters are short, the writing is clear, the arguments supported by examples of behavioural studies, and each chapter usefully ends with a few colloquial statements that sum up what has been said. However, halfway through, the book appears to lose itself, whether by fault of verbosity, repetition, loss of structure of argument, fewer references to work by others, or just sheer volume, I’m not sure. In any case, the reader finishes the book thinking of the cited example of the scratch at the end of disc that dominates the memory of the whole. Where was the editor in all this?
The first part of the book is worth it, however. It turns out that we possess two types of thinking, default System 1 (fast) and System 2 (slow). Between them they determine how we react and make decisions. The book exposes many behavioural studies concerning the relationship between psychology and economics, and the competition between these two disciplines to explain people’s actions and decisions, including a brief mention of the new discipline of neuroeconomics.
According to the author, we are primitive in the art of prediction. We lack methodology. We suffer from illusions of validity. Why does, on one side, someone decide to sell a stock, and on the other another decides to buy it? The evidence shows that more active stock sellers have worse results. Studies show that forecasts by doctors, investment advisors, sports analysts, politicians, economists and myriads of other professionals don’t compare favourably with machine prediction. I would add the element of ‘destructive cleverness’ whereby people tend to apply their expertise in other non-relevant areas to a problem that exists within a different set of givens, contributing factors, and noise factors that need to be properly appreciated. People perhaps tend to think too often out of the box because of lack of familiarity with a subject, and are invariably inconsistent in doing so. The art of decision making needs to be demystified. It needs to be transformed into a more scientifically and factually-based procedure. Might justice be better delivered by computer? Planning fallacies include over-optimism in costs and time due to only taking the inside view and failing to refer to references classes. Optimism is the life blood of entrepreneurs, and only 35% of businesses in the USA survive 5 years. There is the notion of optimistic martyrs; firms that fail, market fodder, if you like, yet signal new markets to more qualified competitors. Potentially dangerous groupthink can be moderated by carrying out a pre-mortem; asking participants to write a reason why a project might have failed (the discipline of FMEA in industrial language).
We learn of hedonometry that quantifies pleasure and pain. We have a less unfavourable memory of pain if it tails off at the end. Our memory of an experience may not be the same as during the experience (example of a scratch at the end of a record). There is a difference between the experiencing and remembering selves. In summing up someone’s life we are over influenced by how it ends. One Unpleasant Index survey of women yielded double for child-rearing than watching TV, which was the same level as socialising. Being alone is more pleasurable than the presence of an immediate boss. Increasing focus is being placed on the measurement of well-being. Above a certain salary (quoted as $75,000 in high-cost areas) affluence does not improve a feeling of well-being, possibly, it is argued, because richer people no longer have the opportunity to enjoy in the same way the small pleasures of life (bars of chocolate).
The author focuses on System 1, for which the cardinal rule is WYSIATI – what you see is all there is. It is impulsive, intuitive, our minds appear over-influenced by bias and spin. It is rarely indifferent to emotional words (a ‘survival chance’ of 90% is preferred to a ‘mortality rate’ of 10%). It jumps to conclusions, and can even govern important decisions depending upon how a problem is presented. Intuition requires training in skills (a top class chess player requires about 10,000 hours of practice). Reminding people of their mortality increases the appeal of authoritarian ideas. There is the Lady Macbeth Effect whereby when people feel their soul is stained they have the desire to clean themselves. The Florida Effect was illustrated when students, who had been encouraged to think of words related to old age, walked more slowly down a corridor. When we place a pen crosswise in our mouth, thereby forcing a smile, we tend to think more favourably of things. The Halo Effect occurs, for example, when people like a president’s politics because they like his voice and appearance. In the Availability Cascade, biases, popular reactions and exaggerated fears (often influenced by the media and popular reactions) influence policy. We are unduly worried about unlikely events, for example when a teenage daughter is late at night. Terrorism speaks directly to System 1 even though, even in the worst cases, it may be responsible for nowhere near the number of deaths by car accident. In decisions related to numbers, there is an Anchor Effect whereby a suggested value influences our decision. Hindsight bias causes us to blame the intelligence services for 9/11. System 1, in effect, tries to make sense of the world, to stereotype, making it predictable and explicable and overestimating predictability. It even breeds overconfidence. It averages instead of adds. People will assign less value to a larger set of dinner crockery that contains some broken items than a smaller set of the same quality but with no broken items. We tempt to rationalise the past in order to predict the future. We suffer from theory-induced blindness. The Endowment Effect provokes an aversion to loss and determines economic behaviour. We may be prepared to sell, but only at a higher price than buy (the ratio is higher in the USA than the UK). People who are poor see a small amount of income as a reduced loss rather than a gain. The brain has a rapid mechanism to detect threats, but no such thing for good events. The negative trumps the positive. A single cockroach will destroy the appeal of a bowl of cherries, but a single cherry will have no effect on a bowl of cockroaches. A stable marital relationship has been found to require at least 5:1 ratio of good interactions to bad ones. A friendship that may take years to develop can be destroyed with one single action. Golfers try harder to avoid a bogie than to gain a birdie. In relation to rational probability, our decisions are skewed negatively near 100% and positively near 0%. People attach value to gains or losses rather than wealth. System 1 makes us overweigh improbable outcomes unless we have prior experience. We tend to overestimate our chances and overweigh estimates. People tend to be risk averse in potential gains and risk-seeking in potential losses. The Sunk Cost fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects. We tend to reinvest in a project in which we are already implicated, even if the prospects have deteriorated, rather than divert out effort into a more promising venture, so as not be part of a failure. We are reluctant to cut our losses. We tend to be risk-averse and environmental and safety laws, for example, are set up to protect us, yet these laws might have prevented development of the airplane, x-rays, open-heart surgery. We prepare ourselves for the feeling of regret. We avoid being too hopeful about a potential football win. People more readily forgo a discount than pay a surcharge even when the end result is identical.
System 2 includes rational thinking and reasoning, but it is inherently lazy. To counteract the negative effects of System 1 determining a choice, we can ask ourselves to produce more arguments to support it. Disbelieving something is hard work. Governments making decisions bases solely on hard facts and statistics as opposed to popular reactions.
by Mrs. K. A. Wheatley
I found this largely fascinating. Understanding how our brain works to code information and make decisions is quite the trip. Kahenman makes it easy for the layperson to understand. His aim, he says is to create that watercooler conversation rather than academic debate. I found I was enthralled by the first half of the book and found the second half more heavy weather but it was still well worth reading. It made me think about the things I tell myself and it was also good to know that I am not the only person with a lazy brain. I’m just more open about my laziness.