This is one of the best books I've ever read. This book deals with cognitive errors in decision making, which is actually a driving force in capitalism. It is interesting, improves decision making, could change your others, society, and world view. It is very useful for everyone to understand how people feel, think, and behave in the real world, not in the theoretic / ideal world.
System 1:
Intuitive decisions and selections are derived by (1) skills, recognition, and memories of experts backed by long-term training and implementation, and (2) heuristics (shortcut like "Do I like it or not?", "Something is wrong.").
We try (1) and switch to (2) when we cannot come up with an idea by using (1).
[1] System 1: Automated, proactive vs System 2: Controlled
[2] System 1 is not designed to deal with statistics.
[3] Overconfidence (hindsight) and underestimation of uncertainty
[4] Prospect theory, framing (decision making is distracted by non-essential matters)
[5] Two types of identity: The present and the past (memory) that have conflicting interests.
[Conclusion]
Two types of identity
Concept of an individual in traditional economics and behavioral finance
Automated system 1 and systems 2 that requires efforts
Merit of small talk / chit chat
How to improve an organization's decision and selection
[Two papers]
1. Decision making under the uncertainty
2. Prospect theory and framing
[1] Two Systems
System 1: Fast and intuitive
System 2: Slow and logical
Avoidance of cognitive demand (effort) - a rule of minimum effort (cost)
3.
System 2 consists of two parts: (1) "algorithmic", intelligence for complicated calculations and (2) "rational".
4.
Priming effect:
Prime (e.g., preceding stimulation) evokes related words, actions, etc. For instance, "Florida", "bald head", and "wrinkle" remind people of elderly people and let people walk slowly (ideo-motor effect).
On the other hand, when you walk slowly, you quickly recognize "old", "solitude", etc.
That is, priming effect is bidirectional.
Also, you smile when you're happy. You're happy when you smile. (Whatever you feel, always be nice and kind to others. You'll be in a kindly mood.)
5. Cognitive Ease
People feel cognitive ease about repeated experiences, clear exhibition, and ideas with Prime, especially when they are in a good mood.
With the cognitive ease, people feel intimate, believable, good, and easy.
Mere exposure effect: The more you get used to it, the more you like it.
I feel better today; that means System 2 does not work well.
6. Greatness and limitation of System 1
Humans can distinguish cause and effect relationships in physical and psychological world.
System 1 does not have an ability to deal with statistical assumptions while System 2 needs professional education for it. There are not many people who have the professional statistical education.
7. System 1 jumps at a conclusion.
System 2 deals with uncertainty and doubt while thinking of conflicting interpretations simultaneously. That needs intellectual efforts.
System 1 is easy to be fooled and has a bias to try to believe something first, especially when it strengthens your belief. (Confirmation bias)
System 2 doubts and makes a decision not to believe something, but System 2 could be too busy and often lazy, especially when one is tired.
Halo Effect:
First impression and perception are more important.
What you see is all there is. (WYSIATI)
The most important thing of a story is consistently, not completeness.
Overconfidence, Framing (how to present facts affects how people think), Ignorance of objective, statistical facts (The well-organized, quiet, and picky should be a librarian, not a farmer. The number of farmers is larger than the one of librarians though.)
8. How to make a judgement
Intensity matching (by System 1): leveling of different dimensions (concepts) to make them comparable
Mental shotgun (by System 1): System 1 cannot focus only on the things indicated by System 2.
9. Answer to a easier question (heuristic question) - substitution
This is due to (1) mental shotgun and (2) leveling of different dimensions (concepts)
Affect heuristics:
System 2 has an ability to slow down a process, do a logical analysis, and turn down a proposal by System 1. However, System 2 supports System 1's "feeling"; System 2 tries to find facts that do not conflict with the feeling.
Summary: System 1
If it's easy for System 1 to recognize it, then System 1 believes it's true, feel good, and let one's guard down.
Confirmation bias
Halo effect
WYSIATI
Average, not sum
Leveling of different dimensions (concepts)
Mental shotgun
Heuristic question
Prospect theory (change, rather than the status)
Overestimate a low probability
Decreasing margin of sensitivity
Avoidance of loss (loss, rather than profit)
Framing and treating problems individually
[2] Heuristics and Bias
10. Law of "small" number
Extremely high or low probabilities can be found in a small sample. ("Artifact due to the small size of the sample")
Law of large number:
The bigger the sample size, the more accurate it is.
11. Anchoring Effect
If you estimate an unknown number after looking at a certain large (small) number, your estimate could be biased upward (downward).
Anchoring effects by System 2: cautious adjustment
Anchoring effects by System 1: automated Priming
Remember that any figures shown could cause an anchoring effect. All you can do is ignore it and derive your number backed by objective facts.
12. Availability Heuristics
When you recall something easily, then you tend to overestimate the probability of it.
Both System 1 and 2 have something to do with it.
13. Availability, Emotion , and Risk
The world in one's head is not a copy of the real world.
Emotional Heuristics: Like or not. Strong emotional response or weak one.
We tend to extremely overestimate or underestimate risks; we ignore risk completely or think that is important. (ignorance of probability)
14. Representativeness (similarity with a stereotype) and Base Rate (probability of an original sample)
It's difficult to estimate a probability, but it's easier to find similarity.
(1) To estimate a probability, you should use an appropriate Base Rate as an anchor.
(2) Always have doubt about an result.
Bayesian estimation:
Sample question
A man is meticulous about everything. He works in a well-organized manner. Estimate a probability that his major is(was) Computer Science.
Posterior probability ~ Likelihood * Prior probability
# of Computer Science major / # of all students = 3%
Prior probability = 3% / (1 - 3%)
Likelihood = 4 / 1 (When five persons are meticulous and work in a well-organized manner, 4 out of 5 are Computer Science major.)
15. Plausibility, likelihood
Conjunction Fallacy:
Linda is a single, 31-year old woman. She used to be interested in issues of discrimination and social justice. She got involved with anti-nuclear activities.
Which probability is higher?
(A) Linda is a bank employee.
(B) Linda is a bank employee, and an activist of feminism.
The answer is (B). (B) < (A)
16. Cause and Statistics
Our brains emphasize on cause and effect base rate, rather than statistical base rate.
17. Regression to the Mean
Success = Talent + Luck
Luck could be considered a random number; when it is very bad, it is expected to be better next.
People look for cause and effect (Talent) while ignoring Luck (regression to the mean).
18. Adjustment of Intuitive Forecast
[3] Overconfidence
19. Hindsight and Halo Effect
Narrative Fallacy: We always try to explain what happened as a story. It's simple, concrete (not abstract), explained by talent, foolishness, or mindset (not randomness). We focus on what happened, not what did not happen. We come up with cause and effect for a recent significant accident, which is a theme of hindsight. A good story explains about people's behavior, will, tendency, personality, etc. in a consistent and simple manner; it makes easier to match cause and effect.
Our brains are poor at dealing with luck and regression to the mean; our brains want reasons, cause and effect, backed by a story, i.e., consistent explanation.
20. Illusion of Validity
The less information, the easier it makes sense; quality and quantity of information is not that considered.
A critic might be good storyteller, but not be a good forecaster. Critics do not admit their failure to forecast, and even when they have to admit it, they have tons of excuses.
21. Intuition and Algorithm: judgment by experts is inferior to statistics
One's intuition could add value, after evaluating objective facts individually with a strict rule.
22. Whether or not an expert's intuition is trustworthy
Recognition-primed decision model: System 1 comes up with a tentative plan, and then System 2 does a simulation to check whether or not the plan works.
Learning with emotions can be done in a short time. Experts' knowledge and skills can be difficult to learn and it takes a long time. Because it is a combination of many skills. (e.g., chess)
People easily recognize a consistent and plausible story; that does not necessarily mean it's true.
It is likely that an intuition can be achieved as a skill when:
(1) environment is cyclical, moves in a regular pattern
(2) there are opportunities to learn the regular pattern in a long training (a clear and quick feedback loop)
23. External info based approach : why a forecast is not correct
External information: distribution of results (probability) for similar projects
Many people tend to downplay or ignore past distribution info. That is a main reason to causes a forecasting error. It is very important to consider all available distribution info.
24. Optimism
Illusion of control
Optimism is good or bad when making decision; it's good when executing.
Overconfidence is caused by the intrinsic nature of System 1; it's not completely controllable. With any quality / quantity of information to back your decision, if your own story is consistent, subjective confidence is formed.
Premortem: "Imagine it's 1 year later; we executed what we have made decision. It failed. Summarize how it failed in 5-10 min."
[4] Selection
25. Utility is a function of change from a reference point (Prospect Theory)
For a possible profit, people don't take risk (risk-averse); for a possible loss, people do take risk (risk-seeking).
26. Prospect Theory (reference point, i.e., initial status, and loss aversion)
Q1 Which do you choose?
You can receive 900 dollars for sure.
Or
You can receive 1,000 dollars with a probability of 90%.
(Answer: former)
Q2 Which do you choose?
You incur a loss of 900 dollars for sure.
Or
You incur a loss of 1,000 dollars with a probability of 90%.
(Answer: latter)
Utility of wealth is dependent on the change of money; not the final amount of money. Preference of profit, avoidance of loss.
(1) evaluation is done for a reference point (or AL, adaptation level)
(2) diminishing sensitivity
(3) loss aversion
Under the Prospect Theory, a reference point, current status has zero value.
27. Endowment Effect
People's buying price < selling price
e.g., When you have a bottle of great wine, it's painful to sell it; when you do not have a bottle of the great wine, it's joy to buy it. Because of loss aversion, backed by the Prospect Theory, the selling and buying price are not the same.
There are two types of goods; (1) to trade/exchange and (2) to consume. A bottle of wine above is (2). Money for consumers is (1) and for a wine shopper, bottles of wine are (1).
28. Avoid loss, rather than pursuing profits
People pay more attention to risky than words (war, crime) than happy words (peace, is love). The Negative things overwhelm positive ones.
To build a good relationship with people, avoiding bad things is more important than pursuing good things.
In a negotiation, if their compromise is a pain for them and your profit, they feel that pain more than your profit. That could cause an incentive to keep the current status.
When people get altruistic revenge to maintain social order and fairness doctrine, people feel better.
On the other hand, our brain is not designed to feel better when we behave generously to others.
Again, it is confirmed that loss and profit are not symmetric.
29. Four divided patterns - when we pursue risks
Possibility Effect: probability change from zero to 5%, e.g., insurance and lottery, overweight on a small chance, (expected value) < (money invested)
Certainty Effect: probability change from 95 to 100%
People do not take a nominal probability as it is; when people make a decision, they take "Decision Weight", which is not consistent with the nominal probability. They overweight less probable thing (Probability Effect) and underweight a certain one (Certainty Effect).
This is not explained by "Expected Value Theory".
Four Divided Patterns
X-axis : profit or loss
Y-axis : high probability (Certainty Effect) or low probability (Probability Effect)
Profit, Loss
High (* 1) (*4)
Low (*2) (*3)
(*1) Risk-aversive
(*2) Risk-seeking, e.g., lottery to have a dream
(*3) Risk-aversive, e.g., insurance to feel safe and secure
(*4) Risk-seeking
(* 1), (*4) high probability, Certainty Effect
(* 2), (*3) low probability, Probability Effect
30. Rare cases - denominator neglect
Probability and weighting on it are different.
If you can have a clear image, then you overestimate an impact while neglecting its probability. (e.g., 1 out of 10,000 dies vs 0.01% of people dies)
31. Risk Policy - comprehensively treating decision making with risk
Framing: How a question is asked or questions are combined affects answers. Broader framing (e.g., a set of combined questions) clearly brings a better answer, or at least it does not bring worse results. Also, risk-aversion for profits and risk-seeking for losses mean you have to pay premium to do it.
Econ, namely, a rational person in economics, uses a broader framing while humans prefer a narrower framing to save energy.
32. Mental Accounting
Disposition Effect: Selling a stock with unrealized gain is preferred to selling a stock with unrealized loss.
When you do not choose a default selection and it goes badly, you regret it more.
33. Preference Reversals
When you evaluate things in a parallel manner, it is likely you can make a rational decision. Think in a large framework.
34. Frame and Objective Facts
Framing (how a question is asked) affects answers.
35. Two Identity - one experiences, another memorizes
Experienced Utility (EU): pleasure and pain
Decision Utility (DU): preference, desirability
For Econ, EU = DU.
For a human, EU is not necessarily equal to DU.
Peak-end-rule:
When people evaluate things based on their memories (EU), the average of pain at its peak and at the end matters.
Ignorance of duration:
Time of the test (pain) does not affect the results of the evaluation for the total amount of pain.
Experiencing Self vs Remembering Self
Experiencing Self: "Does this hurt?"
Remembering Self: "What do you think about it? Did it hurt?"
When we make a decision based on our experience, i.e., memories, it is dependent on Remembering Self, not Experiencing Self.
36. A life is a story. "All's Well That Ends Well"
Memories, rather than the duration.
37. Happiness of Experiencing Self
38. Thinking about a life.
Focusing Illusion: When you think about a certain thing, it is the most important thing for your life at the moment.
Conclusion
There are two systems.
System 1: fast and intuitive
System 2: slow and contemplative, monitoring System 1, controlling with limited resources
There are two species.
Econ: imaginary, theoretical
Human: act in the real world
There are two identities.
Experiencing Self: experience in the real world
Remembering Self: memorize and then select the memory
Two Identities.
Ignorance of the duration.
Peak-end rule. (People remember the peak and the end of experience.)
Experiencing Self ignores time, the most precious resource.
For economists, being rational means logical consistency; whether or not decent / normal does not matter. All preferences should be consistent from their perspective. In that sense, Econ is rational while Humans are not necessarily so. Econ is not destructed by priming / narrow framing / dependence on internal information / preference reversals, and does not think that what they see is all there is, Humans cannot always avoid these.
Reasonable economic models cannot describe humans very well. Daniel Kahneman does not say that human's selections are irrational. Humans have limited thinking ability and time; we should not expect that humans can stick to very strict logical consistency.
Humans are not irrational, but they need help (policy, regulation, systems, etc.) to make a right decision.
Under the behavioral economics, freedom involves a cost. The cost is charged not only to an individual who made a bad decision, but also to the society that has to help the individual. In this sense, it is very troubling for behavioral economists to decide whether or not we should help the individual.
On the other hand, Chicago School does not face the difficulty. For them, rational economic entity does not make a bad decision; freedom is free for them.
Amazon.com
Amazon.co.jp
The Financial Journal is a blog for all financial industry professionals. This blog has been, and always will be, interactive, intellectually stimulating, and open platform for all readers.
AdSense
Subscribe to:
Post Comments (Atom)
Deep Learning (Regression, Multiple Features/Explanatory Variables, Supervised Learning): Impelementation and Showing Biases and Weights
Deep Learning (Regression, Multiple Features/Explanatory Variables, Supervised Learning): Impelementation and Showing Biases and Weights ...
-
Black-Litterman Portfolio Optimization with Python This is a very basic introduction of the Black-Litterman portfolio optimization with t...
-
0_MacOS_Python_setup_for_Quandl.txt # Go to: https://www.quandl.com/ # Sign up / in with your email address and password # Run Termina...
-
This is a great paper to understand having and applying principles to day-to-day business and personal lives. If you do not have your own ...
No comments:
Post a Comment