Thinking

Thinking as a human activity has always fascinated me. In this blog, I will write some of the lessons learned from the books I have read. Both the books were great, by the way!
The Art of Thinking Clearly [Amazon]
  • The short lesson from this book is to be skeptical.
  • Survivor bias - When we concentrate on only few success stories and forget about those who didn't succeed.
  • Ad Models illusion - When we see advertisements for cosmetic products with good looking models, we think that the product made the models look beautiful. But in fact, beautiful models were chosen in the first place to make that product look more attractive. Does Harvard really make you smarter or Harvard recruits only smart people?
  • Patterns illusion - Human brain is designed to see shapes and patterns in otherwise insignificant things. So when you see such a shape or pattern first ask yourself if its a coincidence and if it is not take help from numbers or statistics.
  • Social Burden - In a concert, when one person starts to clap, all other do the same. Sometimes, we do things as part of herd mentality. So just because everyone is doing it, doesn't mean its right.
  • Sunk cost fallacy - If in the middle of movie we realize that its bad but if we still sit there just because we have bought an expensive ticket - that is useless because the money is gone and it's not going to come back.
  • Reciprocity - There are no free lunches. When random people offer free gifts, they might be playing with your guilt. And later when they ask for donation or some other thing, due to that guilt, we might give in.
  • Confirmation bias - Most predominant. Once we start believing in something, if there is an exception to that belief, we tend to ignore the exception. If there is evidence in support of that belief, we tend to pay more attention to it. Identifying the rule behind 2-4-6 sequence is a great example.
Thinking Fast and Slow [Amazon]
  • We have two systems of thinking (we don't really, but it helps to think like that). System one (S1) is fast and we generally have no control over it, for example, our first reaction when we see an image. System two (S2) is lazy, slow and requires focus, for example, when we are asked to multiply say 17*43.
  • Sometimes what S1 suggests might not be correct and we need to question that. It's also important to identify when to question S1. Questioning it always won't do any good.
  • A line with normal arrows on both ends will look smaller than the same size line with inverted arrows on both ends. Muller-Lyer illusion.
  • In a normal working day, very few tasks that we engage in require as much effort as storing six digits for immediate recall.
  • Storing multiple ideas at the same time or doing a S2 task in quick time - both add to the effort for S2.
  • Law of least effort says that if there are several ways to do a particular task, people will take the way which causes the least effort.
  • "Because S1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent. Biases can not always be avoided, because S2 may have no clue to the error. Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortfull activity of S2. As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical. Constantly questioning our own thinking would be impossibly tedious, and S2 is much too slow and inefficient to serve as substitute for S1 in making routine decisions. The best we can do is compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when stakes are high."
  • If we have more talent or develop the skill over a period of time, same S2 task will take lesser effort.
  • Idea priming - The brain and body act according to each other. If S1 thinks about an odd or ugly situation, the body gets ready for defensive action automatically. If we do the acts of smiling or frowning, our thoughts get influenced accordingly. People donated more in a donation box when there was a picture of a person's eyes was hanging over them. They donated less when the picture was of flowers. The eyes gave the impression that someone is watching which made us socially more accountable resulting in our action of donating more. All this happened involuntarily without we realizing. 
  • S1 is gullible and ignores exceptions that are contradicting the real world model that the S1 has prepared over time. S2 is the contradicting one but its lazy so when we are tired we are more likely to believe something.
  • S2 takes up glucose. Parole judges were approving more after lunch and snack and rejecting otherwise.
  • S1 is involuntary.
  • When we are in a happy mood, S2 has its guard down and can make errors of judgement. When we are frowning, S2 is focusing more.
  • S1 impacts action - Group of ppl who were shown elderly words as part of an exercise walked slowly through the corridor.
  • S1 is gullible and biased to believe. S2 is in charge of doubting and unbelieving. But S2 is sometimes busy and often lazy. So when S2 is otherwise engaged, we will believe almost anything. That’s why commercials result in late night purchases when we are tired and depleted.
  • Law of small numbers: We know that a large dataset is more precise but S1 has a tendency to reach a conclusion without thinking about the size of dataset. So, we can make mistakes in jumping to conclusions based on small dataset.
  • Anchor Effect: This occurs when S1 anchors on a particular value for an unknown quantity before estimating that quantity. This is a strategy that car companies use to anchor the price on a higher value.
  • Availability Effect: We tend to give more importance to events that are fresh in our memory. For example, there is a general spike in flood insurance after a major storm hits elsewhere. Or fear of flying after a news of a plane crash, or terrorism because they are almost always in news. We should instead be more analytical and give importance only to the facts.
  • Simplicity Effect: When asked a hard question, S1 tends to find a simple similar question - answer that simple question and treat the answer as the answer to the hard question. S2 being lazy or ignorant approves the answer. For example, to answer the question of "how happy are you with your life these days?", you will probably answer the question "What is my mood right now?"
  • Halo Effect: For example, when you meet a person at a party and you find him/her to be easy to talk to and personable and later if you are asked the question about how generous is that person towards charities - your answer would be to associate generosity with being nice and you would think that the person is also generous.
  • Jumping to Conclusions: We tend to ignore the quantity and quality of evidence in comparison to the quality of the story.
  • Confirmation Bias: If we have started believing in some idea, we tend to support evidence that feeds the belief and reject the evidence that is contrary to our beliefs.
  • Framing Effect: Different ways of presenting the same information evokes different emotions. 90% fat-free sounds better than 10% fat.
  • Base-rate neglect: When given some weak evidence, we tend to give it much more importance than to the base-rates available. We should discipline our intuition to anchor our judgment on a plausible base rate and then reduce or increase the probability based on the trustworthiness of the evidence. Probability is important.
  • Hindsight bias - S1 likes to create causal stories. So for an event's occurrence we might assign a low priority before it happens but after it has happened, we might start to think that this event was always going to happen (aka high probability). We very quickly forget that we didnt think this would happen before it happened.
  • Outcome bias - If a gambler took a crazy shot and won, we think that the gambler was smart and courageous. If he fails, we think that he was just taking too many risks. 
  • Luck - There is a lot of luck involved in many of the situations but since S1 tries to make up a causal story, we underestimate the impact of luck.
  • Expert views are inferior to algorithms and statistical data because experts try to be clever, think outside the box and consider complex combinations of features in making their predictions. Complex combinations may work in certain cases but more often than not they reduce the validity.
  • Unnoticed stimuli have substantial influence on our thoughts and actions. The brief pleasure of a cool breeze on a hot day may make you slightly more positive and optimistic about whatever you are evaluating at that time.
  • To maximize predictive accuracy, final decisions should be left to formulas or algorithms.
  • Simple statistical rules are superior to intuitive clinical judgments.
  • Intuition adds value but only AFTER a disciplined collection of objective information and disciplined scoring of separate traits. We should not simply trust intuitive judgment but we should not dismiss it either.
  • So to create a formula to make a decision for some issue - create at max six dimensions for the prerequisites. The dimensions should be as independent as possible. Next create questions for judging those dimensions and assign a 1-5 scale. 
  • Loss Aversion: Response to a loss is stronger than a response to a corresponding gain. 
  • Prospect theory - Choices between gambles and sure things are resolved differently, depending on whether the outcome is good or bad.
  • S1 is often able to produce quick answers to difficult questions by substitution, creating coherence when there is none.
  • When can we trust an experienced professional who claims to have an intuition - if they had a lot of opportunity to practice their skills and the feedback was quick (S1).
  • Planning fallacy - plans and forecasts that are close to best-case scenarios. Reasons for that are generally optimism of planners, desire to get the plan approved etc. In such cases it is important for the approver to take an outside view of the plan.
  • Optimistic bias - most of us have a favorable view of the world and we tend to think that the goals are much more achievable then they usually are. In a way it's related to survivor bias because we tend to base our optimism on the successful results of others but ignore the failures or failure rate. Then, there is competition neglect. We are generally unaware of what the competition is doing. Lastly, overconfidence has a big role. When we  estimate a quantity, we rely on information that comes to mind and construct a coherent story in which the estimate makes sense. 
  • People overestimate the probabilities of rare events and people overweight unlikely events in their decisions. For ex, tsunamis are very rare even in Japan but we tend to overestimate their probability.
  • Narrow framing vs broad framing - Amateur investors get worried by small data set of few trades or daily price fluctuations while experienced traders look at broad data set and don't get affected by few losses. Narrow framing combined with loss aversion is a costly curse.
  • Keeping score or sunk cost fallacy or disposition effect - A rational agent would have a comprehensive view of the portfolio and sell the stock that is least likely to do well in the future, without considering the price at which the stock was purchased. But we generally tend to keep scores and have a massive preference to sell the winner stock and keep the losing stocks.
  • When you see cases in isolation, you are likely to be guided by an emotional reaction of S1.
  • frame-bound vs reality-bound - Sometimes depending on how a problem is framed, S1 tends to take different decisions based on emotions rather than based on reality. For example, if the gas station says "discount" for people paying by cash - we are generally OK to give up that discount. But had they written "credit card surcharge" - we would have been more reluctant to use our credit card. When in reality, they both mean the same.
  • The way we remember thing gives more weight to the peak of pain and less weight to the duration also called duration neglect. We want pain to be brief and pleasure to be vast. This causes our past experiences about certain things to be a bit away from reality and causes us to take wrong decisions about related incidents in the future. 
  • If your car breaks down in the morning on the way to work and you are in foul mood because of that - if you answer any survey on that day about something unrelated like job satisfaction - your answers would be affected by the morning incident and will not really reflect the reality.
  • It's important to learn probability. So I think i will write a blog refreshing the probability lesson I took in school.
  • The way to block errors that originate in S1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from S2. It will take a lot of practice to actually be able to do this consistently.