Thinking, Fast and Slow by Daniel Kahneman book cover

Thinking, Fast and Slow Book Summary, Review, Notes

Daniel Kahneman’s book “Thinking, Fast and Slow” was written with the intention of assisting readers in identifying cognitive faults. In it, the author hopes to assist readers in comprehending and recognizing these processes inside themselves, as well as gaining an understanding of how it might be remedied. This book is a follow-up to prior research on prospect theory that was collected and published in the academic publication “Choices, Values, and Frames” in the year 2000.

The book “Thinking, Fast and Slow” has been updated with new research findings, numerous examples from ordinary life, and references to other recent work in the field that are relevant to the discussion.

Book Title— Thinking, Fast and Slow
Author—
 Daniel Kahneman
Date of Reading— 
October 2022
Rating— 
9/10

What Is Being Said In Detail

Part I. Two Systems

In this first section, the author defines System 1 and System 2 thinking. In System 1, mental processes occur spontaneously, whereas in System 2, they need conscious effort. With the former, you can come up with complicated patterns of ideas, while the latter is the only way to come up with a well-organized list of steps. All these systems have their own strengths, weaknesses, and limitations. In System 1, these are skills that people have had since they were born. In addition, this covers the routines they’ve been following for so long that they no longer have to think about them. The majority of talents are involuntary, whereas others, such as chewing, are controllable but usually automatic. System 2 includes things that need your full attention. In most cases, the disruption of a process occurs when its focus is shifted. As with any such thinking, the emphasis is on the end result. It’s practically hard to multitask since people’s brains are wired to prioritize one task above the others.

There are numerous ways in which these two systems interact. When System 1 encounters an issue, it typically summons System 2 to provide additional information. Self-control is regulated by System 2 as well. Both systems work together by splitting up the tasks so that they can do the most with the least amount of work. Being quick to act in the face of threats or seize upon significant possibilities increases the likelihood of survival, and as a result, System 1 typically takes the lead in times of crisis. Survival instincts take precedence in this kind of action. Here he proposes the “Law of Least Effort,” which argues that when faced with multiple options for accomplishing the same task, people will choose the one that requires the fewest steps. The brain prefers efficient solutions, thus it tends to choose the simplest one. Self-control and cognitive work are both types of mental work.   Simply put, when System 2 is busy doing something else, System 1 has more of an effect on how people act. A person’s ability to exercise self-control can be weakened by factors other than cognitive load, such as lack of sleep and alcohol use. This occurs because exercising self-control requires conscious thought and action; in other words, System 2 is responsible for the action.

System 2 is responsible for enforcing norms and keeping an eye on the suggestions made by System 1. System 2 also permits activities to take place, repressing or modifying them as needed. Most people have an overinflated sense of self-assurance and trust their gut instincts rather than do any research. When people think something is true, they will probably also believe the arguments that support it.  In the end, intelligence is not just the ability to make sense of things, but also the ability to find important information hidden in your memory and pay full attention when you need to.

Here, Kahneman explains a psychological phenomenon he calls “Association Activation.” The phrase “cognitive process” is used to describe what happens in the brain when one concept triggers a cascade of related ideas. The truth is that all of these parts are intertwined and mutually beneficial. When two words that might seem out of place next to each other on paper are present, System 1 will attempt to figure out why they are there. Even the body makes an effort to make sense of the world around it. Scientists say that hearing one word can make it easier to think of other words that go with it. This phenomenon has been named the priming effect. Furthermore, primed words have the potential to prime further thoughts when they appear. Not only can words, but also feelings and behaviors, be triggered. The ideomotor effect describes this priming scenario, or the ability to affect one action through an idea. The main job of System 1 is to figure out what is normal in a person’s psyche. Connections are made, and meaning is given, to every experience.

The primary goal of System 1 is, at its core, to make assumptions. This may sound harmless, but it’s actually rather dangerous if you’re in an unfamiliar scenario or if the stakes are high. There is no time to do the research necessary to make a sound choice. When things like this happen, it’s easier to make mistakes based on what you think you know. System 1 basically makes a call based on learned patterns and precedents. Once again, f Kahneman elaborates on The Halo Effect. This describes the propensity to form strong opinions about another person based on nothing but their feelings, even if there is evidence to the contrary. According to the halo effect, the order in which we see a person’s characteristics is completely at random, but this doesn’t make much of a difference because initial impressions are so important. Evaluations in System 1 are done mechanically. It only uses intensity matching to compare, and it isn’t always accurate. When posed a question, System 1 answers with more information than is strictly necessary; this is known as “mental shotgun.” Similar to a shotgun, it disperses information rather of narrowly focusing on a single target, thus the name.

Daniel Kahneman Quote: “Declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”

And finally, we get to the bottom of the meaning of “substitution”   When System 1 struggles to discover a solution to a challenging issue, it will often select a much simpler, related question and answer that one instead. System 2 is, on the other hand, lazy by nature. It looks for information that fits with what a person already thinks and believes.

Part 2. Heuristics and Biases

System 2’s operations depend, at least in many cases, on the data that System 1 generates for System 2 through its associative processes. This makes it more likely that System 1 will make mistakes with “merely statistical” information, which is data where the outcome is determined by the rules of statistics rather than the type of cause-and-effect that System 1 is best at finding, whether it’s true or not. Because of how System 1 works, people often have the wrong ideas and don’t know much about sampling effects. They tend to exaggerate how often the things they see happen. Another thing is that research shows that people have a lot of faith in what they can learn by observing, even though everyone knows that a large sample is more accurate than a small one. People make big mistakes when they try to figure out how unpredictable true random events are because they like to think in terms of causes. People often fall into these traps because they look for patterns and think the world makes sense. Most of the time, people think more about what the message says than how reliable the information is. Most of the time, explanations of what caused random events to happen are wrong.

Later in this part of the book, Kahneman explains that the anchoring effect is when people think of a certain value for an unknown quantity before estimating or finding out the quantity. It’s called an anchor because the estimates are close to the number being looked at. One thing that shows this is when someone tries to buy a house. Usually, the asking price is used as a starting point for the estimated amount. Adjustment takes place when it’s clear that there are reasons to move away from the amount or anchor. It takes a lot of work to make this change. When a person’s mind is tired, they change less or stick close to the anchor point. Priming makes it possible for anchoring effects to happen. It also happens when there isn’t enough adjustment. Anchoring makes people more vulnerable than they want to be. This is because of the way their minds work.

In this way, availability cascades happen when the media overhype a simple or minor event. This then makes people think in a biased way. Then, policies are made based on these preferences or biases. When people deal with smaller risks, they either don’t care at all or pay too much attention to them.

Representativeness happens when all attention is put on stereotypes and the base rate is ignored, along with doubts about where the stereotype or label came from or how true it is.

Both Kahneman and Tversky came up with the “Linda Problem” to show that rules of thumb or “heuristics” are not good ways to make decisions and are not compatible with logic. In the research, the people who took part met an imaginary person named Linda. Each of them was given a list of things that could happen to Linda and told to put them in order of how likely they were to happen based on what they knew about Linda. In the research, Linda was shown to be someone who cared about unfair treatment. On the list, you can choose between Linda as a bank teller and Linda as a bank teller who is also an active member of the feminist movement. People who took part in the study actually liked the second choice better.  However, the community of bank tellers is larger, and this includes bank tellers who are part of the feminist movement. So, without the feminist movement, the best choice would be to be a bank teller. As you can see, logic lost out to representativeness or stereotypes when they were put up against each other. This was an example of a mistake in System 2. People usually call it a fallacy when they break a rule of logic. With the Linda problem, this showed a mistake called the “conjunction fallacy.” This is when people think that two connected events are more likely than one event when comparing them directly. The result also showed that it was easy to mix up ideas like probability, coherence, and plausibility. In a joint evaluation, you get to compare two sets, but in a single evaluation, you only get to see one of the two sets. System 1 sometimes shows the “less-is-more” pattern because it is easier to average than to add. This is also what happens in the case of Linda. Even though “Linda as a feminist bank teller” is a smaller subset of “Linda as a bank teller,” those who participated still thought it was the most likely answer.

Most people put people into groups based on stereotypes, and they are more likely to believe causal information than non-causal information. Even if you have strong evidence of a cause, you can’t change long-held or learned beliefs.

The author then talks about “the regression to the mean.” The regression to the mean is what happens when a variable is found to be extreme during the first measurement but tends toward the average during the second measurement. Also, if this variable is extreme the second time it is measured, it will be closer to the average the first time it is measured. It’s basically the definite change in any random process. The brain has a strong preference for causal explanations and doesn’t do well with numbers and statistics. Whenever a regression is found, the idea of what caused it will come up. But these will be wrong because regression to the mean always gives an explanation but never a cause.

There are many times in life when you have to guess what will happen. Even though most people are asked to make a prediction, they often give an evaluation of the evidence instead, even if they don’t realize they’re not answering the question. In this case, each prediction is biased in the same way. When they do this, they don’t care at all about regression to the mean. In this part’s conclusion, System 2 is responsible for correcting intuitive predictions. It takes a lot of work to be able to look for relevant reference categories, estimate baseline predictions, and evaluate the quality of the evidence. Also, System 1 makes very extreme predictions, which most people tend to trust a lot. On the other hand, regression is a common System 2 problem because the idea of regressing to the mean is hard to explain and understand.

Part 3. Overconfidence

In this part of the book, Kahneman explains what he calls “narrative fallacies.” A well-known statistician and psychologist named Nassim Taleb brought up the idea of “narrative fallacy.” It said that flawed stories from the past help shape how people see the world and what they expect from it. People make these mistakes when they keep trying to figure out everything around them. People make these mistakes when they put too much emphasis on how skilled they are and not enough on how lucky they were. When luck plays a big part in a process or story, there isn’t as much to learn from it. When it comes down to it, the brain is a very thorough structure for making sense of stories from the past. Hindsight bias is when a person can’t remember how they felt about something in the past. This makes them wrongly estimate how much it affected them. On the other hand, outcome bias is when people blame those who make decisions when good decisions don’t turn out right. On the other hand, they don’t give much credit when the decision they made turns out well. Basically, what seems like a good idea at the time can look foolish or even neglectful when viewed from a distance.

Daniel Kahneman Quote 2: “Nothing in life is as important as you think it is, while you are thinking about it.”

Next, people have an illusion of validity when they know their predictions aren’t much better than random guesses, but they still think their predictions are true. Stock pickers are equally susceptible to the illusion of skill. Research shows that stock pickers try to make educated guesses when there is a lot of uncertainty. If you look closely, you can see that these educated guesses are just as accurate as guessing the outcome without any information. In professional culture, there is often a strong emphasis on both the illusion of skill and the illusion of validity. As long as there is strong support for a belief (in this case, the person is surrounded by people who share the same beliefs), it tends to last and grow. Despite what most people might think, reality is the result of different forces interacting, which also includes luck. Because of this, things can go in any direction.

Emotional learning, like being afraid of certain things, is easy and quick to learn at its most basic level, but it takes a lot longer to get good at it. Being an expert at a certain task or job comes down to having a large number of smaller skills at your disposal. To learn a skill, you need two things: a setting that is almost always the same or predictable, and the chance to learn these patterns through longer practice. Intuition, in general, is impossible without the presence of external behavior patterns. Unless you have the opportunity to practice anything frequently and deliberately, you will never develop the level of mastery that is required to achieve excellence.

When people work on a project and try to figure out how it will turn out, they are more likely to look at it from the inside. On the other hand, cases handled by other people within the same reference class are taken into account by the external perspective. A planning fallacy happens when predictions and plans are made in an unrealistic way that leans toward the best-case scenario. To avoid a planning mistake, planners should look at the problem from the outside by putting it next to relevant information from other cases. This way of thinking about a planning mistake is called a reference class forecasting.

In conclusion, those who are more likely to see the bright side of things are more likely to take chances despite the high risk involved. Entrepreneurs that have an optimistic bias are more likely to believe that they are associated with a successful business, even if this is not the case. These individuals also continue to persist even when they encounter disappointment in the results of their efforts.

Part 4. Choices

In this part of the book, the author starts to talk about psychophysics, a phenomenon that Daniel Bernoulli, a Swiss physicist and mathematician, found to be true about people who take risks. People tend to choose the sure thing when they have a choice between gambling and something else with the same value. Basically, he thought that when people make a decision, it has nothing to do with how much money something is worth. Instead, it has to do with how they feel about the outcome, or their utilities. But later studies showed that a person’s happiness depends on a change in his or her wealth compared to his or her reference points. Bernoulli’s model didn’t have a point of reference.

Bernoulli’s utility theory says that the benefit of a gain can be found by comparing the benefits of two different kinds of wealth. This model has one flaw: it doesn’t have a point of reference. This link points to the state in which gains and losses are measured. Prospect theory, on the other hand, went beyond utility theory by explaining how people make decisions, especially when risk is involved and they have to choose between different prospects or options. This theory also explains how people make decisions based on their own feelings instead of facts. The prospect theory is more complicated than the utility theory. It has three cognitive features. First, people judge things based on a neutral point of reference, which is usually the status quo or the way things are right now. The second feature says that the principle of diminishing sensitivity can be used to measure both changes in wealth and the way things feel. Lastly, the third feature says that people don’t like to lose—basically, losses seem bigger than gains.

He later explains that the Endowment Effect is when an item seems to increase in value for the person who owns it. During this time, it hurts just as much to let go of something as it does to get or buy something new. Its main idea is that goods that are traded have a different value than goods that are enjoyed.

The brain is made so that it gives more weight to bad news than to good news. This is shown perfectly by the idea of loss aversion. There are moral rules that help decide what can and can’t be done in terms of gains and losses.

Kahneman goes on to explain how people’s preferences tend to follow a pattern. When a person understands something complicated, their brain gives each of its parts a certain amount of importance. Some of the weights that are put on an object are sometimes more important than others. This leads to the possibility effect, which says that results that aren’t likely to happen are given more weight than they deserve. On the other hand, there is also the certainty effect, which says that people give less weight to certain outcomes than what probability would suggest. In the prospect theory, both gains and losses are worth more to a person than their wealth. They also give different values or weights to the outcomes, which is very different from the probability. The four-fold preference pattern is regarded as the most important aspect of this theory. The first part of the fourfold pattern says that people don’t like taking risks, especially if they stand to gain a lot from them. The second is that when a big prize is offered, they don’t care that their chances of winning aren’t very good. Third, people will buy a high-priced item even though it costs a lot just to get peace of mind and stop worrying. One example is insurance. Lastly, people who don’t have many good choices will often gamble even though they’re likely to make things worse, just to have a small chance of avoiding a big loss.

Now, something interesting happens when people are exposed to the media: the availability cascade. It happens whenever the media show people pictures of damage that are very clear and extreme. When these images are shown at this time, it is System 1 that causes an automatic, uncontrolled emotional response. Even if System 2 says that these things are unlikely to happen, it can’t stop System 1 from reacting. When this happens, probability goes out the window. There is only the chance of something happening. People give unlikely events too much weight when making decisions because they think they are more likely to happen than they really are. This choice is based on both how easy it is to picture the situation and how clear it is. There’s also the denominator neglect, which happens when the vividness of a certain event helps make it more important for a decision. This is also why different ways of telling people about risks can have different effects on different people.

Narrow framing is when people look at a series of two small decisions separately. People use broad framing when they think about four options when making a single choice. With a broad perspective, it dulls the emotional response to losses and makes a person more willing to take more risks. For people who make decisions with a narrow framing mentality, it would be helpful to have a risk plan they can use whenever a similar problem comes up. The outside view and risk policy can be used to fix both the planning fallacy’s optimism and the loss aversion’s tendency to make people be careful. People often keep track of what they do by how successful they are and how much they like themselves. Having a finite mindset helps people keep things in check and makes them feel in charge.

People’s beliefs about what is right and wrong are not always based on how they feel. This makes it possible for people’s preferences to change. Aside from that, their moral intuitions aren’t always the same, especially when it comes to different situations. System 1 is usually in charge of how we feel, and it usually only makes one assessment. On the other hand, a joint evaluation needs a more thorough and careful look, so System 2 needs to be in charge. The world is made up of different groups, and people often have rules for each group. As long as things are in the same category, the preferences and decisions made about them make sense. However, if the things being studied are from different categories, the results might not make sense.

There are meanings that have to do with reality, but sometimes the meanings have more to do with what is going on in a person’s System 1 while they are trying to understand the situation. People can’t be thought of as rational because the same logical statements can make different people feel different things depending on how they feel at the time. System 1 isn’t based in reality, so the choices that people make aren’t tied to the real world. These moral feelings are more in line with descriptions or frames of reality than with reality itself.

Part 5. Two Selves

In this last part, the author talks about how we have two different mental selves. People have two selves in their minds: the self that remembers and the self that lives. The experiencing self is in charge of answering questions like, “How are things right now?” The remembering-self, on the other hand, has to answer questions like, “How was it, all in all?” During a person’s life, the only things they can keep are their memories. It’s called a “cognitive illusion” when people mistake what they’re going through for a memory. The remembering self’s job is to make up stories and store them away for future use. Most people put a lot of thought into their life story and go out of their way to make it interesting. When judging whole lives intuitively, even the small parts, the ends and peaks are more important than the length.

Daniel Kahneman Quote 3: “Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed.”

In this way, people experience well-being when their lives are full of things they would rather keep doing than stop doing. People know they are having a good time when they are totally absorbed in what they are doing. This is called “flow.” If someone doesn’t want to be stopped, that’s a good sign that they’re having a good time. The U-index shows how much of a person’s time they spend in a bad state. The idea of “well-being” has two main parts, First, there’s the idea of how people feel when they go about their daily lives. Second, there is the judgment they make about their lives every time they think about it. Using the idea that money can buy happiness, we can say that being poor will make a person unhappy and that having money can make a person happier with their life, but it doesn’t really make them happier. Taking charge of your time is one of the easiest things you can do to make yourself happier or more enjoyable. By giving them time to do the things they enjoy, their happiness will go up.

Lastly, the focusing illusion, affective forecasting, and the passing of time are the three main things that affect how happy a person is with their life. As time goes on, people stop paying attention to something new because it is no longer new. The mind is made to process stories easily, but it is not made to process time easily.

Most Important Keywords, Sentences, Quotes

Part I. Two Systems

“A happy mood loosens the control of [caution and analysis] over our performance: when in a good mood, people become more intuitive and more creative, but also less vigilant and more prone to logical errors.”

“We are prone to overestimate how much we understand about the world and to underestimate the role of chance in events.””We can be blind to the obvious, and we are also blind to our blindness.”

Part 2. Heuristics and Biases

“Declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”

“The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living, and it is the one that makes decisions. What we learn from the past is to maximize the qualities of our future memories, not necessarily of our future experience. This is the tyranny of the remembering self.”

“Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients. An unbiased appreciation of uncertainty is a cornerstone of rationality–but it is not what people and organizations want.”

“The idea of mental energy is more than a mere metaphor. The nervous system consumes more glucose than most other parts of the body, and effortful mental activity appears to be especially expensive in the currency of glucose.”

“A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth. Authoritarian institutions and marketers have always known this fact.”

“Nothing in life is as important as you think it is, while you are thinking about it”

“Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

“The psychologist, Paul Rozin, an expert on disgust, observed that a single cockroach will completely wreck the appeal of a bowl of cherries, but a cherry will do nothing at all for a bowl of cockroaches.”

Part 3. Overconfidence

“Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed.”

“If you care about being thought credible and intelligent, do not use complex language where simpler language will do.”

“The idea that the future is unpredictable is undermined every day by the ease with which the past is explained.”

“Odd as it may seem, I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.”

“A general “law of least effort” applies to cognitive as well as physical exertion. The law asserts that if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of action. In the economy of action, effort is a cost, and the acquisition of skill is driven by the balance of benefits and costs. Laziness is built deep into our nature.”

“This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.”

“I have always believed that scientific research is another domain where a form of optimism is essential to success: I have yet to meet a successful scientist who lacks the ability to exaggerate the importance of what he or she is doing, and I believe that someone who lacks a delusional sense of significance will wilt in the face of repeated experiences of multiple small failures and rare successes, the fate of most researchers.”

“We are prone to overestimate how much we understand about the world and to underestimate the role of chance in events.”

“The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little.”

Part 4. Choices

“Money does not buy you happiness, but lack of money certainly buys you misery.”

“we can be blind to the obvious, and we are also blind to our blindness.”

“The world makes much less sense than you think. The coherence comes mostly from the way your mind works.”

“You are more likely to learn something by finding surprises in your own behavior than by hearing surprising facts about people in general.”

“Familiarity breeds liking.”

“The illusion that we understand the past fosters overconfidence in our ability to predict the future.”

“The easiest way to increase happiness is to control your use of time. Can you find more time to do the things you enjoy doing?”

“The test of learning psychology is whether your understanding of situations you encounter has changed, not whether you have learned a new fact.”

“[…]acquisition of skills requires a regular environment, an adequate opportunity to practice, and rapid and unequivocal feedback about the correctness of thoughts and actions.”

Part 5. Two Selves

“People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media. Frequently mentioned topics populate the mind even as others slip away from awareness. In turn, what the media choose to report corresponds to their view of what is currently on the public’s mind. It is no accident that authoritarian regimes exert substantial pressure on independent media. Because public interest is most easily aroused by dramatic events and by celebrities, media feeding frenzies are common”

“We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact.”

“We focus on our goal, anchor on our plan, and neglect relevant base rates, exposing ourselves to the planning fallacy. We focus on what we want to do and can do, neglecting the plans and skills of others. Both in explaining the past and in predicting the future, we focus on the causal role of skill and neglect the role of luck. We are therefore prone to an illusion of control. We focus on what we know and neglect what we do not know, which makes us overly confident in our beliefs.”

Daniel Kahneman Quote 4: “Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed.”

“Because we tend to be nice to other people when they please us and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty.”

“The premise of this book is that it is easier to recognize other people’s mistakes than our own.”

Book Review (Personal Opinion):

I believe that this is a wonderful book that provides a lot of information about a large number of mistakes that we make in the decisions that we make on a daily basis. The book is easy to read and is divided into extremely short portions that can each be read in thirty minutes or less, making it an excellent choice for reading on public transportation, for example. It demonstrated how our brains process information and make sense of the world around us by employing both intuitive and analytical mental processes simultaneously. This book makes an effort to explain and demonstrate the judgment mistakes that people make. Everything is founded on research, both statistical and psychological, and the studies proved why and how we produce mental fallacies. The research forms the basis of everything. It also gave us new words we could use to talk about or describe these things. This is an excellent book for readers who are interested in delving more deeply into the human psyche as well as gaining a more in-depth understanding of the subject matter at hand.

Rating: 9/10

If You Want To Learn More

Here is an interview with Daniel Kahneman on “Thinking, Fast and Slow | Daniel Kahneman | Talks at Google”

How I’ve Implemented The Ideas From The Book

The dichotomy Kahneman sets up on the first page was one of the most interesting things I noticed about the book. He tells the reader to think of the book as being about two main people. The first character, System 1, is the unsung “hero” of our mind’s story. It is always on and is the first thing our minds think of when they think of something. The main ideas behind System 1 are that it is fast, more “emotional,” and more likely to make mistakes and be biased. It works well, doesn’t take much work, and is always running in the background. In other words, System 1 uses what Kahneman and Tversky call “heuristics” to make decisions. On the other hand, Kahneman adds a side character named System 2 that thinks it is the main character. System 2 is more planned and aware, and it’s what we think of when we talk about ourselves. Knowing this has helped very much since I’ve improved my decision processes. It helps me improve my thinking and decision-making. And, now, I always think consciously about all the decisions I made, instead of just doing the first thing that comes to mind.

thinking-fast-and-slow-by-daniel-kahneman-Book-Summary-Infographic