ILLUSIONS OF REMEMBERING
The word illusion brings visual illusions to mind, because we are all familiar with pictures that mislead. But vision is not the only domain of illusions; memory is also susceptible to them, as is thinking more generally.
David Stenbill, Monica Bigoutski, Sh"imight=s is pictana Tirana. I just made up these names. If you encounter any of them within the next few minutes you are likely to remember where you saw them. You know, and will know for a while, that these are not the names of minor celebrities. But suppose that a few days from now you are shown a long list of names, including some minor celebrities and “new” names of people that you have never heard of; your task will be to check every name of a celebrity in the list. There is a substantial probability that you will identify David Stenbill as a well-known person, although you will not (of course) know whether you encountered his name in the context of movies, sports, or politics. Larry Jacoby, the psychologist who first demonstrated this memory illusion in the laboratory, noscriptd his article “Becoming Famous Overnight.” How does this happen? Start by asking yourself how you know whether or not someone is famous. In some cases of truly famous people (or of celebrities in an area you follow), you have a mental file with rich information about a person— think Albert Einstein, Bono, Hillary Clinton. But you will have no file of information about David Stenbill if you encounter his name in a few days. All you will have is a sense of familiarity—you have seen this name somewhere.
Jacoby nicely stated the problem: “The experience of familiarity has a simple but powerful quality of ‘pastness’ that seems to indicate that it is a direct reflection of prior experience.” This quality of pastness is an illusion. The truth is, as Jacoby and many followers have shown, that the name David Stenbill will look familiar when you see it because you will see it more clearly. Words that you have seen before become easier to see again—you can identify them better than other words when they are shown very briefly or masked by noise, and you will be quicker (by a few hundredths of a second) to read them than to read other words. In short, you experience greater cognitive ease in perceiving a word you have seen earlier, and it is this sense of ease that gives you the impression of familiarity.
TOPIC: #CognitiveBiases
SOURCE: Thinking, Fast and Slow by Daniel Kahneman
The word illusion brings visual illusions to mind, because we are all familiar with pictures that mislead. But vision is not the only domain of illusions; memory is also susceptible to them, as is thinking more generally.
David Stenbill, Monica Bigoutski, Sh"imight=s is pictana Tirana. I just made up these names. If you encounter any of them within the next few minutes you are likely to remember where you saw them. You know, and will know for a while, that these are not the names of minor celebrities. But suppose that a few days from now you are shown a long list of names, including some minor celebrities and “new” names of people that you have never heard of; your task will be to check every name of a celebrity in the list. There is a substantial probability that you will identify David Stenbill as a well-known person, although you will not (of course) know whether you encountered his name in the context of movies, sports, or politics. Larry Jacoby, the psychologist who first demonstrated this memory illusion in the laboratory, noscriptd his article “Becoming Famous Overnight.” How does this happen? Start by asking yourself how you know whether or not someone is famous. In some cases of truly famous people (or of celebrities in an area you follow), you have a mental file with rich information about a person— think Albert Einstein, Bono, Hillary Clinton. But you will have no file of information about David Stenbill if you encounter his name in a few days. All you will have is a sense of familiarity—you have seen this name somewhere.
Jacoby nicely stated the problem: “The experience of familiarity has a simple but powerful quality of ‘pastness’ that seems to indicate that it is a direct reflection of prior experience.” This quality of pastness is an illusion. The truth is, as Jacoby and many followers have shown, that the name David Stenbill will look familiar when you see it because you will see it more clearly. Words that you have seen before become easier to see again—you can identify them better than other words when they are shown very briefly or masked by noise, and you will be quicker (by a few hundredths of a second) to read them than to read other words. In short, you experience greater cognitive ease in perceiving a word you have seen earlier, and it is this sense of ease that gives you the impression of familiarity.
TOPIC: #CognitiveBiases
SOURCE: Thinking, Fast and Slow by Daniel Kahneman
👍2❤1
HALO EFFECT
If you like the president’s politics, you probably like his voice and his appearance as well. The tendency to like (or dislike) everything about a person , including things you have not observed, is known as the halo effect. It is one of the ways the representation of the world that System 1( about system 1 and system 2 look here ) generates is simpler and more coherent than the real thing.
Fore example: You meet a woman named Joan at a party and find her personable and easy to talk to. Now her name comes up as someone who could be asked to contribute to a charity. What do you know about Joan’s generosity? The correct answer is that you know virtually nothing, because there is little reason to believe that people who are agreeable in social situations are also generous contributors to charities. But you like Joan and you will retrieve the feeling of liking her when you think of her. You also like generosity and generous people. By association, you are now predisposed to believe that Joan is generous. And now that you believe she is generous, you probably like Joan even better than you did earlier, because you have added generosity to her pleasant attributes.
Real evidence of generosity is missing in the story of Joan, and the gap is filled by a guess that fits one’s emotional response to her. In other situations, evidence accumulates gradually and the interpretation is shaped by the emotion attached to the first impression.
TOPIC: #CognitiveBiases
SOURCE: Thinking, Fast and Slow by Daniel Kahneman
If you like the president’s politics, you probably like his voice and his appearance as well. The tendency to like (or dislike) everything about a person , including things you have not observed, is known as the halo effect. It is one of the ways the representation of the world that System 1( about system 1 and system 2 look here ) generates is simpler and more coherent than the real thing.
Fore example: You meet a woman named Joan at a party and find her personable and easy to talk to. Now her name comes up as someone who could be asked to contribute to a charity. What do you know about Joan’s generosity? The correct answer is that you know virtually nothing, because there is little reason to believe that people who are agreeable in social situations are also generous contributors to charities. But you like Joan and you will retrieve the feeling of liking her when you think of her. You also like generosity and generous people. By association, you are now predisposed to believe that Joan is generous. And now that you believe she is generous, you probably like Joan even better than you did earlier, because you have added generosity to her pleasant attributes.
Real evidence of generosity is missing in the story of Joan, and the gap is filled by a guess that fits one’s emotional response to her. In other situations, evidence accumulates gradually and the interpretation is shaped by the emotion attached to the first impression.
TOPIC: #CognitiveBiases
SOURCE: Thinking, Fast and Slow by Daniel Kahneman
Telegram
Critical thinking
HOW OUR BRAIN WORKS
Our brains are comprised of two characters, one that thinks fast, System 1, and one that thinks slow, System 2.
System 1 operates automatically, intuitively, involuntary, and effortlessly—like when we drive, read an angry facial expression…
Our brains are comprised of two characters, one that thinks fast, System 1, and one that thinks slow, System 2.
System 1 operates automatically, intuitively, involuntary, and effortlessly—like when we drive, read an angry facial expression…
👍4❤1👌1
MIDDLE GROUND
You claimed that a compromise, or middle point, between two extremes must be the truth.
Much of the time the truth does indeed lie between two extreme points, but this can bias our thinking: sometimes a thing is simply untrue and a compromise of it is also untrue. Half way between truth and a lie, is still a lie.
Example: Holly said that vaccinations caused autism in children, but her scientifically well-read friend Caleb said that this claim had been debunked and proven false. Their friend Alice offered a compromise that vaccinations must cause some autism, just not all autism.
TOPIC: #LogicalFallacy
SOURCE: yourlogicalfallacyis.com
You claimed that a compromise, or middle point, between two extremes must be the truth.
Much of the time the truth does indeed lie between two extreme points, but this can bias our thinking: sometimes a thing is simply untrue and a compromise of it is also untrue. Half way between truth and a lie, is still a lie.
Example: Holly said that vaccinations caused autism in children, but her scientifically well-read friend Caleb said that this claim had been debunked and proven false. Their friend Alice offered a compromise that vaccinations must cause some autism, just not all autism.
TOPIC: #LogicalFallacy
SOURCE: yourlogicalfallacyis.com
THE GAMBLER'S FALLACY
You said that 'runs' occur to statistically independent phenomena such as roulette wheel spins.
This commonly believed fallacy can be said to have helped create an entire city in the desert of Nevada USA. Though the overall odds of a 'big run' happening may be low, each spin of the wheel is itself entirely independent from the last. So whilst there may be a very small chance that heads will come up 20 times in a row if you flip a coin, the chances of heads coming up on each individual flip remain 50/50, and aren't influenced by what happened before.
Example: Red had come up six times in a row on the roulette wheel, so Greg knew that it was close to certain that black would be next up. Suffering an economic form of natural selection with this thinking, he soon lost all of his savings.
TOPIC: #LogicalFallacy
SOURCE: yourlogicalfallacyis.com
You said that 'runs' occur to statistically independent phenomena such as roulette wheel spins.
This commonly believed fallacy can be said to have helped create an entire city in the desert of Nevada USA. Though the overall odds of a 'big run' happening may be low, each spin of the wheel is itself entirely independent from the last. So whilst there may be a very small chance that heads will come up 20 times in a row if you flip a coin, the chances of heads coming up on each individual flip remain 50/50, and aren't influenced by what happened before.
Example: Red had come up six times in a row on the roulette wheel, so Greg knew that it was close to certain that black would be next up. Suffering an economic form of natural selection with this thinking, he soon lost all of his savings.
TOPIC: #LogicalFallacy
SOURCE: yourlogicalfallacyis.com
FRAMING WITH LEADING QUESTIONS AND NEGATION
Framing occurs when a question is asked is a way that suggests what the correct response should be. The reader is “led” into assuming a particular perspective or point of view.
Consider the following problem:
Imagine that the U.S. is preparing for the outbreak of an unusual disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows:
"If Program A is adopted, 200 people will be saved. If Program B is adopted, there is 1/3 probability that 600 people will be saved, and 2/3 probability that no people will be saved. Which of the two programs would you favor? "
Now consider the same problem, and select between the following two programs:
"If Program C is adopted, 400 people will die. If Program D is adopted there is 1/3 probability that nobody will die, and 2/3 probability that 600 people will die. Which of these two programs would you favor?"
When this problem was presented to college students, 72% of those given the first set of choices selected Program A, while 78% of those given the second set of choices selected Program D. Look closely at the choices. Program A and C are effectively identical—they differ only in that A is described in terms of the numbers of lives saved, while C is described in terms of the number who will die. Program B and D are also identical, differing only in the language used to describe the outcomes. It seems that most people are risk adverse, which means that they prefer options that do not involve loss. When an alternative makes a potential loss prominent (e.g., focuses on the number that die), people will reject that alternative. It is clear that when an option is stated in terms of a loss, it is judged more negatively than a mathematically identical statement about gains. This is an important result, showing that human judgments and preferences can be readily manipulated by changes in the way questions are asked or framed. If I tell you that a new medical treatment has a 50% success rate you will be more likely to endorse its use than if I tell you that it has a 50% failure rate. The only difference is whether the information was presented in a positive (success rate) frame or a negative (failure rate) frame.
Framing can be used to influence thinking in many different contexts, so its effects can be powerful. If you understand how they work, you can use framing to your advantage and recognize when others are using it to their advantage. Suppose you are interviewing for a job and you have gotten to the “sticky” issue of negotiating salary. If you said to the prospective employer that you really wanted $50,000 a year, but you are willing to take $45,000, the employer begins to see this offer as a gain of $5000. Similarly, if the prospective employer were to say that she was ready to offer $40,000 but is willing to go as high as $45,000 (after all you studied critical thinking and should be worth more salary), then you would have the “feeling” of having gained $5000.
TOPIC: #CognitiveBiases
SOURCE: Thought and knowledge: an introduction to critical thinking by Diane F. Halpern
Framing occurs when a question is asked is a way that suggests what the correct response should be. The reader is “led” into assuming a particular perspective or point of view.
Consider the following problem:
Imagine that the U.S. is preparing for the outbreak of an unusual disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows:
"If Program A is adopted, 200 people will be saved. If Program B is adopted, there is 1/3 probability that 600 people will be saved, and 2/3 probability that no people will be saved. Which of the two programs would you favor? "
Now consider the same problem, and select between the following two programs:
"If Program C is adopted, 400 people will die. If Program D is adopted there is 1/3 probability that nobody will die, and 2/3 probability that 600 people will die. Which of these two programs would you favor?"
When this problem was presented to college students, 72% of those given the first set of choices selected Program A, while 78% of those given the second set of choices selected Program D. Look closely at the choices. Program A and C are effectively identical—they differ only in that A is described in terms of the numbers of lives saved, while C is described in terms of the number who will die. Program B and D are also identical, differing only in the language used to describe the outcomes. It seems that most people are risk adverse, which means that they prefer options that do not involve loss. When an alternative makes a potential loss prominent (e.g., focuses on the number that die), people will reject that alternative. It is clear that when an option is stated in terms of a loss, it is judged more negatively than a mathematically identical statement about gains. This is an important result, showing that human judgments and preferences can be readily manipulated by changes in the way questions are asked or framed. If I tell you that a new medical treatment has a 50% success rate you will be more likely to endorse its use than if I tell you that it has a 50% failure rate. The only difference is whether the information was presented in a positive (success rate) frame or a negative (failure rate) frame.
Framing can be used to influence thinking in many different contexts, so its effects can be powerful. If you understand how they work, you can use framing to your advantage and recognize when others are using it to their advantage. Suppose you are interviewing for a job and you have gotten to the “sticky” issue of negotiating salary. If you said to the prospective employer that you really wanted $50,000 a year, but you are willing to take $45,000, the employer begins to see this offer as a gain of $5000. Similarly, if the prospective employer were to say that she was ready to offer $40,000 but is willing to go as high as $45,000 (after all you studied critical thinking and should be worth more salary), then you would have the “feeling” of having gained $5000.
TOPIC: #CognitiveBiases
SOURCE: Thought and knowledge: an introduction to critical thinking by Diane F. Halpern
❤3👍1
Which rubric do you want to see more?
anonymous poll
more #CognitiveBiases – 199
👍👍👍👍👍👍👍 29%
more #Psychology – 194
👍👍👍👍👍👍👍 28%
more #Books – 131
👍👍👍👍👍 19%
more #LogicalFallacy – 104
👍👍👍👍 15%
more #Explanations – 65
👍👍 9%
👥 693 people voted so far.
anonymous poll
more #CognitiveBiases – 199
👍👍👍👍👍👍👍 29%
more #Psychology – 194
👍👍👍👍👍👍👍 28%
more #Books – 131
👍👍👍👍👍 19%
more #LogicalFallacy – 104
👍👍👍👍 15%
more #Explanations – 65
👍👍 9%
👥 693 people voted so far.
❤1👍1👌1
THE SCARCITY PRINCIPLE
According to the scarcity principle, objects become more attractive when there are not very many of them. This scarcity may be either real or imagined. People assume that because others appear to want something, and it is in short supply, it must be valuable.
Brands can use the scarcity principle to persuade people to fill out a lead form, purchase a product, or take another desired action. Here’s an example: On many air travel booking sites, such as KAYAK, flight listings are displayed with a note that only a few seats are left at a certain price.
We know that airfare pricing is incredibly volatile — that’s why some of us wait until certain times or days of the week to make purchases — so the knowledge that only one seat is available at that price makes me think I should buy it now, instead of waiting and running the risk of paying more later.
TOPIC: #Psychology
SOURCE: Influence: The Psychology of Persuasion by Robert B. Cialdini
According to the scarcity principle, objects become more attractive when there are not very many of them. This scarcity may be either real or imagined. People assume that because others appear to want something, and it is in short supply, it must be valuable.
Brands can use the scarcity principle to persuade people to fill out a lead form, purchase a product, or take another desired action. Here’s an example: On many air travel booking sites, such as KAYAK, flight listings are displayed with a note that only a few seats are left at a certain price.
We know that airfare pricing is incredibly volatile — that’s why some of us wait until certain times or days of the week to make purchases — so the knowledge that only one seat is available at that price makes me think I should buy it now, instead of waiting and running the risk of paying more later.
TOPIC: #Psychology
SOURCE: Influence: The Psychology of Persuasion by Robert B. Cialdini
👍1
PRIMING EFFECTS
As is common in science, the first big breakthrough in our understanding of the mechanism of association was an improvement in a method of measurement. Until a few decades ago, the only way to study associations was to ask many people questions such as, “What is the first word that comes to your mind when you hear the word DAY?” The researchers tallied the frequency of responses, such as “night,” “sunny,” or “long.” In the 1980s, psychologists discovered that exposure to a word causes immediate and measurable changes in the ease with which many related words can be evoked. If you have recently seen or heard the word EAT, you are temporarily more likely to complete the word fragment SO_P as SOUP than as SOAP. The opposite would happen, of course, if you had just seen WASH. We call this a priming effect and say that the idea of EAT primes the idea of SOUP, and that WASH primes SOAP.
Priming effects take many forms. For example, common gestures can unconsciously influence our thoughts and feelings. In one demonstration, people were asked to listen to messages through new headphones. They were told that the purpose of the experiment was to test the quality of the audio equipment and were instructed to move their heads repeatedly to check for any distortions of sound. Half the participants were told to nod their head up and down while others were told to shake it side to side. The messages they heard were radio editorials. Those who nodded (a yes gesture) tended to accept the message they heard, but those who shook their head tended to reject it. Again, there was no awareness, just a habitual connection between an attitude of rejection or acceptance and its common physical expression. You can see why the common admonition to “act calm and kind regardless of how you feel” is very good advice: you are likely to be rewarded by actually feeling calm and kind.
TOPIC: #CognitiveBiases
SOURCE: Thinking, Fast and Slow by Daniel Kahneman
As is common in science, the first big breakthrough in our understanding of the mechanism of association was an improvement in a method of measurement. Until a few decades ago, the only way to study associations was to ask many people questions such as, “What is the first word that comes to your mind when you hear the word DAY?” The researchers tallied the frequency of responses, such as “night,” “sunny,” or “long.” In the 1980s, psychologists discovered that exposure to a word causes immediate and measurable changes in the ease with which many related words can be evoked. If you have recently seen or heard the word EAT, you are temporarily more likely to complete the word fragment SO_P as SOUP than as SOAP. The opposite would happen, of course, if you had just seen WASH. We call this a priming effect and say that the idea of EAT primes the idea of SOUP, and that WASH primes SOAP.
Priming effects take many forms. For example, common gestures can unconsciously influence our thoughts and feelings. In one demonstration, people were asked to listen to messages through new headphones. They were told that the purpose of the experiment was to test the quality of the audio equipment and were instructed to move their heads repeatedly to check for any distortions of sound. Half the participants were told to nod their head up and down while others were told to shake it side to side. The messages they heard were radio editorials. Those who nodded (a yes gesture) tended to accept the message they heard, but those who shook their head tended to reject it. Again, there was no awareness, just a habitual connection between an attitude of rejection or acceptance and its common physical expression. You can see why the common admonition to “act calm and kind regardless of how you feel” is very good advice: you are likely to be rewarded by actually feeling calm and kind.
TOPIC: #CognitiveBiases
SOURCE: Thinking, Fast and Slow by Daniel Kahneman
👍1
OVERCONFIDENCE PHENOMENON
By definition, there is always some uncertainty in probabilistic events. Yet, research has shown that people tend to be more confident in their decisions about probabilistic events than they should be.
In an experimental investigation of the overconfidence phenomenon, people were asked to provide answers with a specified degree of confidence to factual questions (Kahneman & Tversky, 1979). Try it with this question: “I feel 98 percent certain that the number of nuclear plants operating in the world in 1980 was more than _____ and less than ____.” Fill in the blanks with numbers that reflect 98% confidence. The researchers investigating this effect found that nearly one-third of the time, the correct answer did not lie between the two values that reflected a 98% level of confidence. (The correct answer to this question is 189.) This result demonstrates that people are often highly confident when their high degree of confidence is unwarranted.
Have you ever bought a lottery ticket? Do you know what the odds are against your hitting the jackpot? The laws of probability dictate that you should expect to lose, yet countless numbers of people expect to win. In fact, a disturbing poll published in Money magazine revealed that almost as many people are planning for their retirement by buying lottery tickets (39%) as are investing in stocks (43%) (Wang, 1994).
Overconfidence about uncertain events is a problem even for experts in many fields where there is great uncertainty. In an analysis of political predictions (who is likely to win, for example), Silver (2011, para.3) wrote:
“Experts have a poor understanding of uncertainty. Usually, this manifests itself in the form of overconfidence: experts underestimate the likelihood that their predictions might be wrong.”
Overconfidence can be disastrous for financial investors. In a study of individual investors, two economists found that most people fail to recognize the role that chance plays in the stock market, so they tend to attribute gains to their own expertise in picking stocks and losses to external forces that they could not control. The result is that overconfident investors trade their stocks far too often because they believe that they are making wise choices (Gervais & Odean, 2001). If they could recognize the effects of random fluctuations in the stock market instead of attributing the changes to their own trading behaviors, they would have traded less often and ended up in better financial shape.
TOPIC: #CognitiveBiases
SOURCE: Thought and knowledge: an introduction to critical thinking by Diane F. Halpern
By definition, there is always some uncertainty in probabilistic events. Yet, research has shown that people tend to be more confident in their decisions about probabilistic events than they should be.
In an experimental investigation of the overconfidence phenomenon, people were asked to provide answers with a specified degree of confidence to factual questions (Kahneman & Tversky, 1979). Try it with this question: “I feel 98 percent certain that the number of nuclear plants operating in the world in 1980 was more than _____ and less than ____.” Fill in the blanks with numbers that reflect 98% confidence. The researchers investigating this effect found that nearly one-third of the time, the correct answer did not lie between the two values that reflected a 98% level of confidence. (The correct answer to this question is 189.) This result demonstrates that people are often highly confident when their high degree of confidence is unwarranted.
Have you ever bought a lottery ticket? Do you know what the odds are against your hitting the jackpot? The laws of probability dictate that you should expect to lose, yet countless numbers of people expect to win. In fact, a disturbing poll published in Money magazine revealed that almost as many people are planning for their retirement by buying lottery tickets (39%) as are investing in stocks (43%) (Wang, 1994).
Overconfidence about uncertain events is a problem even for experts in many fields where there is great uncertainty. In an analysis of political predictions (who is likely to win, for example), Silver (2011, para.3) wrote:
“Experts have a poor understanding of uncertainty. Usually, this manifests itself in the form of overconfidence: experts underestimate the likelihood that their predictions might be wrong.”
Overconfidence can be disastrous for financial investors. In a study of individual investors, two economists found that most people fail to recognize the role that chance plays in the stock market, so they tend to attribute gains to their own expertise in picking stocks and losses to external forces that they could not control. The result is that overconfident investors trade their stocks far too often because they believe that they are making wise choices (Gervais & Odean, 2001). If they could recognize the effects of random fluctuations in the stock market instead of attributing the changes to their own trading behaviors, they would have traded less often and ended up in better financial shape.
TOPIC: #CognitiveBiases
SOURCE: Thought and knowledge: an introduction to critical thinking by Diane F. Halpern
THE ILLUSION OF UNDERSTANDING
Nassim Taleb, in The Black Swan, introduced the notion of a narrative fallacy to describe how flawed stories of the past shape our views of the world and our expectations for the future. Narrative fallacies arise inevitably from our continuous attempt to make sense of the world.
Consider the story of how Google turned into a giant of the technology industry. Two creative graduate students in the computer science department at Stanford University come up with a superior way of searching information on the Internet. They seek and obtain funding to start a company and make a series of decisions that work out well. Within a few years, the company they started is one of the most valuable stocks in America, and the two former graduate students are among the richest people on the planet.
I intentionally told this tale blandly, but you get the idea: there is a very good story here. Fleshed out in more detail, the story could give you the sense that you understand what made Google succeed; it would also make you feel that you have learned a valuable general lesson about what makes businesses succeed. Unfortunately, there is good reason to believe that your sense of understanding and learning from the Google story is largely illusory.
Like watching a skilled rafter avoiding one potential calamity after another as he goes down the rapids, the unfolding of the Google story is thrilling because of the constant risk of disaster. However, there is foр an instructive difference between the two cases. The skilled rafter has gone down rapids hundreds of times. He has learned to read the roiling water in front of him and to anticipate obstacles. There are fewer opportunities for young men to learn how to create a giant company, and fewer chances to avoid hidden rocks—such as a brilliant innovation by a competing firm. Of course there was a great deal of skill in the Google story, but luck played a more important role in the actual event than it does in the telling of it.
I have heard of too many people who “knew well before it happened that the 2008 financial crisis was inevitable.” This sentence contains a highly objectionable word, which should be removed from our vocabulary in discussions of major events. The word is, of course, knew. Some people thought well in advance that there would be a crisis, but they did not know it. They now say they knew it because the crisis did in fact happen. This is a misuse of an important concept. In everyday language, we apply the word know only when what was known is true and can be shown to be true. We can know something only if it is both true and knowable. But the people who thought there would be a crisis (and there are fewer of them than now remember thinking it) could not conclusively show it at the time.
The core of the illusion is that we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do. Know is not the only word that fosters this illusion. In common usage, the words intuition and premonition also are reserved for past thoughts that turned out to be true. The statement “I had a premonition that the marriage would not last, but I was wrong” sounds odd, as does any sentence about an intuition that turned out to be false. To think clearly about the future, we need to clean up the language that we use in labeling the beliefs we had in the past.
TOPIC: #CognitiveBiases
SOURCE: Thinking, Fast and Slow by Daniel Kahneman
Nassim Taleb, in The Black Swan, introduced the notion of a narrative fallacy to describe how flawed stories of the past shape our views of the world and our expectations for the future. Narrative fallacies arise inevitably from our continuous attempt to make sense of the world.
Consider the story of how Google turned into a giant of the technology industry. Two creative graduate students in the computer science department at Stanford University come up with a superior way of searching information on the Internet. They seek and obtain funding to start a company and make a series of decisions that work out well. Within a few years, the company they started is one of the most valuable stocks in America, and the two former graduate students are among the richest people on the planet.
I intentionally told this tale blandly, but you get the idea: there is a very good story here. Fleshed out in more detail, the story could give you the sense that you understand what made Google succeed; it would also make you feel that you have learned a valuable general lesson about what makes businesses succeed. Unfortunately, there is good reason to believe that your sense of understanding and learning from the Google story is largely illusory.
Like watching a skilled rafter avoiding one potential calamity after another as he goes down the rapids, the unfolding of the Google story is thrilling because of the constant risk of disaster. However, there is foр an instructive difference between the two cases. The skilled rafter has gone down rapids hundreds of times. He has learned to read the roiling water in front of him and to anticipate obstacles. There are fewer opportunities for young men to learn how to create a giant company, and fewer chances to avoid hidden rocks—such as a brilliant innovation by a competing firm. Of course there was a great deal of skill in the Google story, but luck played a more important role in the actual event than it does in the telling of it.
I have heard of too many people who “knew well before it happened that the 2008 financial crisis was inevitable.” This sentence contains a highly objectionable word, which should be removed from our vocabulary in discussions of major events. The word is, of course, knew. Some people thought well in advance that there would be a crisis, but they did not know it. They now say they knew it because the crisis did in fact happen. This is a misuse of an important concept. In everyday language, we apply the word know only when what was known is true and can be shown to be true. We can know something only if it is both true and knowable. But the people who thought there would be a crisis (and there are fewer of them than now remember thinking it) could not conclusively show it at the time.
The core of the illusion is that we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do. Know is not the only word that fosters this illusion. In common usage, the words intuition and premonition also are reserved for past thoughts that turned out to be true. The statement “I had a premonition that the marriage would not last, but I was wrong” sounds odd, as does any sentence about an intuition that turned out to be false. To think clearly about the future, we need to clean up the language that we use in labeling the beliefs we had in the past.
TOPIC: #CognitiveBiases
SOURCE: Thinking, Fast and Slow by Daniel Kahneman
👍2❤1
THE MENTAL SET
A mental set is a tendency to only see solutions that have worked in the past. This type of fixed thinking can make it difficult to come up with solutions and can impede the problem-solving process.
For example, let's imagine that your vacuum cleaner has stopped working. When it has stopped working in the past, a broken belt was the culprit. Since past experience has taught you that the belt is a common issue, you replace the belt again, but this time the vacuum continues to malfunction.
You ask a friend to come take a look at the vacuum, and he discovers that one of the hose attachments was not connected, causing the vacuum to lose suction. Because of your mental set, you failed to notice a fairly obvious solution to the problem.
TOPIC: #CognitiveBiases
SOURCE: verywell.com
A mental set is a tendency to only see solutions that have worked in the past. This type of fixed thinking can make it difficult to come up with solutions and can impede the problem-solving process.
For example, let's imagine that your vacuum cleaner has stopped working. When it has stopped working in the past, a broken belt was the culprit. Since past experience has taught you that the belt is a common issue, you replace the belt again, but this time the vacuum continues to malfunction.
You ask a friend to come take a look at the vacuum, and he discovers that one of the hose attachments was not connected, causing the vacuum to lose suction. Because of your mental set, you failed to notice a fairly obvious solution to the problem.
TOPIC: #CognitiveBiases
SOURCE: verywell.com
THE FUNCTIONAL FIXEDNESS
Did you read the previous post? Yes? So the Functional fixedness is a specific type of mental set that involves only being able to see solutions that involve using objects in their normal or expected manner.
Imagine that you need to drive a nail into a wall so you can hang a framed photo. Unable to find a hammer, you spend a significant amount of time searching your house to find the missing tool. A friend comes over and suggests using a metal wrench instead to pound the nail into the wall.
Why didn't you think of using the metal wrench? Psychologists suggest that something known as functional fixedness often prevents us from thinking of alternative solutions to problems and different uses for objects.
TOPIC: #CognitiveBiases
SOURCE: verywell.com
Did you read the previous post? Yes? So the Functional fixedness is a specific type of mental set that involves only being able to see solutions that involve using objects in their normal or expected manner.
Imagine that you need to drive a nail into a wall so you can hang a framed photo. Unable to find a hammer, you spend a significant amount of time searching your house to find the missing tool. A friend comes over and suggests using a metal wrench instead to pound the nail into the wall.
Why didn't you think of using the metal wrench? Psychologists suggest that something known as functional fixedness often prevents us from thinking of alternative solutions to problems and different uses for objects.
TOPIC: #CognitiveBiases
SOURCE: verywell.com
The Anchoring Bias
(reading time – 50 sec.)
We tend to be overly influenced by the first piece of information that we hear. For example, the first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based. Researchers have even found that having participants choose a completely random number can influence what people guess when asked unrelated questions, such as how many countries there are in Africa.
This tricky little cognitive bias influences and other things. Doctors, for example, can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.
Topic: #CognitiveBiases
Source: verywell.com
(reading time – 50 sec.)
We tend to be overly influenced by the first piece of information that we hear. For example, the first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based. Researchers have even found that having participants choose a completely random number can influence what people guess when asked unrelated questions, such as how many countries there are in Africa.
This tricky little cognitive bias influences and other things. Doctors, for example, can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.
Topic: #CognitiveBiases
Source: verywell.com
The Actor Observer Bias
(reading time – 50 sec.)
The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation. When it comes to our own actions, we are often far too likely to attribute things to external influences. You might complain that you botched an important meeting because you had jet lag or that you failed an exam because the teacher posed too many trick questions.
When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag) and a fellow student bombed a test because she lacks diligence and intelligence (and not because she took the same test as you with all those trick questions).
Topic: #CognitiveBiases
Source: verywell.com
(reading time – 50 sec.)
The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation. When it comes to our own actions, we are often far too likely to attribute things to external influences. You might complain that you botched an important meeting because you had jet lag or that you failed an exam because the teacher posed too many trick questions.
When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag) and a fellow student bombed a test because she lacks diligence and intelligence (and not because she took the same test as you with all those trick questions).
Topic: #CognitiveBiases
Source: verywell.com
👍1
The False-Consensus Effect
(reading time – 1 min.)
People have a surprising tendency to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values, an inclination known as the false consensus effect. This can lead people not only to incorrectly think that everyone else agrees with them – it can sometimes lead them to overvalue their own opinions.
Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion. Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem. It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.
Topic: #CognitiveBiases
Source: verywell.com
(reading time – 1 min.)
People have a surprising tendency to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values, an inclination known as the false consensus effect. This can lead people not only to incorrectly think that everyone else agrees with them – it can sometimes lead them to overvalue their own opinions.
Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion. Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem. It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.
Topic: #CognitiveBiases
Source: verywell.com
Anecdotal
(reading time – 1 min.)
You used a personal experience or an isolated example instead of a sound argument or compelling evidence.
It's often much easier for people to believe someone's testimony as opposed to understanding complex data and variation across a continuum. Quantitative scientific measures are almost always more accurate than personal perceptions and experiences, but our inclination is to believe that which is tangible to us, and/or the word of someone we trust over a more 'abstract' statistical reality.
Example: Jason said that that was all cool and everything, but his grandfather smoked, like, 30 cigarettes a day and lived until 97 - so don't believe everything you read about meta analyses of methodologically sound studies showing proven causal relationships.
Topic: #LogicalFallacy
Source: yourlogicalfallacyis.com
(reading time – 1 min.)
You used a personal experience or an isolated example instead of a sound argument or compelling evidence.
It's often much easier for people to believe someone's testimony as opposed to understanding complex data and variation across a continuum. Quantitative scientific measures are almost always more accurate than personal perceptions and experiences, but our inclination is to believe that which is tangible to us, and/or the word of someone we trust over a more 'abstract' statistical reality.
Example: Jason said that that was all cool and everything, but his grandfather smoked, like, 30 cigarettes a day and lived until 97 - so don't believe everything you read about meta analyses of methodologically sound studies showing proven causal relationships.
Topic: #LogicalFallacy
Source: yourlogicalfallacyis.com
👍1
The Self-Serving Bias
(reading time – 35 sec.)
A cognitive bias that distorts your thinking is known as the self-serving bias. Basically, people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck. This bias does serve an important role; it helps protect our self-esteem. However, it does often lead to faulty attributions, such as blaming others for our own shortcomings.
Topic: #CognitiveBiases
Source: verywell.com
(reading time – 35 sec.)
A cognitive bias that distorts your thinking is known as the self-serving bias. Basically, people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck. This bias does serve an important role; it helps protect our self-esteem. However, it does often lead to faulty attributions, such as blaming others for our own shortcomings.
Topic: #CognitiveBiases
Source: verywell.com
The Optimism Bias
(reading time – 55 sec.)
Another cognitive bias that has its roots in the availability heuristic is known as the optimism bias. Essentially, we tend to be too optimistic for our own good. We overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. We assume that events like divorce, job loss, illness, and death happen to other people.
So what impact does this sometimes unrealistic optimism really have on our lives? It can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt.
The bad news is that research has found that this optimism bias is incredibly difficult to reduce. There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals. So while cognitive biases can distort our thinking and sometimes lead to poor decisions, they are not always so bad.
Topic: #CognitiveBiases
Source: verywell.com
(reading time – 55 sec.)
Another cognitive bias that has its roots in the availability heuristic is known as the optimism bias. Essentially, we tend to be too optimistic for our own good. We overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. We assume that events like divorce, job loss, illness, and death happen to other people.
So what impact does this sometimes unrealistic optimism really have on our lives? It can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt.
The bad news is that research has found that this optimism bias is incredibly difficult to reduce. There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals. So while cognitive biases can distort our thinking and sometimes lead to poor decisions, they are not always so bad.
Topic: #CognitiveBiases
Source: verywell.com
The Misinformation Effect
(reading time – 1 min.)
Our memories of particular events also tend to be heavily influenced by things that happened after the actual event itself, a phenomenon known as the misinformation effect. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.
In one classic experiment by memory expert Elizabeth Loftus, people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit into each other?” or “How fast were the cars going when they smashed into each other?”
When the witnesses were then questioned a week later, the researchers discovered that this small change in how questions were presented led participants to recall things that they did not actually witness. When asked whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.
Topic: #CognitiveBiases
Source: verywell.com
(reading time – 1 min.)
Our memories of particular events also tend to be heavily influenced by things that happened after the actual event itself, a phenomenon known as the misinformation effect. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.
In one classic experiment by memory expert Elizabeth Loftus, people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit into each other?” or “How fast were the cars going when they smashed into each other?”
When the witnesses were then questioned a week later, the researchers discovered that this small change in how questions were presented led participants to recall things that they did not actually witness. When asked whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.
Topic: #CognitiveBiases
Source: verywell.com
👍1
The Hindsight Bias
(reading time – 40 sec.)
One common cognitive bias involves the tendency to see events, even random ones, as more predictable than they are. In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court. Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.
This tendency to look back on events and believe that we “knew it all along” is surprisingly prevalent. Following exams, students often look back on questions and think “Of course! I knew that!” even though they missed it the first time around. Investors look back and believe that they could have predicted which tech companies would become dominant forces.
Topic: #CognitiveBiases
Source: verywell.com
(reading time – 40 sec.)
One common cognitive bias involves the tendency to see events, even random ones, as more predictable than they are. In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court. Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.
This tendency to look back on events and believe that we “knew it all along” is surprisingly prevalent. Following exams, students often look back on questions and think “Of course! I knew that!” even though they missed it the first time around. Investors look back and believe that they could have predicted which tech companies would become dominant forces.
Topic: #CognitiveBiases
Source: verywell.com
The Availability Heuristic
(reading time – 40 sec.)
After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are. This tendency to estimate the probability of something happening based on how many examples readily come to mind is known as the availability heuristic. It is essentially a mental shortcut designed to save us time when we are trying to determine risk.
The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions. Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking.
Topic: #CognitiveBiases
Source: verywell.com
(reading time – 40 sec.)
After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are. This tendency to estimate the probability of something happening based on how many examples readily come to mind is known as the availability heuristic. It is essentially a mental shortcut designed to save us time when we are trying to determine risk.
The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions. Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking.
Topic: #CognitiveBiases
Source: verywell.com
❤1