Critical thinking – Telegram
OVERCONFIDENCE PHENOMENON

By definition, there is always some uncertainty in probabilistic events. Yet, research has shown that people tend to be more confident in their decisions about probabilistic events than they should be.

In an experimental investigation of the overconfidence phenomenon, people were asked to provide answers with a specified degree of confidence to factual questions (Kahneman & Tversky, 1979). Try it with this question: “I feel 98 percent certain that the number of nuclear plants operating in the world in 1980 was more than _____ and less than ____.” Fill in the blanks with numbers that reflect 98% confidence. The researchers investigating this effect found that nearly one-third of the time, the correct answer did not lie between the two values that reflected a 98% level of confidence. (The correct answer to this question is 189.) This result demonstrates that people are often highly confident when their high degree of confidence is unwarranted.

Have you ever bought a lottery ticket? Do you know what the odds are against your hitting the jackpot? The laws of probability dictate that you should expect to lose, yet countless numbers of people expect to win. In fact, a disturbing poll published in Money magazine revealed that almost as many people are planning for their retirement by buying lottery tickets (39%) as are investing in stocks (43%) (Wang, 1994).

Overconfidence about uncertain events is a problem even for experts in many fields where there is great uncertainty. In an analysis of political predictions (who is likely to win, for example), Silver (2011, para.3) wrote:
“Experts have a poor understanding of uncertainty. Usually, this manifests itself in the form of overconfidence: experts underestimate the likelihood that their predictions might be wrong.”

Overconfidence can be disastrous for financial investors. In a study of individual investors, two economists found that most people fail to recognize the role that chance plays in the stock market, so they tend to attribute gains to their own expertise in picking stocks and losses to external forces that they could not control. The result is that overconfident investors trade their stocks far too often because they believe that they are making wise choices (Gervais & Odean, 2001). If they could recognize the effects of random fluctuations in the stock market instead of attributing the changes to their own trading behaviors, they would have traded less often and ended up in better financial shape.


TOPIC: #CognitiveBiases
SOURCE: Thought and knowledge: an introduction to critical thinking by Diane F. Halpern
THE ILLUSION OF UNDERSTANDING

Nassim Taleb, in The Black Swan, introduced the notion of a narrative fallacy to describe how flawed stories of the past shape our views of the world and our expectations for the future. Narrative fallacies arise inevitably from our continuous attempt to make sense of the world.
Consider the story of how Google turned into a giant of the technology industry. Two creative graduate students in the computer science department at Stanford University come up with a superior way of searching information on the Internet. They seek and obtain funding to start a company and make a series of decisions that work out well. Within a few years, the company they started is one of the most valuable stocks in America, and the two former graduate students are among the richest people on the planet.

I intentionally told this tale blandly, but you get the idea: there is a very good story here. Fleshed out in more detail, the story could give you the sense that you understand what made Google succeed; it would also make you feel that you have learned a valuable general lesson about what makes businesses succeed. Unfortunately, there is good reason to believe that your sense of understanding and learning from the Google story is largely illusory.

Like watching a skilled rafter avoiding one potential calamity after another as he goes down the rapids, the unfolding of the Google story is thrilling because of the constant risk of disaster. However, there is foр an instructive difference between the two cases. The skilled rafter has gone down rapids hundreds of times. He has learned to read the roiling water in front of him and to anticipate obstacles. There are fewer opportunities for young men to learn how to create a giant company, and fewer chances to avoid hidden rocks—such as a brilliant innovation by a competing firm. Of course there was a great deal of skill in the Google story, but luck played a more important role in the actual event than it does in the telling of it.

I have heard of too many people who “knew well before it happened that the 2008 financial crisis was inevitable.” This sentence contains a highly objectionable word, which should be removed from our vocabulary in discussions of major events. The word is, of course, knew. Some people thought well in advance that there would be a crisis, but they did not know it. They now say they knew it because the crisis did in fact happen. This is a misuse of an important concept. In everyday language, we apply the word know only when what was known is true and can be shown to be true. We can know something only if it is both true and knowable. But the people who thought there would be a crisis (and there are fewer of them than now remember thinking it) could not conclusively show it at the time.

The core of the illusion is that we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do. Know is not the only word that fosters this illusion. In common usage, the words intuition and premonition also are reserved for past thoughts that turned out to be true. The statement “I had a premonition that the marriage would not last, but I was wrong” sounds odd, as does any sentence about an intuition that turned out to be false. To think clearly about the future, we need to clean up the language that we use in labeling the beliefs we had in the past.


TOPIC: #CognitiveBiases
SOURCE: Thinking, Fast and Slow by Daniel Kahneman
👍21
THE MENTAL SET

A mental set is a tendency to only see solutions that have worked in the past. This type of fixed thinking can make it difficult to come up with solutions and can impede the problem-solving process.

For example, let's imagine that your vacuum cleaner has stopped working. When it has stopped working in the past, a broken belt was the culprit. Since past experience has taught you that the belt is a common issue, you replace the belt again, but this time the vacuum continues to malfunction.

You ask a friend to come take a look at the vacuum, and he discovers that one of the hose attachments was not connected, causing the vacuum to lose suction. Because of your mental set, you failed to notice a fairly obvious solution to the problem.


TOPIC: #CognitiveBiases
SOURCE: verywell.com
THE FUNCTIONAL FIXEDNESS

Did you read the previous post? Yes? So the Functional fixedness is a specific type of mental set that involves only being able to see solutions that involve using objects in their normal or expected manner.

Imagine that you need to drive a nail into a wall so you can hang a framed photo. Unable to find a hammer, you spend a significant amount of time searching your house to find the missing tool. A friend comes over and suggests using a metal wrench instead to pound the nail into the wall.

Why didn't you think of using the metal wrench? Psychologists suggest that something known as functional fixedness often prevents us from thinking of alternative solutions to problems and different uses for objects.


TOPIC: #CognitiveBiases
SOURCE: verywell.com
The Anchoring Bias
(reading time – 50 sec.)

We tend to be overly influenced by the first piece of information that we hear. For example, the first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based. Researchers have even found that having participants choose a completely random number can influence what people guess when asked unrelated questions, such as how many countries there are in Africa.

This tricky little cognitive bias influences and other things. Doctors, for example, can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.

Topic: #CognitiveBiases
Source: verywell.com
The Actor Observer Bias
(reading time – 50 sec.)

The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation. When it comes to our own actions, we are often far too likely to attribute things to external influences. You might complain that you botched an important meeting because you had jet lag or that you failed an exam because the teacher posed too many trick questions.

When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag) and a fellow student bombed a test because she lacks diligence and intelligence (and not because she took the same test as you with all those trick questions).


Topic: #CognitiveBiases
Source: verywell.com
👍1
The False-Consensus Effect
(reading time – 1 min.)

People have a surprising tendency to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values, an inclination known as the false consensus effect. This can lead people not only to incorrectly think that everyone else agrees with them – it can sometimes lead them to overvalue their own opinions.
Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion. Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem. It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.


Topic: #CognitiveBiases
Source: verywell.com
Anecdotal
(reading time – 1 min.)

You used a personal experience or an isolated example instead of a sound argument or compelling evidence.

It's often much easier for people to believe someone's testimony as opposed to understanding complex data and variation across a continuum. Quantitative scientific measures are almost always more accurate than personal perceptions and experiences, but our inclination is to believe that which is tangible to us, and/or the word of someone we trust over a more 'abstract' statistical reality.

Example: Jason said that that was all cool and everything, but his grandfather smoked, like, 30 cigarettes a day and lived until 97 - so don't believe everything you read about meta analyses of methodologically sound studies showing proven causal relationships.


Topic: #LogicalFallacy
Source: yourlogicalfallacyis.com
👍1
The Self-Serving Bias
(reading time – 35 sec.)

A cognitive bias that distorts your thinking is known as the self-serving bias. Basically, people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck. This bias does serve an important role; it helps protect our self-esteem. However, it does often lead to faulty attributions, such as blaming others for our own shortcomings.


Topic: #CognitiveBiases
Source: verywell.com
The Optimism Bias
(reading time – 55 sec.)

Another cognitive bias that has its roots in the availability heuristic is known as the optimism bias. Essentially, we tend to be too optimistic for our own good. We overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. We assume that events like divorce, job loss, illness, and death happen to other people.

So what impact does this sometimes unrealistic optimism really have on our lives? It can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt.

The bad news is that research has found that this optimism bias is incredibly difficult to reduce. There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals. So while cognitive biases can distort our thinking and sometimes lead to poor decisions, they are not always so bad.


Topic: #CognitiveBiases
Source: verywell.com
The Misinformation Effect
(reading time – 1 min.)

Our memories of particular events also tend to be heavily influenced by things that happened after the actual event itself, a phenomenon known as the misinformation effect. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.

In one classic experiment by memory expert Elizabeth Loftus, people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit into each other?” or “How fast were the cars going when they smashed into each other?”

When the witnesses were then questioned a week later, the researchers discovered that this small change in how questions were presented led participants to recall things that they did not actually witness. When asked whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.


Topic: #CognitiveBiases
Source: verywell.com
👍1
The Hindsight Bias
(reading time – 40 sec.)

One common cognitive bias involves the tendency to see events, even random ones, as more predictable than they are. In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court. Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.
This tendency to look back on events and believe that we “knew it all along” is surprisingly prevalent. Following exams, students often look back on questions and think “Of course! I knew that!” even though they missed it the first time around. Investors look back and believe that they could have predicted which tech companies would become dominant forces.


Topic: #CognitiveBiases
Source: verywell.com
The Availability Heuristic
(reading time – 40 sec.)

After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are. This tendency to estimate the probability of something happening based on how many examples readily come to mind is known as the availability heuristic. It is essentially a mental shortcut designed to save us time when we are trying to determine risk.

The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions. Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking.


Topic: #CognitiveBiases
Source: verywell.com
1
Confirmation Bias
(reading time – 40 sec.)

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way(if two individuals have the same information, the way they interpret it can be biased).

Confirmation bias can lead investors to be overconfident, ignoring evidence that their strategies will lose money. In studies of political stock markets, investors made more profit when they resisted bias. For example, participants who interpreted a candidate's debate performance in a neutral rather than partisan way were more likely to profit. To combat the effect of confirmation bias, investors can try to adopt a contrary viewpoint "for the sake of argument". In one technique, they imagine that their investments have collapsed and ask themselves why this might happen.


Topic: #CognitiveBiases
Source: Wikipedia
👍4
Burden of Proof
(reading time – 45 sec.)

You said that the burden of proof lies not with the person making the claim, but with someone else to disprove.

The burden of proof lies with someone who is making a claim, and is not upon anyone else to disprove. The inability, or disinclination, to disprove a claim does not render that claim valid, nor give it any credence whatsoever. However it is important to note that we can never be certain of anything, and so we must assign value to any claim based on the available evidence, and to dismiss something on the basis that it hasn't been proven beyond all doubt is also fallacious reasoning.

Example: Bertrand declares that a teapot is, at this very moment, in orbit around the Sun between the Earth and Mars, and that because no one can prove him wrong, his claim is therefore a valid one.


Topic: #LogicalFallacy
Source: yourlogicalfallacyis.com
👍1
Appeal to Authority
(reading time – 40 sec.)

You said that because an authority thinks something, it must therefore be true.

It's important to note that this fallacy should not be used to dismiss the claims of experts, or scientific consensus. Appeals to authority are not valid arguments, but nor is it reasonable to disregard the claims of experts who have a demonstrated depth of knowledge unless one has a similar level of understanding and/or access to empirical evidence. However it is, entirely possible that the opinion of a person or institution of authority is wrong; therefore the authority that such a person or institution holds does not have any intrinsic bearing upon whether their claims are true or not.

Example: Not able to defend his position that evolution 'isn't true' Bob says that he knows a scientist who also questions evolution (and presumably isn't a primate).


Topic: #LogicalFallacy
Source: yourlogicalfallacyis.com
👍1👏1👌1
Reciprocation
(reading time – 1 min.)

In social psychology, reciprocity is a social rule that says people should repay, in kind, what another person has provided for them; that is, people give back (reciprocate) the kind of treatment they have received from another. By virtue of the rule of reciprocity, people are obligated to repay favors, gifts, invitations, etc. in the future. If someone receives a gift for their birthday, a reciprocal expectation may influence them to do the same on the gift-giver's birthday. This sense of future obligation associated with reciprocity makes it possible to build continuing relationships and exchanges. Reciprocal actions of this nature are important to social psychology as they can help explain the maintenance of social norms.

A person who violates the reciprocity norm by accepting without attempting to return the good acts of others is disliked by the social group. Individuals who benefit from the group's resources without contributing any skills, helping, or resources of their own are called free riders. Both individuals and social groups often punish free riders, even when this punishment results in considerable costs to the group. Therefore, it is unsurprising that individuals will go to great lengths to avoid being seen as a moocher, freeloader, or ingrate.

The rule enforces uninvited debts and can trigger unfair exchanges.


Topic: #Psychology
Source: Influence: The Psychology of Persuasion by Robert B. Cialdini
Embodied Cognition
(reading time – 20 sec.)

Embodied cognition is the idea that the mind is not only connected to the body but that the body influences the mind.

In other words it reflects the argument that the motor system influences our cognition, just as the mind influences bodily actions.

For example, when somebody holds a pencil in their teeth engaging the muscles of a smile, they comprehend pleasant sentences faster than unpleasant ones, while holding a pencil between their nose and upper lip to engage the muscles of a frown has the reverse effect.


Topic: #CognitiveBiases
Source: Wikipedia
Framing Effect
(reading time – 30 sec.)

The framing effect in psychology is a cognitive bias that humans suffer from. We react unknowingly to things the way they're conveyed to us.

Consider the simple example of a pessimist and an optimist. A glass of water which is either half-full or half-empty: both are equivalent truths. However, when portrayed in a negative frame, you think that the glass is half-empty. If portrayed in a positive frame, you see the glass as half-full.

Such 'frames' can be used to create marketing gimmicks by advertisers to trick consumers into buying their products.


Topic: #CognitiveBiases
Source: psychologenie.com
Strawman
(reading time – 30 sec.)

You misrepresented someone's argument to make it easier to attack.

By exaggerating, misrepresenting, or just completely fabricating someone's argument, it's much easier to present your own position as being reasonable, but this kind of dishonesty serves to undermine honest rational debate.

Example: After Will said that we should put more money into health and education, Warren responded by saying that he was surprised that Will hates our country so much that he wants to leave it defenceless by cutting military spending.


Topic: #LogicalFallacy
Source: yourlogicalfallacyis.com
Subjective Validation
(reading time – 30 sec.)

Subjective validation is a cognitive bias by which a person will consider a statement or another piece of information to be correct if it has any personal meaning or significance to them.

In other words, a person whose opinion is affected by subjective validation will perceive two unrelated events to be related because their personal belief demands that they be related. Subjective validation is an important element in cold reading. It is considered to be the main reason behind most reports of paranormal phenomena.


Topic: #CognitiveBiases
Source: Wikipedia