4👑☸ Cattāri Ariya-saccaṃ 四聖諦
4👑☸
→
Fallacy & Cognitive Bias    
1👁
 
🔝
relevant articles
🔗Edward Thorp's single most important piece of advice: think for yourself and think critically
Fallacy & Cognitive Bias
critical thinking and reasoning
First you need to be able to think critically, to think clearly, to use reason, logic, common sense, without being misled by your own biases and animal impulses.
You need to be able to systematically follow through your thinking to see if it works under all kinds of conditions, not just a simple case.
your circle of competence
You need to know what you know, and know what you don’t know.
You also need street smarts.
Some the most genuine communication is hidden below the surface, in how people say things, and what they’re NOT saying, and their body language. What people say and what they write are often calculated, with an agenda. Having an agenda doesn’t necessarily imply a sinister motive.
They don’t formally teach you these skills in school unfortunately. You either have good parami and instincts that you carry over from your past life, or you’re sharp enough to learn from the personal examples around you from people who do possess sharp critical thinking skills.
Follow the money
A more polite way of expressing this principle is, “look for the incentive.” But I’m more direct. One of the most important life skills is figuring out who you can trust, are they telling the truth, are they reliable? Non-arahants and non-ariya especially, no matter how spiritually developed, are unreliable to some extent, so in the back of your mind you should always be open to that possibility. And if you’re not ariya, you don’t really have any assurance that the person you think is ariya actually is. So always treat everyone, no matter how ariya-like they behave, as potentially non-ariya, and capable of deception and agendas. Again, having an agenda, being deceptive, doesn’t automatically mean they have an evil motive. It just means, they are capable of being blinded by their own biases, and doing something unethical because they think the ends justify the means.
what they do and don't say
If you learn how to carefully observe people, their action, communication, etc, over a long period of time, and understand what’s their motive, what’s their incentive, what do they have to gain, it goes a long way to helping you factor in your lie detection system and ascertaining the reliability of what they say.
Everyone, everything is my teacher
If you’re biased against those younger than you, older than you, more conceited than you, and refuse to learn from them based on that bias, it’s your loss. It’s hard to learn lessons even from the best teacher under the best of circumstances, it would be pretty dumb to limit your opportunities to learn by being too selective about the teacher. If you really value truth, learning, you learn from anyone, any animal, any thing, any situation, like a man dying of thirst licking water from a puddle. If you have good sense, you better just focus on learning and seeing truth wherever and whenever you can, not waiting for an ideal teacher with the perfect polite demeanor who may not even exist. If Hitler has something insightful or valuable to say, I’m going to absorb it. Truth and the messenger are two separate things. If you ignore the truth because of the messenger, it’s cutting off the nose to spite the face.
Think like a thief
The Thai forest Ajahns, I think this comes from Ajahn Lee, had a real practical approach. They said you have to think like a thief. No one is going to hand you the truth and perfect teaching on a silver platter. You have to use all your wits to discover it, to steal the art for yourself.
lessons of the instructive dead
That's a phrase I heard from Charlie Munger. What it means is you learn from mistakes of others that led to their death/failure. Why do you have to learn from the school of hard knocks and personal experience from dangerous and fatal mistakes first hand? Why not learn from the mistakes of dumb and/or careless people and not make those mistakes?
Similar to his teachings on 'invert. always invert', which is reminiscent of MN 117, looking at the 'right' forms of noble eightfold path in terms of not doing the 'wrong' form of it.
(excerpt from seekingalpha.com)
Inspired by the mathematician Carl Jacobi, he said:
Invert, always invert: Turn a situation or problem upside down. Look at it backward. What happens if all our plans go wrong? Where don't we want to go, and how do you get there? Instead of looking for success, make a list of how to fail instead - through sloth, envy, resentment, self-pity, entitlement, all the mental habits of self-defeat. Avoid these qualities and you will succeed. Tell me where I'm going to die, that is, so I don't go there. - Charlie Munger
Munger says that the best way to achieve success is by avoiding failures. He implies that it is not brilliance that made Berkshire Hathaway succeed. They consistently avoided stupidity.
It is easier to avoid failures, than to strive for success directly.Looking at problems and scenarios from a different perspective helps us to identify obstacles in a better way.
Step-by-step guide to inversion
1. Figure out what you want to achieve.
2. What do you not want to happen? This is the worst-case scenario.
3. How could the worst-case scenario happen?
4. How can you avoid the worst-case scenario?
⚠️ Fallacy & Cognitive Bias
(A person who lies 🤥 is capable of any evil)
|
|
♦ 5. musā-vāda-suttaṃ (KN 4.25) n |
KN Iti 25 lying-speech-discourse |
(cst4) |
(derived from thanissaro) |
♦ 25. vuttañh-etaṃ bhagavatā, |
§25. {This}-was-said (by) the Blessed One, |
vuttam-arahatāti me sutaṃ -- |
said-(by the)-Arahant (so) I have-heard: |
|
|
♦ “eka-dhammaṃ atītassa, bhikkhave, purisa-puggalassa |
“one-thing transgressed, *********, (for a) person, |
nāhaṃ tassa kiñci pāpa-kammaṃ a-karaṇīyanti |
there is no evil-action (they are) not-opposed-to-doing. |
vadāmi. |
(this I) say. |
katamaṃ eka-dhammaṃ? |
Which one thing? |
yadidaṃ bhikkhave, sampajāna-musā-vādo”ti. |
just-this, monks: deliberate-lie-telling." |
etam-atthaṃ bhagavā avoca. |
this-(is the)-meaning (of what) the-blessed-one said. |
tatth-etaṃ iti vuccati — |
(with regard to)-that thus (was it) said. |
|
|
|
|
Dictionary result for fallacy
noun: fallacy; plural noun: fallacies
a mistaken belief, especially one based on unsound argument.
"the notion that the camera never lies is a fallacy"
synonyms: misconception, mistaken belief, misbelief, delusion, false notion, mistaken impression, misapprehension, misjudgment, miscalculation, misinterpretation, misconstruction, error, mistake, untruth, inconsistency, illusion, myth, fantasy, deceit, deception, sophism; More
sophistry, casuistry, faulty reasoning, unsound argument
"the fallacy that we all work from nine to five"
Logic
a failure in reasoning which renders an argument invalid.
faulty reasoning; misleading or unsound argument.
"the potential for fallacy which lies behind the notion of self-esteem"
Definition of Cognitive Bias
A cognitive bias is a mistake in reasoning, evaluating, remembering, or other cognitive process, often occurring as a result of holding onto one's preferences and beliefs regardless of contrary information. Psychologists study cognitive biases as they relate to memory, reasoning, and decision-making.
(Chegg.com)
The most common cognitive biases are confirmation, anchoring, halo effect, and overconfidence
https://en.wikipedia.org/wiki/List_of_cognitive_biases
Ad hominem
Ad hominem (Latin for "to the person"[1]), short for argumentum ad hominem, is a fallacious argumentative strategy whereby genuine discussion of the topic at hand is avoided by instead attacking the character, motive, or other attribute of the person making the argument, or persons associated with the argument, rather than attacking the substance of the argument itself.
fake ad hominem attack
excerpt from
https://laurencetennant.com/bonds/adhominem.html
Therefore, if you can't demonstrate that your opponent is trying to counter your argument by attacking you, you can't demonstrate that he is resorting to ad hominem.
...
Actual instances of argumentum ad hominem are relatively rare. Ironically, the fallacy is most often committed by those who accuse their opponents of ad hominem, since they try to dismiss the opposition not by engaging with their arguments, but by claiming that they resort to personal attacks. Those who are quick to squeal "ad hominem" are often guilty of several other logical fallacies, including one of the worst of all: the fallacious belief that introducing an impressive-sounding Latin term somehow gives one the decisive edge in an argument.
example 1: Ven. T and Ven. A on Bhikkhuni Ordination
https://notesonthedhamma.blogspot.com/2019/02/fake-ad-hominem-accusation-is-refuge-of.html
👑 Argument from authority
(Wikipedia) An argument from authority (argumentum ab auctoritate), also called an appeal to authority, or argumentum ad verecundiam, is a form of defeasible[1] argument in which a claimed authority's support is used as evidence for an argument's conclusion. It is well known as a fallacy, though it is used in a cogent form when all sides of a discussion agree on the reliability of the authority in the given context.[2][3]
appeal to authority is valid when...
https://www.logicallyfallacious.com/tools/lp/Bo/LogicalFallacies/21/Appeal-to-Authority
Exception: Be very careful not to confuse "deferring to an authority on the issue" with the appeal to authority fallacy. Remember, a fallacy is an error in reasoning. Dismissing the council of legitimate experts and authorities turns good skepticism into denialism. The appeal to authority is a fallacy in argumentation, but deferring to an authority is a reliable heuristic that we all use virtually every day on issues of relatively little importance. There is always a chance that any authority can be wrong, that’s why the critical thinker accepts facts provisionally. It is not at all unreasonable (or an error in reasoning) to accept information as provisionally true by credible authorities. Of course, the reasonableness is moderated by the claim being made (i.e., how extraordinary, how important) and the authority (how credible, how relevant to the claim).
The appeal to authority is more about claims that require evidence than about facts. For example, if your tour guide told you that Vatican City was founded February 11, 1929, and you accept that information as true, you are not committing a fallacy (because it is not in the context of argumentation) nor are you being unreasonable.
Tip: Question authority -- or become the authority that people look to for answers.
Argument to moderation
(Wikipedia) Argument to moderation (Latin: argumentum ad temperantiam)—also known as false equivalence, false compromise, [argument from] middle ground, equidistance fallacy, and the golden mean fallacy[1]—is an informal fallacy which asserts that the truth must be found as a compromise between two opposite positions.[2][3] An example of a fallacious use of the argument to moderation would be to regard two opposed arguments—one person saying that the sky is blue, while another claims that the sky is in fact yellow—and conclude that the truth is that the sky is green.[4] While green is the colour created by combining blue and yellow, therefore being a compromise between the two positions—the sky is obviously not green, demonstrating that taking the middle ground of two positions does not always lead to the truth.
Vladimir Bukovsky points out that the middle ground between the Big Lie of Soviet propaganda and the truth is itself a lie, and one should not be looking for a middle ground between disinformation and information.[5] According to him, people from the Western pluralistic civilization are more prone to this fallacy because they are used to resolving problems by making compromises and accepting alternative interpretations—unlike Russians, who are looking for the absolute truth.[5]
Bait-and-switch
(raitionalwiki) The bait-and-switch is a logical fallacy that occurs when someone presents a partial, appealing truth while hiding an unappealing falsehood.
The fallacy is a fallacious argument style.
Bait-and-switch tactics are used in cults and other authoritarian settings to conceal doctrine from outsiders and new inductees. Since many such groups base themselves on principles that outsiders often find outlandish, such groups often choose to spoon-feed doctrine to inductees rather than letting them study it independently, in hopes that the extreme points of doctrine will be much more acceptable to the new believer after a period of conditioning brainwashing.
The difference between bait-and-switch and telling people the introductory-level concepts of a religion before teaching the more complex concepts of a religion is twofold. In a bait-and-switch:
the more complex parts of the religion are deliberately hidden (as opposed to hidden for practical reasons), and
the more complex parts of the religion are also the parts which are most objectionable, as opposed to merely being more nuanced versions of more basic beliefs.
Would a rat enter a trap if it knows in advance that is a trap?
Hypnotic Bait and Switch
https://www.logicallyfallacious.com/tools/lp/Bo/LogicalFallacies/105/Hypnotic-Bait-and-Switch
Description: Stating several uncontroversially true statements in succession, followed by a claim that the arguer wants the audience to accept as true. This is a propaganda technique, but also a fallacy when the audience lends more credibility to the last claim because it was preceded by true statements. The negative can also be used in the same way.
This is a classic sales technique often referred to as, “getting the customer used to saying ‘yes’!”
Bandwagon effect
This is the tendency for people to do or think things because other people do or think them.
🛏️ Bed of Procrustes
In Greek mythology, Procrustes (Ancient Greek: Προκρούστης Prokroustes) or "the stretcher [who hammers out the metal]", also known as Prokoptas or Damastes (Δαμαστής, "subduer"), was a rogue smith and bandit from Attica who attacked people by stretching them or cutting off their legs, so as to force them to fit the size of an iron bed.
Begging the question
See circular reasoning. Begging the question is closely related to circular reasoning, and in modern usage the two generally refer to the same thing.
🍒 cherry picking
cher·ry-pick·ing
noun: cherry-picking;
the action or practice of choosing and taking only the most beneficial or profitable items, opportunities, etc., from what is available.
"it is an exaggeration based on the cherry-picking of facts"
The 🍒 cherry picking strategy is abused frequently along with confirmation bias and narrative fallacy. Basically, one has a preconceived notion, an agenda, or biased view. For example:
Since stream entry could be attained with First jhāna, first jhāna must be a very difficult practice, and (V&V💭) vitakka & vicāra, directed-thought & evaluation, must be more subtle and fundamentally different than ordinary thinking.
Then reading through the suttas, MN 128 in this case, we look for evidence to support our agenda (confirmation bias), quote out of context, quote selectively to support our agenda (🍒 cherry picking). And if the evidence doesn't quite fit, we distort and twist it until it can appear to fit, like the bed of Procustes*.
In the case of MN 128, the four jhānas are not mentioned. And instead of 5 hindrances, there are 11 upa-kilesas (defilments). The sutta instead works with 3 ways of samādhi, which are in fact another way of mapping out the four jhānas. But an important point needs to be made here. When Bhikkhu Anālayo, Bhikkhu Sujato, Ajahn Brahm need samādhi to be four jhānas, then they call it four jhānas. In other contexts, when the samādhi is obviously also jhāna quality like it is here, then they say, "it's a samādhi that is lower than four jhāna quality". This is cherry picking, inconsistent application of samādhi standard, confirmation bias, and intellectual dishonesty. In trying to find evidence in the EBT to support their view of Jhāna, they employ these strategies often. And in the long debate between Ṭhānissaro Bhikkhu and Bhikkhu Anālayo on Bhikkhuni ordination, Bhikkhu Anālayo does this in abundance. I didn't follow the whole debate and read all of their back and forth essays, but I read enough of it (very carefully), and looked at some of the pāḷi text to verify some of their claims, enough to see Bhikkhu Anālayo uses the same bag of intellectually dishonest strategies as he does in gathering evidence from the EBT to support his view on jhāna.
Circular reasoning
(wikipedia) (Latin: circulus in probando, "circle in proving";[1] also known as circular logic) is a logical fallacy in which the reasoner begins with what they are trying to end with.[2] The components of a circular argument are often logically valid because if the premises are true, the conclusion must be true. Circular reasoning is not a formal logical fallacy but a pragmatic defect in an argument whereby the premises are just as much in need of proof or evidence as the conclusion, and as a consequence the argument fails to persuade. Other ways to express this are that there is no reason to accept the premises unless one already believes the conclusion, or that the premises provide no independent ground or evidence for the conclusion.
Cognitive dissonance
(Wikipedia) In the field of psychology, cognitive dissonance is the mental discomfort (psychological stress) experienced by a person who holds two or more contradictory beliefs, ideas or values. This discomfort is triggered by a situation in which a person’s belief clashes with new evidence perceived by the person. When confronted with facts that contradict beliefs, ideals and values, people will find a way to resolve the contradiction to reduce their discomfort.[1][2]
In A Theory of Cognitive Dissonance (1957), Leon Festinger proposed that human beings strive for internal psychological consistency to function mentally in the real world. A person who experiences internal inconsistency tends to become psychologically uncomfortable and is motivated to reduce the cognitive dissonance, by making changes to justify the stressful behavior, either by adding new parts to the cognition causing the psychological dissonance or by avoiding circumstances and contradictory information likely to increase the magnitude of the cognitive dissonance.[1]
Conjunction Fallacy
(also known as: conjunction effect)
excerpt from
https://www.logicallyfallacious.com/tools/lp/Bo/LogicalFallacies/72/Conjunction-Fallacy
Description: The assumption that more specific conditions are more probable than general ones. This fallacy usually stems from thinking the choices are alternatives, rather than members of the same set. The fallacy is further exacerbated by priming the audience with information leading them to choose the subset as the more probable option.
Logical Form:
X is a subset of Y.
Therefore, X is more probable than Y.
Example #1:
While jogging around the neighborhood, you are more likely to get bitten by someone’s pet dog, than by any member of the canine species.
Explanation: Actually, that is not the case. “Someone’s pet dog”, assuming a real dog and not some robot dog, would also be a member of the canine species. Therefore, the canine species includes wolves, coyotes, as well as your neighbor’s Shih Tzu, who is likely to bite you just because he’s pissed for being so small.
Example #2: Mr. Pipp, is a sharp dresser, too good-looking, works as an interior decorator and loves everything Barbra Streisand. Is Mr. Pipp more likely to be a man or a gay man?
Explanation: It would be fallacious to say that Mr. Pipp is more likely to be a gay man—even if we found out that Mr. Pipp worked nights as a dancer at a drag queen show. There is a 100% chance Mr. Pipp is a man, and a smaller chance that he is a gay man because the group “man” includes all the members of the group “gay man”.
Exception: When contradicting conditions are implied, but incorrectly stated.
In the example above, the way the question reads, we now know that there is a 100% chance Mr. Pipp is a man and a smaller chance that he is a gay man. However, if the questioner meant to imply, “straight man” or “gay man” as the choices, then it could be more of a poorly phrased question than a fallacy.
References:
Kahneman, D. (2013). Thinking, Fast and Slow (1st edition). New York: Farrar, Straus and Giroux.
Equivocation
(Wikipedia) In logic, equivocation ('calling two different things by the same name') is an informal fallacy resulting from the use of a particular word/expression in multiple senses throughout an argument leading to a false conclusion.[1][2] Abbott and Costello's "Who's on first?" routine is a well known example of equivocation.[3][4]
It is a type of ambiguity that stems from a phrase having two distinct meanings, not from the grammar or structure of the sentence.[1]
Some examples of equivocation in syllogisms (a logical chain of reasoning) are below:
Since only man [human] is rational,
and no woman is a man [male],
Therefore, no woman is rational.[1]
A feather is light [not heavy].
What is light [bright] cannot be dark.
Therefore, a feather cannot be dark.
In the above example, distinct meanings of the word "light" are implied in contexts of the first and second statements.
All jackasses have long ears.
Carl is a jackass.
Therefore, Carl has long ears.
Here, the equivocation is the metaphorical use of "jackass" to imply a simple-minded or obnoxious person instead of a male donkey.
False equivalence
(wikipedia) False equivalence is a logical fallacy in which two completely opposing arguments appear to be logically equivalent when in fact they are not. This fallacy is categorized as a fallacy of inconsistency.[1]
Characteristics
A common way for this fallacy to be perpetuated is one shared trait between two subjects is assumed to show equivalence, especially in order of magnitude, when equivalence is not necessarily the logical result.[2] False equivalence is a common result when an anecdotal similarity is pointed out as equal, but the claim of equivalence doesn't bear because the similarity is based on oversimplification or ignorance of additional factors. The pattern of the fallacy is often as such: "If A is the set of c and d, and B is the set of d and e, then since they both contain d, A and B are equal". d is not required to exist in both sets; only a passing similarity is required to cause this fallacy to be used.
False equivalence arguments are often used in journalism[3][4] and in politics, where the minor flaws of one candidate may be compared to major flaws of another.[5][6]
Examples
The following statements are examples of false equivalence[7]:
They're both living animals that metabolize chemical energy. There's no difference between a pet cat and a pet snail.
(the "equivalence" is in factors that are not relevant to the animals' suitability as pets).
The Deepwater Horizon oil spill is no different from your neighbor dripping some oil on the ground when changing oil in his car.
(the comparison is between things differing by many orders of magnitude: Deepwater Horizon spilled 210 million US gal (790 million l) of oil, your neighbor might spill perhaps a pint (0.5 l).)
False balance (media term)
(Wikipedia) This article is about the media term. For the informal fallacy, see Argument to moderation. For the fallacy of inconsistency, see False equivalence.
Examples of false balance in reporting on science issues include the topics of man-made versus natural climate change, the alleged relation between thimerosal and autism[1] and evolution versus intelligent design.[2]
False balance can sometimes originate from similar motives as sensationalism, where producers and editors may feel that a story portrayed as a contentious debate will be more commercially successful than a more accurate account of the issue. However, unlike most other media biases, false balance may stem from an attempt to avoid bias; producers and editors may confuse treating competing views fairly—i.e., in proportion to their actual merits and significance—with treating them equally, giving them equal time to present their views even when those views may be known beforehand to be based on false information.
Examples
Climate change
Main article: Media coverage of climate change
False balance has been cited as a major cause of spreading misinformation.[4] An example of false balance is the debate on global warming: although the scientific community almost unanimously attributes global warming to the effects of the industrial revolution,[5][6][7][8] there is a very small number, a few dozen scientists out of tens of thousands of scientists, who dispute the conclusion.[9][10][11] Giving equal voice to scientists on both sides makes it seem like there is a serious disagreement within the scientific community, when in fact there is an overwhelming scientific consensus that anthropogenic global warming exists.
💸 Follow the money
"Follow the money" is a catchphrase popularized by the 1976 drama-documentary motion picture All the President's Men, which suggests political corruption can be brought to light by examining money transfers between parties.
Look for the incentive, the underlying motivation.
What are they REALLY trying to accomplish?
Are they honest?
Do they have integrity?
good track record?
good credit (behavior/character) history?
Learn to perceive the communication and information that's happening away from the actual communication. What's NOT said, what's implied, etc, often carries much more information about their true motivation.
😇 Halo effect
(Wikipedia) The halo effect is a type of immediate judgement discrepancy, or cognitive bias, where a person making an initial assessment of another person, place, or thing will assume ambiguous information based upon concrete information.[1][2][3] A simplified example of the halo effect is when an individual noticing that the person in the photograph is attractive, well groomed, and properly attired, assumes, using a mental heuristic, that the person in the photograph is a good person based upon the rules of that individual's social concept.[4][5][6] This constant error in judgment is reflective of the individual's preferences, prejudices, ideology, aspirations, and social perception.[3][6][7][8][9] The halo effect is an evaluation by an individual and can affect the perception of a decision, action, idea, business, person, group, entity, or other whenever concrete data is generalized or influences ambiguous information.[10][11][8][12]
The halo effect can also be explained as the behavior (usually unconscious) of using evaluations based on things unrelated, to make judgments about something or someone. The halo effect specifically refers to when this behavior has a positive correlation, such as viewing someone who is attractive as likely to be successful and popular. When this judgement has a negative connotation, such as someone unattractive being more readily blamed for a crime than someone attractive, it is referred to as the horn effect.[13]
example 1: ride coattail of reputable field
https://notesonthedhamma.blogspot.com/2019/02/halo-effect-and-sujatos-article-on.html
It turns out that if you want to convince someone that your explanation for something is the best way to explain it, you might want to tack on some useless (though accurate) information from a tangentially related scientific field.
It turns out that when you tack on additional information from a respected field of study, people think that makes an explanation more credible.
… And while this is a new finding, it's just one of several cognitive biases we have in favor of certain types of explanations. We think longer explanations are better than short ones and we prefer explanations that point to a goal or a reason for things happening, even if these things don't actually help us understand a phenomenon.
Hindsight bias
Hindsight bias is a term used in psychology to explain the tendency of people to overestimate their ability to have predicted an outcome that could not possibly have been predicted.
(wikipedia) Hindsight bias, also known as the knew-it-all-along phenomenon[1] or creeping determinism,[2] refers to the common tendency for people to perceive events that have already occurred as having been more predictable than they actually were before the events took place.[3][4] As a result, people often believe, after an event has occurred, that they would have predicted, or perhaps even would have known with a high degree of certainty, what the outcome of the event would have been, before the event occurred. Hindsight bias may cause distortions of our memories of what we knew and/or believed before an event occurred, and is a significant source of overconfidence regarding out ability to predict the outcomes of future events.[5] Examples of hindsight bias can be seen in the writings of historians describing outcomes of battles, physicians recalling clinical trials, and in judicial systems as individuals attribute responsibility on the basis of the supposed predictability of accidents.[6][7][2]
♾️🦆 Infinite Duck Dilemma
If it looks like a duck, quacks like a duck, waddles like a duck, it's jhāna!
Unless you're Bhikkhu Anālayo or Bhikkhu Sujato, and you need the evidence to say something else. When you need samādhi to be four jhānas, then you call it four jhānas. If you need to show the samādhi is less than four jhānas, then it's just a bird species that looks incredibly similar to a duck, quacks like a duck, etc., but is not actually a duck.
So what happens you when cherry pick ducks and call them ducks, and label all the other ducks not-ducks? You have an infinite Duck Dilemma. And you've destroyed the integrity and coherence of the suttas in the EBT, making it impossible to interpret and understand the meaning of the EBT if they apply your arbitrary cherry picking way of interpreting text consistently across the rest of the teaching.
🤥 Intellectual dishonesty
(from wikipedia)
Intellectual honesty is an applied method of problem solving, characterized by an unbiased, honest attitude, which can be demonstrated in a number of different ways:
One's personal faith does not interfere with the pursuit of truth;
Relevant facts and information are not purposefully omitted even when such things may contradict one's hypothesis;
Facts are presented in an unbiased manner, and not twisted to give misleading impressions or to support one view over another;
References, or earlier work, are acknowledged where possible, and plagiarism is avoided.
Harvard ethicist Louis M. Guenin describes the "kernel" of intellectual honesty to be "a virtuous disposition to eschew deception when given an incentive for deception".[1]
Intentionally committed fallacies in debates and reasoning are called intellectual dishonesty.
narrative fallacy
One of the limits to our ability to evaluate information objectively is what's called the narrative fallacy. In other words, we love stories, and we let our preference for a good story cloud facts and our ability to make rational decisions.
often used to create fallacious cause and effect
Evaluating events in hindsight, attributing cause and effect relationships where none is justified. But because humans like to reframe the past and search for insight by using a nice story ('narrative'), we often fool ourselves with this device.
🔪 Occam's razor
Occam's razor (also Ockham's razor or Ocham's razor (Latin: novacula Occami); further known as the law of parsimony (Latin: lex parsimoniae) is the problem-solving principle that essentially states that simpler solutions are more likely to be correct than complex ones.
When simpler and more complex solutions have equal explanatory power and causal predictability, there's no need to introduce more complex variables. That is to say, when everything is equal, go for the simpler explanation--but the caveat is that everything has to be equal.
Why is Occam's razor called Occam's Razor?
Occam's “Razor” is the stated principle that, all things being equal, the simplest explanation is usually the correct one. This principle cuts away, or slices and leaves aside, a host of potentially competing conclusions or arguments, leaving the simplest and most likely conclusion in place. Thus it is a “Razor”.
In science,
Occam's razor is used as an abductive heuristic in the development of theoretical models, rather than as a rigorous arbiter between candidate models.In the scientific method, Occam's razor is not considered an irrefutable principle of logic or a scientific result; the preference for simplicity in the scientific method is based on the falsifiability criterion. For each accepted explanation of a phenomenon, there may be an extremely large, perhaps even incomprehensible, number of possible and more complex alternatives. Since one can always burden failing explanations with ad hoc hypotheses to prevent them from being falsified, simpler theories are preferable to more complex ones because they are more testable.
Proof by assertion
From Wikipedia, Proof by assertion, sometimes informally referred to as proof by repeated assertion, is an informal fallacy in which a proposition is repeatedly restated regardless of contradiction.[1] Sometimes, this may be repeated until challenges dry up, at which point it is asserted as fact due to its not being contradicted (argumentum ad nauseam).[2] In other cases, its repetition may be cited as evidence of its truth, in a variant of the appeal to authority or appeal to belief fallacies.[3]
This fallacy is sometimes used as a form of rhetoric by politicians, or during a debate as a filibuster. In its extreme form, it can also be a form of brainwashing.[1] Modern politics contains many examples of proofs by assertion. This practice can be observed in the use of political slogans, and the distribution of "talking points", which are collections of short phrases that are issued to members of modern political parties for recitation to achieve maximum message repetition. The technique is also sometimes used in advertising.[4]
argument by assertion fallacy:
(rational wiki)
“”A lie told often enough becomes the truth.
—Lenin
“”If you repeat a lie often enough, it becomes the truth.
—Goebbels
“”… I have said it thrice:
What I tell you three times is true.
—The Bellman from The Hunting of the Snark
Argument by assertion is the logical fallacy where someone tries to argue a point by merely asserting that it is true, regardless of contradiction. While this may seem stupid, it's actually an easy trap to fall into and is quite common.
This is not the same as establishing initial axioms on which to build a framework of logic or ideas.
🔴🐟 Red herring
(From Wikipedia)
A red herring is something that misleads or distracts from a relevant or important issue.[1] It may be either a logical fallacy or a literary device that leads readers or audiences towards a false conclusion. A red herring might be intentionally used, such as in mystery fiction or as part of rhetorical strategies (e.g., in politics), or it could be inadvertently used during argumentation.
The term was popularized in 1807 by English polemicist William Cobbett, who told a story of having used a kipper (a strong-smelling smoked fish) to divert hounds from chasing a hare.
🎲 skin in the game
Skin in the game (phrase)
(Wikipedia) To have "skin in the game" is to have incurred risk (monetary or otherwise) by being involved in achieving a goal.
In the phrase, "skin" is a synecdoche for the person involved, and "game" is the metaphor for actions on the field of play under discussion.[1] The aphorism is particularly common in business, finance, and gambling, and is also used in politics.[1]
The origin of the phrase is unknown.[1]
It has commonly been attributed to Warren Buffett, referring to his own investment in his initial fund.[2] However, William Safire refutes Buffett as the source of the phrase, pointing to earlier instances.[3]
Another possible explanation is that the phrase draws its origins from William Shakespeare's play The Merchant of Venice, in which the antagonist Shylock stipulates that the protagonist Antonio must promise a pound of his own flesh as collateral, to be exacted by Shylock in the event that Antonio's friend Bassanio defaults on the loan to which Antonio is guarantor.
spot light fallacy
https://rationalwiki.org/wiki/Spotlight_fallacy
The spotlight fallacy (not to be confused with the spotlight effect) is a logical fallacy that occurs when highly publicized data on a group is incorrectly assumed to represent a different or larger group.
The fallacy is an overgeneralization and an informal fallacy.
straw man
noun: straw man; plural noun: straw men; noun: strawman; plural noun: strawmen
1. an intentionally misrepresented proposition that is set up because it is easier to defeat than an opponent's real argument.
"her familiar procedure of creating a straw man by exaggerating their approach"
2. a person regarded as having no substance or integrity.
"a photogenic straw man gets inserted into office and advisers dictate policy"
stealth strawman
A straw man doesn't necessarily have to be obvious and weak looking.
In the hands of a disingenuous and skilled person, the straw man can be incognito and compelling. You have to really understand the context, pay careful attention, and think things through to detect the skilled handiwork of a well crafted stealth straw man.
example 1:
https://notesonthedhamma.blogspot.com/2019/02/mn-19-bhikkhu-sujato-and-stealth-straw.html
Survivorship bias
(wikipedia) Survivorship bias or survival bias is the logical error of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility. This can lead to false conclusions in several different ways. It is a form of selection bias.
example:
https://notesonthedhamma.blogspot.com/2019/02/mn-117-understanding-survivorship-bias.html
Misc.
good resources:
https://en.wikipedia.org/wiki/List_of_fallacies
https://yourlogicalfallacyis.com/
http://utminers.utep.edu/omwilliamson/ENGL1311/fallacies.htm
https://www.logicallyfallacious.com/tools/lp/Bo/LogicalFallacies