Logic and
Language
Load the menuLoad the menu


Copyright   James R Meyer    2012 - 2024 https://www.jamesrmeyer.com

A List Of Fallacious Arguments

This is a list of fallacious arguments originally published by Don Lindsay and available on the Internet, but which now does not seem to appear anywhere else on the Internet.

  • Ad Hominem (Argument To The Man):

    Attacking the person instead of attacking his argument. For example: “Von Daniken’s books about ancient astronauts are worthless because he is a convicted forger and embezzler.” Which is true, but that’s not why they’re worthless. Another example is this syllogism, which alludes to Alan Turing’s homosexuality:

    “Turing thinks machines think. Turing lies with men. Therefore, machines don’t think.”

    Also note here the equivocation in the use of the word “lies”.

     

    A common form is an attack on sincerity, for example: “How can you argue for vegetarianism when you wear leather shoes?” The two wrongs make a right fallacy is related.

     

    A variation (related to Argument By Generalization) is to attack a whole class of people. For example: “Evolutionary biology is a sinister tool of the materialistic, atheistic religion of Secular Humanism.”

     

    Another variation is attack by innuendo: “Why don’t scientists tell us what they really know; are they afraid of public panic?”

     

    There may be a pretense that the attack isn’t happening: “In order to maintain a civil debate, I will not mention my opponent’s drinking problem.” Or “I don’t care if other people say you’re opinionated/boring/overbearing.”

     

    Attacks don’t have to be strong or direct. The attacker can merely show disrespect, or try cut down the opponent’s stature by saying that they seem to be sweating a lot, or that they have forgotten what they said last week. Some examples:

    “I used to think that way when I was your age.”

    “You’re new here, aren’t you?”

    “You weren’t breast fed as a child, were you?”

    “What drives you to make such a statement?”

    “If you’d just listen…”

    “You seem very emotional.”

     

    Sometimes the attack is on the other person’s intelligence. For example: “If you weren’t so stupid you would have no problem seeing my point of view.” Or: “Even you should understand my next point.”

     

    And sometimes the stupidity attack can be reversed. For example, dismissing a comment with: “Well, you’re just smarter than the rest of us.” or: “It’s clear that you’re too clever by half ”, implying to the audience that the person couldn’t be so smart. It is related to Not Invented Here and Changing The Subject.

     

    But - Ad Hominem is not fallacious if the attack goes to the credibility of the argument. For instance, the argument may depend on its presenter’s claim that he’s an expert. In that case a counter by Ad Hominem is undermining an Argument From Authority. Trial judges allow this category of attacks.

     

  • Needling:

    This is attempting to make the other person angry, without trying to address the argument at hand. Sometimes this is a delaying tactic.

     

    Needling is also Ad Hominem if it also insults the opponent. The attacker may instead insult something the other person believes in, interrupt, clown to show disrespect, be noisy, fail to pass over the microphone, and numerous other tricks. All of these work better if the attacker is running things - for example, if it is their radio show, they can cut off the other person’s microphone. Or, if the host or moderator is on the attacker’s side, that is almost as good as the attacker running the show themselves. It can work even better if the debate is videotaped, and the attacker is the person who will edit the video.

     

    Winking at the audience, or in general clowning in their direction, then that is tending to an Argument By Personal Charm.

     

    Usually, the best way to cope with insults is to show mild amusement, and remain polite. A humorous comeback will probably work better than an angry one.

     

  • Straw Man (Fallacy Of Extension):

    Attacking an exaggerated or caricatured version of the opponent’s position. For example, the claim that “Evolution means a dog giving birth to a cat.”

     

    Another example: “Senator Jones says that we should not fund the attack submarine program. I disagree entirely. I can’t understand why he wants to leave us defenseless like that.”

     

    On the Internet, it is common to exaggerate the opponent’s position so that a comparison can be made between the opponent and Hitler.

     

  • Inflation Of Conflict:

    Arguing that scholars debate a certain point. Therefore, they must know nothing, and their entire field of knowledge is “in crisis” or does not properly exist at all. For example, two historians debated whether Hitler killed five million Jews or six million Jews. A Holocaust denier argued that this disagreement made his claim credible, even though his death count is three to ten times smaller than the known minimum.

     

    Similarly, in the book, “The Mythology of Modern Dating Methods” (John Woodmorappe, 1999) we find on one page he says that two scientists “cannot agree” about which one of two geological dates is “real” and which one is “spurious” - but he fails to mention that the two dates differ by less than one percent.

     

  • Argument From Adverse Consequences (Appeal To Fear, Scare Tactics):

    Saying an opponent must be wrong, because if he is right, then bad things would ensue. For example, God must exist, because a godless society would be lawless and dangerous. Or the defendant in a murder trial must be found guilty, because otherwise husbands will be encouraged to murder their wives.

     

    Closely related is wishful thinking, for example: “My home in Florida is one foot above sea level. Therefore I am certain that global warming will not make the oceans rise by fifteen feet.” Of course, wishful thinking can also be about positive consequences, such as winning the lottery, or eliminating poverty and crime.

     

  • Special Pleading (Stacking The Deck):

    Using the arguments that support a position, but ignoring or somehow disallowing the arguments against it.

    For example, Uri Geller used special pleading when he claimed that the presence of unbelievers (such as stage magicians) made him unable to demonstrate his psychic powers.

     

  • Excluded Middle (False Dichotomy, Faulty Dilemma, Bifurcation):

    Assuming there are only two alternatives when in fact there are more. For example, assuming Atheism is the only alternative to Fundamentalism, or being a traitor is the only alternative to being a patriot.

     

  • Short Term Versus Long Term:

    This is a particular case of the Excluded Middle. For example: “We must deal with crime on the streets before improving the schools.” But why can’t we do some of both? Similarly, “We should take away the scientific research budget and use that money to feed starving children.”

     

  • Burden Of Proof:

    The claim that whatever has not yet been proved false must be true (or vice versa). Essentially the attacker claims that he should win by default if his opponent can’t make a strong enough case.

     

    This is a Non Sequitur - simply because something has not been proved false at a given moment in time, that does not imply that it must be true - absence of evidence is not evidence of absence.

     

  • Argument By Question:

    Asking the opponent a question which does not have a snappy answer. (Or anyway, no such answer that the audience has the background to understand.) The opponent has a choice: he can look weak or he can look long-winded. For example: “How can scientists expect us to believe that anything as complex as a single living cell could have arisen as a result of random natural processes?”

     

    Actually, pretty well any question has this effect to some extent. It usually takes longer to answer a question than ask it. Variants are the rhetorical question and the loaded question, such as “Have you stopped beating your wife yet?”

     

  • Argument by Rhetorical Question:

    Asking a question in a way that leads to a particular answer. For example: “When are we going to give the old folks of this country the pension they deserve?” The speaker is leading the audience to the answer “Right now.”

     

    Alternatively, he could have said: “When will we be able to afford a major increase in old age pensions?” In that case, the answer he is aiming at is almost certainly not: “Right now.”

     

  • Fallacy Of The General Rule:

    Assuming that something true in general is true in every possible case. For example: “All chairs have four legs.” Except that rocking chairs don’t have any legs, and what is a one-legged “shooting stick” if it isn’t a chair .

     

    Similarly, there are times when certain laws should be broken. For example, ambulances are allowed to break speed laws.

     

  • Reductive Fallacy (Oversimplification):

    Over-simplifying to conceal unwanted details. As Einstein said, everything should be made as simple as possible, but no simpler. Political slogans such as “Taxation is theft ” fall in this category.

     

  • Genetic Fallacy (Fallacy of Origins, Fallacy of Virtue):

    This is when an argument or a person making an argument has some particular origin, hence it is claimed that the argument must be right (or wrong). The idea is that things from that origin, or that social class, have virtue or lack virtue - being poor or being rich may be held out as being virtuous. Therefore, the plea is that the actual details of the argument can be overlooked, since correctness can be decided without any need to listen or think.

     

  • Psychogenetic Fallacy:

    Arguing that, since you know the psychological reason why the opponent likes an argument, then he’s biased, so his argument must be wrong.

     

  • Argument Of The Beard:

    An argument that two ends of a spectrum are the same, since one can travel along the spectrum in very small steps, and no single step can be said to be the dividing step between the two ends. The name comes from the idea that being clean-shaven must be the same as having a big beard, since in-between beards exist. This is the same argument is that all piles of stones are small, since if you add just one stone to a small pile of stones it remains small (Wangs paradox).

     

  • Argument From Age (Wisdom of the Ancients):

    The argument that very old (or very young) arguments are superior. This is a variation of the Genetic Fallacy, but has the psychological appeal of seniority and tradition (or innovation).

     

    Products labelled “New! Improved!” are appealing to a belief that innovation is of value for such products. It’s sometimes true, but not always. And then there can be an appeal to the past, such as: “Old Fashioned Baked Beans”.

     

  • Not Invented Here:

    Arguing that ideas from elsewhere are unwelcome. “This Is The Way We’ve Always Done It.”

     

    This fallacy is a variant of the Argument From Age. It gets a psychological boost from feelings that local ways are superior, or that local identity is worth any cost, or that innovations will upset matters. An example of this is the common assertion that America has “the best health care system in the world”, an idea refuted by this 2007 New York Times editorial.

     

    People who use the Not Invented Here argument may be accused of being stick-in-the-mud’s. Conversely, foreign and “imported” things may be held out as being superior.

     

  • Argument By Dismissal:

    An idea is rejected without saying why. Dismissals usually have overtones. For example: “If you don’t like it, leave the country” implies that the cause is hopeless, or that the person is unpatriotic, or that their ideas are foreign, or maybe all three. “If you don’t like it, go live in a Communist country” adds an emotive element.

     

  • Argument To The Future:

    Arguing that evidence will someday be discovered which will (then) support that point.

     

  • Poisoning The Wells:

    Discrediting the sources used by the opponent without providing any definitive evidence that diminishes the value of the sources. This can be a variation of Ad Hominem.

     

  • Argument By Emotive Language (Appeal To The People):

    Using emotionally loaded words to sway the audience’s emotions instead of their minds. Many emotions can be useful: anger, spite, envy, condescension, and so on. For example, an argument by condescension: “Support the ERA? Sure, when the women start paying for the drinks! Hah! Hah!”

     

    Americans who don’t like the Canadian medical system have referred to it as “socialist”, but is this intended to mean “foreign”, or “expensive”, or simply guilty by association?

     

    Cliche Thinking and Argument By Slogan can be used as adjuncts, particularly if the audience can be persuaded to chant the slogan. People who rely on this argument may seed the audience with supporters or “shills”, who laugh, applaud or chant at proper moments. This is the live-audience equivalent of adding a laugh track or music track. Now that many venues have video equipment, some speakers give part of their speech by playing a prepared video. These videos are an opportunity to show a supportive audience, use emotional music, show emotionally charged images, and the like. The idea is old; there used to be professional cheering sections; Monsieur Zig-Zag, pictured on the cigarette rolling papers, acquired his fame by applauding for money at the Paris Opera.

     

    If the emotion in question isn’t harsh, Argument By Poetic Language helps the effect. Flattering the audience doesn’t hurt either.

     

  • Argument By Personal Charm:

    Getting the audience to cut slack. An example was Ronald Reagan. And it helps to have an opponent with much less personal charm. Charm may create trust, or the desire to “join the winning team”, or the desire to please the speaker. This last can have the greatest effect if the audience feels sex appeal.

     

    Reportedly George W. Bush lost a debate when he was young, and said later that he would never be “out-bubba’d” again.

     

  • Appeal To Pity (Appeal to Sympathy, The Galileo Argument):

    “I did not murder my mother and father with an axe! Please don’t find me guilty; I’m suffering enough through being an orphan.”

     

    Some authors want the audience to know they’re suffering for their beliefs. For example: “Scientists scoffed at Copernicus and Galileo; they laughed at Edison, Tesla and Marconi; they won’t give my ideas a fair hearing either. But time will be the judge.”

     

    There is a strange variant which shows up on the internet. A person might refuse to answer questions about their arguments, on the grounds that the person asking is mean or has hurt their feelings, or that the question will not be answered since it is personal.

     

  • Appeal To Force:

    Threats, or even violence. On the Net, the usual threat is of a lawsuit. The traditional religious threat is that one will burn in Hell. However, history is full of instances where expressing an unpopular idea could get you beaten up on the spot, or worse.

    “The clinching proof of my reasoning is that I will cut anyone who argues further into dog-meat.”

    attributed to Sir Geoffery de Tourneville, ca 1350 A.D.

     

  • Argument By Vehemence:

    Being loud. Trial lawyers are taught this rule:

    If you have the facts, pound on the facts.

    If you have the law, pound on the law.

    If you don’t have either, pound on the table.

    The above rule paints vehemence as an act of desperation. But it can also be a way to seize control of the agenda, use up the opponent’s time, or just intimidate the easily cowed. And it’s not necessarily aimed at winning the day. A tantrum or a fit is also a way to get a reputation, so that in the future, people might not want to tackle you.

     

    This is related to putting a post on the internet in UPPERCASE, aka SHOUTING in text.

     

    Depending on what you’re loud about, this may also be an Appeal To Force, Argument By Emotive Language, Needling, or Changing The Subject.

     

  • Begging The Question (Assuming The Answer, Tautology):

    Reasoning in a circle. The thing to be proved is actually one of the assumptions, or is effectively contained within an initial assumption. For example: “We must have a death penalty to discourage violent crime.” which simply assumes that the death penalty discourages crime. Or: “The stock market fell because of a technical adjustment.” but is an “adjustment” just a stock market fall?

     

  • Stolen Concept:

    Using what one is trying to disprove. That is, requiring the truth of something for your proof that it is false. For example, using science to show that science is wrong. Or, arguing that you do not exist, when your existence is clearly required for you to be making the argument.

     

    This is a relative of Begging The Question, except that the circularity there is in what one is trying to prove, instead of what one is trying to disprove. It is also a relative of Reductio Ad Absurdum, where one temporarily assumes the truth of something.

     

  • Argument From Authority:

    The claim that the speaker is an expert, and so should be trusted. But there are degrees and areas of expertise. The speaker is actually claiming to be more expert, in the relevant subject area, than anyone else in the room. There is also an implied claim that expertise in the area is worth having. For example, claiming expertise in something hopelessly quack (like iridology) is actually an admission that the speaker is gullible.

     

  • Argument From False Authority:

    A strange variation on Argument From Authority. For example, the TV commercial which starts “I’m not a doctor, but I play one on TV.” Just what are we supposed to conclude from this?

     

  • Appeal To Anonymous Authority:

    An Appeal To Authority is made, but the authority is not named. For example: “Experts agree that…”, or: “Scientists say …” or even: “They say …”. This makes the information impossible to verify, and brings up the very real possibility that the arguer himself doesn’t know who the experts are. This can just be a case of spreading a rumor.

     

  • Appeal To Authority:

    This is often used to try to sway an argument without actually delving into the details. But authorities can be wrong. And if one were to always accept unquestioningly what current authorities claim, then in many cases there could never be any advancement from that position of authority, even if it were wrong.

     

    On the other hand, introducing the claim of an authority is acceptable as an introduction when what the authority claims is then backed up by reasoned argument.

     

    A variation is to appeal to unnamed authorities.

     

  • Appeal To False Authority:

    A variation on Appeal To Authority, but the authority is outside his area of expertise. For example: “Famous physicist John Taylor studied Uri Geller extensively and found no evidence of trickery or fraud in his feats.” But Taylor was not qualified to detect trickery or fraud of the kind used by stage magicians, and Taylor later admitted Geller had tricked him.

     

    A variation is to appeal to a non-existent authority. For example, someone reading an article by the creationist Dmitri Kuznetsov tried to look up articles referenced by him, and some of the references turned out to be to non-existent journals.

     

    Another variation is to misquote a real authority; there are several kinds of such misquotation. A quote can be inexact or have been edited, it can be taken out of context. For example, Chevy Chase: “Yes, I said that, but I was singing a song written by someone else at the time.” Another example is that it’s easy to prove that Mick Jagger is an assassin. In “Sympathy For The Devil” he sang: “I shouted out, who killed the Kennedys, When after all, it was … me…”

     

    Other variations can be the “gluing together” of separate quotes, or parts of a quote can be excluded to give a different meaning to the original.

     

  • Statement Of Conversion:

    The speaker says “I used to believe in X.”

     

    This is simply a weak form of asserting expertise. The speaker is implying that he has learned about the subject, and now that he is better informed, he has rejected X. So perhaps he is now an authority, and so this is an implied Argument From Authority.

     

    Another version of this is “I used to think that way when I was your age.” The speaker hasn’t said what is wrong with the argument; he is merely claiming that his age has made him an expert. But “X” has not actually been countered unless there is agreement that the speaker has that expertise.

     

    In general, any bald claim always has to be buttressed. For example, there are a number of Creationist authors who say they “used to be evolutionists”, but this does not imply that they must therefore have some expertise in evolution.

     

  • Bad Analogy:

    Claiming that two situations are highly similar, when they aren’t. For example: “The solar system reminds me of an atom, with planets orbiting the sun like electrons orbiting the nucleus. We know that electrons can jump from orbit to orbit; so we must look to ancient records for sightings of planets jumping from orbit to orbit also.”

     

    Or: “Minds, like rivers, can be broad. The broader the river, the shallower it is. Therefore, the broader the mind, the shallower it is.”

     

    Or: “We have pure food and drug laws; why can’t we have laws to keep movie-makers from giving us filth?”

     

  • Extended Analogy:

    The claim that two things, both analogous to a third thing, are therefore analogous to each other. For example, this debate:

    “I believe it is always wrong to oppose the law by breaking it.”

    “Such a position is odious: it implies that you would not have supported Martin Luther King.”

    “Are you saying that cryptography legislation is as important as the struggle for Black liberation? How dare you!”

     

    A person who advocates a particular position (For example, about gun control) may be told that Hitler believed the same thing. The clear implication is that the position is somehow tainted. But Hitler also believed that window drapes should go all the way to the floor. Does that mean people with such drapes are monsters?

     

  • Argument From Spurious Similarity:

    This is a relative of Bad Analogy. It is suggested that some resemblance is proof of a relationship. There is a World War 2 story, probably apocryphal, about a British lady who was trained in spotting German airplanes. She made a report about a certain very important type of plane. While being quizzed, she explained that she hadn’t been sure until she noticed that it had a little man in the cockpit, just like the little model airplane at the training class.

     

  • Reifying:

    An abstract thing is talked about as if it were concrete. Possibly a Bad Analogy is being made between concept and reality. For example: “Nature abhors a vacuum.”

     

  • False Cause:

    Assuming that because two things happened, the first one caused the second one. But sequence is not causation. For example: “Before women got the vote, there were no nuclear weapons.” Or: “Every time my brother Bill accompanies me to Fenway Park, the Red Sox are sure to lose.”

     

    Essentially, these are arguments that the sun goes down because we’ve turned on the street lights.

     

  • Confusing Correlation And Causation:

    Earthquakes in the Andes were correlated with the closest approaches of the planet Uranus. Therefore, Uranus must have caused them. But Jupiter is nearer than Uranus, and more massive too.

     

    When sales of hot chocolate go up, street crime drops. Does this correlation mean that hot chocolate prevents crime? No, it means that fewer people are on the streets when the weather is cold.

     

    The bigger a child’s shoe size, the better the child’s handwriting. Does having big feet make it easier to write? No, it means the child is older.

     

  • Causal Reductionism (Complex Cause):

    Trying to use one cause to explain something, when in fact it had several causes. For example: “The accident was caused by the taxi parking in the street.” But other drivers went around the taxi. Only the drunk driver hit the taxi.

     

  • Cliche Thinking:

    Using as evidence a well-known wise saying, as if that is proven, or as if it has no exceptions.

     

  • Exception That Proves The Rule:

    A specific example of Cliche Thinking. This is used when a rule has been asserted, and someone points out the rule doesn’t always work.

     

    The cliche rebuttal is that this is “the exception that proves the rule.” Many people think that this cliche somehow allows a person to ignore the exception, and continue using the rule. In fact, the cliche originally did no such thing. There are two standard explanations for the original meaning.

     

    The first is that in the expression the word “proves” meant “tests”. That is why the military takes its equipment to a Proving Ground to test it. So, the cliche originally said that an exception tests a rule. That is, if you find an exception to a rule, the cliche is saying that the rule is being tested, and perhaps the rule will need to be discarded.

     

    The second explanation is that the stating of an exception to a rule, proves that the rule exists. For example, suppose it was announced that, “Over the holiday weekend, students do not need to be in the dorms by midnight.” This announcement implies that normally students do have to be in by midnight. Here is a discussion of that explanation.

     

    In either case, the cliche is not about waving away objections.

     

  • Appeal To Widespread Belief (Bandwagon Argument, Peer Pressure, Appeal to Common Practice):

    The claim, as evidence for an idea, that many people believe it, or used to believe it, or do it. If the discussion is about social conventions, such as “good manners”, then this is a reasonable line of argument.

     

    However, in the 1800’s there was a widespread belief that blood-letting cured sickness. All of these people were not just wrong, but horribly wrong, because in fact it made people sicker. Clearly, the popularity of an idea is no guarantee that it’s right.

     

    Similarly, a common justification is “Everybody does it”, for example in relation to bribery, and in the past this was a justification for slavery.

     

  • Fallacy Of Composition:

    Assuming that a whole has the same simplicity as its constituent parts. But in fact, a great deal of science is the study of emergent properties. For example, if you put a drop of oil on water, there are interesting optical effects. But the effect comes from the oil/water system: it does not come just from the oil or just from the water.

     

    Another example is: “A car makes less pollution than a bus. Therefore, cars are less of a pollution problem than buses.” Or: “Atoms are colorless. Cats are made of atoms, so cats are colorless.”

     

  • Fallacy Of Division:

    Assuming that what is true of the whole is true of each constituent part. For example, human beings are made of atoms, and human beings are conscious, so atoms must be conscious.

     

  • Complex Question (Tying):

    Unrelated points are treated as if they should be accepted or rejected together. In fact, each point should be accepted or rejected on its own merits. For example: “Do you support freedom and the right to bear arms?”

     

  • Slippery Slope Fallacy (Camel’s Nose)

    This can be fallacious if it relies on the assumption that something is wrong simply because it appears to be closely associated to something that is wrong, and the assumption can be that it is wrong because it could slide towards something that is wrong.

     

    For example: “Allowing abortion in the first week of pregnancy would lead to allowing it in the ninth month.” Or: “If we legalize marijuana, then more people will try heroin.” Or: “If I make an exception for you then I’ll have to make an exception for everyone.”

     

    But there can be a link between two things that appear to be associated such that one can be the cause of the other, in which case, the claim of a slippery slope is not fallacious - but the reason why one causes the other should be described rather than relying solely on the “slippery slope”.

     

  • Argument By Pigheadedness (Doggedness):

    Refusing to accept something even though the evidence against it is overwhelming. For example, there are still Flat Earthers.

     

  • Appeal To Coincidence:

    Asserting that some fact is due to chance. For example, a person has had a dozen traffic accidents in six months, yet he insists they weren’t his fault. This may be Argument By Pigheadedness.

     

    But on the other hand, coincidences do happen, so this argument is not always fallacious.

     

  • Argument By Repetition (Argument Ad Nauseam):

    If someone says something often enough, some people will begin to believe it. There are some internet screwballs who keeping reposting the same articles on the internet, presumably in hopes it will have that effect.

     

  • Argument By Half Truth (Suppressed Evidence):

    This is hard to detect, of course. You have to investigate. For example, an amazingly accurate “prophecy” of the assassination attempt on President Reagan was shown on TV. But was the tape recorded before or after the event? Many stations did not ask this question. (In fact it was recorded afterwards).

     

    A book on “sea mysteries” or the “Bermuda Triangle” might tell us that the yacht Connemara IV was found drifting crewless, southeast of Bermuda, on September 26, 1955. None of these books mention that the yacht had been directly in the path of Hurricane Iona, with 180 mph winds and 40-foot waves.

     

  • Argument By Selective Observation:

    Also called cherry picking, the enumeration of favorable circumstances, or as the philosopher Francis Bacon described it, counting the hits and forgetting the misses. For example, a state boasts of the Presidents it has produced, but is silent about its serial killers. Or the claim: “Technology brings happiness.” - but technology can only cause unhappiness.

     

    Casinos encourage this human tendency. There are bells and whistles to announce slot machine jackpots, but losing always happens silently. This makes it much easier to think that the odds of winning are good.

     

  • Argument By Selective Reading:

    Making it seem as if the weakest of an opponent’s arguments was the best he had. If the opponent had a strong argument X and also a weaker argument Y, an attacker can concentrate on rebutting Y and claim that the opponent has made a weak case.

     

    This is a relative of Argument By Selective Observation, in that the arguer overlooks arguments that he does not like. It is also related to Straw Man (Fallacy Of Extension), in that the opponent’s argument is not being fairly represented.

     

  • Argument By Generalization:

    Drawing a broad conclusion from a small number of perhaps unrepresentative cases; the cases may be unrepresentative because of Selective Observation. For example: “They say 1 out of every 5 people is Chinese. How is this possible? I know hundreds of people, and none of them is Chinese.” So, by generalization, there aren’t any Chinese anywhere.

     

    This is connected to the Fallacy Of The General Rule.

     

    Similarly, “Because we allow terminally ill patients to use heroin, we should allow everyone to use heroin.”

     

    It is also possible to under-generalize. For example: “A man who had killed both of his grandmothers declared himself rehabilitated, on the grounds that he could not conceivably repeat his offense in the absence of any further grandmothers.” - from “Ports Of Call” by Jack Vance.

     

  • Argument From Small Numbers:

    For example: “I’ve thrown three sevens in a row. Tonight I can’t lose.” This is Argument By Generalization, but it also assumes that small numbers are the same as big numbers. Three sevens is actually a common occurrence. Thirty three sevens is not.

     

    Or: “After treatment with the drug, one-third of the mice were cured, one-third died, and the third mouse survived.” Does this mean that if we treated a thousand mice, 333 would be cured? Well, no.

     

  • Misunderstanding The Nature Of Statistics (Innumeracy):

    President Dwight Eisenhower expressed astonishment and alarm on discovering that fully half of all Americans had below average intelligence. Similarly, some people get fearful when they learn that their doctor wasn’t in the top half of his class. But that applies to half of the class!

    “Statistics show that of those who contract the habit of eating, very few survive.” - Wallace Irwin.

    Very few people seem to understand “regression to the mean”. This is the concept that things tend to go back to normal. If you feel normal today, does it really mean that the headache cure you took yesterday performed wonders? Or is it just that your headaches are nearly always gone the next day?

     

    Journalists are notoriously bad at reporting risks. For example, in 1995 it was loudly reported that a class of contraceptive pills would double the chance of dangerous blood clots. The news stories mostly did not mention that “doubling” the risk only increased it by one person in 7,000.

     

    The reports of “Cell phones cause brain cancer” take this further, with the supposed increase in risk being at most one or two cancers per 100,000 people per year. So, if the fear-mongers are right, your cellphone has increased your risk from “very little” to “very little”.

     

  • Inconsistency:

    For example, the declining life expectancy in the former Soviet Union is due to the failures of communism. But the rather high infant mortality rate in the United States is not a failure of capitalism.

     

    This is related to Internal Contradiction.

     

  • Non Sequitur:

    Something that just does not follow. For example: “Tens of thousands of Americans have seen lights in the night sky which they could not identify. The existence of life on other planets is fast becoming certainty!” Another example is someone arguing at length that their religion is of great help to many people, and then concluding that the teachings of that religion are undoubtedly true.

     

    Or: “Bill lives in a large building, so his apartment must be large.”

     

  • Meaningless Questions:

    Irresistible forces meeting immovable objects, and the like.

     

  • Argument By Poetic Language:

    If it sounds good, then surely it must be right. Songs often use this effect to create a sort of credibility. For example: “Don’t Fear The Reaper” by Blue Oyster Cult. Politically oriented songs should be taken with a grain of salt, precisely because they sound good.

     

  • Argument By Slogan:

    If it’s short, and connects to an argument, it may seem to be an argument. But the Reductive Fallacy of oversimplification tends to apply to pithy slogans. Being short, a slogan increases the effectiveness of Argument By Repetition. It also helps Argument By Emotive Language (Appeal To The People), since emotional appeals need to be punchy. And also, the gallery can easily chant a short slogan. Using an old slogan is Cliche Thinking.

     

  • Argument By Prestigious Jargon:

    Using long, complicated or impressive sounding words so that one will seem to be an expert. Why do people use “utilize” when they could utilize “use”? For example, some crackpots used to claim they had a Unified Field Theory. Now the word Quantum is popular, much used in pseudo-scientific psychobabble. And then there are Zero Point Fields.

     

  • Argument By Gibberish (Bafflement):

    This is the extreme version of Argument By Prestigious Jargon. An invented vocabulary helps the effect, but perfectly ordinary words can also be used to baffle. For example: “Omniscience is greater than omnipotence, and the difference is two. Omnipotence plus two equals omniscience. META = 2.” - from R. Buckminster Fuller’s No More Secondhand God.

     

    Gibberish may come from people who can’t find meaning in technical jargon, so they think they should copy style instead of meaning. Or it can also be a “snow job”, AKA “baffle them with bullshit”, by someone actually familiar with the jargon.

     

    Or it could be Argument By Poetic Language; an example of poetic gibberish is: “Each autonomous individual emerges holographically within egoless ontological consciousness as a non-dimensional geometric point within the transcendental thought-wave matrix.”

     

  • Equivocation:

    Using a word to mean one thing, and then later using it to mean something different. For example, sometimes “Free software” costs nothing, and sometimes it is without restrictions. Some more examples are:

    The sign said “Fine for parking here”, and since it was fine, I parked there.

    “All trees have bark; all dogs bark; therefore, all dogs are trees.”

    “Consider that two wrongs never make a right, but that three lefts do.”

     

  • Euphemism:

    The use of words that sound better. The lab rat wasn’t killed, it was sacrificed. Mass murder wasn’t genocide, it was ethnic cleansing. The death of innocent bystanders is collateral damage. Microsoft doesn’t find bugs, or problems, or security vulnerabilities; they just discover an issue with a piece of software.

     

    This is related to Argument By Emotive Language, since the effect is to make a concept emotionally palatable.

     

  • Weasel Wording:

    This is very much like Euphemism, except that the word changes are done to claim a new, different concept rather than soften the old concept. For example, an American President may not legally conduct a war without a declaration of Congress, and so various Presidents have conducted “police actions”, “armed incursions”, “protective reaction strikes,” “pacification,” “safeguarding American interests”, and a wide variety of “operations.”

     

    Similarly, War Departments have become Departments of Defense, and untested medicines have become alternative medicines. The book “1984” has some particularly good examples.

     

  • Error Of Fact:

    For example: “No one knows how old the Pyramids of Egypt are.” Except, of course, for the historians who’ve read records and letters written by the ancient Egyptians themselves.

     

    Typically, the presence of one error means that there are other errors to be uncovered.

     

  • Argument From Personal Astonishment:

    Errors of Fact caused by stating offhand opinions as proven facts. The speaker’s thought process typically being “I don’t see how this is possible, so it isn’t.” The speaker may not be lying, since the speaker may be expressing what he thinks, but it can seem that way to people who know more about the subject than the speaker does.

     

  • Lies:

    Intentional Errors of Fact. In some contexts this is called bluffing. If the speaker thinks that lying serves a moral end, this would be a Pious Fraud.

     

  • Contrarian Argument:

    In science, espousing some thing that the speaker knows is generally ill-regarded, or even generally held to be disproven. For example, claiming that HIV is not the cause of AIDS, or claiming that homeopathic remedies are not just placebos.

     

    In politics, the phrase may be used more broadly, to mean espousing some position that the establishment or opposition party does not hold.

     

    This type of argument is sometimes done to make people think, and sometimes it is needling, or perhaps it supports an external agenda. But it can also be done just to oppose conformity, or as a pose or style choice: to be a “maverick” or lightning rod. Or, perhaps just for the ego of standing alone.

    “It is not enough to succeed. Friends must be seen to have failed.” - Truman Capote.

    “If you want to prove yourself a brilliant scientist, you don’t always agree with the consensus. You show you’re right and everyone else is wrong.” - Daniel Kirk-Davidoff discussing Richard Lindzen.

     

    Calling someone contrarian risks the Psychogenetic Fallacy. People who are annoying are not necessarily wrong. On the other hand, if the position is ill-regarded for a reason, then defending it may be uphill. Trolling is Contrarian Argument done to get a reaction; trolling on the Internet often involves pretense.

     

  • Hypothesis Contrary To Fact:

    Arguing from something that might have happened, but didn’t.

     

  • Internal Contradiction:

    Saying two contradictory things in the same argument. One author wrote on one page, “Sir Arthur Conan Doyle writes in his autobiography that he never saw a ghost.”, but later on in the same book we find “Sir Arthur’s first encounter with a ghost came when he was 25, surgeon of a whaling ship in the Arctic.”

     

    This is much like saying “I never borrowed his car, and it already had that dent when I got it.”

     

    This is related to Inconsistency.

     

  • Changing The Subject (Digression, Red Herring, Misdirection, False Emphasis):

    This is sometimes used to avoid having to defend a claim, or to avoid making good on a promise. In general, there is something the other people are not supposed to notice.

     

    This is connected to various diversionary tactics, which may be obstructive, obtuse, or needling. For example, if someone quibbles about the meaning of some word a person used, they may be quite happy that the person interjected about that word, since that can mean that they’ve changed the subject and taken attention away from part of the argument.

     

    People may pick nits in wording, perhaps asking for a definition, or they may deliberately misunderstand the speaker, for example:

    “You said this happened five years before Hitler came to power. Why are you so fascinated with Hitler? Are you anti-Semitic?”

     

    This type of argumentative tactic is also connected to various rhetorical tricks, such as announcing that there cannot be a question period because the speaker must leave and there is no more time.

     

  • Argument By Fast Talking:

    If one goes from one idea to the next quickly enough, the audience won’t have time to consider in detail what is being said. This is connected to Changing The Subject and (for some audiences) Argument By Personal Charm.

     

    Some psychologists consider that to understand what you hear, you must for a brief moment believe it. If this is true, then rapid delivery does not leave people time to reject what they hear.

     

  • Having Your Cake (Failure To Assert, or Diminished Claim):

    Almost claiming something, but backing out. For example: “It may be, as some suppose, that ghosts can only be seen by certain so-called sensitives, who are possibly special mutations with, perhaps, abnormally extended ranges of vision and hearing. Yet some claim we are all sensitives.”

     

    Another example is: “I don’t necessarily agree with the liquefaction theory, nor do I endorse all of Walter Brown’s other material, but the geological statements are informative.” The strange thing here is that liquefaction theory (the idea that the world’s rocks formed in flood waters) was demolished in 1788. To “not necessarily agree” with it, today, is in the category of “not necessarily agreeing” with 2 + 2 = 3. Note that the writer implies some study of the matter, and only partial rejection.

     

    A similar thing is the failure to rebut. Suppose I raise an issue, and I get the response that “Woodmorappe’s book talks about that and states that …” This could possibly be a reference to a resounding rebuttal. Or maybe it’s a lie; or perhaps the responder hasn’t even read the book yet, and doesn’t know himself. How can we tell? We can’t, but we can point out that the responder hasn’t proved anything.

     

  • Ambiguous Assertion:

    A statement is made, but it is sufficiently unclear that it leaves some sort of leeway. For example, a certain book about Washington politics did not place quotation marks around quotes; this created ambiguity about which parts of the book were first-hand reports and which parts were second-hand reports, assumptions, or outright fiction.

     

    If a statement can have two different meanings, this is an amphibology, for example: “Last night I shot a burglar in my pyjamas.” But the intended meaning should be clear from the context.

     

  • Failure To State:

    If one makes enough attacks, and asks enough questions, one may never have to actually define one’s own position on the topic.

     

  • Outdated Information:

    Information is given, but it is not the latest information on the subject. For example, some creationist articles talk about the amount of dust on the moon and quote a theoretical quantity based on measurements made in the 1950’s. But many much better measurements have been done since then, debunking the quoted quantity, see for example, NASA’s article Dust on the moon.

     

  • Amazing Familiarity:

    The speaker seems to have information that there is no possible way for him to get, on the basis of his own statements. For example: “The first man on deck, seaman Don Smithers, yawned lazily and fingered his good luck charm, a dried seahorse. To no avail! At noon, the Sea Ranger was found drifting aimlessly, with every man of its crew missing without a trace!”

     

  • Least Plausible Hypothesis:

    Ignoring all of the most reasonable explanations. This makes the desired explanation into the only one. For example: “I left a saucer of milk outside overnight. In the morning, the milk was gone. Clearly, my yard was visited by fairies.”

     

    There is an old rule to assist in reaching a decision as to which explanation is the most plausible, usually referred to as “Occam’s Razor”, and it basically argues that the simplest is usually the best. The current phrase among scientists is that an explanation should be “the most parsimonious”, meaning that it should not introduce new concepts (like fairies) when old concepts (like neighborhood cats) will do.

     

    On ward rounds, medical students love to come up with the most obscure explanations for symptoms rather than suggest common maladies.

     

  • Argument By Scenario:

    Telling a story which ties together unrelated material, and then using the story as proof they are related.

     

  • Affirming The Consequent:

    This is a reversal of logic. A correct statement of the form “if P then Q” gets turned into “Q therefore P”. For example: “All cats die; Socrates died; therefore Socrates was a cat.”

     

    Another example: “If the earth orbits the sun, then the nearer stars will show an apparent annual shift in position relative to more distant stars (stellar parallax). Observations show conclusively that this parallax shift does occur. This proves that the earth orbits the sun.” In reality, it proves that Q [the parallax] is consistent with P [orbiting the sun]. But it might also be consistent with some other theory. In the past, other theories did exist, but they are now dead, because although they were consistent with a few facts, they were not consistent with all the facts.

     

    Another example: “If space creatures were kidnapping people and examining them, the space creatures would probably hypnotically erase the memories of the people they examined. These people would thus suffer from amnesia. But in fact many people do suffer from amnesia. This tends to prove they were kidnapped and examined by space creatures.” This is also a Least Plausible Hypothesis explanation.

     

  • Moving The Goalposts (Raising The Bar, Argument By Demanding Impossible Perfection):

    If the opponent successfully addresses some point, then an attacker can say he must also address some further point. If the attacker can make these points more and more difficult (or diverse) then eventually the opponent must fail. If nothing else, such an attacker may eventually find a subject that the opponent isn’t totally au fait with.

     

    This is related to Argument By Question. Asking questions is easy: it’s answering them that’s hard. If each new goal causes a new question, this may get to seem like an infinite regression.

     

    It is also possible to lower the bar, reducing the burden on an argument. For example, a person who takes Vitamin C might claim that it prevents colds. When they do get a cold, then they move the goalposts, by saying that the cold would have been much worse if not for the Vitamin C.

     

  • Appeal To Complexity:

    If the arguer doesn’t understand the topic, he concludes that nobody understands it, and he believes that his opinions are as good as anybody’s, so others should accept his opinions.

     

  • Common Sense:

    Unfortunately, what common sense actually is in many scenarios isn’t always clear, and there simply isn’t a common-sense answer for many questions. In politics, for example, there are a lot of issues where people disagree, and each side thinks that their answer is common sense. Clearly, some of these people are wrong. The reason they are wrong is because common sense depends on the context, knowledge and experience of the observer. That is why instruction manuals will often have paragraphs like these:

    When boating, use common sense. Have one life preserver for each person in the boat.

    When towing a water skier, use common sense. Have one person watching the skier at all times.

    If the ideas are so obvious, then why the second sentence? Why do they have to spell it out? The answer is that here “use your common sense” is not a good substitute for “pay attention, I am about to tell you something that inexperienced people often get wrong.”

     

    Science has discovered a lot of situations which are far more unfamiliar than water skiing. Not surprisingly, beginners find that much of it violates their common sense. For example, many people can’t imagine how a mountain range would form. But in fact anyone can take good GPS equipment to the Himalayas, and measure for themselves that those mountains are rising today.

     

    Hence, if a speaker tells an audience that he supports using common sense, it is very possibly an Ambiguous Assertion.

     

  • Argument By Laziness (Argument By Uninformed Opinion):

    The arguer hasn’t bothered to learn anything about the topic. He nevertheless has an opinion, and will be insulted if his opinion is not treated with respect.

     

  • Disproof By Fallacy:

    If a conclusion can be reached in an obviously fallacious way, then the conclusion is incorrectly declared wrong. For example:

    “Take the division 64/16. Now, canceling a 6 on top and a six on the bottom, we get that 64/16 = 4/1 = 4.

    “Wait a second! You can’t just cancel the six!”

    “Oh, so you’re telling us 64/16 is not equal to 4, are you?”

    Note that this is different from Reductio Ad Absurdum, where the opponent’s argument can lead to an absurd conclusion. In this case, an absurd argument leads, by chance, to a correct conclusion.

     

  • Reductio Ad Absurdum:

    Showing that the opponent’s argument leads to some absurd conclusion. This is in general a reasonable and non-fallacious way to argue. If the issues are razor-sharp, it is a good way to completely destroy their argument. However, if the waters are a bit muddy, perhaps it will only succeed in showing that the opponent’s argument does not apply in all cases.

     

    That is, using Reductio Ad Absurdum is sometimes using the Fallacy Of The General Rule. However, if one is faced with an argument that is poorly worded, or only lightly sketched, Reductio Ad Absurdum may be a good way of pointing out the holes.

     

    An example:

    Bertrand Russell, in a lecture on logic, mentioned that in the sense of material implication, a false proposition implies any proposition. A student raised his hand and said:

    “In that case, given that 1 = 0, prove that you are the Pope”.

    Russell immediately replied:

    “Add 1 to both sides of the equation: then we have 2 = 1. The set containing just me and the Pope has 2 members. But 2 = 1, so it has only 1 member; therefore, I am the Pope.”

     

  • False Compromise:

    If one does not understand a debate, it might be argued that it must be “fair” to split the difference, and agree on a compromise between the opinions, for example:

    “Some say the sun rises in the east, some say it rises in the west; the truth lies probably somewhere in between.”

    But one side might be wrong; in any case one could simply suspend judgment. Note that journalists often invoke this fallacy in the name of “balanced” coverage; television reporters like balanced coverage so much that they may give half of their report to a view held by a small minority of the people in question. Viewers need to be aware of this tendency.

     

  • Fallacy Of The Crucial Experiment:

    Claiming that some idea has been proved (or disproved) by a pivotal discovery. This is the “smoking gun” version of history. Scientific progress is often reported in such terms. This is inevitable when a complex story is reduced to a soundbite, but it can be a distortion. In reality, a lot of background happens first, and a lot of buttressing (or retraction) happens afterwards. And in natural history, most of the theories are about how often certain things happen (relative to some other thing). For those theories, no one experiment could ever be conclusive.

     

  • Two Wrongs Make A Right (Tu Quoque, You Too, What’s sauce for the goose is sauce for the gander):

    A charge of wrongdoing is answered by a rationalization that others have sinned, or might have sinned. For example, Bill borrows Jane’s expensive pen, and later finds he hasn’t returned it. He tells himself that it is okay to keep it, since he thinks that she would have taken his in similar circumstances.

     

    War atrocities and terrorism are often defended in this way. Similarly, some people defend capital punishment on the grounds that the state is killing people who have killed.

     

    This is related to Ad Hominem (Argument To The Man).

     

  • Pious Fraud:

    A fraud done to accomplish some good end, on the theory that the end justifies the means. For example, a church in Canada had a statue of Christ which started to weep tears of blood. When analyzed, the blood turned out to be beef blood. We can reasonably assume that someone with access to the building thought that bringing souls to Christ would justify the deception.

     

    In the context of debates, a Pious Fraud could be a lie. More generally, it would be when an emotionally committed speaker makes an assertion that is shaded, distorted or even fabricated. For example, British Prime Minister Tony Blair has been accused in 2003 of “sexing up” his evidence that Iraq had weapons of mass destruction.

     

    Around the year 400, Saint Augustine wrote two books, PDF De Mendacio [On Lying] and PDF Contra Medacium [Against Lying] on this subject. He argued that the sin is not in what you do (or don’t) say, but in your intention to leave a false impression. He strongly opposed Pious Fraud.

Interested in supporting this site?

You can help by sharing the site with others. You can also donate at Go Get Funding: Logic and Language where there are full details.

 

 

As site owner I reserve the right to keep my comments sections as I deem appropriate. I do not use that right to unfairly censor valid criticism. My reasons for deleting or editing comments do not include deleting a comment because it disagrees with what is on my website. Reasons for exclusion include:
Frivolous, irrelevant comments.
Comments devoid of logical basis.
Derogatory comments.
Long-winded comments.
Comments with excessive number of different points.
Questions about matters that do not relate to the page they post on. Such posts are not comments.
Comments with a substantial amount of mathematical terms not properly formatted will not be published unless a file (such as doc, tex, pdf) is simultaneously emailed to me, and where the mathematical terms are correctly formatted.


Reasons for deleting comments of certain users:
Bulk posting of comments in a short space of time, often on several different pages, and which are not simply part of an ongoing discussion. Multiple anonymous user names for one person.
Users, who, when shown their point is wrong, immediately claim that they just wrote it incorrectly and rewrite it again - still erroneously, or else attack something else on my site - erroneously. After the first few instances, further posts are deleted.
Users who make persistent erroneous attacks in a scatter-gun attempt to try to find some error in what I write on this site. After the first few instances, further posts are deleted.


Difficulties in understanding the site content are usually best addressed by contacting me by e-mail.

 

Based on HashOver Comment System by Jacob Barkdull

Copyright   James R Meyer   2012 - 2024
https://www.jamesrmeyer.com