Week 4: Assignment Journal

Week 4: Fallacies, Biases, and Rhetoric

Just as it is important to find truth it is equally important to learn to avoid error. It is analogous to playing defense. The main way that we play defense in logic is by guarding against fallacies and biases. Fallacies are common forms of inference that are not good; they do not adequately support their conclusions. The best way to learn to avoid them is to learn to identify them so that you will see when they are occurring.

Since there are literally hundreds of fallacies, we will only have time to discuss a small few. However, we will focus on some of the most common, and readers can go on to learn more, both from our book as well as other online resources. Here is a brief summary of a few of the most important and most common (these are explained in much greater detail in the book, and there are many more fallacies addressed in the book, so make sure to reach Chapter 7 before doing the activities of the week).

This week's guidance will cover the following topics:

  1. Begging the Question

  2. The Straw Man Fallacy

  3. The Ad Hominem Fallacy

  4. The Appeal to Popular Opinion

  5. The Appeal to Emotion

  6. Other Fallacies

  7. Cognitive Biases

  8. Argumentative Devices

  9. Things to Do This Week

Begging the Question

Possibly the most commonly committed fallacy is Begging the Question (by assuming a main point at issue). Here is a nice explanation:

Circular reasoning is an extreme version of begging the question in which a premise is identical to the conclusion.

Here are some examples of each:

  1. Don’t listen to that candidate; he’s untrustworthy.

  2. You shouldn’t bet on that horse; it’s going to lose.

  3. Don’t buy a Mac since PCs are better.

  4. Marijuana should not be legalized because that would be disastrous.

  5. You should join my religion because it’s the true one.

  6. That food is bad for you because it is unhealthy.

How to Avoid Begging the Question

In order to avoid this fallacy it is necessary to use premises that do not assume the point at issue, but rather that are based in principles and observations upon which both parties could in principle agree.

Can you think of ways to fix each of the above arguments? What premises could you add to make the arguments, not only substantive, but also to support their conclusions in ways that are likely to be acceptable to someone who doesn’t already agree?

An Example of Avoiding Begging the Question by Creating a Supporting Argument

Suppose you want to say why abortion is wrong, and you use the premise that abortion kills a human being. This argument simply assumes that a human fetus is a human being, which is a major point at issue. One way that you might seek to get out of this problem is to come up with a supporting argument for that premise. That is, you might construct a piece of reasoning intending to demonstrate to the other parties why a fetus should count as a human being.

To do this without begging the question will be difficult, but it typically will involve appealing to a definition of what counts as a human being. Once you are done the argument might look as follows:

Premise 1: By definition, a human being is a living organism of the human species.

Premise 2: A human fetus is a living organism from conception.

Premise 3: A human fetus is of the human species.

Conclusion: Therefore a human fetus is a human being from conception.

Providing a supporting argument is a way to support a premise rather than merely asserting it. This particular argument may or may not be persuasive to the other party (the central question may not be simply the entity’s species but the stage at which it begins to have rights), but at least it is a sincere effort to demonstrate the reasoning for one’s side in a way that attempts to start on neutral ground, without simply assuming one answer to the question.

Learning not to beg the question can be a lifelong battle, but one which, by attempting, you will find yourself to be not only more intelligent, but also more fair minded.

The Straw Man Fallacy

One of the most challenging things to do when confronting positions that are strongly contrary to one’s own is to learn to represent the other side’s position fairly. This means actually learning their side and then representing it well. We often find it so much easier simply to make fun of the other side’s position by making it sound ridiculous; however, to do so is to commit the straw man fallacy.

Intentional or Unintentional? It’s bad either way

As the video mentions, people commit the straw man fallacy either intentionally or unintentionally. Either case is bad. If someone commits it unintentionally, it is generally because they don’t fully understand the other side’s position. It is not possible to come to a good conclusion about an issue if you don’t properly understand the other position. Committing the straw man in this way might fool people who also don’t understand the other side, but to those on the other side, it just reveals one’s ignorance.

The case is worse if someone commits the straw man intentionally. This would mean that one perhaps knows that one’s argument cannot stand up to the real opposing arguments, so one is trying to fool one’s audience by putting one’s argument against a weakened or absurd version of that side’s views. This approach demonstrates a kind of dishonesty in reasoning and is not part of how a mature critical thinker would advocate for a position.

Furthermore, people with the other view often find it unintelligent or even offensive to hear someone misrepresent their views. Have you ever heard your views misrepresented? It can be rather irking, and it does not lead to high respect for the person who does so. This can be particularly troubling in controversial arenas like religion, politics, and personal relationships.

Here are some more examples of possible straw man fallacies that might occur in daily life:

  1. You never want to have any fun (in response to the lack of interest in going to one party)

  2. You think I should spend my whole life studying (in response to parents asking him to finish his homework before going out)

  3. Republicans don’t care about poor people

  4. Democrats want the government to control everything

  5. Israel just wants to wipe Palestine off the map

  6. Palestine just wants to wipe Israel off the map

  7. Vegetarians want us all to live on salad

  8. Meat eaters don’t care how animals are treated

  9. You hate my family (in response to expressing stress about a visit)

How to Avoid the Straw Man Fallacy

One of the great challenges of a sophisticated thinker is to appreciate views that are contrary to one’s own. It may actually be impossible to get there without learning to appreciate that their views are not so dumb after all. The reason that otherwise reasonable and decent people subscribe to a certain view is usually not because they are all stupid or evil, but because they have good reasons for those views. However, those reasons are often complex and deeply rooted in their upbringing, their religion, their sub-culture, their friends, their feelings, and their personal experiences. Therefore, it is not easy to present views in a convincing manner unless one can appreciate the depth and breadth of the person’s point of view.

Gay marriage could be an example. Here are two straw man versions of people’s views:

  1. People oppose gay marriage just because they are homophobic.

  2. People who support gay marriage want to ruin the family.

Neither of those seems like a very fair representation of people’s actual views and feelings. To understand those you might have to do the following:

  1. Actually listen to what people on the other side say. Read high quality articles on both sides; listen to someone explain their view without arguing back. Remember that they may not yet have fully worked out their views themselves, so they may require some thinking. Be patient and charitable.

  2. Practice empathy; think about why someone might feel that way and learn some of the experiences that led them to those positions.

  3. Repeat the process until it actually makes sense and until you can explain their views to them without them objecting.

You can tell that you are on your way to overcoming this fallacy when their views no longer seem offensive; you may still not fully agree, but you can understand why a perfectly intelligent person might come to such a view and you can represent those views fairly to others.

Exercise

See if you can fix the examples of the straw man above. Actually write down what would be a more accurate interpretation of what someone on that side of the issue probably thinks. Writing them down will help you to develop skills for writing more sophisticated interpretations in the future.

The Ad Hominem Fallacy

It is important in life to focus on the issues at hand rather than just the person arguing for the position. It is common, for example, for people to pick sides of an issue based upon the types of people that we like to hang around. We then dismiss the views of others not because they are wrong but because we object to the person him or herself. To do so would be to commit the ad hominem fallacy. The ad hominem fallacy is a dismissal of a person’s views simply because of who said the view, not because of the merits of the arguments.

Here would be two examples (one on each side):

  • Environmentalism is wrong because they are just a bunch of dirty hippy tree huggers.

  • You should support the environment because people who don’t are heartless and greedy.

If we are to address the question of whether to support a particular piece of environmental legislation we should carefully look at the merits of the arguments on each side of the issue, not simply take a global stance because of the types of people that advocate either side.

One version of the ad hominem dismisses someone’s view because he or she does not fully live up to that view. This would be a more specific type of fallacy called tu quoque, which is Latin for “you too!” The fact that Al Gore has a large house, for example, has been used to dismiss his climate advocacy as insincere or hypocritical. How he personally lives does not change the merits of his arguments.

The Difference Between Ad Hominem Fallacy and Questioning an Appeal to Authority

Sometimes the person giving an argument is relevant to the strength of the argument. This is the case if the argument is an appeal to authority. It is fully legitimate in such arguments to call into question a source’s credibility; to fail to do so might lead us to commit a fallacious appeal to inadequate authority.

The difference between a legitimate questioning of an authority and an ad hominem fallacy, then, is largely about whether we are dismissing someone’s views (and reasoning) because of who said them or whether we are simply challenging the trustworthiness of a source. Here is an example of each:

Legitimate: I do not trust the claims in that ad since it was produced by the company that stands to make money from it.

Fallacious: We shouldn’t listen to his reasoning about this issue because he has been to jail twice.

It is sometimes tricky to tell the difference, for example, if we object to a speaker not having a degree in the subject matter. Some subjects require vast amounts of knowledge (and perhaps a degree), but often people can have good arguments even without having majored in the field. Often the best way to tell the difference is based on whether we are seeking to dismiss or ignore their reasoning. To do so is generally fallacious. To avoid this fallacy we should actually be more careful about assessing the quality of their reasoning itself rather than just the person giving that reasoning.

Exercise

See if you can identify whether each of the following is fallacious, and see if you can fix each one that is:

  1. You have no right to tell me how to parent my kids, just look at how yours turned out?

  2. You say that climate change is real, but then, why do you do drive a car?

  3. Why should I trust your arguments about wearing seat belts; you smoke!

  4. I heard that politician’s speech, and he was loud and obnoxious. I would not vote for him.

  5. I wouldn’t listen to that study. The people who wrote it are employees of the industry being studied.

  6. I would not listen to chiropractors; they just make claims in order to get your money.

The Appeal to Popular Opinion

The appeal to popular opinion (covered in the book) is more common that one might think. Whether we realize it or not, most of our views are formed by the culture in which we live. Therefore, much of what we take to be true we have received from the popular views of our society. It can even be very difficult to think truly independently. The great philosopher Immanuel Kant defined “enlightenment” as the ability to think for one’s self; it can be a rare and even dangerous skill.

Here are some examples of appeals to popular opinion based on things that are commonly taken to be true:

  • “It is a truth universally acknowledged that a single man in possession of a good fortune, must be in want of a wife.” (Austin, n.d.)

  • One would have to be crazy to give all of one’s money to famine relief; that is just too extreme.

  • You should obviously take the job that will bring you the most wealth and prestige. To drop out and live as a poor person is very irresponsible.

The list could go on and on. Closely related to the appeal to popular opinion is the appeal to tradition in which one reasons that something is right because that is what people have believed or done for generations. Though tradition is good in many cases for building unity within a people, there are plenty of cases in which there are good reasons to break from a tradition or to modify it.

Exercise

See if you can tell which of the following are fallacious appeals to popular opinion and which are legitimate appeals to authority:

  1. I won’t buy a Ford; everyone says that they are unreliable.

  2. Everyone eats meat, so it must be right.

  3. 97% of climate scientists agree that global warming is real (NASA, n.d.), so it is.

  4. All physicists believe that Einstein was wrong about Quantum Mechanics, so he was.

  5. Hershey’s is by far the most popular candy bar, so it must be the best.

  6. Rotten Tomatoes (a movie reviewing site) has that movie at 95% approval, so I am going to see it.

The Appeal to Emotion

We all have emotions, and we are all deeply influenced by them all of the time. It is not wrong or fallacious to care about one’s emotions and to act in ways that are meant to make us happy. The fallacy of appeal to emotion is to use emotion instead of or in contradiction to careful reasoning.

If one buys a car partly because it will be fun to drive (providing one can afford it), this is not necessarily fallacious. However, if one buys a car because it will be fun even though it is financially unwise, one has committed the fallacy.

It is not always easy, however, to know where our emotions end and our reasoning begins. In the issue of capital punishment, for example, we often feel that the person deserves to be killed for certain heinous crimes. Is this feeling merely an emotion or is it based in reason? These are sometimes difficult questions for philosophers and ethicists.

Closely related is the appeal to pity, in which one argues that a decision should be made out of the emotion of pity rather than in reason.

There is a famous commercial for the ASPCA in which a song by Sarah McClachan tears at one’s heartstrings while images of dejected puppies in cages, etc. are shown. This is seen by many as a blatant appeal to pity. It may, however, be a borderline case. To the extent that the commercial informs viewers of actual realities that happen to be very sad, it may not be fallacious. Emotion in response to viewed suffering may not be an instance of a fallacy but of compassion. However, if the money given may not do much actually to relieve the suffering of animals, but the commercial tugs on people’s emotions to make them feel that they should give, then it would be. Determining officially whether the commercial is fallacious, therefore, may require doing actual research into the specifics of ASPCA’s charitable practices.

Here are two more commercials to consider:

The appeal to fear is in the same category, but the emotion is fear instead. Whether the appeal is fallacious is generally based on whether the fear is rationally justified or not.

Exercise

Try to make a reasoned determination about whether each of the following constitutes an appeal to emotion or appeal to pity fallacy:

  1. Judge, you should let this guy have a short sentence, because he has had a hard life.

  2. You shouldn’t break up with me because I need you.

  3. You should give money to famine relief because children are starving (pictures shown).

  4. You should not fly in a plane because it could crash, and you would die.

  5. You should marry her because you love her, and you will be happier.

  6. Without this war, our country could be taken over by terrorists.

  7. He should get the death penalty because he is a monster.

Other Fallacies

There are many other fallacies that can be learned within chapter 7 of the book, including the appeal to force, appeal to ignorance, appeal to ridicule, biased sample, cherry picking, equivocation, fallacy of accident, fallacy of composition, fallacy of division, false cause, false dilemma, hasty generalization, non sequitur, poisoning the well, red herring, shifting the burden of proof, and the slippery slope.

Rather than focus on these fallacies in more detail, we instead will take some time to go over the very common phenomenon of cognitive biases. Here is a set of exercises to test your skills at identifying fallacies:

Exercise

Match the fallacy to the example (be aware that many examples can be examples of more than one fallacy, however, one fallacy is supposed to be dominant in each one):

Example Fallacy

1. “What you are doing is wrong because it isn’t right.”

a. Ad hominem

2. “I wore the red shirt and won; it must be lucky.”

b. Appeal to popular opinion

3. “You should catch your refrigerator because it’s running!”

c. Appeal to emotion/pity

4. “Both of my friends didn’t like the movie, so no one does.”

d. Appeal to fear

5. “You either agree with me or you are a bad person.”

e. Appeal to force

6. “Don’t listen to him; he’s a Cowboys fan.”

f. Appeal to ignorance/Shifting the burden of proof

7. “Speeding is bad, so don’t speed on the way to the emergency room.”

g. Begging the question/Circular reasoning

8. “If we legalize marijuana, then soon we will have to legalize heroin too.”

h. Biased sample/Hasty generalization

9. “John is going to talk to you; don’t believe a word he says.”

i. Equivocation

10. “Don’t buy that kind of soda or you will lose all of your teeth.”

j. Fallacy of accident

11. “Don’t bet on that horse; everyone knows he will lose.”

k. Fallacy of composition/division

12. “I you don’t agree with me, you are fired!”

l. False cause

13. “Don’t buy that car because, well, hey, didn’t you like that movie yesterday?”

m. False dilemma

14. “You should agree with me unless you can prove I am wrong.”

n. Non sequitur/Red herring

15. “Each fish is light, so a net full of them should be light.”

o. Poisoning the well

16. “You should give me an A; my dog just died.”

p. Slippery slope

Cognitive Biases

We are not as rational as we might think. It turns out that all of us are subject to certain tendencies to think of things in ways that are not truly objective; we see things in ways that favor maintaining certain beliefs and activities, for example, those that are approved of by our social groups. A general term for such biases that lead us away from pure objectivity is called cognitive bias.

There are many specific types of cognitive bias. One of the most famous and common is known as confirmation bias the tendency to accept only that information that confirms what we already believe. We tend only to notice, remember, and accept only things that confirm our opinions. Studies have even shown that when faced with contrary evidence to our cherish beliefs we not only do not change our views we actually dig in and become even more persuaded than before; this is known as the backfire effect (Silverman, 2011).

The bandwagon effect is very similar to the appeal to popular opinion except that it is not as much an inference as a tendency to believe what other people do. Such views are socially reinforced as we have all kinds of social incentives to agree with our friends, family, and other associates. Examples throughout history have shown that people are usually even willing to go as far as to dismiss the rights of entire categories of human beings if almost all others in their society do.

We also tend to be more aware of information that stands out to us. Information that is emotionally charged or in someother way noticeable is the type that we remember and focus on. This is the phoneonenon of availability heuristic in which we take much greater account of information that is more available to our minds. This combined with the negativity bias in which people have greater recall of negative memories than positive ones may help explain so-called Murphy’s Law. Even if we tend to hit normal amounts of red lights and long shopping lanes we tend to remember the times in which everything goes wrong; we therefore think of ourselves as victims of Murphy’s law when we are really just susceptible to the availability heuristic.

Paradigms and Bias

Combine these things and what we get is a real resistance to entertain views that are not our own and that are not shared by our social groups. Politics would be a good example of this phenomenon. We tend to listen to people whose views agree with our own and we have whole paradigms (loosely defined as the interwoven set of things that we believe) that are constantly being reinforced against opposition.

What happens next is that when we encounter a contrasting view especially about an emotionally charged topic we then have a hard time fitting it into our own paradigms. As a result we can see the view as ridiculous or unsupported. That is because we don’t share the surrounding beliefs that help to explain why they believe it.

The Interconnected Webs of Belief

This diagram shows how difficult it can be to fully understand what other people think. We can’t just understand just one thing; we have to understand all of the other beliefs that surround and support it, and all the beliefs that surround and support those, etc. We have to understand a whole paradigm! Expanding one’s points of view, can therefore be a lifelong exercise in open minded exploration.

Bias and the Media

People like to have their views confirmed and they especially do not like to have them shot down. This is one of the reasons that people tend to listen to media sources that support their own points of view (McRaney, 2010). This can lead to deeper levels of entrenchment when it comes to things like politics and religion. Unfortunately these levels of entrenchment are part of the source of deep political divides and can even harm close relationships. Is there a solution to this problem?

Logic may be part of the solution by challenging us to see the reasons for our differences of interpretation. Logic demands that we do not misrepresent other views or dismiss them easily by merely assuming that they are wrong because they don’t agree with our own views.

One of the ways that the media can influence us is to tell us how to interpret the information that is out there. The same piece of information can be taken in totally different ways and in ways that support opposite conclusions depending upon who is telling us the information. This is a bias called the framing effect. Our media source frames the information for us telling us what it means and how to interpret it. Sometimes most of the work in drawing conclusions from a piece of information is in the interpretation. The same event for example in the Middle East can be seen as verifying either democratic or republican foreign policies depending upon who is explaining the event to us. This supplies yet another reason to hear from multiple points of view when it comes to any question. A common way to react to views being presented to us that we do not like is to reject their source as being biased. This is known as the hostile media effect in which we consider a source to be biased because what it presents is so contrary to our own views.

One way to try to avoid these types of biases is to seek out media sources of high quality whose views are contrary to one’s own. One can also open one’s mind to principles and interpretations of facts from multiple perspectives. These types of activities can lead one to become more fair minded as will be discussed in the final week of this course.

Argumentative Devices

In addition to giving arguments there are many other techniques that can really help to strengthen one’s presentation of an argument. However, each can be misused as well. The book explains a few examples of argumentative devices and how they can be misused:

Evaluative language describes something in a positive or negative way. For example we can call a smart person a genius (positive) or a nerd (negative). Slanting is the trick of overusing evaluative language or the use of evaluative language as a substitute for actual arguments.

Assuring Terms use phrases like “I know,” “studies show,” and “definitely” to indicate to our audience a high degree of certainty that our claim is true. These tools are not necessarily fallacious as there often are good reasons in support of our claims even if we are not in a good position to give those details at the moment. However this tool also can be abused as when we use this tool as a substitute for good reasoning or when there may not be an actual source as strong as our claim that there is. There is even such a thing as an abusive assurance when one gives an assurance that is designed to make people feel stupid if they disagree. Examples of abusive assurances include “obviously,” “everyone knows,” and “there is no question.” Proof Surrogates occur when someone claims authoritative support when that support may not exist. We should be wary of falling for an assurance that implies that we should not challenge the support for a claim.

Qualifiers (or guarding/hedging terms) weaken a claim so that it is more likely to be true. Words like “most” (rather than “all”) “some” (rather than “most”) “usually” (rather than “always”) are used to make a claim much more plausible. The good news is that sentences with guarding terms are much more likely to be true than the stronger version. The bad news is that the claim is not as strong. This can be a problem especially when the sentence is being used as a premise but the guarding terms makes it no longer strong enough to support the conclusion adequately.

The mistake of overguarding a premise so that it no longer adequately supports the conclusion is known as the disappearing hedge (because guarding terms are called hedging terms and the hedge in these cases seems to have disappeared by the time we get to the conclusion. Qualifying a claim so much that it means very little at all makes it a weasel word, as discussed in the book.

Other rhetorical devices used in the book include Weasel words, euphemisms and dysphemisms, proof surrogates, hyperbole, and innuendo and paralipsis. Make sure to read chapter 8 to understand these terms.

An Example of Logical/Rhetorical Analysis

While this may be hard to read, here is what an example of what it can look like to circle and label the terms discussed in this section in a newspaper article, and to notice the rhetorical moves and logical tricks committed by things that we see every day:

This helps demonstrate the ways in which these types of analyses can be relevant to daily life. We can tear apart nearly everything noticing the logical and rhetorical maneuvers, both legitimate and otherwise! This kind of ability can be highly useful in daily life!

Things to Do This Week
  1. Read the required materials for the week, including this guidance and chapters 7 & 8 from the textbook.

  2. Watch the weekly intro video and all of the videos under the “Lectures” tab for this week of the course and view all other required materials.

  3. Post a timely (initial post by day 3) and thorough response to the first discussion forum as well as substantive replies to peers. This discussion topic is up to your instructor. The specific prompt will be posted by your instructor as the first response in the discussion forum. Make sure to follow carefully all elements of the discussion prompt.

  4. Respond to the second discussion prompt as well (initial post by day 3) as well as responses to peers. This discussion topic is also up to your instructor.

  5. Take the Quiz for the week (by day 7). It covers the central concepts of the course as covered in the textbook, this guidance, and the lecture videos for this week.

  6. Post a response to the journal prompt as well. Make sure that your response is at least one full page (double spaced) and that you respond to all aspects of the instructions.

  7. Start working on your final papers (due next week and 20% of your grade). An early start will help you to succeed.

If you have any questions, make sure to let your instructor know, either via email or in the Ask Your Instructor forum.

References

Austin, J. (n.d.). Pride and Prejudice. Retrieved from https://www.gutenberg.org/files/1342/1342-h/1342-h.htm#link2HCH0001 George Argento (2013). Love is a fallacy [Video file]. Retrieved from https://www.youtube.com/watch?v=7_81fz6kUJI

Hardy, J. (2013). Arguments vs. explanations [HT1] [PDF file]. College of Liberal Arts and Sciences, Ashford University, San Diego, CA.

Konnikova, M. (2014, May 16). I don't want to be right. The New Yorker. Retrieved from http://www.newyorker.com/science/maria-konnikova/i-dont-want-to-be-right

McRaney (2010). Confirmation bias. Retrieved from http://youarenotsosmart.com/2010/06/23/confirmation-bias/

Mobil (1992, August 27). Currents of change. The New York Times.

NASA (n.d.). Consensus 97% of climate scientists agree. Retrieved from http://climate.nasa.gov/scientific-consensus/

PBS Idea Chanel (2014). The ad hominem fallacy | Idea Channel | PBS Digital Studios [Video file]. Retrieved from https://www.youtube.com/watch?v=IVFK8sVdJNg

PBS Idea Channel (2014). The strawman fallacy | idea channel | PBS digital [Video file]. Retrieved from https://www.youtube.com/watch?v=cGZkCPo7tC0

Philosophy Tube (2014). Begging the question – the gentleman thinker [Video file]. Retrieved from https://www.youtube.com/watch?v=OAXKc-rvMa8

Silverman (2011). The backfire effect: More on the press’s inability to debunk bad information. Retrieved fromhttp://www.cjr.org/behind_the_news/the_backfire_effect.php?page=all

Woolbob96’s channel (2013). Commercial appeal to emotion.wmv [Video file]. Retrieved from https://www.youtube.com/watch?v=RwhzbzZSAYg

Just for fun, if you've got a little time (warning: this video uses some of the old Latin fallacy names):