EthicalReasoning Nicowilliam only

Modes of Thinking

Logical Thinking

Thinking logically and identifying reasoning fallacies in one’s own and in others’ thinking is the heart of critical thinking.

Reasoning

Reasoning is a process by which we use the knowledge we have to draw conclusions or infer something new about the domain of interest.

There are a number of different types of reasoning:

  • deductive,

  • inductive, and

  • abductive.

We use each of these types of reasoning in everyday life, but they differ in significant ways.

Deductive Thinking

Deductive reasoning derives the logically necessary conclusion from the given premises. Deductive thinking is the kind of reasoning that begins with two or more premises and derives a conclusion that must follow from those premises. The basic form of deductive thinking is the syllogism. An example of a syllogism follows:

  1. All squares have four sides.

  2. This figure is a square.

  3. Therefore, this figure has four sides.

Usually our thinking is not as formal as this but takes on a shorter form: “Because a square has four sides and this figure also has four sides, it is a square.” To understand the logic behind our shortened thought, we need to understand the structure that supports it: the syllogism. A syllogism is a three-step form of reasoning which has two premises and a conclusion. (Premises are statements that serve as the basis or ground of a conclusion.) Not all syllogisms are alike. We will look at three types: the categorical, the hypothetical, and the disjunctive (Kirby,1999).

Categorical Syllogisms

The classic example of a categorical syllogism comes from the philosopher Socrates. Updated for gender, it goes as follows:

  1. MAJOR - All human beings are mortal.

  2. MINOR - Ann is a human being.

  3. CONCLUSION - Therefore, Ann is mortal.

We can see that categorical syllogisms categorize. In the example above “human beings” are put in the “mortal” category. “Ann” is in the “human being” category. And in the last statement, “Ann” is in the “mortal” category. If the first line of the syllogism above read, “some human beings are mortal,” then only some human beings would be in the “mortal” category.

A categorical syllogism is a form of argument that contains statements (called categorical propositions) that either affirm or deny that a subject is a member of a certain class (category) or has a certain property. For example, “Toby is a cat” is a categorical statement because it affirms that Toby (the subject) is a member of a class of animals called “cats” (Kirby, 1999).

“Toby is brown” affirms that Toby has a property of brownness. Similarly, “Toby is not a cat” and “Toby is not brown” are categorical statements because they deny that Toby has the property of brownness and that Toby belongs to a class of animals called “cats.” All valid syllogisms must have at least one affirmative premise.

In the standard form of a categorical syllogism, the major premise always appears first. It contains the “major” term (in this case “mortal”), which is the term that appears as the predicate in the conclusion:

  1. MAJOR - All human beings are mortal.

  2. MINOR - Ann is a human being.

  3. CONCLUSION - Therefore, Ann is mortal.

What is a predicate? It is simply the property or class being assigned to the subject in the last line. In our example above, the subject in the last line is Ann, and the property of Ann is that she is “mortal.” If a syllogism concluded with the words “Robert is intelligent,” then “intelligent” would be the predicate because in this sentence it is the property of the subject, “Robert.” “Intelligent” is also the major term and would appear in the first (or major) premise:

  1. MAJOR - Our faculty are intelligent.

  2. MINOR - Kerry is one of our faculty.

  3. CONCLUSION - Therefore, Kerry is intelligent.

Let’s look at the other parts of the syllogism and see how they combine to form a valid argument. The minor premise introduces the minor term (in our examples, “Ann” and “Kerry”).

  1. MAJOR - All human beings are mortal.

  2. MINOR - Ann is a human being.

  3. CONCLUSION - Therefore, Ann is mortal.

 

  1. MAJOR - Our faculty are intelligent.

  2. MINOR - Kerry is one of our faculty.

  3. CONCLUSION - Therefore, Kerry is intelligent.

The minor premise makes a connection between the minor term and the major term. It makes this connection through the “middle term,” which then disappears in the conclusion:

  1. MAJOR - All human beings are mortal.

  2. MINOR - Ann is a human being.

  3. CONCLUSION - Therefore, Ann is mortal.

 

  1. MAJOR - Our faculty are intelligent.

  2. MINOR - Kerry is one of our faculty.

  3. CONCLUSION - Therefore, Kerry is intelligent.

This diagram below summarizes the parts of the syllogism discussed in this section.

Three Kinds of Propositions

You may have noticed by now that some of the premises refer to all members of a class, as in “All humans are mortal.” These kinds of propositions are called universal propositions. They may also take the obverse form, “No humans are immortal” or simply “Humans are mortal,” when the statement implies “all humans.” All syllogisms must have at least one universal premise. The syllogisms above have only one universal premise, but two universals are also allowed:

  1. All whales are mammals.

  2. All mammals breathe by means of lungs.

  3. Therefore: all whales breathe by means of lungs.

It is important to note that in modern logic, universal propositions do not imply that the subject actually exists – only that if the subject exists, it would have the characteristics of the predicate. Thus, “All unicorns are white” does not imply that unicorns exist, but only that if they do, they would be white. Of course, in our everyday use of logic, we usually know that at least one member of the subject exists, as in “All faculty are human beings.”

The other two kinds of propositions are particular and singular. Particular propositions refer to some members of a class, as in “Some faculties are female.” In logic, some means “at least one.”

Four Figures

There are four possible variations, or figures, of the categorical syllogism. The figure depends on the placements of the major, minor, and middle terms:

Figure

#1

#2

#3

#4

M P

P M

M P

P M

S M

S M

M S

M S

S P

S P

S P

S P

Validity of Categorical Syllogism

By "valid" we mean that the argument, which is the reasoning from premises to a conclusion, is accurate. An argument  can be valid or invalid but not true or false (only premises and conclusions are true or false). An argument can be valid even if it contains false premises and a false conclusion. Conversely, it is possible to have an invalid argument with true premises and a true conclusion.

The goal of a good thinker is to develop syllogisms that have both true premises and validity. When we have a valid syllogism with true premises, we have what is called a "sound" argument. In sound arguments the conclusion must be true – and therein lies the beauty and usefulness of the syllogism (Kirby, 1999).

The Rules of Validity

Negation

  1. No valid syllogism can have two negative premises.

  2. If the conclusion is negative, at least one premise must be negative and vice versa.

Distribution

  1. The middle term in at least one premise must be distributed.

  2. If a term is distributed in the conclusion, it must be distributed in the premise in which it occurs.

Particularity

  1. If the conclusion is particular, at least one premise must be particular.

Reasoning in Everyday Life

We interpret events, information, data, and arguments all the time. When we wake up in the morning and look to see what time it is, we interpret the numbers on the clock in order to tell whether it’s time to get up. As we walk, we interpret the speed of an approaching car in order to decide whether the driver is going to stop at a crosswalk and allow us to cross the street. When we read an article or essay, we try to determine what information we get from it. As we perform each of these tasks, our interpretation of these texts is often accompanied by another important activity. In each case, we have to decide between two or more possible actions. When we wake up, we have to decide either to get up and go to work or go back to sleep. As we look at approaching car, we have to decide whether we should start to cross the street or wait until the car stops or drives by. Each of these decisions involves weighing the possibilities and then deciding which one is in our best interest.

The decision-making process requires us to make an informal argument, a case for one action or another based on the available evidence, such as if we would rather stay in bed or go to class. Several factors might influence our decision. The instructor’s attendance policy, our attitude to the subject taught, our current average in the class, the number of times we have missed class before, the weather, and our state of health might all be factors that help us decide.

While we are presented with such informal situations all the time, we are also often required to present our decision-making process to others on a more formal level. For example, let’s say that, after waking late with a cold on a rainy day, you decide not go to work and when you return a couple of days later, you find out that your boss refused to pay you for those days. Now you have to try to persuade your boss to pay for your absence. The process that you use to convince this boss is similar to the one you used to decide to stay in bed.

This section discusses the basic elements in making the kinds of argument.

Argument 101

At its most fundamental level, an argument is a claim supported by one or more reasons. By “claim” we mean a statement that you believe to be true but whose validity might be questioned by others. For example, if you missed a couple of days, you might argue that you should be paid. In this case, your claim would be:

  • I should be paid.

You believe that this statement is true, but your boss might not. However, a claim by itself is not an argument. To be an argument, you also need at least one reason why your boss should also believe this claim. In this case, your reason might be:

  • I was sick and unable to come to work.

Taken together, these two sentences compose a basic argument:

  • I should be paid.

  • I was sick and unable to come to work.

Reasons

After you check to make sure that an argument’s claim is debatable, it is important to make sure that it also has at least one logical reason to support its validity. Many factors influence whether a reason offers valid support for the argument’s claim. These include the reason’s logical connection to the claim and the rhetorical situation in which the argument is being made.

Checking the Connections

Common sense tells us that every claim must have some reason to support it and that every reason must have some legitimate connection to that claim. For example, the following argument immediately evokes our opposition:

I should be paid because I’m smart.

This argument immediately raises the question of what being smart has to do with being allowed to be paid. It might also lead the boss to ask, “If you’re so smart, why didn’t you make it to work?” In this case, being smart does not rationally connect to being allowed to be paid. One might as well say that I should be allowed to be paid because I’m pretty or because I’m a man. These are all biological factors that have nothing to do with whether the boss should allow you to be paid.

Inductive Thinking

Inductive argument moves from a particular premise to a universal conclusion. Inductive reasoning begins with a set of evidence or observations about some members of a class. From this evidence or observation, we draw a conclusion about all members of a class. Because of this move from the particular to the general, the conclusions of good inductive reasoning likely or probably follow from the observation; they do not absolutely follow. By moving from the particular to the general, the conclusions of inductive reasoning are not logically contained in the premises.

Inductive Logic

The first part of this chapter dealt with deductive arguments. In deductive logic, arguments were said to be valid or invalid. In the domain of deductive logic, we had a great deal of certainty.

Fallacies in deductive logic when a person violates one of the rules of validity are said to be formal fallacies such as an argument with two negative premises. In deductive logic when you were doing truth tables or Venn diagrams, the class had more of a mathematical flavor.

Now we have crossed over into the area of inductive logic. In the area of inductive logic, arguments are said to be stronger or weaker. There is less certainty.

Fallacies in inductive logic are said to be informal fallacies. Informal fallacies have names. Anyone who has played chess is familiar with openings such as the Benoni counter gambit, the Stone Wall, King’s Indian, Queen’s Indian, and so on. These fallacies are like chess openings in that they can become easy to recognize with some practice. They are used on us every day by our employers, children, spouses, coworkers, politicians, advertisers, or anyone who is trying to persuade us to do something.

Informal Fallacies

Argumentum ad hominem (circumstantial) or the argument against the person.

How can Bill Clinton know anything about health care? He went to law school, not to medical school. Here instead of attacking the president’s health care plan, the person attacks the person’s circumstances.

How can Bill Clinton be smart? Everybody that I have ever met who grew up in Arkansas is totally stupid.

Immigrants are not very smart. They don’t speak good English.

Bill Clinton can’t be a good president because he cheated on his wife.

A better argument might be that he is a bad husband because he cheated on his wife or that he is setting a bad example for his daughter because he cheated on his spouse.

Argumentum ad hominem (abusive) or the argument against the person, in this case by being abusive. Essentially name calling.

George Bush is a murderer. He signed off on 125 executions.

Murder is the unjust killing of an innocent person. Some would argue that Bush did not kill anyone. Just as Bill Clinton did not kill anyone in Yugoslavia. He just gave the order. Is it fair to call it murder if it is during war?

Anytime that a participant in an argument resorts to name calling it is the abusive form of argumentum ad hominem. Unless it is true. Jeffrey Dahmer was a serial killer. It would not be abusive to call him a murderer. John Wilkes Booth was an assassin. It would not be inappropriate to call him an assassin. Benedict Arnold was a traitor. It would not be inappropriate to call him a traitor.

Argumentum ad baculum [to war (threat)] is any argument that contains a threat. The threat can be explicit (spelled out) such as, “If you say that about my mother’s apple pie again, I am going to punch you in the nose.” Or, the threat can be implicit, such as when a parent says to his/her child, “Clean up your room or else” (implied). The child does not know if it is going to be a spanking, loss of allowance, or loss of privileges, but he/she knows that he/she has been threatened.

Argumentum ad populum (bandwagon) is the everybody-is-doing-it argument. Politicians frequently use it. Everybody else is taking soft money. Teenagers also use it. All of the other kids have their own car. All of the other kids are buying $500 class rings. All of the other kids can stay out until 3:00 a.m. All of the other kids smoke marijuana.

Argumentum ad ignorantum (argument from ignorance) is frequently used in business. I have never seen it work before. I have never seen God so he must not exist. I have never seen a UFO so they must not exist. I have never met an honest man.

Complex Question (wife beating question) usually illustrated by the question when did you stop beating your wife? When did you stop copying Dave’s homework? It is complex because there are two questions. No matter what you say you are trapped into admitting something. There is an assertion inside the question that accuses you of doing something wrong.

Argumentum ad vericundiam [appeal to authority (expertise)] is frequently used in advertising. Four out of five doctors who smoke, smoke Camels. Nine out of ten dentists who chew gum, chew Trident. Tiger Woods uses Titleist golf balls. Michael Jordan wears Nike sneakers. Caution authority in this fallacy means expertise not authority like a judge or a policeman.

Argumentum ad misercordiam (pity) (:) I should have a higher grade even though I didn’t do well on the exam because I had the flu. School is hard because I am a single mother. I don’t do well at work because I am an alcoholic. I don’t do well in school because my parents are divorced. Maybe you didn’t do well because you didn’t study. Maybe you didn’t do well because you didn’t come to class. Maybe you didn’t do well because you don’t have an aptitude for the subject. But don’t blame your parents.

Post hoc ergo propter hoc (after this, therefore, because of this) involves confusion about cause and effect. I lost money at the casino because I forgot to bring my lucky rabbit’s foot. Every time I wash my car, it rains. My spouse (boyfriend or girlfriend) always starts a fight with me during final exams week. Male students frequently think that middle-aged male professors give female students higher grades because they are female. Male students also think that female teachers give higher grades to female students because they are of the same gender. It may be the case that females are just better students. Or, it may be the case that the distribution of good grades is about the same regardless of gender.

Hasty generalization. This fallacy usually takes the form of prejudice. This is taking one example and applying it to an entire category of people. When a woman cuts Jim off in traffic, he shouts, “Women drivers.” Blondes are dumb. Blondes have more fun. Chinese are clever and deceptive.

Scottish people are thrifty. My husband is Scottish, and he is definitely not thrifty. Irishmen are drunkards. I have an Irish friend, and he does not drink at all.

Practicing Inductive Thinking

Identify the informal fallacies:

Top of Form

  1. Einstein became a great physicist because his parents and his teachers left him alone to dream. Had they badgered him to study, he never would have gotten beyond the Swiss patent office.

    • Answer: Since there is no causal connection between "leaving someone alone to dream" and "becoming a great physicist," the fallacy of false cause occurs.

  • As I drove to school this morning, not one car which was turning had its turn signal on. Thus, I conclude that the drivers in this state are not well trained since they do not ever use their turn signals.

    • Answer: The number of examples and the method of selection are not reliable methods of generalization; hence, the fallacy of converse accident occurs

  • The best definition distinguishing man from the other animals is that man is a rational animal. Therefore, you, as a person, should spend more time studying and using your brain than you should spend having a good time.

    • Answer: Although all persons, as human beings, have rational capacities, it does not follow that in this specific case one should necessarily be rational more often--fallacy of accident is committed.

  • I can see that you are greatly impressed by the power of logic and argument. Therefore, are you going to sign up for Philosophy 102: Introduction to Philosophic Inquiry this semester or next semester? It's got to be one or the other.

    • Answer: The question presupposes that the listener will sign up for a logic course; hence, the fallacy of complex question occurs.

  • The Smithson Foundation is investigating whether or not police officers are using excessive force in traffic arrests of minorities. Hence, we may conclude that some police officers, at least, use excessive force in that kind of arrest.

    • Answer: An investigation does not entail that any evidence has been forthcoming so far. Since no evidence is adduced, one cannot justifiably come to a conclusion. The fallacy of ad ignorantiam occurs in this passage.

  • The testimony of the defendant accused of manslaughter in this indictment should be disallowed because she has been arrested for shoplifting on many occasions.

    • Answer: Strictly speaking, one should evaluate the cogency of the testimony and evaluate it on its own merit. Fallacy of ad hominem occurs because being a shoplifter does not entail not telling the truth.

  • Why haven't you written to your mother as often as you should? You would feel much better about yourself if you would attend to the details of life which are this important.

    • Answer: The supposition that the mother is not written to sufficiently often is assumed without evidence and is used as the evidence for drawing another conclusion; hence, the fallacy of complex question is committed.

  • It should be no surprise to you that the state is, again, headed into either a recession or perhaps a deep economic downturn. After all, a Republican has just been elected governor.

    • Answer: The locutor assumes, without evidence, that the election of a Republican will cause a slowing down of the economy. The fallacy of false cause occurs.

  • When I was shopping at Bess's Fine Clothing, not one person gave me the time of day. I guess Bess's is not a very friendly place to work.

    • Answer: The speaker is generalizing from one experience. More evidence would be necessary to reach the conclusion that Bess's is not a good place to work. The speaker commits the fallacy of converse accident.

  • John Bardeen, a professor at the Advanced Institute of Physics, has gone on record to say that the American Medical Association needs to raise its standards for physicians. The opinion of a man of that brilliance should not be disregarded.

    • Answer: An authority in physics is being cited outside of his field of expertise. The ad verecundiam fallacy occurs. Simply from the fact that most persons believe a statement is true, it does not follow logically that the statement is true--ad populum fallacy.

  • If we took a poll right now, almost every American would agree that a vaccine for AIDS will soon be found. Therefore there can be little doubt that AIDS will be practically wiped out in the near future.

    • Answer: Simply from the fact that most persons believe a statement is true, it does not follow logically that the statement is true--ad populum fallacy.

  • I made low grades on my first tests in math and English. I must really be dumb.

    • Answer: Too few examples are used to justify such a conclusion; fallacy of converse accident is committed.

  • As a daughter when I was four, my father taught me the beauty of numbers, and I have excelled at mathematics ever since. My conclusion on why females do not score as high on math tests? The males with a high aptitude for mathematics are not spending enough time with their daughters.

    • Answer: The author of this example assumes that her case would be typical of all or most other daughters if they had had similar experiences. The fallacy of converse accident occurs.

  • I think that the tests given in this class were more than fair, and I think you will agree with me because, if you do not, your grade in this course will certainly be in jeopardy.

    • Answer: The threat of a poor grade is logically unrelated to the fairness of tests; hence the ad baculum fallacy is committed.

  • The result of my doing well in economics is very simple. I eat Post Toasties for breakfast every morning for breakfast, and this breakfast helps my ability to analyze in great depth. I think it must be all those complex carbohydrates.

    • Answer: No causal evidence is given for the relation between eating a breakfast cereal and ability to analyze, so the fallacy of false cause occurs.

  • Oriental Philosophy is the best course taught at Lander University. I know this because all of my friends say so.

    • Answer: Although most friends think so, that doesn't make it so. Ad populum fallacy occurs.

  • Look Mr. IRS examiner, of course I owe taxes--I'm not denying that. However, I was unable to file on time because my wife was sick and my two children need my attention. Surely the IRS is not opposed to keeping the family together.

    • Answer: The unfortunate circumstances of the taxpayer are logically independent of his responsibility to pay his taxes--ad misericordiam fallacy.

  • Mr. Smith, maybe there is some truth in what you say about me being rude to sales people, but I have certainly heard many sales people complain about your manners, so you are certainly not the person to point this out to me.

    • Answer: The ad hominem variation of "you're another" or tu quoque is offered.

  • Sir, don't you want to look more closely at our aluminum siding for your new home? When we put this up, your home will take on the glow of beauty, and you will be admired by others as someone who cares. Not only that, but your life will be richer as you invite with pride others to your home to share the better way of life.

    • Answer: Some logicians would classify this passage of an instance of the ad populum fallacy. However, the passage is regarded here as "rhetoric and persuasion."

  • It is easy to see that goodness is in the world and not just in our minds because as we look at the world, some things are obviously not evil in themselves.

    • Answer: Although "good" and "evil" are not complementary classes, this passage can be analyzed as petitio principii, since ceteris paribus the meanings are similar enough to be circular reasoning. If this analysis is acceptable, then, in a sense, this fallacy turns of the fallacy of false dichotomy.

  • Mr. Watkins has clearly and concisely detailed his arguments concerning the relative safety of tobacco products for third world countries. But, let me remind you that we could hardly expect him to say anything else because he has worked in the tobacco industry for the last twenty years.

    • Answer: Although Mr. Watkins worked for the tobacco industry, it does not follow necessarily that he does not speak the truth. One might even offer the argument that his expertise is actually relevant to the subject of the argument. Fallacy of ad hominem is committed.

  •  All persons act in order that they might get pleasure. Even so-called altruistic persons who help others so much that they do almost nothing for themselves get pleasure out of giving. Otherwise, they wouldn't do it. Suppose a person hits himself over the head with a hammer. He must get pleasure from it because why else would he do it if he didn't get pleasure from it?

    • Answer: The fallacy is petitio principii or circular argument. The premiss that all persons act from the motive of pleasure is the same statement as the conclusion.

  • The Roper Organization says that more persons watch CBS's 60 Minutes than any other news program on television. Therefore, it must be the best news programming on TV.

    • Answer: Simply because a program is popular, the conclusion doesn't logically follow that the program is the best--unless, of course, one defines "best" as "the most popular" as is sometimes done in marketing. Logically speaking, the fallacy of ad populum occurs.

  • Hilda Robinson, an old backwoods, ignorant lady who never got past the fourth grade in school, claims that chicken soup is good for a cold. What does she know? She is ignorant of the scientific evidence.

    • Answer: The attack on character and circumstances is characteristic of the ad hominem fallacy.

  • Watch the Business Report at 7:00 on channel 6. It's the best report on current dealings on Wall Street because no comparative study of business reposts has ever proved to our satisfaction that there is any better.

    • Answer: From the fact that a conclusion has not been proved, no other conclusion can be drawn. This passage illustrates one common version of the ad ignorantiam fallacy.

Abductive Thinking

Abductive thinking reasons from the result and a generalization to propose a specific case. It is more closely aligned with deductive than with inductive thinking. It involves both at times and tends to be used in finding causes, as well as in fields of inquiry such as artificial intelligence. For example, because the neighbors on my block are tall (result) and tall parents have tall children (generalization), then the children of my tall neighbors will most likely be tall.

Abduction is a method of logical inference introduced by Charles Sanders Peirce which comes prior to induction and deduction and for which the colloquial name is guessing. Abductive reasoning starts when an inquirer considers a set of seemingly unrelated facts, armed with the hunch that they are somehow connected. The term "abduction" is commonly presumed to mean the same thing as hypothesis; however, an abduction is actually the process of inference that produces a hypothesis as its end result. It is used in both philosophy and computing.

Abductive Thinking – A Foundation for Creativity … No it’s not about aliens, but the concept is alien to many. Up until now we’ve talked about Deductive and Inductive Logic. There’s another “thinking” mechanism that many believe is part of the foundation of creativity. It is called Abductive Reasoning. Here’s the short version.

Deduction goes like this; All marbles in your bucket are red. If you pick a marble from your bucket, it will be red. If the premise is true, the conclusion must be true.

Induction goes like this; You have pulled 10 marbles from a bucket, and they have all been red. You plan to pull out another marble. It will probably be red. The outcome has a probability associated with it. Almost all of our thinking is based on Induction.

Abduction works sort of the other way around. You find a red marble. You observe that there is a bucket of red marbles on the other side of the room. You conclude that the red marble probably came from the bucket with red marbles. It’s a guess but not based on either deductive or inductive logic. There is no direct evidence, no set of experiments, no direct cause and effect, but a hunch, a guess, a feeling, a “relationship” that is “seen.” In HeadScratching we call this process “Association.”

Deduction allows deriving b as a consequence of a. In other words, deduction is the process of deriving the consequences of what is assumed. Given the truth of the assumptions, a valid deduction guarantees the truth of the conclusion. It is true by definition and is independent of sense experience. For example, if it is true (given) that the sum of the angles is 180° in all triangles, and if a certain triangle has angles of 90° and 30°, then it can be deduced that the third angle is 60°. Induction allows inferring a entails b from multiple instantiations of a and b at the same time.

Induction is the process of inferring probable antecedents as a result of observing multiple consequents. An inductive statement requires empirical evidence for it to be true. For example, the statement 'it is snowing outside' is invalid until one looks or goes outside to see whether it is true or not. Induction requires sense experience. Abduction allows inferring a as an explanation of b. Because of this, abduction allows the precondition a to be inferred from the consequence b. Deduction and abduction thus differ in the direction in which a rule like “a entails b” is used for inference. As such abduction is formally equivalent to the logical fallacy affirming the consequent or Post hoc ergo propter hoc, because there are multiple possible explanations for b.

Abductive Reasoning is best explained as the development of a hypothesis which, if true, would best explain the presented evidence.

Because it infers cause and effect, abduction is a form of the logical fallacy Post hoc ergo propter hoc (after this, on account of this).

Charles Pierce first introduced abductive thinking to modern logic, using it in his pre 1900s work to describe the use of a known rule to explain an observation.

Also called reasoning through successive approximation, this is validation of one's hypothesis through abductive reasoning. According to abductive validation, an explanation is valid if it gives the “best” explanation for the data (best being defined in terms of elegance and simplicity, i.e., Occam’s razor). In science, abductive validation is common practice for hypothesis formation.

The “Doubting Game”

This approach to problem solving emphasizes argument and a rigidly reductive method of rationality: problem definition, problem analysis, presentation and evaluation of alternatives, detailing of solutions. The doubting game forces one to poke holes in ideas, tear apart assertions, probe continually, and be analytical. Careful development of generalizations in research requires an emphasis on the doubting game. Its view of rationality makes a person feel “rigorous, disciplined, tough minded.” Conversely, if a person refrains from playing the doubting game, he or she feels “unintellectual, irrational, sloppy.”

In his book Writing Without Teachers, Peter Elbow introduces the concept of the “believing” and “doubting” games--complementary methods of approaching texts which he claims are both vital to the “intellectual enterprise.”

Elbow says that most academics or intellectuals are obsessed with one method of approaching new texts and ideas--the doubting game--at the expense of the other. The doubting game allows you to approach a text “critically,” to look for errors and contradictions; it is a game of “self-extrication” from a text's underlying assumptions and conclusions, which you flush out into the open with your hard-headed, scientific skepticism. In Elbow's words, “The truer it seems, the harder you have to doubt it.” By playing the "doubting game," you can come to realize your own opinions and positions by reacting against those of another writer, by engaging in what Elbow calls a “dialectic of propositions.”

Elbow’s article is a plea for a more balanced approach that also includes the “believing game." ”Rather than extricating yourself from the text, the believing game allows you to project yourself into a writer's point of view, to try it on for size, to “try to have that experience of meaning.” You intentionally believe everything--taking in a text as Elbow says, the way an owl eats a mouse--and trust your “organism” eventually to sort out the useful from the unuseful.

Ultimately, the philosophy of the believing game sees ideas not as inherently true or false, but as tools: “By believing an assertion,” writes Elbow, “we can get farther and farther into it, see more and more things in terms of it or 'through' it, use it as a hypothesis to climb higher and higher to a point from which more can be seen and understood.”

The “Believing Game”

In this approach to problem solving, the first rule is to refrain from doubting. It initially believes all assertions so as to see further into them. Effective planning and design require the believing game in order to engender “breakthrough” solutions. The believing game is complementary, not contradictory, to the doubting game. That is, successful solution finding requires use of both approaches. Yet the believing and doubting games cannot be played simultaneously. Each must be employed at different times in the solution-finding process.

The Believing Game is based on the concept of “methodological belief,” developed by educator Peter Elbow, which asserts that the emphasis on critical, deductive thinking reinforces doubt and promotes the one right answer, or “true” conclusion.

The Believing Game is a methodology of thinking that entails believing everything about a given idea or object even if the parts seemingly or actually contradict. If it somehow does not work, a way must be found to MAKE it work.

Imagine that you see something in an inkblot that interests and pleases you - but your colleagues or classmates don’t see what you see. In fact, they think you are crazy or disturbed for seeing it. What would you do if you wanted to convince them that your interpretation makes sense?

If it were a matter of geometry, you could prove you are right (or wrong!). But with inkblots, you don’t have logic’s leverage. Your only hope is to get them to enter into your way of seeing - to have the experience you are having. You need to get them to say the magic words: “Oh now I see what you see.”

This means getting them to exercise the ability to see something differently (i.e., seeing the same thing in multiple ways), and also the willingness to risk doing so (not knowing where it will lead). In short, you need them to be flexible both cognitively and emotionally. You can’t make people enter into a new way of seeing even if they are capable of it. Perhaps your colleagues or classmates are bothered by what you see in the inkblot. Perhaps they think it’s aberrant or psychotic. If you want them to take the risk, your only option is to set a good example and show that you are willing to see it the way they see it.

From Inkblots to Arguments

 What do you see in this picture?

Interpreting inkblots or any ambiguous stimuli is highly subjective, but the process serves to highlight how arguments also have a subjective dimension. Few arguments are settled by logic. Should we invade countries that might attack us? Should we torture prisoners who might know what we need to know? Should we drop a nuclear bomb on a country that did attack us? And by the way, what grade is fair for this paper or this student? Should we use grades at all?

There is a significant force in logic. Logic can uncover a genuine error in someone’s argument. But logic cannot uncover an error in someone’s position. If we could have proven that Iraq had no weapons of mass destruction, that wouldn’t have proven that it was wrong to invade Iraq. “We should invade Iraq” is a claim that is impossible to prove or disprove. We can use logic to strengthen arguments for or against the claim, but we cannot prove or disprove it. Over and over we see illogical arguments for good ideas and logical arguments for bad ideas. We can never prove that an opinion or position is wrong - or right. No wonder people change their minds when someone finds bad reasoning in their argument. (By the same token - or at least a very similar token - it is impossible to prove or disprove the interpretation of a text.)

This explains a lot about how most people deal with differences of opinion:

  • Some people love to argue and disagree, and they do it for fun in a friendly way. They enjoy the disagreement and the give-and-take, and they let criticisms and even attacks roll right off their backs. It’s good intellectual sport for them.

  • Some people look like they enjoy the sport of argument. They stay friendly and rational---they’re “cool”---because they’ve been trained well. “Don’t let your feelings cloud your thinking.” But inside they feel hurt when others attack ideas they care about. They hunker down into their ideas behind hidden walls.

  • Some people actually get mad, raise their voices, dig in, stop listening, and even call each other names. Perhaps they realize that language and logic have no power to make their listeners change their minds---so they give in to shouting or anger.

  • And some people - seeing that nothing can be proven with words - just give up on argument. They retreat. “Let’s just not argue. You see it your way, I’ll see it my way. That’s the end of it. There’s no use talking.” They sidestep arguments and take a relativist position: any opinion is as good as any other opinion. (It’s worth pondering why so many students fall into this attitude.)

But sometimes people actually listen to each other, come to really see the merit in opinions they started off fighting about. Through listening to someone else’s views, they do something amazing: they actually change their thinking. Sometimes strong differences of opinion are resolved---even heated arguments.

When this happens people demonstrate the two inkblot skills: the ability and the willingness to see something differently - or in this case to think or understand something differently. (People often say “I see” when they “understand” something differently). These are precious skills, cognitive and psychological. You won’t have much luck encouraging them in other people unless you develop them in yourself.

With inkblots, the risk seems small. If you manage to see a blot the way a classmate or colleague sees it, you don’t have to say, “Stupid me. I was wrong.” It’s “live and let live” when you are dealing with inkblots. With arguments, however, it feels like win or lose. You often want people not just to understand your position; you often want them to give up their (“wrong, stupid”) position.

Inkblots were used earlier to look for the subjective dimension in most arguments (given that logic cannot prove or destroy a position). Now inkblots can teach you something else. They can teach you that there’s actually a “live-and-let-live” dimension in many arguments - probably most. But people often feel arguments as win/lose situations because they so naturally focus on how their side of an argument differs from the other person’s side. They assume that one person has to say, “Stupid me. I was wrong.”

The believing game will help you understand ideas you disagree with, and thereby help you see that one needs not to lose or give up his or her central idea. The believing game can help you see that both sides in an argument are often right; or that both are right in a sense; or that both positions are implicitly pointing to some larger, wiser position that both arguers can agree on.

What is the Believing Game?

In a sense, it was already explained with the analogy between inkblots and arguments. Now, contrasting it with the doubting game will shed even more light on the subject.

The doubting game represents the kind of thinking most widely honored and taught. It’s the disciplined practice of trying to be as skeptical and analytical as possible with every idea we encounter. By doubting well, we can discover hidden contradictions, bad reasoning, or other weaknesses in ideas that look true or attractive. We scrutinize with the tool of doubt. This is the tradition that Walter Lippman invokes:

The opposition is indispensable. A good statesman, like any other sensible human being, always learns more from his opponents than from his fervent supporters. For his supporters will push him to disaster unless his opponents show him where the dangers are. So if he is wise he will often pray to be delivered from his friends, because they will ruin him. But, though it hurts, he ought . . . to pray never to be left without opponents; for they keep him on the path of reason and good sense.

The widespread veneration of “critical thinking” illustrates how our intellectual culture venerates skepticism and doubting. (“Critical thinking” is a fuzzy, fad term, but its various meanings usually appeal to skepticism and analysis for the sake of uncovering bad thinking. When people call a movement “critical linguistics” or “critical legal studies,” they are saying that the old linguistics or legal studies are flawed by being insufficiently skeptical or critical---too hospitable to something that’s wrong.)

The believing game is the mirror image of the doubting game or critical thinking. It’s the disciplined practice of trying to be as welcoming as possible to every idea we encounter: not just listening to views different from our own and holding back from arguing with them, but actually trying to believe them. We can use the tool of believing to scrutinize not for flaws but to find hidden virtues in ideas that are unfashionable or repellent. Often we cannot see what’s good in someone else’s idea (or in our own!) until we work at believing it. When an idea goes against current assumptions and beliefs---or seems alien, weird, dangerous---or if it’s poorly formulated---we often cannot see any merit in it.

“Believing” is a Scary Word

Many people get nervous when they hear ‘believing.’ They point to an asymmetry between our sense of what “doubting” and “believing” mean. Believing seems to entail commitment whereas doubting does not. It commonly feels as though we can doubt something without committing ourselves to rejecting it---but that we cannot believe something without committing ourselves to accepting it and even living by it. Thus it feels as though we can doubt and remain unscathed, but believing will scathe us. Indeed believing can feel hopelessly bound up with religion. (“Do you BELIEVE? Yes, Lord, I BELIEVE!”)

This contrast in meanings is a fairly valid picture of natural, individual acts of doubting and believing. (Though I wonder if doubting leaves us fully unchanged.) But it’s not a picture of doubting and believing as methodological disciplines or unnatural games. Let me explain the distinction.

Natural individual acts of doubting happen when someone tells us something that seems dubious or hard to believe. (“You say the earth is spinning? I doubt it. I feel it steady under my feet.”) But our culture has learned to go way beyond natural individual acts of doubting. We humans had to struggle for a long time to learn how to doubt unnaturally as a methodological discipline. We now know that for good thinking, we must doubt everything, not just what’s dubious; indeed the whole point of critical thinking is to try to doubt what we find most obvious or true or right (as Lippman advises).

In order to develop systematic doubting, we had to overcome believing: the natural pull to believe what's easy to believe, what we want to believe, or what powerful people tell us to believe. (It’s easy to believe that the earth is stationary.) As a culture, we learned systematic doubting through the growth of philosophical thinking (Greek thinkers developing logic, Renaissance thinkers developing science, and Enlightenment thinkers pulling away from established religion). And we each had to learn to be skeptical as individuals, too - for example, learning not to believe that if we are very good, Santa Claus/God will bring us everything we want. As children, we begin to notice that naïve belief leads us astray. As adults we begin to notice the dreadful things that belief leads humans to do - like torturing alleged witches/prisoners until they “confess.”

Now that you have finally learned systematic doubting with its tools of logic and strict reasoning and its attitude of systematic skepticism - critical thinking - you are likely to end up afraid of believing itself. You had to learn to distrust natural believing (“My parents/country/God will take care of me whenever I am in need.”). So believing can seem a scary word because our culture has not yet learned to go beyond natural acts of naïve believing to develop unnatural believing as a methodological discipline. In short, the believing game is not much honored or even known (though it’s not new).

The methodology of the doubting game gives you a model for the methodology of the believing game. When the doubting game asks you to doubt an idea, it doesn’t ask you to throw it away forever. You couldn’t do that because the game teaches you to doubt all ideas, and you’ll learn to find weaknesses even in good ideas. You can’t throw all ideas away. The scrutiny of doubt is methodological, provisional, conditional. So when a good doubter finally decides what to believe or do, this involves an additional act of judgment and commitment. The doubting game gives good evidence, but it doesn’t do our judging and committing for us.

Similarly, when the believing game asks you to believe all ideas - especially those that seem most wrong - it cannot ask you to marry them or commit yourself to them. Your believing is also methodological, conditional, provisional - unnatural. (It’s hard to try to believe conflicting ideas all at once, but we can try to enter into them one after another.) And so too, if you commit yourself to accepting an idea because the believing game helped you see virtues in it, this involves an additional act of judgment and commitment. The believing game gives you good evidence, but it doesn’t do your deciding for you.

A Surprising Blind Spot for the Doubting Game

The flaws in your own thinking usually come from your assumptions - your ways of thinking that you accept without noticing. But it’s hard to doubt what you can’t see because you unconsciously take it for granted. The believing game comes to the rescue here. Your best hope for finding invisible flaws in what you can’t see in your own thinking is to enter into different ideas or points of view - ideas that carry different assumptions. Only after you’ve managed to inhabit a different way of thinking will your currently invisible assumptions become visible to you.

This blind spot in the doubting game shows up frequently in classrooms and other meetings. When smart people are trained only in critical thinking, they get better and better at doubting and criticizing other people’s ideas. They use this skill particularly well when they feel a threat to their own ideas or their unexamined assumptions. Yet they feel justified in fending-off what they disagree with because they feel that this doubting activity is “critical thinking.” They take refuge in the feeling that they would be “unintellectual” if they said to an opponent what in fact they ought to say: “Wow, your idea sounds really wrong to me. It must be alien to how I think. Let me try to enter into it and see if there’s something important that I’m missing. Let me see if I can get a better perspective on my own thinking.” In short, if we want to be good at finding flaws in our own thinking (a goal that doubters constantly trumpet), we need the believing game.

The Believing Game is Not Actually New

If you look closely at the behavior of genuinely smart and productive people, you will see that many of them have exactly this skill of entering into views that conflict with their own. John Stuart Mill is a philosopher associated with the doubting game, but he also advises good thinkers to engage in the central act of the believing game:

[People who] have never thrown themselves into the mental position of those who think differently from them . . . do not, in any proper sense of the word, know the doctrine which they themselves profess. (129)

Yet this skill of sophisticated unnatural belief is not much understood or celebrated in our culture - and almost never taught.

Imagine, for example, a seminar or a meeting where lots of ideas come up. One person is quick to point out flaws in each idea as it is presented. A second person mostly listens and gets intrigued with each idea - and tends to make comments like these: “Oh I see” and “That’s interesting” and “Tell me more about such and such” and “As I go with your thinking, I begin to see some things I never noticed before.” This second person may be appreciated as a good listener, but the first person will tend to be considered smarter and a better thinker because of that quick skill at finding flaws.

Some people used to feel that they are unintelligent because when one person gave an argument they would feel, “Oh that’s a good idea,” but then when the other person argued the other way, they found themselves feeling, “Oh that sounds good, too.” They may wonder what was the matter with their loose, sloppy mind to let them agree with people and ideas that are completely at odds with each other. The “smart people” tended to argue cleverly and find flaws that may be not so obvious. But ability to play the believing game is not just “niceness” or sloppy thinking; it’s a crucial intellectual strength rather than a weakness - a discipline that needs to be taught and developed.

There is nothing wrong about the doubting game. You need the ability to be skeptical and find flaws. Indeed, the doubting game probably deserves the last word in any valid process of trying to work out trustworthy thinking. For even though the scrutiny of belief may lead us to choose a good idea that most people at first wanted to throw away, nevertheless, we mustn’t commit ourselves to that idea before applying the scrutiny of doubt to check for hidden problems.

But it shouldn’t dominate thinking. People need both disciplines.

Concrete Ways to Learn to Play the Believing Game

If you want to learn good thinking skills, it helps to notice the inner stances - the cognitive and psychological dispositions - you need for doubting and believing:

  • If you want to doubt or find flaws in ideas that you are tempted to accept or believe (perhaps they are ideas that “everyone knows are true”), you need to work at extricating or distancing yourself from those ideas. There’s a kind of language that helps here: clear, impersonal sentences that lay bare the logic or lack of logic in them.

  • If, on the other hand, you want to believe ideas that you are tempted to reject (“Anyone can see that’s a crazy idea”) - if you are trying to enter in or experience or dwell in those ideas - you benefit from the language of imagination, narrative, and the personal experience.

Here are some specific practices to help you experience things from someone else’s point of view.

  1. If people are stuck in a disagreement, you can invoke Carl Rogers’ application of “active listening.” John must not try to argue his point until he has restated Mary’s point to her satisfaction.

  2. But what if John has trouble seeing things from Mary’s point of view? His lame efforts to restate her view show that “he doesn't get it.” He probably needs to stop talking and listen; keep his mouth shut. Thus, in a discussion where someone is trying to advance a view and everyone fights it, there is a simple rule of thumb: the doubters need to stop talking and simply give extended floor time to the minority view. The following three concrete activities give enormous help here:

    • The three-minute or five-minute rule. Any participant who feels he or she is not being heard can make a sign and invoke the rule: no one else can talk for three or five minutes. This voice speaks, we listen; we cannot reply.

    • Allies only - no objections. Others can speak---but only those who are having more success believing or entering into or assenting to the minority view. No objections allowed. (Most people are familiar with this “no-objections” rule from brainstorming.)

    • “Testimony session.” Participants having a hard time being heard or understood are invited to tell stories of the experiences that led them to their point of view and to describe what it's like having or living with this view. Not only must the rest of us not answer or argue or disagree while they are speaking; we must refrain, even afterwards, from questioning their stories or experiences or feelings. We may speak only to their ideas. (This process is particularly useful when issues of race, gender, and sexual orientation are being discussed.)

    • The goal here is safety. Most speakers feel unsafe if they sense we are just waiting to jump in with all our objections. But we listeners need safety, too. We are trying to enter into a view we want to quarrel with or feel threatened by. We’re trying to learn the difficult skill of in-dwelling. It’s safer for us if we have permission simply not to talk about it anymore for a while. We need time for the words we resist just to sink in for a while with no comment.

  • The language of story and poetry helps us experience alien ideas. Stories, metaphors, and images can often find a path around our resistance. When it’s hard to enter into a new point of view, try telling a story of someone who believes it; imagine and describe someone who sees things this way; tell the story of events that might have led people to have this view of the world; what would it be like to be someone who sees things this way? Write a story or poem about the world that this view implies.

  • Step out of language. Language itself can sometimes get in the way of trying to experience or enter into a point of view different from our own. There are various productive ways to set language aside. We can draw or sketch images (rough stick figures are fine). What do you actually see when you take this position? It’s also powerful to use movement, gesture, dance, sounds, and role-playing.

  • Silence. For centuries, people have made good use of silence for in-dwelling. If we’re having trouble trying to believe someone’s idea, sometimes it’s helpful for no one to say anything for a couple of minutes. That’s not much time out of a meeting or conference or class hour, but it can be surprisingly fertile.

  • Private writing. There's a kind of silence involved when everyone engages in private writing. Stop talking and do 7-10 minutes of writing for no one else’s eyes. What's crucial is the invitation to language in conditions of privacy and safety.

  • Use the physical voice. When it’s hard to enter into a piece of writing that feels difficult or distant, for example something written by someone very different from us---or an intricate work like a Shakespeare sonnet---it helps to try to read it aloud as well and meaningfully as possible. (When I’m teaching a longer text, I choose crux passages of a few paragraphs or a page.) The goal is not good acting; the goal is simply to say the words so that we feel every meaning in them---so that we fully mean every meaning. Get the words to “sound right” or to carry the meanings across—for example, to listeners who don’t have a text. After we have three or four different readings of the same passage, we can discuss which ones manage to “sound right”---and usually these readings help us enter in or assent. (It’s not fair to put students on the spot by asking them to read with no preparation time. I ask students to prepare these readings at home or practice them briefly in class in pairs.)

  • This activity illustrates something interesting about language. It’s impossible simply to say words so they “sound right" without dwelling in them and thus feeling their meaning. So instead of asking students to “study carefully” this Shakespeare sonnet, I say, “Practice reading it aloud until you can say every word with meaning.” This involves giving a kind of bodily assent.

  • Nonadversarial argument. Finally, the classroom is an ideal place to practice nonadversarial forms of argument. Our traditional model of argument is a zero-sum game: “If I'm right, you must be wrong.” Essays and dissertations traditionally start off by trying to demolish the views of opponents. “Unless I criticize every other idea,” the assumption goes, “I won’t have a clear space for my idea.” But this approach is usually counterproductive--except with readers who already agree with you and don’t need to be persuaded. This traditional argument structure says to readers: “You cannot agree with my ideas---or even hear them---until after you admit that you’ve been wrong or stupid.”

The structure of nonadversarial argument is simple, but it takes practice and discipline: argue only for your position, not against other positions. This is easy for me here since I have no criticisms at all of the doubting game or critical thinking in itself. It’s much harder if I really hate the idea I’m fighting. It’s particularly hard if my essential argument is negative: “Don’t invade Iraq.” So yes, there are some situations in which we cannot avoid arguing why an idea is wrong. Yet even in my position on Iraq, there is, in fact, some space for nonadversarial argument. I can talk about the advantages of not invading Iraq---and not try to refute for invasion. In this way, I would increase the chances of my opponent actually hearing my arguments.

The general principle is this: If all I have to offer are negative reasons why the other person’s idea is bad, I’ll probably make less progress than if I can give some positive reasons for my alternative idea---and even acknowledge why the other person might favor his/her idea.

Look back at the inkblots. Arguments that look conflicting might both be somehow valid or right. They might need to be articulated better or seen from a larger view - a view the disputants haven't yet figured out. I may be convinced that someone else’s idea is dead wrong, but if I’m willing to play the believing game with it, I will not only set a good example, but I may even be able to see how we are both on the right track. Nonadversarial argument and the believing game help us work out larger frames of reference and better ideas.

Automatic Thinking

Definition: Automatic thoughts are just what the name implies. They are the thoughts that occur constantly as our minds seek to narrate what is going on around us.

Automatic thinking occurs without conscious thought. It enables people to act swiftly and effectively in a wide variety of situations, on the basis of relatively permanent cognitive structures known variously as mental maps, scripts, schemata, belief structures, or theories of action.

It includes:

  • Reflexes

  • Involuntary bodily actions

  • Assumptions

  • Stereotypes

  • Labeling

  • Actions that have become “2nd nature” over time.

Everything we think is an automatic thought. A problem arises when our automatic thoughts manifest as cognitive distortions. Cognitive distortions are automatic thoughts that are based on deeply ingrained core beliefs, and they are irrational reactions we habitually have to situations. We often don’t even know that we see the world in terms of these cognitive distortions. Just as the name implies, they are based on faulty reasoning. There are several types of common cognitive distortions:

Assumptions: An assumption is a proposition that is taken for granted, as if it were true based upon presupposition without preponderance of the facts.

The natural thing to do is the thing you have always done. Every time you approach a problem, you bring your accumulated experience, knowledge, and training to bear on it. But this includes your accumulated assumptions and biases – conscious and unconscious. The more experienced and expert you are, the more likely you are to assume outcomes by extrapolating from the known facts and experiences to predict a result. This mental baggage can prevent you from accepting innovative ideas.

Sometimes the way you frame a problem contains an assumption that prevents you from solving it. In the Middle Ages, the definition of astronomy was the 'study of how the heavenly bodies move around the Earth', i.e. the Earth was considered to be the center of the universe, which resulted in the chain of wrong explanations of various phenomena.

Overgeneralization: As we go through life, we learn from our experiences. It is a natural process of trial and error. Problems arise when we lump all similar experiences together and decide that all experiences of a certain nature will always turn out the same way. See the uses of the words “all” and “always” in that last sentence? That’s a hint at overgeneralization.

Stereotypes: a fixed set of characteristics people tend to attribute to all members of certain groups. One way to simplify things is to organize people into groups. Stereotypes enable you to make quick judgments, but these are often wrong.

Examples:

  • Ethnic stereotypes. They may have changed, or they may just have gone underground. Substantial changes in stereotypes have been reported between 1932 and 1967. Compared with 1932 undergraduates, few 1967 undergrads characterized Americans as industrious or intelligent, Italians as artistic or impulsive, blacks as superstitious or lazy, and Jews as shrewd or mercenary. However, the idea that negative stereotyping is bigoted and socially undesirable has increased, so reports may be biased by attempts to hide bigotry.

  • Gender stereotypes. Males are considered more independent, dominant, aggressive, scientific, and stable in handling crises. Females are seen as more emotional, sensitive, gentle, helpful, and patient. People also have stereotypes of feminists - one study showed feminists were assumed to be less attractive even though that was not the case.

Labeling: It can take the form of making sweeping overgeneralizations about a group of people based on the actions of only a few of them. It can also manifest as self-labeling. Self-labeling can have extremely negative effects. If a person gets a bad grade on a math test and automatically says, “I’m a bad math student,” that person won’t take the steps necessary to improve his/her math skills. “Bad math student” is a label that the student has applied to him/herself, and it most likely is not true.

Other examples: Mind Reading: “They Must Think I’m an Idiot!”

Fortune Telling: “There’s no way they’ll consider me for this job.”

Emotional Reasoning: “I must not be very well prepared. Otherwise, I wouldn’t be so nervous. I’m going to make a fool out of myself!”

Shoulding: “I should be doing this instead of doing THIS.”

Blame: “It's my fault that the world exploded and fifty billion people died.” Blaming yourself for something that is beyond your control.

Automatic thinking is like the experience of learning to drive a stick-shift car. Did you learn to drive a standard transmission? Do you remember what it was like? You had to concentrate very hard on coordinating your hands and your feet. With one hand on the steering wheel and one on the gearshift, you had to use your foot to press the clutch at just the right moment to change gears. If it was not done smoothly, then you either choked down or did the “bunny hop” of starting and stopping down the street. With practice you improved and today can drive across town, play a CD, talk on the phone, and arrive alive without really knowing how you did it. You don’t have to think about it anymore because it is automatic. Changing gears and steering happen out of your awareness. You only pay close attention in an emergency.

Your thinking is the same. You have practiced certain thoughts and reactions so many times that they are automatic. You make a mistake and immediately think, “I can’t do anything right.” A coworker is rude and you say to yourself, “She’s out to get me again.” Someone cuts you off in traffic, and you think, “What an idiot.” These automatic responses produce a mood, influence your behavior, and create your reality. You never stop and challenge what you think because you are not aware of the process. What you tell yourself just seems to be the truth. You do always mess up, your coworker is out to get you, or the world is full of jerks. There are other possibilities. If you learn to pay attention to what you say to yourself then you can challenge it. You can make different choices. Is it true that you always make a mistake? Could the boss have just criticized your coworker and that is why she was upset? Was the other driver rushing to the hospital? If you see another possibility, then you get another reality and a different experience. Automatic thinking can create trouble that you don’t need. It can create a distressing reality that is based upon your habitual thinking.

Yet the very advantages of automatic thinking for dealing with routine situations are likely to become disadvantages in the face of change and uncertainty. Automatic thinking leads people to see only what they already know, to ignore crucial information, and rely on standard behavioral repertoires when change is necessary. Learning from experience requires switching to a more conscious, reflective mode; yet researchers have noted that people often resist making this switch, precisely when it is most called for. Reflective thinking is quite useful under condition of ambiguity and threat.

Automatic thinking means mastering certain skills. After they become automatic, you may perfect them in your creative way.

Automatic thinking means ‘thinking inside of the box.’ Before you start thinking ‘out of the box’, you need to know what is inside of the box. If you want to become an artist, you need to practice drawing, painting, or computer graphics until it becomes automatic.

What you can do to overcome negative sides of automatic thinking:

  • Learn to pay more attention to what you say to yourself.

  • Look for other explanations for what happened.

  • Consider the possibility that your first thought may not be right.

  • Challenge your thinking and give life the best meaning that you can.

  • Enliven life by looking for fresh possibilities and don’t fall back on old, automatic habits.

Reversal Thinking

Testing assumptions is probably the second most important creative thinking principle because it is the basis for all creative perceptions. You see only what you think you see. Whenever you look at something, you make assumptions about reality. Optical illusions - one form of creative perception - depend on this phenomenon.

Every day we act before thinking through what we are doing or the possible consequences. In fact, we make so many daily decisions that it is impossible to test all the potential assumptions.

For instance, the simple act of talking with someone else involves many assumptions. You must assume that the other person actually heard what you said and understood you, that the person’s nonverbal reactions indicate what you think they indicate, and that you can figure out any hidden meanings or purposes.

Testing assumptions can help you shift perspectives and view problems in a new light.

There is an old joke that illustrates this point:

Two men were camping in the wilderness when they were awakened one morning by a large bear rummaging through their food supply. The bear noticed the men and started lumbering toward them. The men still were in their sleeping bags and didn’t have time to put on their boots, so they picked up their boots and began running away from the bear. The terrain was very rough, however, and they couldn’t make much progress. The bear was gaining on them.

Suddenly, one of the men sat down and began pulling on his boots. His friend couldn’t believe what he was seeing and said, “Are you nuts? Can’t you see that the bear is almost here? Let’s go!”

The man on the ground continued putting on his boots. As he did this, he looked up at the other man and said, “Well, Charlie, the way I look at it, I don’t have to outrun the bear - I only have to outrun you!”

And so, another problem is resolved by testing assumptions. In this case, both men originally assumed the problem was how to outrun the bear. When one of the men tested this assumption, a creative solution popped out. This single act provided that man with one critical extra opinion. His spontaneous creative thinking enabled him to gain an edge over his “competitor.”

For years, bankers assumed that their customers preferred human tellers. In the early 1980s, Citibank concluded that installing automatic tellers would help them cut costs. However, the Citibank executives did not imagine that customers would prefer dealing with machines, so they reserved human tellers for people with more than $5,000 in their accounts and relegated modest despositors to the machines. The machines were unpopular, and Citibank stopped using them in 1983. Bank executives took this as proof of their assumption about people and machines.

Months later, another banker challenged this assumption and looked at the situation from the customer’s perspective. He discovered that small depositors refused to use the machines because they resented being treated as second-class customers. He reinstituted the automatic tellers with no “class distinctions,” and they were an instant success. Today, even Citibank reports that 70 percent of their transactions are handled by machine.

The banker challenged the dominant assumption by looking at it from the customer’s perspective. A good way to challenge any assumption is by looking at it from someone else’s perspective.

Reverse assumptions. Reversing your assumptions broadens your thinking. You may often find yourself looking at the same thing as everybody else, yet seeing something different. Many creative thinkers get their most original ideas when they challenge and reverse the obvious.

Consider Henry Ford. Instead of answering the usual question, “How can we get the workers to the material?” Ford asked, “How can we get the work to the people?” With this reversal of a basic assumption, the assembly line was born.

Alfred Sloan took over General Motors when it was on the verge of bankruptcy and turned it around. His genius was to take an assumption and reverse it into a “breakthrough idea.” For instance, it had always been assumed that you had to buy a car before you drove it. Sloan reversed this to mean you could buy it while driving it, pioneering the concept of installment buying for car dealers.

He also changed the American corporate structure by challenging the conventional assumptions about how organizations were run. He quickly realized that GM’s haphazard growth was stifling its potential. So he reversed the basic assumption that major companies are run by an all-powerful individual, creating a new theory that allowed for entrepreneurial decision-making, while still maintaining ultimate control. Under Sloan, GM grew into one of the world’s biggest companies, and his reversal became the blueprint for the modern American corporation.

Reverse some of your basic assumptions about business. For instance, you might start with the idea that “A salesperson organizes the sales territory,” then reverse it to “The sales territory organizes (controls) the salesperson.”

This reversal would lead you to consider the demand for new salespeople as territories become more complex. A salesperson with a large territory may be too well “controlled” by it to react to new accounts and sales possibilities.

Another reversal might be: “A salesperson disorganizes the sales territory.” This would lead to consideration of how to make salespeople more efficient. You could add telemarketing support personnel, in-office follow-up systems, and so on, to organize the territories for salespeople.

Harry Seifert, CEO of Winter Gardens Salads, used reversal to cook up a winning recipe for productivity. Instead of giving employees a bonus after the busy times of the year, he gives them their bonus before the busiest time of the year.

Just before Memorial Day, when they have the largest demand for coleslaw and potato salad, Siefert dishes out $50 to each of his 140 employees to arouse their enthusiasm for filling all of the holiday orders as efficiently as possible. “Because employees are trying to achieve a goal,” he observes, “they don’t feel like they are being taken advantage of during the intense periods.” Production has risen 50 percent during the bonus period.

Once an assumption is reversed and a breakthrough idea achieved, you may be startled by how obvious the idea seems. The reversal need not be a 180-degree turn - just a different angle on a problem. Years ago, shopkeepers assumed they had to serve customers. Someone changed that to shoppers serving themselves and the supermarket was born.

Suppose my challenge is: “In what ways might I create a new business for airports and train stations?”

My basic assumptions are:

  • Airports and train stations are for people who are travelling from one point to another;

  • Planes and trains are constantly arriving and departing;

  • People depart rapidly.

I reverse these assumptions to:

  • Airports and train stations are for people who are not travelling;

  • Planes and trains are not arriving and departing;

  • People are not departing rapidly.

I now have a new perspective on the challenge. Perhaps I could create a business to serve people caught by bad weather, strikes, missed trains or planes, or those who have long delays or layovers and want to rest - people who, for some reason, are not able to travel. They would need lodging but would not want or be able to leave the terminal.

The idea: A capsule hotel that would provide basic amenities in modular, Pullman-style sleeping compartments that could be stacked two or three high. Each capsule would come with a TV, radio, alarm clock, and reading light. A community shower would be available to all guests. The front desk would be staffed twenty-four hours a day and would carry razors, soap, toothpaste, toothbrushes, and so on. The price for a twenty-four-hour stay would be 50 percent cheaper than airport hotels and would also have low hourly rates, perhaps $10 an hour. Such hotels are already in use in Japan, and are quite popular.

The capsules would be easy to clean and maintain. Because they are modular, they could be easily moved between locations. They could also be leased to cities as temporary lodging for the homeless or for people who are forced out of their homes by fires and floods.

Assumption reversal enables you to:

  1. Escape from looking at a challenge in the traditional way;

  2. Free up information so that it can come together in new ways;

  3. Think provocatively. You can take a novel position and then work out its implications;

  4. Look for a breakthrough.

Reversal. The reversal method for examining a problem or generating new ideas takes a situation as it is and turns it around, inside out, backwards, or upside down. A given situation can be "reversed" in several ways; there is no one formulaic way.

For example, the situation, "a teacher instructing students" could be reversed as

  • students instructing the teacher

  • the teacher uninstructing students

  • students instructing themselves

  • students instructing each other

  • teacher instructing him/herself

  • students uninstructing (correcting?) the teacher

Example problem: a motorist came up behind a flock of sheep in the middle of the road and told the shepherd to move the sheep to the side so that he could drive through. The shepherd knew that on such a narrow roadside, he could not easily keep all his sheep off the road at once. Reversal: Instead of "drive around the sheep," drive the sheep around the car: have the car stop and drive the sheep around and in back of it.

Example: going on vacation: bring vacation home, stay on vacation most of year and then "go on work" for two weeks, make work into a vacation, send someone on vacation for you to bring back photos and souvenirs, etc.

Example: how can management improve the store?

  • how can the store improve management?

  • how can the store improve itself?

  • how can management make the store worse?

  • how can the store make itself worse?

  • how can the store hinder management?

Note that in some reversals, ideas are generated which then can be reversed into an idea applicable to the original problem. Example from reversal, "How can management hurt the store?" Hurt it by charging high prices on low quality goods, dirty the floors, be rude to customers, hire careless employees, encourage shoplifting, don't put prices on anything and charge what you feel like, or have to ask for a price check on every item. These bad things can then be reversed, as in, be nice and helpful to customers, make sure all items are priced, etc., and supply a good number of ideas. Sometimes it's easier to think negatively first and then reverse the negatives.

Example: What can I do to make my relationship with my boss or spouse better? Reversal: what can I do to make it worse? Have temper tantrums, use insults, pretend not to hear, etc. Reverse: control temper, use compliments, be solicitous to needs and requests.

In another example, a variety store chain was being hurt by the competition. Some possible reversals include these:

  • how can the store hurt competition?

  • how can competition help the store?

  • how can the competition hurt itself?

  • how can the store help itself?

The second reversal, "How can competition help the store?" was chosen and was implemented by sending employees to competing stores every week to examine displays, sales, floor plans, goods quality and selection, anything that appeared to be effective or useful. The employees brought these ideas back to the company, compared, and implemented the best in the store. Result: competition helped the store.

The value of reversal is its "provocative rearrangement of information" (de Bono's term). Looking at a familiar problem or situation in a fresh way can suggest new solutions or approaches. It doesn't matter whether the reversal makes sense or not.

Ask nonexperts

Sometimes it seems that the test of a truly brilliant idea is whether or not the “experts” discount it. Often these are the ideas that later seem obvious - even to the experts.

Charles Duell, director of the U. S. Patent Office, in 1899: “Everything that can be invented has been invented.”

Grover Cleveland in 1905: “Sensible and responsible women do not want to vote.”

Robert Millikan, Nobel prizewinner in physics, in 1923: “There is no likelihood man can ever tap the power of the atom.”

Lord Kelvin, president of Royal Society, in 1895: “Heavier-than-air flying machines are impossible.”

The King of Prussia who predicted the failure of railroad because: “No one will pay good money to get from Berlin to Potsdam in one hour when he can ride his horse in one day for free.”

There are as many stories about experts failing to understand new ideas as there are keys on a piano. In 1861, Philip Reis, a German inventor, built an instrument that could transmit music and was very close to transmitting speech. Experts told him there was no need for such an instrument, as “the telegraph is good enough.” He discontinued his work, and it was not until fifteen years later that Alexander Graham Bell patented the telephone.