We consistently misperceive how others feel about us. For instance, most people think most others see them like they see themselves. That isn't true (Kenny & DePaulo, 1993). Other people's reactions to and feelings about us vary greatly; we are not liked equally by everybody, just as we don't like everyone equally. But we think most people see us in about the same way. We are largely unaware of the discrepancy between how we think another person views us and reality (and many other people hope to keep it that way).

 Many people also tend to find psychological causes for events and ignore other causes: "My head is hurting, I must be up tight," "I forgot to call him, I must not want to do it." Other people find mystical causes: "Hypnotic regression to past lives and the experiences of people who have died and come back to life prove that there is a life after death." Most of us find "good" socially acceptable causes for what we do, called rationalizations (see chapter 5). But, if we do harm someone, we may illogically attempt to deny our responsibility by denying any intention to harm, "I didn't mean to hurt you," or by blaming the victim, "He was a scum." These are all biases.

The greatest discovery of my generation is that human beings can alter their lives by altering their attitudes of mind.
-William James, 1890

 c. Self-deception --when some thought or awareness makes us uncomfortable, we have a variety of ways to avoid it (Horowitz, 1983):

 I would add to this list: avoid reality by believing in mystical forces and myths. Did you know that more people in America believe in ESP than believe in evolution? that 1 in 4 Americans think they have had a mental telepathy experience? that 1 in 6 have spoken with the dead? that 66% of Americans believe in the devil? that 1 in 10 say they have talked with the devil? There is some pay off for believing in superstition, astrology, and psychics. To the extent we surrender to or depend on mystical forces, we lose a chance to discover the real causes and make things better.

 Daniel Goleman (1985) provides a fascinating book about self-deception as a way of avoiding stress. Lockard and Paulhus (1988) have edited a more specialized text. When patients with a divided brain are given written instructions to the right half of the brain only, e.g. "leave the room," they do not realize they received the directions. Yet, they obey the instructions. Furthermore, they believe they are directing their own behavior and say, "I want to get a drink." Perhaps many of the things we think we have consciously decided were actually decided by unconscious thought processes for reasons unknown to us. Denying our blind spots makes it impossible to cope. Admitting our blind spots gives us a chance to cope.

 We are taught as children to deny the causes of our emotions. Children hear: "You make me so mad," "You make me so proud," "I can't stand the messes you make," and on and on. Is it any wonder that adults still assume that other people cause their feelings?

 It isn't just that we avoid the unpleasant. We also seek support for our beliefs, our prejudices, our first impressions, our favorite theories, etc. Example: The psychoanalyst finds sex and aggression underlying every problem. The behavioral therapist finds the environment causing every problem. The psychiatrist finds a "chemical imbalance" behind every unwanted emotion. The religious person sees God everywhere; the atheist sees Him no where. We all like to be right, so "don't confuse me with too many facts." As we think more about an issue, our opinion usually becomes more extreme.

The mind is like a parachute. It only works when it is open.

 In all fairness, it must be mentioned that investigators are busy documenting that self-deception may at times be beneficial to us physically and emotionally (Snyder and Higgins, 1988; Taylor, 1989). Examples would include certain kinds of rationalizations, excuses, unrealistic optimism, denial of negative information, illusions enhancing oneself, and so on. They make us feel better.

 d. Attack the messenger --if you can't attack the person's argument or reasoning, attack the person personally. If you don't like what a person is arguing for but can't think of good counter arguments, call the speaker names, such as Communist, homo, women's liber, a dope, etc., or spread nasty rumors about him/her. An "ad hominem" attack means "against the man," not the argument, such as "If you aren't a recovered alcoholic, you can't know anything about addiction."

 Likewise, if you are being criticized by someone, there is a tendency to counterattack with, "You do something that is worse than that," which is totally irrelevant. Besmirching the speaker, "You're so stupid," doesn't invalidate the message.

 Another way to unfairly attack an argument is to weaken it by making it look foolish. This is called a straw man argument. Examples: The only reason to stop smoking is to save money. You won't make love with me because you have a hang-up about sex.

 e. Misleading analogies --making comparisons and drawing conclusions that are not valid. Keep in mind, many analogies broaden and clarify our thinking. But, other analogies often confuse our reasoning, e.g. suppose you are arguing against nuclear arms by saying that nothing could justify killing millions of innocent people. Your opponent challenges, "Wouldn't you have the guts to fight if someone were raping your daughter?" That is a silly, irrelevant, hostile analogy which is likely to stifle any additional intelligent discussion. Suppose someone expresses an idea and others laugh at it. The person might respond, "They laughed at (some great person) too!" But that is hardly proof that his/her idea is great. Many foolish ideas have been laughed at too.

 f. Citing authority --reverence for a leader or scholar or authority can lead us astray. Aristotle was revered for centuries; he was smart but not infallible. We are raised to respect authorities: "My daddy says so," "My instructor said...," "Psychologists say...," "The Bible says...." Some people become true believers: "Karl Marx said...," "The president says...," "E. F. Hutton says...." Any authority can be wrong. We must think for ourselves, circumstances change and times change.

 Sometimes the authority cited is "everybody" or intelligence, as in "Everybody knows...," "54% of Americans believe...," "Everybody wants a Mercedes," "It is perfectly clear...," "If you aren't stupid, you know...." Likewise, an old adage or proverb may be used to prove a point, but many adages are probably not true, e.g. "Early to bed, early to rise...," "Shallow brooks are noisy," "He who hesitates is lost," "The best things in life are free," etc. Knowing the truth takes more work--more investigation--than a trite quote.

 A similar weakness is over-relying on general cultural beliefs. It is called "arguing ad populum" when social values are blindly accepted as truths: "Women should stay home," "Men should fight the wars," "Women are more moral than men," "God is on our side," "Marriage is forever," etc.

 Another undependable authority is one's intuition or "gut feelings." "I just know he is being honest with me. I can tell." We tend to be especially likely to believe a feeling if it is strong, as when we say "I'm sure it is true, or I wouldn't be feeling it so strongly." A Gestalt therapist might say, "get in touch with your gut feelings and do what feels right." Neither intuitive feelings nor brains have a monopoly on truth or wisdom.

 g. Over-dependence on science and statistics --we take one scientific finding and pretend that it provides all the answers. Just as we revere some authority and look to him/her for the answers, we accept conclusions by scientists without question. While science is the best hope for discovering the truth, any one study and any one researcher must be questioned. Read Darrell Huff's (1954) book, How to Lie with Statistics. Also, watch out for predictions based on recent trends: although life expectancy and divorce rate have doubled or more while SAT scores and birth rate drastically declined, it is unlikely that humans will live for 200 years in 2100 and have several spouses but only a few retarded children. Don't be intimidated by numbers. Ask the statistician: "How did you get these numbers?" Ask yourself: "Does this make sense?"

 h. Emotional blackmail --implying God, great causes, "the vast majority," your company, family or friend supports this idea. Propagandists make emotional references to our belief in God (and our distrust of the unbeliever), to freedom, to a strong economy, to "this great country of ours," to family life or family values, to "the vast majority" who support his/her ideas. When you hear these emotional appeals, better start thinking for yourself. Remember: in war both sides usually think God is on their side. Remember: 100 million Germans can be wrong. Remember: freedom and wealth (while others are starving, uneducated and poor) may be sins, in spite of being in a "Christian" democracy. Remember: millions have gone to war, but that doesn't make war right or inevitable.

 When it is implied that your friends and/or family won't like you, unless you believe or act certain ways, that is emotional blackmail, not logical reasoning. Cults, religions and social cliques use this powerful method when they threaten excommunication, damnation, and rejection.

 By the same token, it may become clear to you that your company, lover, friend, family and so on may be real pleased if you think or act in a certain way. This is a powerful payoff, but that does not make the argument logical or reasonable. In the same way, many want to buy and wear what is "really in" this spring. To buy something just because millions of others have done so is called the fallacy of the appeal to the many.

 An appeal to pity may be relevant at some times (Ethiopians are starving) but not at others (give me a good evaluation because I need the job). A good job evaluation must be based on my performance, not my needs.

 i. Irrelevant or circular reasoning --we often pretend to give valid reasons but instead give false logic. Moslems believe their holy book, the Koran, is infallible. Why? "Because it was written by God's prophet, Muhammad." How do you know Muhammad is God's prophet and wrote the book? "Because the Koran says so." That's circular and isn't too far from the child who says, "I want a bike because I need one." Or, from saying, "Clay knows a lot about self-helping because he has written a book about it." Or, from, "Man is made in God's image. God is white. Therefore, blacks are not human."

 To argue that grades should be eliminated because evaluations ought not exist is "begging the question," it gives no reasons. Likewise, "I avoid flying because I'm afraid," and "I'm neurotic because I'm filled with anxiety" are incomplete statements. Why is the person afraid? ...what causes the anxiety?

 To argue that people should help each other because people should always do what feels good is illogical--feeling good is not necessarily relevant to the issue of doing good unto others, helping others frequently involves making sacrifices, not having fun.

 j. Explaining by naming --by merely naming a possible cause we may pretend to have explained an event. Of course, we haven't but many psychological explanations are of this sort. Examples: Ask a student why he/she isn't studying more and he/she may say, "I'm not interested" or "I'm lazy." These comments do clarify the situation a little but the real answers involve "Why are you disinterested? ...lazy?" How often have you heard: "He did it because he is under stress... hostile... bisexual... introverted... neurotic... self-centered"? True understanding involves much more of an explanation than just a name.

 k. Solving something by naming the outcome goals --when I ask students how to deal with a certain problem, such as procrastination or shyness, they often say, "Stop putting things off" or "Go out and meet people." They apparently feel they have solved the problem. Obviously, solving a problem involves specifying all the necessary steps for getting where you want to go, not just describing the final destination. Freeman and DeWolf (1989) describe "ruminators" as regretting their past and wishing they had lived life differently. Such persons think only of final outcomes, not of the process of getting to the end point. Langer (1989) says a self-helper will focus on the steps involved in getting what he/she wants, not simply on the end result. A student must study before he/she becomes a rich doctor.

 l. Irrational expectations and overestimating or underestimating the significance of an event should also be avoided --believing things must or must not be a certain way (see method #3). Making wants into musts: "I have to get her/him back." "I shouldn't make mistakes." "Things should be fair." "I should get what I want." A related process is awfulizing or catastrophizing: "I'll bet my boy/girlfriend is out with someone else." "I don't know what I'll do if I don't get into grad school." "If something can go wrong, it will." "Flying is terribly dangerous." In short, making mountains out of mole hills. Of course, there is the opposite: "Oh, it (getting an A) was nothing" or "Employers don't care about your college grades, they want to know what you can do" or "I'm pregnant but having a baby isn't going to change my life very much." That's making mole hills out of mountains.

 It is fairly common for certain people in a group to assume that others are watching or referring to them specifically. Often, such a person makes too much out of it. Thus, if someone makes a general but critical comment or walks out of a meeting, such people feel the individual's action is directed at them. Or, if a party flops, certain people will believe that it is their fault. This is called personalizing. Another common assumption is that the other person intended to make you feel neglected, inferior, unathletic, or whatever. This thinking that you know what the other person is thinking is called mind reading.

 m. Common unrealistic beliefs are similar to the irrational ideas in l. above and in method #3 (Flanagan, 1990). Included are the assumptions that most people are happy and that you should be too. This idea may come from people putting on their "happy face," so they look happier than they are. Seeking constant happiness is foolish; with skill and luck we can avoid constant un happiness. Secondly, we humans often assume that others agree with us and do or want to do what we do. Sorry, not true. We are very different. If you sat in one seat in one room alone for month after month (like I am doing writing this), many of you would feel tortured. A few of you, like me, would like it. Some of us love silence; many people experience sensory deprivation if music isn't playing most of the time. The party animal can't understand the person who wants to quietly stay at home. Many of these differences can cause serious conflicts if one person or both start to assume the other person has a problem and is weird, a nerd or boor, a social neurotic, etc. Lastly, there is the very inhibiting belief that you can't change (see chapter 1) and that others won't change. These beliefs exist because they meet certain needs, like a need to be right or accepted, or reflect wishful thinking, like wanting to be very happy. Instead, they may cause unhappiness.

 n. Blocks to seeing solutions --a very clever book by James L. Adams (1974) describing many blocks to perceiving and solving a problem. These may be perceptual blocks, such as stereotyping and inflexibility, or emotional blocks, such as a fear of taking a risk and a restricted fantasy, or cultural blocks, such as thinking intuition and fantasy are a waste of time, or intellectual blocks, such as lacking information, trying to solve the problem with math when words or visualization would work better, and poor problem-solving skills. Adams also suggests ways of overcoming the blocks and cites many other good books.

It is so easy and there are so many ways to be wrong, but it is so hard and there are so few ways to be right.

 By reading this bewildering collection of unreasonableness, it is hoped you will detect some of your own favorite errors. Unfortunately, I was probably able to gather only a small sample of our brain's amazing productivity of nonsense (for more see Gilovich, 1991, and Freeman & DeWolf, 1992, and for overcoming it, see Gula, 1979). Next, you need to diagnose your unique cognitive slippage.

back forward

[ << ][ << ]