STEP ONE: Recognize common errors in thinking and arguments.
I think it will amaze and maybe horrify you to see how many ways the human mind makes mistakes. This isn't a complete list. Indeed, certain irrational ideas have already been discussed extensively in previous cognitive methods, especially #3 above. These thoughts lead to unwanted emotions which, in a circular fashion, further distort our thinking. In addition, we all have our "touchy topics" or "sore points" that set our minds reeling and mess up our thinking. For example, making a mistake or being surprised may shut down your brain for a moment, being laughed at or treated with disrespect may infuriate you, being envious or jealous may distract your thoughts, etc. It is important to understand what is happening to our thinking in these situations, in order to gain some control and peace of mind.
The recent emphasis on Cognitive Therapy has lead to several books cataloging an assortment of toxic ideas or beliefs. For example, Freeman and DeWolf (1992) say the 10 dumbest mistakes are (1) assuming a catastrophe is about to happen, (2) thinking we know what other people are thinking (or they should know what we think), (3) assuming responsibility for other people's troubles or bad moods, (4) believing too many good things about ourself and our future, (5) believing too many bad things about ourself and our future, (6) insisting on being perfect, (7) competing or comparing with everyone and losing, (8) worrying about events that never happen, (9) being abused by our own excessive "shoulds," and (10) finding the negative aspect of everything good. They offer solutions too.
Other books (Lazarus, Lazarus & Fay, 1993) list thoughts that cause us trouble, such as "it is awful every time something unfair happens," "why would anyone settle for being less than perfect?" "I'm always losing," "you can't count on others, if you want something done right, you've got to do it yourself." Likewise, McKay & Fanning (1991) discuss basic beliefs that define our personality and limit our well-being. Shengold (1995), a psychoanalyst, contends that infantile beliefs ("I'm omnipotent," "Mom loves me most") continue into adulthood and mess up our lives. Sutherland (1995) and vos Savant (1996) also attempt to explain why and how we don't think straight.
Hopefully, by becoming aware of the following typical "errors in thinking" or "cognitive distortions," you should be able to catch some of your own false reasoning and correct it. An additional corrective step might be to explore your history to gain some insight into the original experiences that now prompts the experience-based mind to think in these stressful, unhelpful ways.
Also included in this list are fallacious, misleading strategies used by debaters to persuade the opponent of their viewpoint. These are ways we get fooled and fool ourselves too.
a. Over-generalizing and common mental errors --coming to a conclusion without enough supporting data. We hear about many teenagers using drugs and alcohol, then conclude that the younger generation is going "to pot." We hear that many black men desert their families and that many black women go on welfare, then assume (pre-judge) that most black men are sexually irresponsible and most black women want babies, not work. On a more personal level, the next teenager or black we meet we may suspect of being "high" or unfaithful. We are turned down by two people for a date, then conclude "no woman/man will go with me." We have found school uninteresting and conclude that we will never like to study. We find two red spots on our nose and conclude we have cancer (also called catastrophizing).
Anecdotal evidence is another example of taking one incident and assuming it proves a larger principle. Example: "I had a case once in which the marital problems disappeared as soon as the woman learned to have orgasms, so I do sex therapy with all couples." This thinking won't surprise anyone, but there is a troubling tendency to give more weight to a single person's opinion or experience--especially if the information is given to us face to face--than to a statistical summary of many people's opinions or experience. One person's story is not an accurate sample! Frankly, there is evidence that we don't read tables very well, e.g. we attend more to what a diagnostic sign (like a depression score) is related to, than we do to what the absence of the sign is related to. Let's look at an example.
The situation may become a little complicated, however. Suppose you had a psychological test that you knew was 95% accurate in detecting the 5% of people who are depressed in a certain way. Further suppose that 35% of non-depressed people are misdiagnosed as being depressed by this test. If a friend of yours got a high depression score on this test, what are the chances he/she really is depressed? What do you think? The majority of people will say 65% or higher. Actually the chances are only 13%! The test is very good at detecting the 5% who are depressed (and we notice this score), but the 35% "false positives" is terrible (but not noticed), i.e. the test is misdiagnosing over 1/3rd of the remaining 95% of people as being depressed when they are not. But unless we guard against ignoring the base rates (the ratio of non-depressed to depressed persons in the population), we will, in this and similar cases, error in the direction of over-emphasizing the importance of the high test score. Guard against over-generalizing from one "sign." One swallow doesn't make a summer. Also, guard against ignoring missing information; this is a general human trait which results in wrong and more extreme judgments.
In short, we often jump to wrong conclusions and make false predictions. We spill our morning juice and conclude we are going to have a bad day. We may make too much of a smile or a frown. We may sense sexual attraction where there is none. We see the teacher as disapproving when he/she is not. Indeed, perhaps the most common errors of all are our "mental filters" in one of two opposite directions: negative expectations (of ourselves, of others, or of the world, as we saw in chapter 6) and excessive optimism. The latter is sometimes a "oh, no problem" or a "everything will work out fine" attitude, which is anxiety reducing and advantageous if you still work diligently on solving the problem. If you neglect the problem, it is an attitude that will bring you grief.
Gathering all the relevant information before deciding something is hard work, time consuming, and, often, impossible. We of necessity must operate most of the time with very limited information; most of the time incomplete data isn't a serious problem but sometimes it is.
b. Over-simplification and cognitive biases --it is far easier to have a simple view of a situation, but the simple view is usually wrong, e.g. "Abortion is either right or wrong!" And we have favorite ways of being wrong. Examples: we think things are true or false, good or bad, black or white, but mostly things are complex--gray. We ask, "Is this leader competent or incompetent?" In reality, there are hundreds of aspects to any job, so the question is very complex, "How competent is he/she in each aspect of the job?" You ask, "Will I be happy married to this person forever?" The answer almost certainly is, "You will be happy in some ways and unhappy in others." A simple view of life is appealing, but it isn't real.
For every complex problem, there is a simple answer--and it is wrong!
Yet, humans (especially the experience-based mind) use many devises to simplify things. The truth is we must interpret so many situations and events every day, we can't do a thorough, logical analysis every time. So we make mistakes. If we make too many misinterpretations, they start to accumulate and our minds go over the edge and we either become unreasonable in our behavior or we become emotional--depressed, anger, scared, etc. The more reasonable we can stay, still using both our rational intelligence and our experience-based intelligence, the better off we will be. Therefore, we need to recognize the common kinds of mistakes we make.
We use categorical (either-or) thinking and labeling. Some people believe others are either on their side or against them, either good or bad, good socializers or nerds, intelligent or stupid, etc. Then once they have labeled a person in just one category, such as bad, nerd, real smart, etc., that colors how the entire person is judged and responded to, and inconsistent information about the person is ignored. Likewise, if there are either sophisticated or crude people, and you are sure you aren't sophisticated, then you must be crude. The world and people are much more complex than that.
When explaining to ourselves the causes of a situation, we often commit the fallacy of the single cause. There are many examples: Traits of adults are attributed to single events, such as toilet training (Freud), being spoiled, birth order, being abused, parents' divorce, etc. It's usually far more complex than that. When a couple breaks up, people wonder "who was at fault." There are many, many complex causes for most divorces. The first method in chapter 15, "Everything is true of me," addresses this issue. Usually 15 to 20 factors or more "cause" a behavior.
If we do not attend to all the factors, such as the multiple causes of our problems or the many ways of self-helping, we are not likely to understand ourselves or know how to change things (see chapter 2). For example, if you assume your friend is unhappy because of marital problems, you are less likely to consider the role of the internal critic, irrational ideas, hormones, genes, children leaving home, or hundred's of other causes of depression. Similarly, if you assume that the person who got the highest SAT in your high school will continue to excel at every level of education and in his/her career, you are likely to be wrong. There are many factors involved, resulting in the "regression to the mean" phenomena, which is illustrated by having an unusually high or low score on some trait, but, in time, your score on that trait tends to become more average.
On the other hand, having a lot of evidence is sometimes not enough. Even where you have considerable evidence for a certain view, such as for ESP or life after death, that evidence must be stronger than the evidence against the view or for an alternative interpretation. Consider another example: "Drugs have reduced panic attacks and since intense stress is caused biochemically, psychological factors have little or nothing to do with treating panic attacks." You must weigh the evidence for and against all three parts of the statement: drugs work, stress is chemical, and panic is reduced only by chemicals. All three statements would be hard to prove.
Few of us are without sin (misjudgment). Almost every judge is biased on some issue, e.g. at the very least, the therapist or scientist or sales person wants his/her product to be the best. When evaluating other people's judgments, we have many biases, including a tendency to give greater weight to negative factors than to positive factors, e.g. being told "he sometimes exaggerates" is likely to influence us more than "he is patient." Likewise, in marriage, as we all know, one scathing criticism or hurtful act may overshadow days of love and care.
Another favorite way to over-simplify is to find fault: "It was my spouse's fault that we got divorced." "I failed the exam because it had a lot of trick questions." Obviously, this protects our ego, as does an "I-know-that" hindsight bias: When asked to predict behavior in certain situations, people may not have any idea or may do no better than chance if they guess, but when told that a certain behavior has occurred in that situation, people tend to say, "I expected that" or "I could have told you that."
Another common error is the post hoc fallacy --A preceded B, so A must have caused B. Example: Young people started watching lots of television in the 1950's and 60's, after that ACT and SAT scores have steadily gone down; thus, TV watching must interfere with studying. In truth, TV may or may not contribute to the declining scores. We don't know yet (too many other changes have also occurred).
Likewise, a correlation does not prove the cause. Examples: the economy gets better when women's dresses get shorter. Also, the more Baptist ministers there are in town, the more drinking is done. Obviously, women showing more leg don't improve the economy nor do ministers cause alcoholism. Other more complex factors cause these strange relationships. (On the other hand, a correlation clearly documents a relationship and if it seems reasonable, it may be a cause and effect relationship. Thus, in the absence of any other evidence of cause and effect, the correlation may suggest the best explanation available at this time. But it is not proof.)
Research has shown another similar fallacy: the most visible person or aspect of a situation, e.g. the loudest or flashiest person, is seen, i.e. misperceived, as the moving force in the interaction (Sears, Peplau, Freedman & Taylor, 1988), even though he/she isn't.
The answer or hunch that first comes to our mind, perhaps merely because of a recent or a single impressive experience, will often be the basis for our judgment--and it's often wrong. Examples: If a friend has recently won the lottery or picked up someone in a bar, your expectation that these things will happen again increases. If you have recently changed your behavior by self-reinforcement, you are now more likely to think of using rewards. In a similar way, assuming how-things-are-supposed-to-be or using stereotypical thinking impairs our judgment. Examples: If you hear the marital problems of one person in a coffee shop and the same problems from another person in a Mental Health Center, you are likely to judge the latter person to have more serious problems than the coffee shop patron. We expect clients in Counseling Centers to have grave problems. Guard against these impulsive first impressions.
Here is a clever illustration of the power of the first impression to influence our overall judgment:
A. If you start with 8 and multiply it by 7 X 6 X 5 X 4 X 3 X 2 X 1=The average guess for A is 2250 and 513 for B. The correct answer for both is 40,320. Your ability to guess numbers isn't very important, but it is important that we recognize the fallibility of our minds. Our ability to judge the actual outcome of some economic or political "theory" or promise is not nearly as high as the certainty with which we hold our political beliefs. Likewise, our first impressions of people tend to last even though the first impressions are inconsistent with later evidence. This is true of trained therapists too.
B. If you start with 1 and multiply it by 2 X 3 X 4 X 5 X 6 X 7 X 8=
Without figuring, what do you guess the answers are?
It may come as a surprise to you but considerable research indicates that, in terms of predicting behavior, better trained and more confident judges are frequently not more accurate than untrained, uncertain people. Why not? It seems that highly confident judges go out on a limb and make unusual or very uncommon predictions. They take more chances and, thus, make mistakes (which cancels out the advantages they have over the average person). The less confident predictor sticks closer to the ordinary, expected behavior (high base rate) and, thus, makes fewer mistakes. (Maybe another case where over-simplification is beneficial.)
While it is not true of everyone (see chapter 8), there is a tendency to believe we are in control of our lives more than we are (not true for depressed people). For example, people think their chances are better than 50-50 if you put a blue and a red marble in a hat and tell them that they will win a real car if they pick out the blue marble, but they get only a match box car if they draw out the red marble. Gamblers have this I'm-in-control-feeling throwing dice, obviously an error. We want to believe we are capable of controlling events and we like others who believe in internal control (Sears, Peplau, Freedman & Taylor, 1988); it gives us hope. This is also probably related to misguidedly believing in "a just world, " i.e. thinking people get what they deserve. We believe good things happen to good people ("like me") and bad things happen to bad people. There is little data supporting this belief, but, if bad things have happened to you, people will conclude you must have been bad and deserve what happened (and, therefore, many will feel little obligation to help you).
Some people believe they are the sole cause of other people's actions and feelings: "I am making him so depressed." Not only do some people feel in control, others feel they should be in control, i.e. have special privileges (a prince in disguise). "I shouldn't have to help clean up at work." "Everybody should treat me nicely."
A special form of over-simplification is cognitive bias, i.e. a proneness to perceive or think about something in a certain way to the exclusion of other ways. One person will consistently see challenges as threats, while another person will respond to the same challenging assignments as opportunities to strut his/her stuff. Cognitive biases have already been mentioned in several psychological disorders, e.g.:
Thinking bias Anxiety
Expectation that things will go wrong. Anorexia
A belief that one is getting fat and that's terrible. Depression
Negative view of self, the world, the future. Anger
A belief that others were unfair and hurtful;
Exaggeration of the importance of pleasing others. Social addiction
I can only have fun with my friends.
There is one cognitive bias so common it is called the fundamental attribution error: we tend to see our behavior and feelings as caused by the environment but we think others' behavior and feelings are caused by their personality traits, needs, and attitudes. In short, we are psychoanalysts with others but situationists with ourselves. Example: When rules are laid down to a teenager, the action is seen by the parents as being required by the situation, i.e. to help the adolescent learn to be responsible, but the teenager becomes a little Freud and sees the rules as being caused by the parents' need to control, distrust, or meanness. When rules are broken, however, it is because "the kid is rebellious" (parents now do the psychoanalyzing) or "my friends wanted me to do something else and, besides, my parents' rules are silly" (the teenaged Freud suddenly doesn't apply this psychology stuff to him/herself). This kind of thinking is over-simplified and self-serving. More importantly, it causes great resentment because the troubles in a relationship are attributed to the bad, mean, selfish traits of the other person.
In spite of the fundamental attribution error, we will make an exception for ourselves when we are successful: Our successes are attributed to positive internal, not situational, factors--our ability, our hard work, or our good traits. In keeping with the fundamental attribution error, our failures are usually considered due to bad external factors--the lousy system, the terrible weather, someone else's fault, bad luck, and so on. Sometimes we are so desperate to protect our ego from admitting we don't have the ability to do something that we will actually arrange to have a handicap (see self-handicapping in method #1) or excuse for failing, "I was drunk," "I didn't get any sleep," "I forgot," etc. Sometimes, we just lie and make up an excuse, "I was sick," "I'm shy," "I have test anxiety," "I've had bad experiences," etc. Likewise, people exaggerate their contributions to any desirable activity; they tend to see themselves as being more important or more responsible than others. And, we believe that the majority of others agree with our opinions, even when that is clearly not the case. These misconceptions--self-cons really--help us feel better about ourselves by overlooking important facts.