Subscribe
Subscribe
MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
Logo IMG
HOME > SCIENTISTS' NIGHTSTAND > Scientists' Nightstand Detail

INTERVIEW

An interview with Carol Tavris

Anna Lena Phillips

Why do people persist in believing things that have been proved to be untrue? Social psychologist Carol Tavris, author of Anger and The Mismeasure of Woman, joins fellow social psychologist Elliot Aronson to answer this question in Mistakes Were Made (But Not by Me): How We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts (Harcourt, 2007). The authors use cognitive dissonance theory to analyze issues and disputes in the worlds of politics, medical science, psychiatry, the criminal justice system and personal relationships. The theory can't explain everything, Tavris says, but it can shed light on a surprising number of issues.Carol TavrisClick to Enlarge Image

American Scientist assistant book review editor Anna Lena Phillips interviewed Tavris by telephone and e-mail in August and September 2007. 

How did you become interested in the subject of cognitive dissonance, and how did you and Elliot Aronson determine the course you would take in writing the book?

Well, we have been friends and colleagues for many years. We were sitting around one afternoon talking about George W. Bush and the fact that commentators from right, left and center were all shouting at him to admit that he was wrong about weapons of mass destruction and wrong about everybody dancing in the streets to greet us, and how come he didn't just say so. Andy Rooney, in a commentary for 60 Minutes, actually wrote him a mock-speech and begged him to deliver it to the country: "I told you Saddam Hussein tried to buy the makings of nuclear bombs from Africa. That was a mistake and I wish I hadn't said that. I get bad information sometimes just like you do."

But as students of dissonance theory, we figured it was extremely unlikely that Bush would do that. We predicted that a politician who is self-justifying will not be self-correcting, and will even be willing to lose the support of the entire country and his party rather than admit error and change direction.

Elliot's lifework has been on dissonance theory, hypocrisy and the psychology of self-justification, and on how you can apply dissonance theory in real-life situations to improve people's behavior—to get them to save energy, use condoms, practice safe sex. My work as a social psychologist and as a communicator of psychological science has often focused on the harms caused by scientifically unsupported fads and myths, and on the epidemics of "moral panic" that sweep the country from time to time. How come most of the perpetrators of these harmful beliefs and epidemics don't later say, "Sorry, we were wrong; thank goodness for scientific research"? Like Bush, they continue to justify their beliefs. So Elliot and I realized we had a lot to say about why this happens. We decided to apply contemporary dissonance theory to the many domains in which the inability to say "I was wrong" can create so much damage and injustice.

It sounds like Mistakes Were Made was pretty fun as a collaborative project. Do you remember any particularly funny or thought-provoking incidents that arose during your work on the book?

I'll tell you the story that's most poignant to both of us. We knew at the outset that Elliot was going to be losing his vision to macular degeneration. We both thought he'd have a few more years—his vision was worsening, but he was able to compensate by enlarging the font on his computer, and he could see well enough. Shortly after we signed the contract for this book, though, his vision bottomed out quite abruptly, and he became blind. He has peripheral vision, he can walk with a cane, he can still see some things up close—but he can't read. His computer reads text aloud to him.

Now, you can imagine that for a scholar, scientist and writer like Elliot, this was a devastating blow. And for me, as his friend, it was devastating. At first he was going to bail out. But we felt from the beginning that we were going to do this book together, that it was truly a blend of our ideas as well as our writing, and his bailing out was not an option. So we had to learn a different way to write—and that was writing through speaking. We would meet, we would discuss what should be in a chapter, what kind of research it might cover; he would dictate ideas and suggest research—and I would go away and draft something and come back and read it to him, and he would listen and edit.

As a writer, I found this to be a fascinating experience, because it's only when you read something out loud that you can hear that you're trying to sneak something past the reader. When I came to a paragraph or section that was graceless, or that didn't make sense, I found that I was reading that part a little more quickly. But Elliot's a terrific listener, and he would stop me and say, "What are you talking about? That doesn't make sense!" They used to say that trying to sneak a fastball past Hank Aaron was like trying to sneak a sunrise past a rooster. That's how it felt for me. I couldn't throw a fastball sentence past Elliot.

And so we both learned to write in a new collaborative way, by speaking and listening along with putting the words on paper. What started as a frustrating struggle to find a solution became an exhilarating process for us both.

Could you could talk about the pyramid as a metaphor for cognitive dissonance and how you came to it?

I said to Elliot the other day, "Which of us came up with the metaphor of the pyramid?" Because, of course, in my self-justifying way, I thought I had. But he gently corrected my memory and told me that no, on the contrary, the pyramid is a metaphor he's been using for years, to show how self-justification can move people in a direction they might never have imagined going.

It works like this: Consider two students who have the same attitude about cheating. They don't think it's a terrible thing, but they know it's not a good or honorable thing either. Suppose that they now have to take a test—say, one that's going to determine whether they get into graduate school. They freeze on a crucial essay question, and suddenly the student in front of them, the one who has the most beautiful and legible handwriting on the planet, makes some answers visible.

Each of them makes an impulsive decision: One cheats to get a good grade; the other resists cheating to preserve his or her integrity. Now they will justify the choice they made. The student who cheated will minimize the seriousness of cheating and thereby become more vulnerable to cheating again. The one who resisted cheating will become even more adamant that cheating is unethical and wrong. Over time, through the process of self-justification, these two students will move further and further away from each other in their beliefs about cheating. It is as if they had started out at the top of a pyramid, close in their beliefs, but, having taking a step down in different directions, by the time they reach the bottom they are far apart. Moreover, they will come to believe that they always felt that way about cheating. Elliot developed the metaphor of the pyramid from an early experiment that Judson Mills did with children, which got precisely these results. The kids who cheated justified their behavior, and so did the ones who resisted.

That is what self-justification does: It sets us off on a course of action that moves us further and further from the original choice point and then begins to blind us to the possibility that we were wrong. The danger is not so much in the first step we take off the pyramid, but in how far we have come from our original beliefs or intentions by the time we are at the bottom.

You know, once you have this metaphor of the pyramid in your mind you see it everywhere in society. One example that we did not include in the book is the seemingly endless argument between new mothers who make the decision to stay home with their children and those who continue working outside the home. What is the reason for each side's certainty that there's only one right way to be? The data show that children are not harmed when their mothers work; women have been working throughout the centuries; in Europe, children go to nurseries and daycare and that's that—no one thinks their little psyches will be damaged. But in the books by women who have left the work force to be full-time mothers (a tiny minority, by the way, of working women—most can't afford that luxury) I find that the heat they bring to the subject often stems from their ambivalence about their decision.

Most decisions have positive and negative consequences. When you're making an important life decision for which there is no single right answer—as is the case so often in our lives—then that decision is going to be followed by huge postdecision dissonance. You will look for all the reasons to justify the decision you made and notice everything negative about the choice you rejected. If you are not comfortable with the decision you made, you may feel the need to disparage and criticize the people who took a different path. They are, after all, a constant reminder of the road you didn't take.

In an upcoming interview, the Scientist's Bookshelf will be talking with Wendy Williams, coauthor with Stephen J. Ceci of Why Aren't More Women In Science? (APA Books, 2007). Given your research for this book and for The Mismeasure of Woman, I'm wondering if you have thoughts either on the book's title question or, if you've seen it, on the book itself.

There are limits to dissonance theory. It doesn't actually explain everything in the world. However, what it does explain is why so many of us are not as open-minded as we think we are or would like to be. The stronger our intellectual, moral, political or religious beliefs—the beliefs that most define us—the less likely we are to be open-minded about evidence or information that we could be wrong. People who, in addition, have vested economic interests in the status quo will be even more likely to dismiss evidence that might threaten their position. As Stephen Jay Gould wrote in The Mismeasure of Man, about biases in the study of race and intelligence, and I wrote in my book on biases in the study of gender, people's beliefs about the origins of race and gender differences are often "consonant" with their views about the possibility, or impossibility, of equality and change.

I haven't seen the book, but I'm looking forward to it—I know Wendy Williams and Steve Ceci and admire their work, and I know their book considers all aspects of the complicated question about women in science. But for me, the bottom line is the "so what?" question. Suppose someone finds a molecule in the brain that makes 4 of every 10 men interested in science and only 3 of every 10 women interested in science. So what? We can't afford to lose those three women!

The questions we ought to be asking are, What can we as a society do, in this era of rising scientific illiteracy, to change our schools and our scientific institutions to make science more appealing and accessible to anybody who's interested? And what can we as a society do to make sure that science is taught well, at every level, so that more people of both sexes become interested? Countless studies of women scientists show that they have often felt marginalized and excluded and have been made to feel abnormal for wanting to enter a field that was traditionally dominated by men. So let's fix the institutional and organizational conditions, and then let's see who's interested in science and who's not.

In the chapter on blind spots, you discuss the emergence of Big Pharma and the cases of some research scientists who relinquished objectivity when they were investigating potentially profitable drugs. How can scientists work to combat conflicts of interest and be aware of their own motivations and biases?

Elliot's favorite word in answer to this question is "vigilance." The entire reason for the firewall that universities maintained for decades between research and industry was precisely because it is so difficult to be vigilant about conflicts of interest. Good people, by definition, see themselves as being above the possibility of being corrupted. If they are dedicated scientists and are also taking money from industry—and increasingly, this is an economic necessity—they have to be hyper-alert for ambiguous findings. Independent investigators are more likely to look for worrisome findings that, say, indicate a drug's potential harm. Drug-company-funded investigators are more likely to resolve ambiguities in the sponsor's favor. The greater a scientist's incentive to unconsciously suppress disconfirming data, the greater the need to be vigilant about looking for it—to look for the dog that didn't bark, as Sherlock Holmes said. We all listen for the dogs that bark; we don't realize it's just as much of a clue when the dog doesn't bark.

Can you offer any suggestions for science teachers who want to help their students become aware of the potential for dissonance in conducting research?

Understanding how dissonance works is critical for us as teachers—and learners—for two reasons: First, it explains why, faced with scientific information that disconfirms their important beliefs, most people will tell you to get lost and take your data with you. Scientists may despair of creationists who remain unpersuaded by 8,000 studies of evolution, but scientists too have been known to dismiss 8,000 studies opposing their own cherished position on a political or intellectual issue.

Second, understanding dissonance helps us discuss findings in better, more persuasive ways—without making the other person feel stupid for believing something now shown to be false: "How could you possibly believe that!" or "Look, isn't it interesting that your lifelong theory of child development is wrong?" We can try to present science not in a negative, debunking way but in a positive way—to show what is fun, exciting and creative even about disconfirming research. Scientists understand that there is nothing inherently dissonant about disconfirming results; they may not welcome such findings, but they see them (or should!) as important information that moves us a little further along the path of knowledge.

Early in Mistakes Were Made you say, "Most people have a reasonably positive self-concept." Accordingly, most of the book is focused on instances in which people's cognitive dissonance works to make them think better of themselves. But as you mention, there are also cases in which it works in the opposite direction. How did you make the decision to focus so predominantly on cases of positive self-concept, and what role do you think the negative has to play?

We know from studies that the overwhelming majority of people believe themselves to be at least moderately competent, smart and ethical. So most of the examples in our book illustrate the dissonance caused when such individuals are faced with evidence that they just did something incompetent, foolish or unethical. But dissonance will also apply to somebody who has a poor self-concept, and who then gets evidence that they actually did something terrific: Their self-concept remains the same, and they dismiss the compliments as being phony or untrue. That is what is so powerful about understanding dissonance. We will put ourselves into contortions to preserve the beliefs that are most central to us—even when they're just clearly wrong.

Now, another aspect of this is the problem for people who don't reduce dissonance enough. One of the things we say in the book is that the ability to reduce dissonance is adaptive, precisely because it allows us to sleep at night. You couldn't get anything done if you had to keep reassessing every decision you made and every belief you held 40 times a day. It's beneficial to stay with a set of beliefs that guide how you live your life. And it's also a fine thing to be able to reduce dissonance after you've made a decision and bought that car and married that person and moved to Cincinnati, so that you won't beat yourself up about everything that you might have done wrong. The people who don't reduce dissonance enough suffer from regret and remorse, and it can be just as dysfunctional for them not to reduce dissonance as it is for other people to reduce dissonance too quickly, too mindlessly.

This issue leads to a great existential question: What happens when the action we've taken has really been a devastating one to ourselves or others, whether it's in the course of our professional lives, our private lives or a war? How do we live with the realization that we committed a devastating mistake, caused devastating harm? How do we forgive ourselves? What do we do to not just bury the dissonance—but to accept what we did because it can't be undone? How can we understand what we did wrong, and not just make a superficial apology, but learn in some deep way from the harm that we caused, so that we don't make the same mistake again? That's the goal. That's the reason we wrote this book.


comments powered by Disqus
 

Connect With Us:

Facebook Icon Sm Twitter Icon Google+ Icon Pinterest Icon RSS Feed

Sigma Xi/Amazon Smile (SciNight)


Latest Multimedia

VIDEO: Citizen Scientists Aid Researchers in Studying Camel Crickets

MJEpps CricketsThey may bounce really high and look strange, but don't worry, they are harmless...they even scavenge for crumbs off of your floor! A continental-scale citizen science campaign was launched in order to study the spread and frequency of native and nonnative camel crickets in human homes across North America.

Mary Jane Epps, PhD, an author of the paper, went into more detail about the study and significance of citizen scientists in an interview with Katie-Leigh Corder, web managing editor.

To view all multimedia content, click "Latest Multimedia"!


Subscribe to Free eNewsletters!

  • Sigma Xi SmartBrief:

    A free daily summary of the latest news in scientific research. Each story is summarized concisely and linked directly to the original source for further reading.

  • American Scientist Update

  • An early peek at each new issue, with descriptions of feature articles, columns, Science Observers and more. Every other issue contains links to everything in the latest issue's table of contents.

  • Scientists' Nightstand

  • News of book reviews published in American Scientist and around the web, as well as other noteworthy happenings in the world of science books.

    To sign up for automatic emails of the American Scientist Update and Scientists' Nightstand issues, create an online profile, then sign up in the My AmSci area.


EMAIL TO A FRIEND :

Subscribe to American Scientist