How Do People Excuse Their Bad Actions?
Why humans use complex mental strategies to balance their moral bank accounts. A Q&A with Texas McCombs faculty members Robert Prentice and Adrian Ward
Based on the research of Robert Prentice and Adrian Ward
Most people don’t set out to be crooks. But at one time or another, many people do things that, deep down, they know are wrong, whether it’s pilfering a few office supplies or covering up financial fraud.
Often, they excuse themselves for such behavior, making a momentary mental calculation that a dirty deed isn’t really so dirty. When they do it at the highest levels of business — as is alleged with cryptocurrency king Sam Bankman-Fried of FTX — the repercussions can bring down corporations.
Adrian Ward, associate professor of marketing at Texas McCombs, recently published research on how people view their prior immoral actions. He and Robert Prentice, a professor of business law and business ethics, talk about how and why we let ourselves off the hook, what that means for business and society, and ways to avoid such actions in the first place.
Why do people excuse themselves for bad deeds?
AW: People are generally motivated to hold a positive view of themselves, but bad behavior complicates matters. How can we maintain a positive moral identity in the face of our own behaviors, when maybe they’re not always perfectly moral? We excuse our own misdeeds to avoid the cognitive dissonance that would arise from fully acknowledging our imperfections.
RP: We want to be good people, but we also sometimes want the things that may go with doing bad things. When we succumb to those temptations, our brains often create mechanisms to reconcile our good self-image with our bad deeds.
What are the different ways our brains do that?
AW: Our research has found that one way is by thinking of those immoral actions in very mechanistic terms that remove the meaning from them. It’s related to what psychologists call construal level theory.
You can mentally view, or construe, any action at a high level, one that is focused on meaning and purpose, all the way down to a low level, which is very mechanistic. If I turn on the light in my office, I can think of this either as brightening things up or as simply flipping a switch. They are both ways of talking about the same thing, but one feels more meaningful.
In general, people prefer high-level construal of their behavior, because they want to live meaningful lives. But our research shows that when people think about their prior bad behavior, they prefer to think about it in low-level, mechanistic terms. They were not cheating on a test but simply checking someone else’s answer. They were not stealing from the library but keeping a book.
RP: Another way we make this work is through rationalization. We tell ourselves, “I know it was wrong, but my boss made me do it.” Or “My competitors do stuff that’s even worse.” Or “Nobody was really hurt.”
What can be the fallout from these forms of self-protection?
AW: Other people haven’t gone through the same reasoning process as you have. So, if you’re trying to explain yourself or apologize, you might have to deal with this disconnect.
Here’s an example. If your company makes a big mistake — say, it adopts a new billing system that overbills thousands of customers — you figure out what exactly went wrong so it doesn’t happen again.
But if you communicate to the public using mechanistic terms — “We did an audit. Here are the things that happened” — it can seem like you’re not taking enough responsibility. You feel you’re being accurate and thorough. But to others, it seems you’re not acknowledging the higher-level meaning, which is that you hurt many people.
RP: Another form of fallout may be an unconscious lowering of our own moral standards. Psychologist David Luban says: “When our conduct clashes with our prior beliefs, our beliefs swing into conformity with our conduct.” In other words, acts that we once recognized as wrong can become acceptable, because we’re the ones who did them.
How do we slip into these actions in the first place?
RP: It’s often very subtle. We may succumb to incrementalism, unconsciously lowering our ethical standards over time through small changes that balloon into something much bigger. Virtually every Ponzi scheme, securities fraud, and embezzlement began with relatively small numbers and grew over time.
Another bias that fuels immoral action is overconfidence. A large majority of people believe that they are more ethical than their peers and their competitors. They wind up making poor moral choices by not being careful or reflective and not seeking others’ advice.
Can good intentions set the stage for immoral actions?
RP: There’s a concept called moral licensing. Most of us keep a sort of mental moral scoreboard in our heads that compares our view of ourselves as good people with our actions. If we do something good, or we plan to, it can produce a surplus on our mental moral scoreboard that can give ourselves license to cut a few moral corners here and there.
What are some ways we can catch our biases and avoid unethical decisions?
AW: It starts at a personal level. Do we want to believe that we are good, or do we want to actually be good? Taking a hard look at whether our actions line up with how we see ourselves can be uncomfortable, but it’s an important step toward being good — or at least being better.
RP: Most important might be to monitor your own rationalizations. For example, if you catch yourself thinking, “This is a bad thing, but I can do it and still be a good person, because my boss told me to,” then a little bell should go off in your head.
The same is true for moral licensing, incrementalism and overconfidence. I tell students, “OK, now you understand how these things work, and how all of us can fall prey to them. You’ve got no excuse now. Keep your ethical antennae up at all times.” This is hard stuff.
Adrian Ward is co-author of “Making Molehills Out of Mountains: Removing Moral Meaning from Prior Immoral Actions,” published in the Journal of Behavioral Decision Making, with Chelsea Helion and Ian O’Shea of Temple University and David Pizarro of Cornell University.
Robert Prentice is faculty director of McCombs’ Ethics Unwrapped, a free resource on practical applications of behavioral ethics research, with more than 10,000 users in universities and businesses around the world.
Story by Sally Parker