Most people walk around the world thinking of themselves as
pretty good, law-abiding, upstanding citizens.
Sure, you might exceed the speed limit every once in a while, but the
speed limit is a guideline. You might
make a little extra cash helping a friend with something, but not declare that
on your income tax. After all, the
government isn’t really interested in pocket change.
These kinds of justifications for doing something that
violates the law are called moral disengagement. You don’t want to change your self-concept
that you are basically a good person, so you take the negative action and make
it feel more morally acceptable.
How are you able to pull this off, though? If you break a law or cheat, you really have done
something that violates some kind of moral code. Even if you try to describe it as being
acceptable, wouldn’t your memory of doing something wrong lead to feelings of
guilt? In Edgar Allen Poe’s classic
story, The Telltale Heart, a murderer
is pursued by the sound of the heart of his victim.
A paper in the March, 2011 issue of Personality and Social Psychology Bulletin by Lisa Shu, Francesca
Gino, and Max Bazerman suggests that one mechanism that our memory helps with
this process of moral disengagement. In
particular, when people cheat, they tend to have poor memory for aspects of the
situation that might lead to feelings of guilt.
The authors developed a scale to measure people’s overall
degree of moral disengagement. It had
questions like, “Rules should be flexible enough to be adapted to different
situations” that measure whether people want to be bound by strict codes. A preliminary study in the paper found that
people using this scale showed higher levels of moral disengagement when they
imagined themselves in a situation in which they cheated on an exam than when
they imagined themselves with an opportunity to cheat that they did not take.
In an elaborate study, the experimenters gave people a task
developed by Nina Mazar, On Amir, and Dan Ariely that provides an experimental
setting where people can cheat.
Participants are brought together in a large group. They do a series of complicated math
problems. Then, they are given an answer
key and score their performance. After that, people pay themselves from an
envelope of money for each correct answer.
Because the room is large and people are paying themselves, and there is
no real oversight by the experimenter there is a chance for people to cheat. Mazar, Amir, and Ariely found that people
often took more money than they should have in this situation.
Shu, Gino, and Bazerman set up a complex task. Half the people given this task were paid by
the experimenter, so they could not cheat.
The other half paid themselves, so that they could cheat if they wanted
to.
These two groups (no possible cheating vs. potential to
cheat) were assigned to one of three other conditions. One group just performed the task. A second group read an honor code that talked
about the rights and responsibilities of students before doing the math
test. A third group read the honor code
and signed it before doing the math test.
So, what happened?
The group given the opportunity to cheat took about twice the amount of
money they should have. The group that
read the honor code took about 50% more money than they should have. The group that read and signed the honor code
did not cheat much at all.
People did the moral disengagement scale after the math
test. Those who had no opportunity to
cheat showed very low levels of moral disengagement overall, though reading the
honor code and signing it actually made them feel strongly that they should
stick to the rules. When people had the
chance to cheat, those who did not read the honor code showed a pretty high
level of moral disengagement. That is,
they used moral disengagement as a strategy for keeping them from feeling
guilty about cheating. Those who just
read the honor code, but didn’t sign it also showed a moderate amount of moral
disengagement. But, those who signed the
honor code showed very low levels of moral disengagement.
At the end of the study, people who read the honor code were
asked to recall as many of the items on the honor code as they could. The people who read and signed the honor code
remembered a lot of it regardless of whether they had the opportunity to cheat
or not. Of interest, the people who just
read the honor code remembered much less about it if they had the opportunity
to cheat than if they did not.
People read the honor code before they did the math test.
So, the difference in memory had to arise from something about
cheating. Indeed, those specific
individuals who cheated on the test were the ones who showed the worst memory
for the items on the honor code. That
means that people were systematically suppressing information that might have
made them feel guilty about their behavior.
What do these results tell us?
First, the results suggest that people have a finely honed
mechanism for helping them to justify cheating.
One the one hand, people think about rules being more flexible when they
have just cheated. That helps them to
view their own behavior as more socially acceptable.
At the same time, people’s memory for the rules (and
probably for their own behavior) are worse when they cheated than when they did
not. Forgetting the details of the rules
helps people to avoid guilt.
That said, there is a hopeful side to these results. The people who read and signed the honor code
tended not to cheat. That means that any
organization that expresses strong moral norms will promote good behavior by
the people in that organization.
Cheating need not be the norm.