Imagine that you go to City Hall for a construction permit to renovate your house. The employee who receives your form says that, because of the great number of applications the office has received, the staff will take up to nine months to issue the permit. But if you give her $100, your form will make it to the top of the pile. You realize she has just asked for a bribe: an illicit payment to obtain preferential treatment. A number of questions are likely to go through your head. Will I pay to speed things up? Would any of my friends or relatives do the same? You would probably not wonder, however, whether being exposed to the request would, in and of itself, affect a subsequent ethical decision. That is the kind of question behavioral researchers ask to investigate how corruption spreads.
The extent of bribery is hard to measure, but estimates from the World Bank suggest that corrupt exchanges involve $1 trillion annually. In 2018 Transparency International reported that more than two thirds of 180 countries it surveyed got a score of less than 50 on a scale from 0 (“highly corrupt”) to 100 (“very clean”). Major scandals regularly make global headlines, such as when Brazilian construction company Odebrecht admitted in 2016 to having paid upward of $7oo million in bribes to politicians and bureaucrats in 12 countries. But petty corruption, involving small favors between a few people, is also very common. Transparency International’s Global Corruption Barometer for 2017 shows that one in every four of those surveyed said they had paid a bribe when accessing public services in the previous year, with almost one in three reporting such payments in the Middle East and North Africa.
Corruption, big or small, impedes the socioeconomic development of nations. It affects economic activities, weakens institutions, interferes with democracy and erodes the public’s trust in government officials, politicians and their neighbors. Understanding the underlying psychology of bribery could be crucial to tackling the problem. Troublingly, our studies suggest that mere exposure to corruption is corrupting. Unless preventive measures are taken, dishonesty can spread stealthily and uninvited from person to person like a disease, eroding social norms and ethics—and once a culture of cheating and lying becomes entrenched, it can be difficult to dislodge.
Suppose you refused the City Hall employee’s request for a bribe. How would the experience influence your response to a subsequent ethical dilemma? In laboratory studies we conducted with behavioral researchers Vladimir Chituc, Aaron Nichols, Heather Mann, Troy Campbell and Panagiotis Mitkidis, which are currently under review at an academic journal, we sought an answer to that question.
We invited individuals to the behavioral lab in the university to play a game that involved throwing a virtual die for a reward. Everyone was told that they would be compensated based on the outcome of multiple rolls. In practice, however, they could misreport their rolls to earn more money. So all participants faced a conflict between playing the game by the rules and behaving dishonestly to earn more. We created this setup to assess how individuals balance external and internal—or psychological—rewards when making ethical decisions. Research that Nina Mazar, On Amir and one of us (Ariely) published in 2008 indicates that most people act unethically to the extent that they can benefit while also preserving their moral self-image—an observation they described as the theory of self-concept maintenance.
Our game involved rolling a virtual die 30 times on iPads. Many behavioral economists have used similar paradigms involving physical dice and coins to assess dishonesty in so-called decontextualized games—that is, games that are not affected by social or cultural norms. Prior to each roll, participants were instructed to choose a side of the die in their mind—top or bottom—and report their choice after seeing the outcome of the roll. They would earn a fixed amount of money per dot on the side they reported each time. So everyone had a financial incentive to cheat by reporting the high-paying side. For example, if the outcome of the roll was two on the top of the die and five on the bottom of the die, people might be tempted to say they had chosen “bottom” before the roll even if they had not.
This paradigm does not allow us to know whether someone cheated in a specific roll. Nevertheless, when results are aggregated across all rolls and participants in a group, the proportion of favorable rolls chosen can be compared against chance (50 percent) to assess the magnitude of dishonesty.
After participants received instructions about the game and how they would make money in the session, which they would get to take home, they were randomly assigned to a low- or a high-payment version. Those in the high-payment game would do exactly the same thing as those in the low-payment game but earn 10 times more. Everyone was told about the existence of the other game. Then, half the participants in the low-payment condition were offered the possibility of paying a bribe to be switched to the high-payment game.
The research assistant administering the session framed that opportunity as illegal to engender a moral dilemma similar to one that might arise in real life. The person mentioned that the boss was not around and that the participant could easily be switched to the high-paying game without anyone finding out. Thus, we ended up with three groups of people: low-payment no bribe, high-payment no bribe, and bribe exposed; the last group could be further split into bribe payers and bribe refusers. This arrangement allowed us to assess how ethically those exposed to the bribe would behave after having encountered the offer.
We administered three versions of the test to a total of 349 individuals in our behavioral lab. In the first two studies, some participants were offered the possibility of paying a $2 bribe to be placed in the high-payment version of the game, and 85 percent of them paid. Crucially, we observed that in the games they went on to play, bribe-exposed participants cheated more than participants who did not receive such a request. In the second study, for example, bribe-exposed participants cheated 9 percent more than those who played the high-payment version of the game and 14 percent more than participants who played the low-payment version of the game but had not been asked for a bribe.
In a third study, we tested whether people act more immorally when they pay a bribe or when they are merely exposed to one. We made the bribe costlier at $12, and 82 percent turned down the request, giving us a large sample size of bribe refusers. Disturbingly, even when we limited our analysis to this group of apparently ethical individuals, we found that bribe-exposed individuals cheated more than those who did not receive the illegal request. Taken together, results from these three experiments suggest that receiving a bribe request erodes individuals’ moral character, prompting them to behave more dishonestly in subsequent ethical decisions.
Our work suggests that bribery is like a contagious disease: it spreads quickly among individuals, often by mere exposure, and as time passes it becomes harder and harder to control. This is because social norms—the patterns of behavior that are accepted as normal—impact how people will behave in many situations, including those involving ethical dilemmas. In 1991 psychologists Robert B. Cialdini, Carl A. Kallgren and Raymond R. Reno drew the important distinction between descriptive norms—the perception of what most people do—and injunctive norms—the perception of what most people approve or disapprove of. We argue that both types of norms influence bribery. Simply put, knowing that others are paying bribes to obtain preferential treatment (a descriptive norm) makes people feel that it is more acceptable to pay a bribe themselves. Similarly, thinking that others believe that paying a bribe is acceptable (an injunctive norm) will make people feel more comfortable when accepting a bribe request. Bribery becomes normative, affecting people’s moral character.
In 2009 Ariely, with behavioral researchers Francesca Gino and Shahar Ayal, published a study showing how powerful social norms can be in shaping dishonest behavior. In two lab studies, they assessed the circumstances in which exposure to others’ unethical behavior would change someone’s ethical decision-making. Group membership turned out to have a significant effect: When individuals observed an in-group member behaving dishonestly (a student with a T-shirt suggesting he or she was from the same school cheating in a test), they, too, behaved dishonestly. In contrast, when the person behaving dishonestly was an out-group member (a student with a T-shirt from the rival school), observers acted more honestly.
But social norms also vary from culture to culture: What is acceptable in one culture might not be acceptable in another. For example, in some societies giving gifts to clients or public officials demonstrates respect for a business relationship, whereas in other cultures it is considered bribery. Similarly, gifts for individuals in business relationships can be regarded either as lubricants of business negotiations, in the words of behavioral economists Michel André Maréchal and Christian Thöni, or as questionable business practices. And these expectations and rules about what is accepted are learned and reinforced by observation of others in the same group. Thus, in countries where individuals regularly learn that others are paying bribes to obtain preferential treatment, they determine that paying bribes is socially acceptable. Over time the line between ethical and unethical behavior becomes blurry, and dishonesty becomes the “way of doing business.”
Interestingly, in cross-cultural research we published in 2016 with behavioral researchers Heather Mann, Lars Hornuf and Juan Tafurt, we found that people’s underlying tendency to behave dishonestly is similar across countries. We studied 2,179 native residents in the U.S., Colombia, Portugal, Germany and China. Using a game similar to the one in our bribing studies, we observed that cheating levels in these countries were about the same. Regardless of the country, people were cheating to an extent that balanced the motive of earning money with that of maintaining a positive moral image of themselves. And contrary to commonly held beliefs (which we assessed among a different set of participants) about how these countries vary, we did not find more cheaters in countries with high corruption levels (such as Colombia) than in countries with low corruption levels (Germany).
So why do we observe huge international differences in levels of corruption and bribery? It turns out that although individuals’ innate tendencies to behave honestly or otherwise are similar across countries, social norms and legal enforcement powerfully influence perceptions and behaviors. In 2007 economists Raymond Fisman and Edward Miguel published a study of parking violations among United Nations diplomats living in Manhattan. They found that diplomats from high-corruption countries accumulated more unpaid parking violations. But when enforcement authorities could confiscate diplomatic license plates of offenders, the number of unpaid violations decreased significantly. Their work suggests that cultural norms and legal enforcement are key factors in shaping ethical behavior.
But what are the psychological mechanisms involved in the exchange of a bribe? Behavioral researchers have examined these in the lab and the field. For example, in recent research behavioral economists Uri Gneezy, Silvia Saccardo and Roel van Veldhuizen studied the psychology behind the acceptance of bribes. They conducted a lab study with 573 participants, divided into groups of three. Two participants competed for a prize by writing jokes, and the third chose the winner. The writers could bribe the referees by including $5 in an envelope when submitting their entry. Gneezy and his colleagues studied how referees reacted and how receiving a bribe distorted their judgment. They found when the referees could keep only the winner’s bribe, bribes distorted their judgment, but when the referees could keep the bribe regardless of the winner, bribes no longer influenced their decision. This study suggests that people are influenced by bribes out of self-interest and not because they want to return the favor to whoever paid the bribe.
In related studies, published in 2017, Nils Köbis, now at the University of Amsterdam, and his colleagues tested the idea that severe corruption emerges gradually through a series of increasingly dishonest acts. They found that, in fact, participants in their four experiments were more likely to behave unethically when given the opportunity to do so in an abrupt manner—that is, when tempted with a single opportunity to behave unethically for a large gain rather than when faced with a series of choices for small benefits. As the researchers concluded, “sometimes the route to corruption leads over a steep cliff rather than a slippery slope.”
Given how damaging corruption is to societies, we believe it is crucial to further probe its psychological roots. Three areas beg for future research. First, we need a fuller accounting of what drives a culture toward less ethical behavior. What, for example, prompts someone to ask for a bribe? What impacts the likelihood of accepting a bribe? Second, what are the consequences of bribery? Clearly, bribery and, more broadly, dishonesty are contagious. But future research could investigate the lasting effects of bribery over time and across domains: What happens when people are consistently exposed to bribes? Does recurring exposure to bribery strengthen or weaken the effect of bribes on individual dishonesty? Last, what kinds of interventions would be most effective in reducing bribe solicitations and acceptance?
Going back to our initial example, we see that the corrupt exchange that the City Hall employee offered might have seemed trivial or at least be considered an isolated event. Sadly, a single bribe request will affect the requester and the recipient. And notably, its dominolike effect can impact many individuals over time, spreading quickly across a society and, if left unchecked, entrenching a culture of dishonesty.
This article was originally published with the title “Contagious Dishonesty” in Scientific American 321, 3, 62-66 (September 2019)doi:10.1038/scientificamerican0919-62
[su_divider divider_color=”#050c00″ margin=”15″]
[su_divider top=”no” divider_color=”#050c00″ margin=”15″]