by Michael Bret Hood
If you had to rate yourself on how ethical you are using a scale of 0 (not ethical at all) to 100 (absolutely ethical), what score would you give yourself if the average person got a 50? Did you rate yourself above average? If you did, you are like most people, but if you stopped for a second and really considered the question, are you as ethical as you think yourself to be?
Please share this article – Go to very top of page, right hand side for social media buttons.
It’s relatively easy to say that you would act ethically in a given situation but it’s not as easy as you think.
“Most of us drastically underestimate the degree to which our behavior is affected by incentives and other situational factors.”[1]
Acting ethically seems easy enough in theory but what you fail to take into account is the inner battle that transpires between knowing what you should do versus knowing what you want to do.
“The should self dominates before and after we make a decision, but the want self often wins at the moment of decision.”[2]
As an investigator, you will frequently focus on the actions of the perpetrator when they succumbed to the want self. It would be easy for you to say that your suspect was simply not as ethical as people believed, but that explanation wouldn’t take into account how our brains work. Most people who have acted unethically would likely have predicted that he/she would act ethically in a similar situation, but scientific research has shown our predictions of our behavior are frequently incorrect.
“General principles and attitudes drive our predictions; we see the forest but not the trees. As the situation approaches, however, we begin to see the trees, and the forest disappears. Our behavior is driven by details, not abstract principles.”[3]
This has led to a concept called bounded ethicality.
“Over the past couple of decades, psychologists have documented many different ways that our minds fail to see what is directly in front of us. They’ve come up with a concept called ‘bounded ethicality’: that’s the notion that cognitively, our ability to behave ethically is seriously limited, because we don’t always see the ethical big picture.”[4]
If your mind manipulated the facts to alter the ethical issue to a more favorable perspective, would it be easier for you to make an unethical choice? Think about a fraudster. How many of them have stolen millions of dollars but, upon being caught, provided the excuse that he/she intended to pay it all back?
Investigators tend to believe that their suspects operate using the rational choice theory, weighing the decision based on which option provides the most utilitarian benefit. An investigator who assesses motives and evidence using this model has a significant chance of being wrong in their assumption.
“The reasons we make decisions are not always rational and can’t be isolated from who we are, where we are, or maybe even how long it took us to decide what outfit to wear that morning.”[5]
Things that affect your ability to make ethical decisions are your implicit biases (those biases to which you are consciously unaware), advantageous comparison, diffusion, and framing.
Contrary to popular opinion, implicit bias does not equate to racism or sexism, but rather it is how we generate labels for people as we come in contact with them. The next time you are walking down the street and you notice someone, stop for a second and try to determine if you have attached some kind of label to the person you just saw. The label you choose could be an assumed character trait, an assumed profession, or an assumed designation such as dangerous or funny. Whatever label you would choose probably originates from one of your life experiences, your culture, and/or your personal beliefs. This is an example of your implicit bias.
Whether you realize it or not, the label you assign to people influences your decision-making in ways to which you are not always aware. If you choose a favorable label for a person, your decision is likely to be different than if you were to label the person as unfavorable. The classic “us vs. them” groupings can lead to unethical choices caused by bounded ethicality.
“The human tendency to favor the self and ingroup creates a gravitational pull toward one set of interests, even when that pull is quite invisible, even to the self.”[6]
Given how the brain works, it is not uncommon for people to exhibit unethical behavior when assisting people they like while fully believing their decisions and actions to be completely ethical.
Another way your brain clouds your ethical decisions are through advantageous comparison, justifying your behavior, decision and/or action by citing or remembering someone else who acted more egregiously than you. Using this method allows you to preserve your positive self-image because the unfavorable comparison makes your action, decision and/or behavior appear more acceptable. Common refrains in advantageous comparison are, “I know I did this, but look at what the other guy did!” What would happen if you chose to compare your actions to someone who had acted ethically? Would you be able to preserve your positive self-image if you chose a comparison that made your action look unfavorable?
Crowds can also have an effect on your decision of whether or not to act ethically. When groups of people around you are acting unethically, it is easier for you to diffuse your personal responsibility and to follow suit. Setting ethics to the side for a moment, can you recall a time where you were influenced by the actions of the group? Did you find yourself adopting the behaviors of the leaders without fully comprehending why you were choosing to do so? Protesters who turn into looters after they see other protesters doing the same are examples of people who suffer from bounded ethicality because of crowd diffusion. The typical excuse provided to justify his or her behavior is,
“Everybody was doing it.”[7]
Advantageous comparison and crowd diffusion have similar effects on your ethical boundaries. By watching and/or comparing your actions to others, you can unconsciously expand what is acceptable behavior, thereby adjusting your ethical values to fit your desires in that moment. Organizational leaders who act unethically invariably increase the chances that their followers as well as the organizational culture will mirror their unethical behavior.
In addition, how you frame a decision has a direct impact on whether or not you choose to act ethically. This idea was proven when Dr. Ann Tenbrunsel separated two groups of volunteers by asking one group to think about a business decision while asking the other group to think about an ethical decision. After distracting them for a moment by having both groups concentrate on an unrelated task, Tenbrunsel then presented them with an opportunity to achieve a personal benefit by cheating. Those participants who were primed to think about a business decision were significantly more likely to lie than those participants who were primed to think of an ethical decision.
“The business frame cognitively activates one set of goals – to be competent, to be successful; the ethics frame triggers other goals. And once you’re in a, say, business frame, you become really focused on meeting those goals, and other goals can completely fade from view.”[8]
Inasmuch as business decisions sometimes conflict with ethical decisions, converting an ethical dilemma into a business decision is an easy way for you to step outside your normal ethical boundaries.
In the 1970s, Ford safety employees, when pre-testing the Pinto, discovered a design flaw that frequently caused the gas tank to explode if the vehicle was involved in a rear end collision. Despite knowing about the flaw, Ford executives rushed the Pinto into production. Before making the decision to proceed with production, Ford executives did a cost-benefit analysis and determined that repairing the flaw (about $11 per vehicle at that time) would be more expensive than paying the families whose loved ones were injured or died while driving or riding in the Pinto.[9] What are the chances that these Ford executives would have rated themselves above average in regards to how ethical they were as compared to others?
The Ford executives effectively took the ethical dilemma out of the decision-making process when they performed the cost-benefit analysis. Looking strictly at the numbers allowed the executives to retain their positive self-image while also taking an action that would achieve business goals, thereby maximizing the company’s earnings. What if Ford executives would have changed the frame of their discussion from a simple cost versus benefit perspective to something more personal. What if the question they asked was,
“Would I allow someone I love dearly to drive the Pinto in its’ current design state?”
Do you think that looking at the issue from a more personal perspective would have changed the ethical perspective of their decision?
Bounded ethicality not only occurs in your suspects and your co-workers, but also in you. While an investigator is charged with seeking the truth, your brain sometimes interferes in the process without you knowing it. Decisions that seem ethical may not, in actuality, be as ethical as you perceive them to be. When you are investigating a case and wondering why a person would do such a thing, don’t automatically default to the belief that the individual fully understood the ethical implications of what they chose to do. It is commonplace for an investigator to label his/her suspect as a bad person, but doing so can potentially lead the investigator into his/her own version of bounded ethicality. An investigator has to consider the fact that, on some occasions, bad things are actually done by good people.
[1] Bazerman, M. & Tenbrunsel, A. (2011). Blind Spots: Why We Fail To Do What’s Right And What To Do About It, Princeton University Press, 2011. Pp. 37
[2] Ibid. P. 68
[3] Ibid.
[4] Joffe-Walt, C. & Spiegel, A. (2012). Psychology of Fraud: Why Good People Do Bad Things, NPR, May 1, 2012. Taken from the Internet on April 5, 2012 at http://www.npr.org/2012/05/01/151764534/psychology-of-fraud-why-good-people-do-bad-things
[5] Ye, L. (2015). The Psychology Behind How We Make Choices, HubSpot, may 29, 2015. Taken from the Internet on April 5, 2017 at https://blog.hubspot.com/agency/psychology-choices#sm.0009slfciodgfbc11gw1kllwrpz7g
[6] Chugh, D, Bazerman, M.H., & Banaji, M.R. Bounded Ethicality As A Psychological Barrier To Recognizing Conflicts of Interest. Conflicts of Interest: Problems and Solutions from Law, Medicine and Organizational Settings. London: Cambridge University Press, (2005).
[7] Riggio, R. (2014). The Science of Why Good People Do Bad Things, Psychology Today, November 1, 2014. Taken from the Internet on April 5, 2017 at https://www.psychologytoday.com/blog/cutting-edge-leadership/201411/the-science-why-good-people-do-bad-things
[8] [8] Joffe-Walt, C. & Spiegel, A. (2012). Psychology of Fraud: Why Good People Do Bad Things, NPR, May 1, 2012. Taken from the Internet on April 5, 2012 at http://www.npr.org/2012/05/01/151764534/psychology-of-fraud-why-good-people-do-bad-things
[9] Bazerman, M. & Tenbrunsel, A. (2011). Blind Spots: Why We Fail To Do What’s Right And What To Do About It, Princeton University Press, 2011. Pp. 70-71