by Michael Bret Hood
An Investigator’s Guide To Ethics, Part 1
When you bring up the idea of ethics and compliance training in your organization, what kind of reaction do you get? Why do you think people react in the way that they do? In your opinion, do you believe current compliance and ethics training methods are effective?
Organizations spend hundreds of thousands of dollars each year on ethics and compliance training. Many times the training offered is required for employees and labeling it mandatory may have the opposite effect of what is intended.
“When someone discourages you from doing something, you often feel that your freedom is being threatened, which motivates you to regain choice and control by doing exactly the opposite.”[1]
Freedom of choice and psychological compulsions are exactly why traditional ethical and compliance training programs do not work.
In his landmark book, Thinking, Fast And Slow, author Daniel Kahnemann describes two different systems that control our thinking labeling the processes as System 1 and System 2. In System 1 thinking, your brain works off of automatic pilot making choices to which we are not always consciously aware.[2] When you suddenly swerve to avoid something in the road while driving, you are employing System 1 behavior in that you reacted to a stimulus without taking the time to consider the effects of that action.
System 2 thinking activates the rational processes in your brain causing you to actively consider what you deem to be relevant data before deciding to take action.[3] When you go to purchase a car, you are likely using System 2 thinking in that you have probably decided on the model of car you want, the features you want, and the price range you are willing to pay. You would probably say that on an average day, you spend more time using System 2 thinking as opposed to System 1 thinking, but think back to a time where you purchased a car or negotiated some other significant expense. In your negotiations, did you agree to a price at the high end or just beyond your initial limits? If you did, did you notice how System 1 immediately took over creating reasons why you were justified in extending the upper ranges of your price limit?
System 1 thinking is very prevalent in how you address ethical and compliance matters. While you may think that every decision you make, including decisions involving ethical and compliance issues, are based in System 2, more often than not, System 1 has taken over the decision-making process without you being aware of it. Modern ethics and compliance training, whether in-person or virtually, normally rely on scenarios, which more often than not, have an obvious answer. In these situations, System 1 works jointly with System 2 to achieve the goal…..complete the training in whatever manner is needed so that you can go back to doing whatever it is you deem important. Therefore, you will answer certain ways that may or may not be how you would respond if a similar scenario were to occur in real life.
In addition, some ethical and compliance training utilizes a third party perspective where you are placed in the role of an unattached observing who is charged with gauging the proper and/or improper behavior of others. By seeing facts vicariously, your answers will be altered. For example, how would you feel if you were shown a video of someone who clearly cuts in front of a car causing the other driver to slam on the brakes? If you were a police officer who pulled the person over for that infraction, what would you say and do to that driver? Would you lecture them? Would you give them a ticket?
Conversely, recall a time where you inadvertently cut into another driver’s lane because you didn’t see them there. Would you react in the same way as you did above? Chances are you would be quick to justify your driving because you were fully aware of your intentions. You didn’t mean to drive recklessly, but likely neither did the other driver in the previous scenario. When you are actively placed in the situation as opposed to being a witness to it, the psychological processes that occur change significantly.
Max H. Bazerman and Ann E. Tenbrunsel refer to this battle as the “want self” and the “should self”.[4] In this premise, you know what you should do, but knowing what you should do doesn’t always coincide with what you want to do.
“When it comes time to make a decision, our thoughts are dominated by thoughts of how we want to behave, thoughts of how we should behave disappear.”[5]
This “want self” versus the “should self” phenomenon is easily reflected when you diet to lose weight. The “want self” will clearly urge you to eat something that violates the parameters of your diet while the “should self” urges you to refrain. Which entity typically wins the battle?
Traditional ethics and compliance training typically neglect the “want self” and the “should self” by focusing on decision outcomes and not the processes that produce the decision. In effect, they ignore System 1.
“Ethics training presumes that emphasizing the moral components of decisions will inspire executives to choose the moral path. But the common assumption this training is based on – that executives make explicit trade-offs between behaving ethically and earning profits for their organizations – is incomplete. This paradigm fails to acknowledge our innate psychological responses when faced with an ethical dilemma.”[6]
What you are most likely to do when faced with an ethical dilemma that has positive ramifications for yourself, System 1 is more likely to re-arrange the dilemma to allow you to maintain your positive self-image. “When we fail to recognize a decision as an ethical one, whether due to our own cognitive limitations or because external forces cause ethical fading, this failure may very well affect how we analyze the decision and steer us toward unintended, unethical behavior.”[7]
Enron Corporation’s Code of Ethics booklet encompasses sixty-five pages. Former CEO Kenneth Lay provided an opening introduction to Enron’s corporate ethics when he promised to conduct business in both legal and moral ways. All Enron Corporation employees were required to sign this code as a condition of employment.[8] Yet despite this extensive ethical code, Enron executives created a counter-culture negating any training, ethics and compliance programs. Enron Corporation and the executives were accused, among other things, of avoiding and evading taxes, bribing foreign officials, manipulating markets to the detriment of the consumer, falsifying accounting records, and conspiring with others to deceive regulatory officials as well as the investing public.[9] Lay, in Enron’s Code of Ethics booklet, said,
“Enron’s reputation finally depends on its people, on you and me. Let’s keep that reputation high.”
Formal documents, systems and training will never be as powerful as informal systems. Ralph Larsen, a former CEO for the company Johnson & Johnson once said,
“All the laws in the world cannot ensure that corporate executives will observe them day in and day out.”[10]
Investigators need to not only look at the ethical codes of conduct, compliance rules/procedures, and training attended, but also endeavor to study the informal mechanisms that control the group behavior throughout the organization. Clearly, executives have influence by the example they set, but just as important, it is the things executives choose to overlook that are equally important to the investigator. Failure to divine informal attitudes and behaviors of the organization can lead to incorrect assumptions as well as irrelevant and unnecessary investigative paths.
In order for ethical and compliance training programs to be successful, they have to address natural psychological processes such as bounded ethicality, ethical fading, and motivated blindness. These concepts are instilled in System 1 and are difficult to overcome unless you are consciously aware of their effects. While you may never be able to truly replicate the “want self” and the “should self“, the participant should be immersed in the experience through multi-sensory perception so that the ethical dilemma or scenario seems as real as possible. Approaches such as filmmaker Edouard Getaz’ Inside Risk (www.insiderisk.com) combine the element of interactive filmmaking with total immersive decision-making. By placing people into the problem and making them part of the decision-making process, you can create and improve an informal culture of ethical behavior and compliance. Instead of judging events, participants are transported into the event, which if done effectively, can initiate the psychological processes in the brain as if they were actually involved. It is in these moments that you are more likely to learn how you would really react to various ethical and compliance dilemmas as opposed to reading a scenario and consciously or unconsciously acting in the way you believe you are expected to act.
An investigator needs to recognize whether an organization has taken the next step into 21st Century ethics and compliance training techniques when conducting a risk assessment. Those organizations that haven’t updated are taking the chance that their board as well as their employees will succumb to the “want self” instead of the “should self“, just as the Enron executives did.
See why people consistently rate Bret as a top presenter by booking him for your next event. With 25 years experience as an FBI Special Agent who taught ethics to new Special Agents, Bret’s unique style of teaching and his completely interactive methods help people understand how good people can end up doing bad things while believing in their own mind that they are acting ethically. An investigator who understands how the brain can trick itself has a better chance of successfully resolving issues while also creating both a formal and informal culture that promotes ethical behavior as well as compliance with applicable laws. If you are interested in one of a kind training described by previous participants as genuinely thought-provoking, fun, and incredibly worthwhile, message Bret through LinkedIn or email him at [email protected] to find out how you can increase the chances of preventing fraud and other unethical behavior in your organization.
[1] Grant, A. (2013). Why People Often Do The Exact Opposite Of What They’re Told, Business Insider, August 14, 2013. Taken from the Internet on March 21, 2017 at http://www.businessinsider.com/why-people-dont-follow-directions-2013-8
[2] Kahneman, D. (2015). Thinking, Fast And Slow, New York: Farrar, Straus and Giroux.
[3] Ibid.
[4] Bazerman, M. & Tenbrunsel, A. (2011). Blind Spots: Why We Fail To Do What’s Right And What To Do About It, Princeton University Press, 2011. Pp. 67-68
[5] Ibid.
[6] Bazerman, M. & Tenbrunsel, A. (2011). Blind Spots: Why We Fail To Do What’s Right And What To Do About It, Princeton University Press, 2011. P. 4
[7] Bazerman, M. & Tenbrunsel, A. (2011). Blind Spots: Why We Fail To Do What’s Right And What To Do About It, Princeton University Press, 2011. P. 31
[8] Miller, M. (2002). Enron’s Ethics Code Reads Like Fiction, Columbus Business First, April 1, 2002. Taken from the Internet on March 21, 2017 at http://www.bizjournals.com/columbus/stories/2002/04/01/editorial3.html
[9] Johnson, C. (2003). Enron’s Ethical Collapse: Lessons For Leadership Educators, Journal of Leadership Education, Vol 2(1), Summer 2003. Taken from the Internet on March 21, 2017 at http://www.journalofleadershiped.org/attachments/article/30/JOLE_2_1_Johnson.pdf
[10] Bazerman, M. & Tenbrunsel, A. (2011). Blind Spots: Why We Fail To Do What’s Right And What To Do About It, Princeton University Press, 2011. P. 118