What kind of leader would you most like to work under and why? Give examples. 2. What kind of leader would you least like to work under and why? Give examples. 3. How do you make difficult decisions?

Decision Making and Decision Biases Even well - meaning leaders can fall into decision making biases that render their decisions unethical or problematic. Messick and Bazerman (1996, 2000) claim that many unethical organizational decisions are not really based on greed, neglect or cruelty but rather biased ways in which people process information to make their decisions. Leaders often suffer from biases they have regard ing the way the world works, about other people, and about themselves. Biases about how the world works: Ignoring the possibility that low probability events can occur even if they pose serious consequences later. Limiting the search for stake holders an d as result, overlooking the needs of important groups. Ignoring the possibility that the public may find out about an action. Discounting the future by putting immediate needs ahead of long term goals Underestimating the impact of decisions on other grou ps or stake holders. Believing the world is certain instead of unpredictable. Failure to acknowledge and confront risk. Framing risk differently for followers than for leaders. Blaming people when larger systems are at fault. Excusing those who fail to ac t when they should. Biases about other People Beliefs that people in our group, culture, or circle are more normal, better, and trustworthy while people outside our group, culture, or rank are inferior, strange, or less trustworthy. Giving special conside ration and aid to people in our in - group Judging or evaluating people according to their group membership (stereotyping). Biases about Ourselves Rating ourselves more highly than other people. Underestimating the likelihood that negative things like illne ss, accidents, failures, and addictions can happen to us. Believing that we can control random events or other people. Overestimating the contributions we make to our organization. Overconfidence which prevents us from learning more about a situation Belie ving that normal rules and obligations don’t apply to us. Contextual pressures Organizations that promote conformity and obedience over dissent and innovation. Putting profits over people. Putting appearances above reality. A division of labor that allows subordinate to claim they were “just following orders” and superiors to claim they only set broad policies and cannot be held accountable for unethical or illegal acts done by their subordinates. When tasks are broken up into small segments, we may not even know we are doing anything wrong. For example, a low level government secretary may unknowingly shred documents that are needed in a criminal case. Sometimes the pressure to do well on national tests may lead elementary and high school teachers to give their students more time or help than is ethically allowed. 1. When do these biases creep into your decision making? 2. Which one are you most prone too and what has happened when you have let it influence your decision making? Implicit Bias: Bias as a Cognitive Process Explicit bias reflects the attitudes or beliefs that we knowingly endorse on a conscious level. In contrast, implicit bias is the bias in judgment and/or behavior that stems from cognitive processes operating on a level below o ur conscious awareness and without our intentional control (Staats, 2014). Implicit bias arises out of the human tendency for snap judgment. Everyone is a “cognitive miser”; people want to avoid thinking anymore than necessary when they make quick decision s (Fiske & Taylor, 1991). In doing so, people avoid “fact checking” the “evidence” that supports their judgment or decision. So for example, instead of having a conscious reason for an assessment (smoking is bad because one’s seen a damaged lung); one make s an implicit association (a boisterous Arabic man on the street reminds one of angry shouting crowds of Middle Easterners as seen on the media). Everyone is susceptible to implicit biases (Rutland, et al., 2005) and from a young age, children develop bia sed associations directly from authority figures and indirectly through observation (Castelli, Zogmaister, & Tomelleri, 2009; Kang, 2012) and media exposure (Dasgupta, 2013). As adults, implicit biases, which encompass both benevolent and unfavorable asses sments, are activated involuntarily and without an individual’s awareness or intentional control (Blair, 2002; Rudman, 2004). Because implicit associations arise outside of conscious awareness; they do not necessarily align with an individual’s declared be liefs or even the stances they would explicitly endorse (Graham & Lowery, 2004; Greenwald & Krieger, 2006; Reskin, 2005). Emotionally, people generally associate bias or prejudice with hostility or antipathy towards an out - group member. However, research s uggests that episodes of implicit bias are accompanied by feelings of fear, anxiety, a sense of being startled, and discomfort, rather than emotions such as anger, hostility, or a sense of disapproval (Fazio et al., 1995; Dovidio et al., 1997). While impl icit biases are deeply rooted and largely unconscious, they are nonetheless malleable and manageable when brought to awareness. Through counter bias training, the implicit associations people form can be gradually unlearned and replaced with new mental ass ociations (Blair, 2002; Blair, Ma, & Lenton, 2001; Dasgupta, 2013; Kang & Lane, 2010; Roos, et al., 2013).

Research also suggests that making a sense of personal accountability conscious (i.e., to imagine being called upon to justify one’s attitude or acti ons) can decrease the influence of bias (Reskin, 2000, 2005). Similarly, taking the perspective of others and considering contrasting viewpoints has shown promise at reducing automatic biases (Benafordo & Hanson, 2008). To show how pervasive implicit bias can be, Chugh, Milkman, and Akinola (2014) sent emails out to more than 6,500 randomly selected professors from 259 American universities. Each email was from a (fictional) prospective out - of - town student whom the professor did not know, expressing intere st in the professor’s Ph.D. program and seeking guidance. These emails were identical and written in impeccable English, varying only in the name of the student sender. The messages came from students with names like Meredith Roberts, Lamar Washington, Jua nita Martinez, Raj Singh and Chang Huang, names that earlier research participants consistently perceived as belonging to either a white, black, Hispanic, Indian or Chinese student. In total, they used 20 different names in 10 different race - gender categor ies (e.g. white male, Hispanic female). What did they discover? First, despite not knowing the students, 67 percent of the faculty members responded to the emails, and remarkably, 59 percent of the responders even agreed to meet on the proposed date with a student about whom they knew little and who did not even attend their university. When Chugh’s team computed the average response rates for each category of student (e.g., white male, Hispanic female), by dividing the number of responses from the professo rs by the number of emails sent from students in a given race or gender category, their analyses revealed that the response rates did indeed depend on students’ race and gender identity. Professors were more responsive to white male students than to female , black, Hispanic, Indian or Chinese students in almost every discipline and across all types of universities. Chugh and her colleagues found the most severe bias in disciplines paying higher faculty salaries and at private universities. For example, the d iscipline of business showed the most bias, with 87 percent of white males receiving a response compared with just 62 percent of all females and minorities combined. Surprisingly, several supposed advantages that some people believe women and minorities enjoy did not materialize in their data. For example: Were Asians favored, given the model minority stereotype they supposedly benefit from in academic contexts? No. In fact, Chinese students were the most discriminated - against group in their sa mple. Did reaching out to someone of the same gender or race — such as a black student emailing a black professor — reduce bias? No. They saw the same levels of bias in both same - race and same - gender faculty - student pairs that appeared in pairs not sharing a race or gender (the one exception was Chinese students writing to Chinese professors). Did it help to be in a discipline with a greater representation of women and minorities? Again, no. Faculty members in those more diverse disciplines, like criminal j ustice, were no less likely to discriminate than those in less diverse disciplines, like statistics. 3. Describe an instance where you fell into an implicit bias? 4. Describe an instance where you felt implicit bias from someone else against you? 5. Have you ever caught yourself in a moment of implicit bias – what happened? REFERENCES: Benforado, A., & Hanson, J. (2008). The great attributional divide: How divergent views of human behavior are shaping legal policy. Emory Law Journal, 57 (2), 311 - 408. Blair, I. V. (2002). The malleability of automatic stereotypes and prejudice. Personality and Social Psychology Review, 6 (3), 242 - 261. Castelli, L., Zogmaister, C., & Tomelleri , S. (2009). The transmission of racial attitudes within the family. Developmental Psychology, 45 (2), 586 - 591. Dasgupta, N. (2013). Implicit attitudes and beliefs adapt to situations: A decade of research on the malleability of implicit prejudice, stereo types, and the self - concept. Advances in Experimental Social Psychology, 47 , 233 - 279. Dovidio, J. F., Kawakami, K., Johnson, C., Johnson, B., & Howard, A. (1997). On the nature of prejudice: Automatic and controlled processes. Journal of Experimental Soc ial Psychology , 33 , 510 - 540. Fazio, R. H., Jackson, J. R., Dunton, B. C., & Williams, C. J. (1995). An individual difference measure of motivation to control prejudiced reactions. Personality and Social Psychology Bulletin , 23 , 316 - 326. Fiske, S. T. a nd Taylor, S. E. (1991). Social cognition (2nd edn.) . New York: McGraw Hill. Graham, S., & Lowery, B. S. (2004). Priming unconscious racial stereotypes about adolescent offenders. Law and Human Behavior, 28 (5), 483 - 504. Greenwald, A. G., & Krieger, L. H. (2006). Implicit bias: Scientific foundations. California Law Review, 94 (4), 945 - 967. Kang, J., & Lane, K. (2010). Seeing through colorblindness: Implicit bias and the law. UCLA Law Review, 58 (2), 465 - 520. Messick, D. M., & M. H. Bazerman. (1996). Ethics for the 21st century: A decision making approach." MIT Sloan Management Review 37, 2, 9 – 22. Reskin, B. (2000). The proximate causes of employment discrimination. Contemporary Sociology, 29 (2), 319 - 328. Reskin, B. (2 005). Unconsciousness Raising. Regional Review, 14 (3), 32 - 37. Roos, L. E., Lebrecht, S., Tanaka, J. W., & Tarr, M. J. (2013). Can singular examples change implicit attitudes in the real - world? Frontiers in Psychology, 4 (594), 1 - 14. Rutland, A., Cameron, L., Milne, A., & McGeorge, P. (2005). Social norms and self - presentation: Children’s implicit and explicit intergroup attitudes. Child Development, 76 (2), 451 - 466. Staats, C. (2014). State of the Science: Implicit Bias Review 2014, Kirwan Institute for the Study of Race and Ethnicity, The Ohio State University.