Studying how the human brain works has helped scientists learn more about decision-making. Here are some findings and related tips that are especially relevant for lawyers.
This article was first published in the Wisconsin Lawyer on January 7th, 2022.
Most of us learned the classic, reason-or-emotion concept of decision-making. As described long ago by Plato, human thinking is understood as akin to a chariot driver trying to control two horses, one guided by rational impulse and the other driven by irrational passions. Plato taught that when reason predominates, we are more fully human and less likely to err.
Lawyers embrace the Platonic model. If we can just put our emotions in a box, we believe, we can rely on glorious reason to carry the day for our clients and ourselves.
If only.
Recent advances in the science of decision-making undercut long-held assumptions about how people make decisions. Science now teaches that a human being’s cerebral cortex (and its deliberate, logical power) does not solely or separately rule the day. Instead, logic or reason (described as “system 2” thinking) operates alongside and in conjunction with the evolutionary brain and its quick, instinctual impulses (described as “system 1” thinking). To put this construct in everyday terms, system 1 thinking is like driving a car down an open highway while system 2 thinking is like parallel parking on a busy street.
Nobel Laureate Daniel Kahneman, one of the leaders in the developing understanding of human decision-making, explains that system 1 silently “takes over” from system 2 by introducing unconscious mental shortcuts known as “heuristics.” As detailed in his book, Thinking, Fast and Slow, these mental shortcuts “are not chosen; they are the consequence of the mental shotgun, the imprecise control we have over targeting our responses to questions.”Thus, we are not even aware of the tricks our mind is playing in order to solve a problem. “The mental shotgun makes it easy to generate quick answers to difficult questions without imposing much hard work on your lazy System 2.”Again, this process occurs automatically without conscious awareness, so these shortcuts take over our thinking easily and automatically.
Armed with insights into heuristics – along with strategies to overcome routine thinking errors – lawyers can become better decision-makers. Here are two examples to whet your appetite to learn more.
1) Affect Heuristic
This devilish mental shortcut substitutes an easier, and often irrelevant, question in place of a difficult question – one that requires the use of logic and facts to answer correctly. In so doing, we get the right answer to the wrong question.
Kahneman gives this example of the affect heuristic at work. He asked the chief investment officer (CIO) of a large financial firm why the CIO had invested tens of millions of dollars in the stock of Ford Motor Company. The right answer, according to economists, would have involved an explanation of how the current market price of Ford stock was below where it should have been (that is, Ford’s stock was undervalued by the market); how the CIO had reason to know this; and thus why the CIO chose to take advantage of his insight into the true, higher stock value by buying at the below-value market price.
But that is not what the sophisticated CIO said. The CIO instead replied that he had recently attended an auto show and had been impressed by the Ford cars he saw: “Boy, do they know how to make a car,” he said. He made it clear that in making the stock investment he had trusted his gut feeling.
Kahneman points out that this sophisticated investor had avoided the difficult question of weighing Ford’s market price versus its “true” value and instead allowed the affect heuristic to take over: He liked the cars, he liked the company, and he liked the idea of owning the stock. Instead of focusing on whether Ford stock was currently underpriced, system 1 and the affect heuristic guided his judgment with a feeling of liking and disliking, with little deliberation or reasoning.“If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it.”The question the investor faced (should I invest in Ford stock?) was difficult, but the answer to an easier and related question (do I like Ford cars?) came readily to his mind and determined his choice.
Hiring decisions are one area in which the affect heuristic comes into play for lawyers. We scan a candidate’s resumé, look for points of common experience, have a 30-minute conversation, and then answer the question, “do I like this person?” If you or I do, and if our colleagues likewise like the person, the person is hired. And then we’re disappointed when the person’s skills don’t meet our needs. If all we consider is whether we like someone, then we’re not asking the right questions.
Kahneman’s latest book, NOISE: A Flaw in Human Judgment, offers a way to overcome the affect heuristic. Companies working to overcome hiring flaws now practice “structured interviewing.” One such company is Google, which, despite significant efforts to make good hiring decisions, found in an audit of its recruiting that there was “zero relationship” between the predicted and actual hiring outcomes. Google thus revamped its system. Now, it first carefully identifies the specific skills needed for a given job (for lawyers, this could be brief writing, taking depositions, client-generation abilities, and so on). Next, it requires interviewers to assess each key skill separately so that a high score in one category does not influence another category in which the rating should be low. Also, and especially important, interviewers work with a common set of predefined questions so that each candidate is being asked the same questions, without which fairly comparing the candidates becomes difficult or impossible.
While Google adds other data to the mix, including work sample tests to check job-related knowledge, the final decision also includes the judgment and intuition of the hiring committee. Instead of leading with intuition, however, as the affect heuristic wants us to do, structured interviewing delays the application of intuition until all the evidence is gathered and analyzed.
Now imagine for a moment how this approach can improve the process for hiring lawyers. Instead of a series of random interviews that yield little in the way of meaningful information, you and your colleagues can use a structured approach that first decides which skills are important, next identifies the questions likely to yield evidence about those skills, then has interviewers ask these questions of each candidate while separately recording scores for each skill category. Once this is done, you assemble the data and apply your experience and intuition to the evidence before you.
2) Confirmation Bias
Confirmation bias is a second powerful mental shortcut that can lead to poor decision-making. Because our minds dislike the discomfort of uncertainty, we want to answer questions or solve riddles as quickly as possible. We do so by creating a hypothesis and then looking for facts to support – “confirm”– that hypothesis. In short, first we pick an answer and then we look for facts to support that choice.
The problem with confirmation bias is that it shuts off our brains to later-received contradictory evidence, which, if we considered it, could cause us to change our minds. Unfortunately, confirmation bias takes over and causes us to ignore or discount as untrustworthy evidence that contradicts what we already have come to believe. To paraphrase Paul Simon’s lyric in The Boxer, “we hear what we want to hear and disregard the rest ….”
Confirmation bias often is mentioned in the political arena, with good reason. In a 2004 study, brain scans were conducted on participants as they read statements from then-presidential candidates George W. Bush and John Kerry in which the candidates clearly contradicted themselves.Republican participants were far more critical of John Kerry’s statements, ignoring the contradictions of their own candidate, while Democrat participants took George W. Bush to task and gave Kerry a pass.The brain scans revealed that as the participants evaluated the statements, the part of the brain most associated with reasoning was dormant while the parts of the brain associated with emotions and resolving conflict were highly active.“Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones.”
So what can lawyers do to account for and offset confirmation bias? How do we think about solutions to clients’ problems so we are receptive to information that contradicts our initial assessment? How do we step away from the warm feelings offered by certainty and actually consider whether an answer is wrong?
Advice from the legendary Judge Learned Hand might help. In his June 28, 1951, Senate Committee testimony, Judge Hand warned against unquestioned certainty. Judge Hand quoted from Oliver Cromwell’s letter asking the Scots to reconsider their position and avoid what became the Battle of Dunbar. Judge Hand then urged that Cromwell’s plea – “I beseech ye in the bowels of Christ, think that ye may be mistaken.” – be inscribed over the door of every courthouse and used to begin every court session. Hand understood the value of challenging one’s own thinking and wanted that value broadly and permanently expressed in American courts.
Science has affirmed the usefulness of Judge Hand’s admonition – to consider that we “may be mistaken” – as a way to improve decision-making. Professor Charles G. Lord and colleagues tested this strategy and described their findings in Considering the Opposite: A Corrective Strategy for Social Justice.Their work investigated whether confirmation bias could be overcome either 1) by an instruction to “be unbiased” or 2) by an instruction to “consider the opposite.”The “consider the opposite” instruction was worded as follows: “Ask yourself at each step whether you would have made the same high or low evaluations had the same study produced results on the other side.”
The experiment showed that the “consider the opposite” approach worked. While those instructed to “be unbiased” became more extreme in their beliefs, those told to “consider the opposite” were able to offset confirmation bias. The authors reported, “We cannot but conclude that Judge Hand’s advice should be taken literally.”
The military and intelligence communities long have understood the value of “red teams,” whose role is to challenge the accepted wisdom of the group. You sometimes see participants in a meeting purporting to play that role, usually announcing in advance of their remarks the qualifier, “Just to play devil’s advocate ….” But this comment suggests that going against the group consensus needs to be noted and excused, rather than expected and encouraged.
Humility drives a culture that values red teams and devil’s advocates. Leaders who value humility strengthen their decision-making by understanding they may be wrong. We all know people who are not always right but who are never in doubt. The superficial attraction of such self-assured people wears thin when the consequences of their narrow field of vision come home to roost.
Conclusion
I hope this introduction to decision-making science whets your appetite to learn more about ways to avoid common thinking errors. If so, watch for a future article on how anchoring and loss aversion play important roles in negotiating.