Tuesday, March 5, 2013
None of us wants to feel that our opinions are tainted by bias. The ability to recognize when bias is an influence in an expert's opinion and the skill of an expert to overcome his or her biases is integral to an expert's credibility.
Experts make decisions that are expressed as opinions by applying analytical methods developed through training, education and practice. Such prior experience may induce biases that cause the expert to use trusted methods without considering alternatives. Forensic science seeks to produce reliable evidence which is clearly reported (Sjerps & Meester 2009). Experts must recognize when their biases and those of others influence their decisions.
Black's Law Dictionary defines bias as, "A predisposition to decide a cause or an issue in a certain way (Garner, 2009)." Prior experiences, learning paradigms, individual beliefs, and other biases can cloud the understanding of what is important. There are two types of bias: cognitive bias and motivational bias (Giannelli, 2008). Cognitive biases, which occur at the subconscious level, frequently interfere with the ability of people to make good decisions. Motivational bias, which can occur at the conscious or subconscious level, results from a person's desire to deliver expected results.
Experts work in private, crime or other forms of laboratories. ASTM reported that 80% of studied laboratories showed laboratory bias (Lawrey, 2009). Twenty percent of the laboratories displayed "significantly high bias." This bias was the result of interactions among many people. Griffen and Tversky (1992) attributed similar bias to people's tendency toward being more overconfidant in their judgments than is warrented by the facts. When we select evidence that is is not independent of the forensic analysis, problems occur (Sjerps & Meester, 2009). Schwab (2008) showed that bias induces experts to be overconfidant in rating their abilities.
The National Academy of Sciences (NAS) (National Research Council, 2008) reported that bias is a severe problem in forensic sciences. Cognitive biases were described as, "common features of decision making, and they cannot be willed away." NAS reported that judges are subject to bias in their rulings. The NAS report cites studies that half the fingerprint examinations had bias introduced into the procedures. A recommendation is made to remove the association of crime laboratories from police agencies to reduce the motivational bias (National Research Council 2008). The expert medical examiner bias can be reduced if the expert is not aware of the side which has hired him or her (Baer, 2005).
Research demonstrates that awareness of the source of cognitive bias is insufficient to prevent a person from being trapped by biases (Ariely 2008, Cialdini 2001). Arzy, Brezis, Khoury, Simon & Ben-Hur (2009) discovered that by including one misleading detail about a patient, cases were misdiagnosed in 90% of cases by practicing physicians. Telling a control group that there was one misleading detail did not reduce the diagnostic error. When the misleading detail was omitted from the information, the misdiagnosis reduced to 30% (Arzy, et. al., 2009). As experts, we must be able to sort through evidence so as not to follow the trail of misleading information which results in a flawed opinion.
The presentation of the information is known as framing. When a problem is framed in a manner that appears to be logically sound, the problem solver will accept the framing and attempt to solve the problem in conjunction with the way the problem is framed (Bernstein, 1996). A study was conducted at Stanford University to test the impact of framing a situation then adding additional information about the decision that is to be made. Subjects were given sufficient information regarding a courtroom trial (Kahneman and Tversky, 1995). One group was given more detail regarding the defendant and another group was given additional information regarding the plaintiff. Although the groups knew the data was biased, they were unable to mentally balance the information. The biased groups were more confident about the outcome in favor of the side whose information was more voluminous than the group with balanced information (Kahneman and Tversky, 1995).
Webber (2008) reported, "juries … typically base their decisions on whichever story seems most plausible to them, rather than weighing the evidence." These decisions are made regardless of whether the information is accurate. McAuliff, Kovera & Nunez (2009) expanded on Webber's findings stating that when jurors' motivation is low or their ability to understand the presented information is poor, they rely on heuristics and that which they understand as real-life situations (McAuliff, et. al., 2009). McAuliff, et. al. discovered that jurors are, "insensitive to the presence of a confound or experimenter bias in the expert's research," yet the jurors relied on their flawed analysis of the expert's evidence when rendering a verdict.
A sharp attorney can bias a jury by framing questions to the expert. Framing bias can cause the jury to view the expert as qualified or not qualified. McAuliff, et. al. (2009) found a positive relationship between verdicts and juror's evaluation of expert's evidence. McAuliff, et. al. reported that jurors are not able to evaluate statistical evidence and methodologies. They also reported that, "judges are unable to differentiate between valid and junk science …" leading to, admission of invalid research at trial." The expert must reduce these potential biases by presenting clear and easy to understand evidence to support an opinion.
References
Ariely, D. (2008). Predictably irrational; The hidden forces that shape our decisions. New York, N.Y: Harper Prennial.
Arzy, S., Brezis, M., Khoury, S., Simon, S. & Ben-Hur, T. (2009). Misleading one detail: a preventable mode of diagnostic error? Journal of Evaluation in Clinical Practice, 15, 804-809.
Baer, M. A. (2005). Is an independent medical examination independent? The Forensic Examiner. Winter, 33.
Garner, B. A (Editor). (2009). Black's Law Dictionary, Nineth Edition. St. Paul, MN: Thompson Reuters.
Giannelli, Paul C. "Confirmation bias in forensic testing." GP Solo 25.2 (2008): 22. General OneFile. Web. 13 Jan. 2010.
Cialdini, R. (2001). Influence: science and practice. Boston: Allyn & Bacon.
Kahneman, D., & Tversky, A. (1995). Conflict resolution: a cognitive perspective. in Arrow, K., et. al. (Eds). (1995) Barriers to conflict resolution. New York, NY: W. W. Norton & Company, Inc.
McAuliff, B. D., Kovera, M. B. & Nunez, G. (2009). Can jurors recognize missing control groups, and experimenter bias in psychological science, Law and Human Behavior. 33, 247-257.
National Research Council (2009). Strengthening forensic science in the United States: A path forward. Washington, D.C. The National Academy Press.
Schwab, A. P., (2008). Putting cognitive psychology to work: Improving decision-making in the
medical encounter. Social Science & Medicine. 67, 1861-1869
Webber, S. (2008). The dark side of optimism; why looking on the bright side keeps us from thinking critically. The Conference Board Review. 45, 30-36.
Author bio
Mike Wakshull is a forensic document examiner based in Temecula, CA. He holds a graduate school certificate in forensic document examination from East Tennessee State University and a master's degree in Technology Management from the University of Denver. He is a certified quality engineer. Mike is Chair of the 2012 National Association of Document Examiners Conference in San Diego. Mike presented two papers at the World Congress of Forensics in Chongqing, China in October, 2011.
Mike managed corporate quality risk management at Amgen. He managed implementation of global quality risk management procedures for Abbott Vascular. He is a major contributor to the risk management section of the ANSI/ISO standard Guide to a Project Management Body of Knowledge.
He can be reached at
mikew@quality9.com or 951-252-4929.