Friday, July 3, 2015

DNA Evidence Causes Wrongful Convictions 15% of the Time!?

Today's Forensic Magazine reports that
Recent estimates indicate that as many as 15 of every 100 incarcerated offenders where DNA was an element in their trial may be wrongfully convicted because of misused DNA evidence matching techniques. One common reason for this error is scientifically invalid testimony on forensic evidence. [FULL STORY]
Did I read this correctly? Fifteen percent -- one out of every six or seven -- of the men and women in prison are innocent? That is almost four times the 4% rate estimated in Gross et al. (2013). But that study, the leading one in the field, was limited to defendants with death sentences. Their cases would have been scrutinized especially carefully, so the 4% figure is probably on the low side. Maybe 15% for the broader prison population is realistic.

But wait. Forensic Magazine was not referring to the entire prison population, but only to those cases that went to trial. And within that group, 15% is the figure for defendants for whom "DNA was an element in their trial[s]." Because almost no defendants introduce DNA evidence at trial, these must be cases in which the prosecution has linked the defendant to the crime by DNA testing and the state "misused DNA ... techniques." Can it be that 15% of inmates convicted because of DNA tests are falsely convicted? That is an intolerable error rate for what is supposed to be the gold standard in forensic science. Is it time to halt DNA testing until we can find out why it generates so much "scientifically invalid testimony"?

Or are the editors of Forensic Magazine seeking sensational news rather than reading what they are reporting? Let's look at the "recent estimates." To get to them requires a few steps backwards. Forensic Magazine lists the National Institute of Justice (NIJ) -- a part of the U.S. Department of Justice -- as the source of its story. NIJ funded the Rand Corporation to do a study on improving expert performance in computing a posterior probability (which is not what forensic experts routinely do). Rand investigated two questions: "Is bias reduced when experts do not know whether the prosecution or defense is the hiring party?," and "Is bias reduced by expert consensus feedback, wherein expertise is culled from multiple sources and those sources examine the majority view to move toward a group consensus?"

The summary at the start of the Rand research report begins with the very two sentences that Forensic Magazine broadcast. The researchers gave no references to indicate where these "recent estimates" came from, but page one of the report supplies an apparent answer. It reads as follows:
As many as 15 of every 100 incarcerated offenders may be wrongfully convicted, according to DNA evidence–matching techniques (Roman et al., 2012). One reason for this is scientifically invalid testimony on forensic evidence (e.g., Gould et al., 2012; Innocence Project, 2013).
This passage reveals that Rand simply misrepresented the "recent estimates" in the first instance. Roman et al. (2012) is an NIJ-funded study from the Urban Institute that involved no cases of wrongful convictions because of misused DNA evidence. Instead, that study "analyzed the results of new DNA testing of old physical evidence from 634 sexual assault and homicide cases that took place in Virginia between 1973 and 1987." In other words, it looked at postconviction exonerations in cases in which the trials took place before DNA evidence was even available. This study of wrongful convictions for reasons other than faulty DNA evidence "found that in five percent of homicide and sexual assault cases DNA testing eliminated the convicted offender as the source of incriminating physical evidence. When sexual assault convictions were isolated, DNA testing eliminated between 8 and 15 percent of convicted offenders and supported exoneration."

Are there any cases of false convictions caused by DNA evidence? Almost certainly. Does the rate of such convictions approach 15%? Neither Forensic Magazine, NIJ, nor the Rand Corporation offers any reason to believe it.

Friday, June 26, 2015

Peering into Peer Review

After the Supreme Court, in Daubert v. Merrell Dow Pharmaceuticals, listed publication in peer reviewed scientific journals as an important factor in ascertaining whether a scientific method produces results that can be admitted in court, the term "peer review" became commonplace in opinions on scientific evidence.

But it takes more than a bland statement that publications appear in a peer reviewed journal to show that a scientific discipline treats the claims in those publications as worthy of respect. For example, some experts, like celebrity doctor Andrew Weil, tout publication in the Journal of Clinical Ecology as indicative of credible scientific findings. But courts, following the advice of the broader medical community, do not. It is not enough to have a Society for Clinical Ecology (reinvented as the American Academy of Environmental Medicine) publish a journal reviewed by peers who are true believers. Rather, "[i]t is the publication of ... basic research, using accepted research techniques, providing scientific (as opposed to anecdotal) evidence ... to which the Daubert inquiry is directed." 1/

In this spirit, the National Commission on Forensic Science recently emphasized that “[s]cientific literature comprises manuscripts that report empirical data and have been independently peer-reviewed for quality, originality, and relevance to the discipline. To strengthen confidence in results obtained in forensic examinations, each forensic discipline must identify resources that are scientifically credible, valid and with a clear scientific foundation. Such foundational literature in forensic practice should conform to norms across all scientific disciplines.”

In this light, consider the opinion of the U.S. District Court for the Eastern District of New York in United States v. Ashburn. 2/ In response to a motion to exclude testimony from "Detective Salvatore LaCova ... that all of the cartridge casings and deformed bullets ... were fired from [a particular] gun," the court ran through the usual checklist of Daubert factors. It quickly found "that the AFTE [Association of Firearm and Tool Mark Examiners] methodology has been published and subject to peer review, weighing in favor of admission of LaCova's testimony." This positive endorsement followed entirely from the fact that "[t]he AFTE itself publishes within the field of toolmark and firearms identification." It is not clear the court knew anything more about the journal than the remarks in earlier district court cases that "articles submitted to the AFTE Journal are subject to peer review" and that the "AFTE Journal [has a] formal process for the submission of articles." 3/

This "formal process" apparently consists of an editorial board of firearms and tool mark examiners who screen articles and “post-publication review by the members of the Association of Firearm & Tool Mark Examiners” whose unsolicited reactions, appear in the ‘AFTE Peer Review and Letters to the Editor’ section of the Journal.” 4/ Surely, postpublication review limited to the organization’s members, does not meet the Commission's call for publications that “conform to norms across all scientific disciplines.”

Information in journals that exist at the fringes or even outside the corpus of scientific literature can be quite valuable. But there is no excuse for courts to continue to rely on the existence such publications as proof of scientific validity. After all, the nature of the AFTE Journal is no secret to the legal profession. One prominent law review article explained that:
Another major limitation of the current forensic science culture relates to several of the publication venues for the pattern identification field. Several of the most significant journals focused on publishing pattern identification research simply do not comport with broader norms of access, dissemination, or peer review typically associated with scientific publishing. For example, the AFTE Journal, a quarterly publication of the Association of Firearm and Toolmark Examiners, has published numerous articles on firearms identification. WorldCat—the largest online catalog of library materials, which includes the holdings of 72,000 libraries worldwide, including virtually every university-based library in the United States—lists only eighteen libraries with a copy of this journal in their holdings. Furthermore, the AFTE Journal does not appear to be indexed or included in any major indexing service anywhere. The only available index to AFTE was created by an individual firearms examiner on his own initiative and was not continued past 2005. Moreover, peer review of submissions to AFTE is not blind; the author and the reviewer are both aware of each other’s identity. In addition, the peer reviewers appear to come entirely from the editorial board, which consists entirely of AFTE members, and therefore includes no members from outside the toolmark and firearms practitioner community. This journal therefore appears to have extremely limited dissemination beyond the members of AFTE itself; completely lacks integration with any of the voluminous networks for the production and exchange of scientific research information; and engages in peer review that is neither blind nor draws upon an extensive network of researchers. None of this is compatible with an accessible, rigorous, transparent culture of research. 5/
This does not mean that no scientific literature on "the AFTE method" in which Detective LaCova was trained exists. Journals that meet the criteria listed by the National Commission have published articles on various aspects of the process for matching striations and the inferences that can be drawn from them. It is this literature that courts interested in "publications and peer review" should consult before they make up their minds.

Notes
  1. Gabbard v. Linn-Benton Housing Authority, 219 F.Supp.2d 1130, 1137 (D. Or. 2002).
  2. No. 11–CR–0303 (NGG), 2015 WL 739928 (E.D.N.Y. Feb. 20, 2015).
  3. See also, e.g., United States v. Otero, 849 F.Supp.2d 425, 433 (D.N.J. 2012) (“AFTE theory is subject to peer review through submission to and publication by the AFTE Journal of validation studies which test the theory.”); Commonwealth v. Pytou Heang, 942 N.E.2d 927, 939 n.20 (Mass. 2011) (“The Association of Firearm and Toolmark Examiners (AFTE) is an organization of firearm and toolmark examiners that publishes the peer-reviewed AFTE Journal.”).
  4. The journal's webpage, from which these quotations are taken, does not list the institutional affiliations of its editorial board members.
  5. Jennifer L. Mnookin et al., The Need for a Research Culture in the Forensic Sciences, 58 UCLA L. Rev. 725, 754-56 (2011) (notes omitted).

Wednesday, June 24, 2015

Frontline's Expose of DNA Testing: Yes and No

The content of a recent Frontline story on “The Surprisingly Imperfect Science of DNA Testing: How a Proven Tool May Be Anything But” will come as no surprise to anyone familiar with the professional and academic literature on forensic DNA identification. The work is important, though, because the main points of the story need to be widely understood. I highlight them below. At the same time, the story relies on some putative problems that are more perceived than real. 

The story boils down to this (with annotations on the right)

Complex DNA mixtures can be tough to interpret, especially when the amounts of DNA are so small that stochastic effects in amplifying DNA sequences are important. In these situations, analysts using a variety of cues and procedures--and even following the same general procedure--can reach different conclusions. False arrests and convictions can follow. True. See, e.g., Erin Murphy, The Art in the Science of DNA: A Layperson's Guide to the Subjectivity Inherent in Forensic DNA Typing, 58 Emory L.J. 489 (2008).

There can be a big difference between the answer to the question “What is the probability of a match between a specific DNA profile and the profile an individual picked at random?” and “What is the probability of a match between a specific DNA profile and at least one profile in a large database of profiles picked at random?” Yes, different questions give different answers. The size of the database affects the answer to the second question, but not the first. But which question should be addressed in court? Probably neither. See, e.g., Ian Ayres & Barry Nalebuff, The Rule of Probabilities: A Practical Approach for Applying Bayes’ Rule to the Analysis of DNA Evidence, 67 Stan. L. Rev. 1447 (2015); David J. Balding, The DNA Database Search Controversy, 58 Biometrics 241 (2002); David H. Kaye, Rounding Up the Usual Suspects: A Legal and Logical Analysis of DNA Database Trawls, 87 N. Car. L. Rev. 425 (2009).

Thinking about how traces of DNA ended up where they did—and not just whose DNA ended up there—can be crucial to an investigation and prosecution. If investigators, lawyers, and jurors do not understand this, there can be mistaken arrests, prosecutions, and convictions. A very important point. See, e.g., Peter Gill, Misleading DNA Evidence (2014); David H. Kaye, David Bernstein & Jennifer Mnookin, The New Wigmore on Evidence: Expert Evidence (2d ed. 2011).

A few things will be surprising

The 2011 Hampikian-Dror “experiment” with a complex DNA mixture was good proof that examiner bias (from knowing what detectives believed) affected two examiner's interpretations. From the viewpoint of experimental design, a potentially important confounding variable was obviously present. See D.H. Kaye, The Design of “The First Experimental Study Exploring DNA Interpretation”, 52 Science & Justice 256 (2012). As Dror later wrote, the study was  “suggesting that the extraneous context of the criminal case may have influenced the interpretation of the DNA evidence” (emphasis added by Dror in a reply letter). This conclusion is reminiscent of a microscopic hair association reported as “suggesting that the [defendant] may have [left the hair at the crime scene].” How helpful is that?

"It’s not clear how often coincidental matches occur." Indeed, it might not be a rare event at all, considering that "a rogue Arizona state employee had run tests on the state’s database without the FBI’s permission and found" an inexplicably high number of partial matches. The employee was not a “rogue”; she did not need the FBI’s permission to use a state database this way; the results were presented on behalf of the state laboratory at an International Symposium on Human Identification. They are not especially anomalous, but largely a consequence of trawling for partial matches among all possible pairs of profiles in a database that includes close relatives. See, e.g., David H. Kaye, Trawling DNA Databases for Partial Matches: What Is the FBI Afraid Of?, 19 Cornell J. L. & Public Pol'y 145 (2009).



Monday, June 22, 2015

48 Hours for DNA

Today's New York Times reports that
DNA matching that of two escaped killers was found in a cabin in the remote resort of Mountain View, N.Y., 15 heavily wooded miles west of the state prison in Dannemora, an official briefed on the investigation said on Monday morning.

The forensic evidence indicated that the men had been there within the last 48 hours, according to the official, who was not authorized to discuss the search and spoke on the condition of anonymity.

A pair of prison-issued underwear was also found in the cabin, the official said.
Andy Newman & William K. Rashbaum, DNA of Escaped Convicts Found in Cabin, Official Says, N.Y. Times, June 22, 2015 

It would interesting to know what "forensic evidence indicated that the men had been there within the last 48 hours." DNA itself carries no known signs of how long it has sat on some surface. If the DNA were in saliva on, say, an apple core left from a snack, would the extent of oxidation of the food allow it to be dated within a 48-hour period? Seventeen days have elapsed since the two men escaped.

Saturday, June 6, 2015

Maryland v. King and Fourth Amendment Doctrine

In Maryland v. King,1/ the Supreme Court upheld the practice of routine DNA sampling soon after arrest for certain crimes. The impact of the decision of the doctrinal framework for applying the Fourth Amendment to searches has been the subject of debate. The most extreme view is that is the opinion presages the collapse of the doctrine that criminal investigatory searches are per se unreasonable unless they fall within some well-defined exception to the requirement of a warrant based on probable cause.2/ Another destabilizing view is that the case establishes “that any suspicionless search of an arrestee is allowed if it will be useful to solve crimes.” 3/ Other observers were less alarmed.4/

A case having nothing to do with DNA suggests that King is not the death knell of either the reasonable-suspicion requirement for a search incident to arrest or the warrant requirement. In Riley v. California,5/ Chief Justice Roberts wrote that
As the text [of the Amendment] makes clear, “the ultimate touchstone of the Fourth Amendment is reasonableness." Our cases have determined that "[w]here a search is undertaken by law enforcement officials to discover evidence of criminal wrongdoing, ... reasonableness generally requires the obtaining of a judicial warrant." Such a warrant ensures that the inferences to support a search are "drawn by a neutral and detached magistrate instead of being judged by the officer engaged in the often competitive enterprise of ferreting out crime." In the absence of a warrant, a search is reasonable only if it falls within a specific exception to the warrant requirement. 6/
Applying this framework, the Court unanimously held that police generally need a warrant to search a cellphone, The mere fact that it was acquired incident to an arrest is insufficient to justify rummaging though its contents.Thus, if Riley is any indication, Maryland v. King changed neither the basic framework of Fourth Amendment analysis nor the parameters of searches incident to arrest.

Notes
  1. 133 S. Ct. 1958 (2013).
  2. Erin Murphy, License, Registration, Cheek Swab: DNA Testing and the Divided Court, 127 Harv. L. Rev. 161 (2013).
  3. Tracey Maclin, Maryland v King: Terry v Ohio Redux, 2013 Supreme Court Review 359, 403.
  4. David H. Kaye, Why So Contrived? The Fourth Amendment and DNA Databases After Maryland v. King, 104 J. Crim. L. & Criminology 535 (2014); David H. Kaye, Maryland v. King Per Se Unreasonableness, the Golden Rule, and the Future of DNA Databases, 127 Harv. L. Rev. F. 39, 40, 42-43 (2013); Orin Kerr, A Few Thoughts on Maryland v. King, The Volokh Conspiracy, June 3, 2013 (“while King is very important from a practical standpoint, there isn’t a whole lot of academically-interesting stuff happening in the King opinions.”).
  5. 134 S. Ct. 2473 (2014).
  6. Id. at 248 (citations and internal quotation marks omitted)

Wednesday, June 3, 2015

Spitting in Syracuse: Another Disgusting DNA Case

Police have linked restaurant and grocery store employees to expectoration in food or drink. The latest case is described in Syracuse.com, which reports how a server at the Chili's Restaurant in Clay, NY, was caught after the act and convicted of disorderly conduct. Now the affected customers plan to sue him -- and the corporate owners of Chili's.

Earlier, and even more unpleasant cases, come from Seattle and Albuquerque.


Saturday, May 23, 2015

No Relief for Jeffrey MacDonald After FBI Declares It “Exceeded the Limits of Science” with Hair Analysis

It was not yet 3:30 a.m. on February 17, 1970, when tragedy struck Captain Jeffrey MacDonald’s family at 544 Castle Drive, Fort Bragg, North Carolina. His pregnant wife, Collete, “had both her arms broken and was stabbed repeatedly in the chest and neck with a paring knife and an ice pick” (Anthony 2013).  Five-year-old Kimberley “was beaten across the head with a club and stabbed multiple times in the neck. Two-year-old Kristen was stabbed over 30 times in the back, chest and neck ... . MacDonald himself received relatively minor injuries except for a single stab wound that punctured his lung” (Ibid.)

MacDonald, who was a surgeon with the Green Berets, spoke of an attack “by four intruders — two white men, a black man and a white woman. He said the woman held a candle and chanted ‘Acid is groovy’ and ‘Kill the pigs’. On the headboard in the marital bedroom the word ‘PIG’ was written in blood.” (Ibid.) It was eerily similar to the depraved murders of Charles Manson’s followers in Los Angeles. “At Roman Polanski's home they killed the director's pregnant wife, Sharon Tate, and with her blood smeared the word ‘PIG’ on a wall.” (Ibid.) Indeed, Army investigators found an article on the Manson murders in the living room.

After an extended preliminary hearing culminated in a report exonerating Captain MacDonald, he left the Army with an honorable discharge and moved to California. But his father-in-law’s relentless pursuit led to the case being placed before a federal grand jury in 1974. An indictment came the next year. In 1979, federal prosecutors convicted him of the three murders. Appeals and post-conviction motions ensued. The case generated a “small library of books, a TV mini-series, countless documentaries and a forest of newsprint.” (Ibid.)

The latest opinion in this “wilderness of error” (to use the title of the most recent book on the case) is from a federal district court in North Carolina. The court issued this opinion last week, in the midst of an ongoing investigation into FBI reports and testimony about hair comparisons in thousands of cases before 2000. MacDonald’s case is now one of many in which the Department of Justice has confessed error in the presentations of its FBI laboratory personnel who compared hair samples from crime scenes to those of suspects.

Thus, last year, the Department advised MacDonald’s counsel that:
We have determined that the microscopic hair comparison analysis testimony or laboratory report presented in this case included statements that exceeded the limits of science and were, therefore invalid: (1) the examiner stated or implied that the evidentiary hair could be associated with a specific individual to the exclusion of all others—this type of testimony exceeded the limits of science; (2) the examiner assigned to the positive association a statistical weight or probability or provided a likelihood that the questioned hair originated from a particular source, or an opinion as to the likelihood or rareness of the positive association that could lead the jury to believe that valid statistical weight can be assigned to a microscopic hair association—this type of testimony exceeded the limits of science. (A copy of the documents upon which our determination is based is enclosed.) We take no position regarding the materiality of the error in this case.
According to the court, the FBI and the Innocence Project (IP) identified three errors based in the lab reports or trial testimony. None of them prompted the court to change an earlier order denying him post-conviction relief.

In light of the perception of award-winning journalists that the FBI “faked an entire field of forensic science” (Lithwick 2015), that the Bureau placed “pseudoscience in the witness box” (ibid.), and that it performed “virtually worthless” analyses (Blakemore, 2015), it is worth looking carefully at the descriptions of the self-reported “invalid” science. Not having the FBI-IP report cited by the court at my disposal, I rely solely on the court’s description of it. If this description is accurate and if the report on MacDonald's case is representative, one may want to exercise some caution with respect to the surprising number of FBI reports that are said to exude "junk science" (Editorial 2015).

Hair analysis figured into the MacDonald case in an unusual way. It was not performed to associate MacDonald with the crime scene. He was lying in the house, wounded and apparently floating in and out of consciousness. Hairs in the house — especially ones on or around the bodies of the victims — were significant only because they might have come from the invading Manson-like killers. But visual and microscopic inspections of various hairs from the house did not seem to support MacDonald's extraordinary story. Instead, the features seen in the hairs were consistent with hairs sampled from the MacDonalds themselves.

1

A bedspread on the floor of the master bedroom of the MacDonald home contained a hair entangled with a purple cotton thread. An FBI lab technician mounted the hair on a slide marked “Q96 H (from thread).” Paul Stombaugh, who was in charge of the Chemistry Branch of the Chemistry and Physics Section of the FBI crime laboratory, examined the Q96 thread and hair, and wrote:
Light brown to blond head hairs that microscopically match the K1 head hairs of COLLETE MACDONALD were found in specimens ... Q96.... The Q96 hair was found entangled around a purple cotton sewing thread like that used in the construction of the Q12 pajama top [belonging to defendant]. Further, this hair had bloodlike deposits along its shaft.
The 2014 report found no errors in this 1974 laboratory report or in Stombaugh’s testimony at the 1979 trial that “this hair—in conducting a comparison examination with the comparison microscope—microscopically matched the head hairs of Colette MacDonald.

On cross-examination, however, defense counsel suggested that it was peculiar that the thread and the hair “were still wrapped around together after four years of having been in the laboratory custody.” He asked, “Doesn't it make a difference to you to find out what treatment or handling a hair would have had before you examined it in the laboratory?” Stombaugh replied that “The hair was not mounted sir, as were many other ones in this submission. We opened the vials up and identified what was inside. If they were hairs, we would mount it on a slide and then they were compared.” The following exchange then occurred:
Q. Mr. Stombaugh, the question was: weren't you concerned with what might have been done to that hair that might possibly lead you to a wrong conclusion unless you found out what they had done with it?
A. Sir, the only conclusion on the hair examination that I was going to make was its origin.
Q. That is pretty serious about whose hair it is. That is a fundamental question you were being asked.
A. That is correct.
This last exchange is what, in the eyes of the Inspector General and the FBI and IP reviewers, moved Stombaugh’s testimony beyond the limits of science—he said he was examining the hair to reach a “conclusion” of some sort about “its origin” and that this was a “fundamental question.” But he never presented any definitive conclusion of identity. Neither did he try to quantify the probability of identity. To be sure, he did state that the hairs had matching colors and microscopic features. But the reviewers did not deem this conclusion improper or unacceptable. Somehow the conclusion became “invalid” because Stombaugh explained that he was not overly concerned with how the hair had come to entangled with the thread. This event, he said, was not a problem for him to consider because his task was strictly limited to ascertaining whether there was a possible association between that hair and the sample of known hairs from the defendant. Considering this testimony about “the origin” in context, it hardly seems like an egregious example of “pseudoscience” or the like.

2

The second instance of “invalid science” reported in 2014 was a 1999 laboratory report of Robert Fram, an examiner in the FBI Lab Hairs and Fiber Unit. At this point in the post-conviction proceedings, the district court had ordered the FBI to ship the hairs to the Armed Forces DNA Identification Laboratory for mitochondrial DNA testing. Fram documented the contents of the slides and sample being packed up and sent. During this process, he examined a glass microscope slide marked “19 1/2 L2082 Q96 PMS,” which contained four hairs. He observed that:
A forcibly removed Caucasian head hair found on one of the Q96 resubmitted glass microscope slides . . . exhibits the same microscopic characteristics as hairs in the K2 specimen. Accordingly, this hair is consistent with having originated from KIMBERLY MACDONALD, the identified source of the K2 specimen.
Fram also stated in the report that “[h]air comparisons are not a basis for personal identification.

Again, condemning these observations as erroneous seems harsh. Although the phrase “consistent with” is far from ideal, no one seems to doubt that the hair truly was “consistent with” the little girl’s, and MacDonald did not contend that it originated from anyone else.

3

In response to MacDonald’s original 1990 Petition for Post Conviction Relief, FBI laboratory analyst Michael Malone studied one hair found near Colette MacDonald. Malone was to become notorious for giving false or dubious testimony in other cases (Earl 2014). In this phase of the MacDonald case in 1991, however, he simply wrote that:
This hair [Q79] was compared to the pubic hair sample of JEFFREY MACDONALD (specimen K22). This hair exhibits the same individual microscopic characteristics as the pubic hairs of JEFFREY MACDONALD, and accordingly is consistent with having originated from JEFFREY MACDONALD.
Like Fram, he added a qualification. But where Fram cautioned that “[h]air comparisons are not a basis for personal identification,” Malone noted that “hair comparisons do not constitute a basis for absolute personal identification.

Despite the addition of the word “absolute,” on their face, these statements do not seem to “state[] or impl[y] that the evidentiary hair could be associated with a specific individual to the exclusion of all others,” and they do not “assign[] to the positive association a statistical weight or probability or provide[] a likelihood that the questioned hair originated from a particular source.” Finding matching physical features is consistent with the proposition that the hair was MacDonald's. At the same time, “hair comparisons do not constitute a basis for absolute personal identification” -- the match does not exclude everyone else in the world. Thus, Malone's statements do not seem to be scientifically invalid (at least with respect to the two criteria in the Inspector General's letter).

Rather, the legitimate concern is psychological -- without a literal statement that the observed similarities are also consistent with the possibility that the hair was not MacDonald's, the reader might give the match more weight than it logically deserves. This misconstruction of the report by a lay reader is certainly possible, and I would not want reports about hair matches to be written like Malone's and Fram's were. But this objection is different than dismissing the findings as invalid on the theory that the statements in the report logically imply that the only individual in world who could have been the source of the hair was MacDonald. In reaching the latter conclusion, the Inspector General may have gone too far.

Microscopic hair comparison is only a rough indicator of identity. Many people could share the same characteristics. But this limitation does not make the field fraudulent. Many disease symptoms, for example, are overinclusive when used to make a diagnosis, but that fact does not render them invalid or worthless as diagnostic criteria.

Likewise, the consistency that Malone reported was not sufficient to establish to a near certainty that the hair was MacDonald’s rather than an intruder’s. In fact, the later mitochondrial DNA testing excluded MacDonald, his wife, and his children as the source of the hair. Consequently, Malone’s reported similarity could have been false (if Malone did not make accurate observations, or if he lied about what he observed). Or, perhaps the Q79 hair was physically similar to MacDonald’s, as Malone said, but it nevertheless originated from someone else. As MacDonald and Fram explicitly stated, physical similarity alone is probative but not definitive of identity.

References

Related Postings