Friday, January 04, 2008


Autopsy of a Fraud

Neil Munro and Carl Cannon undertook an autopsy of the highly-suspect 2006 study by British medical journal The Lancet, purporting to estimate "excess" Iraqi deaths after the 2003 invasion at 654,965.

Munro and Cannon publish the results of their autopsy in National Journal. In brutal summary, based on their analysis, the state of the “science” behind the Study has only further decomposed since its publication. Yet somehow, this will be the rotting corpse of the Iraqi debate, the stench of which mainstream media (MSM) won’t notice.

I criticized the study when it was first released in October 2006, highlighting my objections at MILBLOGS, as did my fellow MILBLOGGERS Soldier's Dad and Steve Schippert.

Munro and Cannon are painstaking in their dissection of the many flaws in the study, as well as what amounts to the rather obvious the circumstantial evidence that the Study was intended as an assault on US electoral politics:

Three weeks before the 2006 midterm elections gave Democrats control of Congress, a shocking study reported on the number of Iraqis who had died in the ongoing war. It bolstered criticism of President Bush and heightened the waves of dread -- here and around the world -- about the U.S. occupation of Iraq.

Published by The Lancet, a venerable British medical journal, the study [PDF] used previously accepted methods for calculating death rates to estimate the number of "excess" Iraqi deaths after the 2003 invasion at 426,369 to 793,663; the study said the most likely figure was near the middle of that range: 654,965. Almost 92 percent of the dead, the study asserted, were killed by bullets, bombs, or U.S. air strikes. This stunning toll was more than 10 times the number of deaths estimated by the Iraqi or U.S. governments, or by any human-rights group.

In December 2005, Bush had used a figure of 30,000 civilian deaths in Iraq. Iraq's health ministry calculated that, based on death certificates, 50,000 Iraqis had died in the war through June 2006. A cautiously compiled database of media reports by a London-based anti-war group called Iraq Body Count confirmed at least 45,000 war dead during the same time period. These were all horrific numbers -- but the death count in The Lancet's study differed by an order of magnitude.

Queried in the Rose Garden on October 11, the day the Lancet article came out, Bush dismissed it. "I don't consider it a credible report," he replied. The Pentagon and top British government officials also rejected the study's findings.

Such skepticism would not prove to be the rule.

That’s an understatement, not as it applied to the MSM, and Munro and Cannon provide several examples.

MILBLOGGERS, those of us on the ground at the time or recently returned, knew the numbers bore no semblance to reality for lots of obvious reasons. Here’s the gist of my initial reaction:

You don't have to be an "expert" in social scientific "method" to recognize crap when you see it.

Much like polls in general, anything based on anecdotal evidence is going to be hopelessly biased and potentially orders of magnitude from reality. Even if we take these researchers at their word that they "checked 92%" of death records, how did they ensure they didn't double count? Did they keep a copy and reconcile no dupes? In a tribal community, many "families" would claim the same family member as "one of their own."

I remember clearly when the earlier report came out from these researchers. Then, it was clear that any “insurgent” who managed to die away from the location of combat would almost surely be counted as a “civilian” casualty, as the Al Qaeda in Iraq and Baathist holdouts we were fighting at the time purposely hid their identities and wore no uniforms. Many injured and killed were showing up at Iraqi hospitals and morgues, mis-identified as “civilians.” Call it an early prototype of the same public relations and deception efforts that Hezbollah would later professionalize.

The same objection applies to this new report. Death certificates are no doubt completed in many cases without adequate or sufficient information to know whether those deaths were the result of combat or not, or whether the corpse was that of a combatant.

And more on the sample: how did they statistically ensure that their neighborhoods, streets, blocks, cities were a good sample? Methinks the answers sought dictated the scope for questioning.

Surveys often breed a response whereby the survey subjects steer their answers towards subtle survey or survey taker biases. They get social "credit" and approval from providing information.

One last point on methods. In Social Science in particular, statistical extrapolations are notoriously unreliable. Results need to be checked for validity (call it the sniff test).

As this post suggests, the idea that more than 700 (out of 770) deaths go unreported daily -- in an atmosphere where reporters and their sources are rewarded for high body counts -- is unbelievable on its face.

That these boobs from Johns Hopkins retain zero credulity for the magnitude of disconnect between physical evidence and what anecdotal "data" they've gathered says far more about their own biases, rather than denial on the part of us skeptics.

Of course, what do I know? I’m one of those neocon, rabid warbloggers, unable and unwilling to see facts or reason. Munro and Cannon point out that even other anti-war observers dismiss the Lancet 2006 numbers as completely inaccurate, and suggest some reasons offered for why:

How to explain the enormous discrepancy between The Lancet's estimation of Iraqi war deaths and those from studies that used other methodologies? For starters, the authors of the Lancet study followed a model that ensured that even minor components of the data, when extrapolated over the whole population, would yield huge differences in the death toll. Skeptical commentators have highlighted questionable assumptions, implausible data, and ideological leanings among the authors, Gilbert Burnham, Riyadh Lafta, and Les Roberts.

Some critics go so far as to suggest that the field research on which the study is based may have been performed improperly -- or not at all. The key person involved in collecting the data -- Lafta, the researcher who assembled the survey teams, deployed them throughout Iraq, and assembled the results -- has refused to answer questions about his methods.

Some of these questions could be resolved if other researchers had access to the surveyors' original field reports and response forms. The authors have released files of collated survey results but not the original survey reports, citing security concerns and the fact that some information was not recorded or preserved in the first place. This was a legitimate problem, and it underscored the difficulty of conducting research in a war zone.

Over the past several months, National Journal has examined the 2006 Lancet article, and another [PDF] that some of the same authors published in 2004; probed the problems of estimating wartime mortality rates; and interviewed the authors and their critics. NJ has identified potential problems with the research that fall under three broad headings: 1) possible flaws in the design and execution of the study; 2) a lack of transparency in the data, which has raised suspicions of fraud; and 3) political preferences held by the authors and the funders, which include George Soros's Open Society Institute.

Read the whole thing. A CSI-worthy dissection, with gruesome results. The Lancet, and John Hopkins, should be shamed into publicly renouncing the Study.

Via Glenn Reynolds, who links to more over at Pajamas Media and Bizzy Blog.

Labels: ,

Links to this post:

Create a Link

<< Home

This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]