The Iraqi Ministry of Planning and Development Cooperation, in conjunction with the U.N. Development Program, has released an extensive survey of conditions in Iraq, called Iraq Living Conditions Survey 2004 (ILCS). The part of the survey dealing with war related deaths appears to strongly undercut the Lancet Iraqi Mortality Survey (LIMS) of which I have been so critical.
The home page for the ILCS contains links to PDFs of the full report divided in three volumes: Volume I: Tabulation Report, Volume II: Analytical Report and Volume III: Socio-economic Atlas. The Analytical Report is the main document.
I haven’t had time to thoroughly digest the new study but I do have some preliminary observations. Take them with a grain of salt. I’m not ready to declare myself vindicated yet.
(1) The two studies do not cover identical time frames. The LIMS was conducted in August of 2004 whereas the ILCS study was mostly conducted in March 22 through May 25 of 2004. Significant fighting broke out in April-July of 2004, mostly associated with Sadr’s uprising. So the ILCS might have missed deaths that the LIMS measured (although with the Falluja cluster excluded most of the deaths measured by LIMS occurred before March 2004).
(2) The ILCS study doesn’t seek to measure the results of the war per se but rather to provide a good picture of overall conditions in Iraq. It seeks to measure the history of conditions during Saddam’s reign as well as conditions after the liberation. Unlike LIMS, it is intended to provide actual useful data to decision makers, so it concentrates on healthcare, housing, education etc. Deaths from violence are not the main focus of the study and don’t appear to be broken out in detail anywhere.
(3) Like all U.N. studies, it relies wholly on the governing authority of the country in which it was conducted. This is why U.N. sponsored studies on, say, Saddam-era Iraqi infant mortality, or Cuban health care standards, are so suspect. In the case of ILCS, the governing authority was the Iraqi provisional government under the Coalition. If you don’t trust them perhaps you shouldn’t trust the study.
(1) The ILCS uses a huge sample. Like the LIMS it uses cluster sampling but it uses a large number of small clusters. The LIMS sampled 990 households divided into 33 clusters of 30 houses each. It assigned the clusters by choosing specific points and then sampling the 30 households most contiguous to each point. The ILCS sampled 22,000 households divided into 2,200 clusters of 10 houses each. Further, the 10 households were not physically contiguous as with LIMS but were randomly selected from all houses in a defined area. The ILCS sampling is much larger and more random than the LIMS. And, unlike the LIMS, the ILCS sampled all the governorates.
For these reasons, data from the ILCS will be significantly more solid than the LIMS. LIMS looks like a high-school science project in comparison.
(2) As much as I hate to defend Roberts and the other authors of LIMS, I do not know whether I can’t say at present whether it is fair to report, as most news stories have, that the ILCS measured only 24,000 war related deaths whereas the LIMS measured 100,000. It is not immediately clear what the ILCS considers a “war-related” death [Analytical Report logical page 54], which is where the 24,000 number appears to come from. If the definition of “war-related” differs significantly from those that LIMS counted, then the LIMS might be off by only a factor of 2 or so. Just a reminder of how badly the media suck at reporting science.
LIMS looked at changes in mortality from all causes. The 100,000 figure (what the authors call a “conservative” estimate) includes death from illness and accident, crime, insurgents/terrorists and Coalition actions. The percentage breakdown in deaths (excluding the Falluja cluster) was 66% from violent causes and 33% from non-violent causes. That means that crudely the Lancet study reports 66,000 deaths from violence. Most of those deaths (57%) resulted from the non-Coalition actors, who where either insurgents or economic criminals. If the ILCS did not include crime in their “war-related” category, and we include deaths that occurred after after the end of May 2004, then the ILCS deaths could bubble up to 30,000 or so. On the other hand, LIMS excluded military deaths from its estimate whereas the ILCS seems to include them, so perhaps it all cancels out.
The ILCS authors do imply that their 24,000 figure matches the LIMS 100,000 figure because they refer to it directly [Analytical Report logical page 54]:
Another source (Roberts et al. 2004) estimates the number to be 98,000, with a confidence interval of 8,000 to 194,000. The website “Iraq Body Count” (http://www.iraqbodycount.net/) estimates that between 14,619 and 16,804 deaths have occurred between the beginning of 2003 and 7 December 2004 (IBC 2004).
Since the LIMS executive summary makes it clear that it measures all sources of mortality I think it safe to assume that the ILCS considers “war-related” deaths to cover the same causes of deaths as LIMS. Moreover, the Iraqi ministries rejected the LIMS conclusions when they were originally published, stating that their own research showed nothing like the death rate reported by LIMS. Since the ILCS had already been conducted they may have been relying on its then-unpublished data to reject the LIMS.
(Interesting note: LIMS was conducted, analyzed and published in under 90 days. The ILCS took over a year.)
In any case, I think it fair to say that the LIMS is high by at least a factor of 2. When you consider that the authors repeatedly have asserted that the 100,000 deaths was a “conservative” estimate and “may be much higher” then LIMS really doesn’t look very good. Its minimum estimate is at least twice that of the much better ILCS.
Still, comparing the ILCS’s 24,000 war-related deaths to LIMS 100,000 “conservative” estimate might be inaccurate.
(3) The study demolished many of the claims made about infant mortality both during the sanction era and in the time before and after the war.
There never was a huge spike in infant mortality in the 1996-1998 era as many (Including LIMS co-author Richard Garfield) had asserted. IIRC, Garfield’s paper in Lancet claimed a mortality rate of 109/1000. As many suspected at the time, it appears likely that those studies were subverted by Saddam in an attempt to build support for the lifting of sanctions. It that sense, LIMS was following an established tradition.
There also doesn’t seem to have been any spike in infant mortality after the war. LIMS portrayed infant mortality as falling from a peak in 1998 until the invasion and then jumping upward afterward. ILCS shows a steadily worsening infant mortality rate across the 90’s then a decrease from 1998 that reversed in 2001 and then steadily increased up to the study time. [Analytical Report Fig 26 logical page 52]. The rate seemed to bottom out in the upper 20s in early 2001, then began to rise steadly. There doesn’t seem to be any discontinuity associated with the war. At no time did the mortality rate exceed 40/1000.
LIMS had reported 29/1000 in the year before the invasion and 57/1000 afterward. ILCS measured infant mortality at 32/1000 for the entire period 1999-2003. That means that quite a lot of the excess deaths that LIMS claimed to have measured either didn’t occur or didn’t have anything to do with the war.
I was really surprised (pleasently) by the low rate. I assumed that LIMS underestimated the pre-invasion rate but instead it appears to have overestimated the post-invasion rate. What threw me off was the assumption that the 1998 rate must have been up above 60/1000 when it was actually around 40/1000. I knew the rate couldn’t have changed that quickly. (Infant mortality doesn’t change quickly because so many factors influence it. That is why it is often used as proxy for the health of entire populations.)
(4) I can say with absolute confidence that all the numbers produced by including the Falluja cluster data are complete garbage. With the Falluja cluster included, LIMS shows an excess of a 250,000 dead (nearly 1% of the total population of Iraq) from violence alone. If anything approaching that number of people had died the huge number of samples in the ILCS would have easily uncovered it. If Roberts et al were honest researchers they would have tossed out all the data produced by the cluster. Instead they have relied on it for all their primary conclusion except for their “conservative” 100,000 total death figure. That remains my primary criticism of the study.
I think the news stories to date may be giving the false impression that the LIMS is worse than it is, but given the dishonest way in which its authors have marketed the study, it would be karmic if the idea got wedged in the mind of the media that the LIMS was wildly inaccurate (as it might actually be). It is clear that the LIMS “conservative” estimate is at best way on the high side and any “non-conservative” estimates they might offer are clearly wrong. LIMS is at best a very weak study that was hailed by many as a very strong one.
For Lancet, this new study appears to strongly contradict two controversial studies it has published on Iraq. I suppose it is too much to hope that this will prompt some sort of internal review of potential political bias.
I’ll go over the ILCS in more detail as my time permits.
UPDATE: Commenter AMac has posted another thoughtful analysis (written before the UN study came out) of the Lancet study at Winds of Change. [Update by JG.]