There He Goes Again: Mann Claims His Hockey Stick was Given “Clean Bill of Health”
Spinmeister Michael Mann is quoted in this article from the Telegraph yesterday as follows:
Prof Hand (Head of the UK Royal Statistical Society) praised the blogger Steve McIntyre of Climate Audit for uncovering the fact that inappropriate methods were used which could produce misleading results. “The Mann 1998 hockey stick paper used a particular technique that exaggerated the hockey stick effect,” he said.
Prof Mann, who is Professor of Earth System Science at the Pennsylvania State University, said the statistics used in his graph were correct. “I would note that our ’98 article was reviewed by the US National Academy of Sciences, the highest scientific authority in the United States, and given a clean bill of health,” he said. “In fact, the statistician on the panel, Peter Bloomfield, a member of the Royal Statistical Society, came to the opposite conclusion of Prof Hand.”
Mann has been repeating this arrogant duplicitous spin continuously since Climategate and refuses to acknowledge any problems whatsoever with his infamous doomsday hockey stick graph. Mann always refers to the subtly worded US National Academy of Sciences (NAS) report as his ally because he knows McIntyre & McKitrick, the Wegman Report, Hans von Storch, et al, and now the Head of the Royal Statistical Society have minced no words debunking his hockey stick. But what did the NAS report and the authors actually say about the Mann hockey stick? In fact, the NAS report validated all of the significant criticisms of McIntyre & McKitrick (M&M):
1. The NAS indicated that the hockey stick method systematically underestimated the uncertainties in the data (p. 107).
2. In subtle wording, the NAS agreed with the M&M assertion that the hockey stick had no statistical significance, and was no more informative about the distant past than a table of random numbers. The NAS found that Mann’s methods had no validation (CE) skill significantly different from zero. In the past, however, it has always been claimed that the method has a significant nonzero validation skill. Methods without a validation skill are usually considered useless. Mann’s data set does not have enough information to verify its ‘skill’ at resolving the past, and has such wide uncertainty bounds as to be no better than the simple mean of the data (p. 91). M&M said that the appearance of significance was created by ignoring all but one type of test score, thereby failing to quantify all the relevant uncertainties. The NAS agreed (p. 110), but, again, did so in subtle wording.
3. M&M argued that the hockey stick relied for its shape on the inclusion of a small set of invalid proxy data (called bristlecone, or “strip-bark” records). If they are removed, the conclusion that the 20th century is unusually warm compared to the pre-1450 interval is reversed. Hence the conclusion of unique late 20th century warmth is not robust—in other word it does not hold up under minor variations in data or methods. The NAS panel agreed, saying Mann’s results are “strongly dependent” on the strip-bark data (pp. 106-107), and they went further, warning that strip-bark data should not be used in this type of research (p. 50).
4. The NAS said ” Mann et al. used a type of principal component analysis that tends to bias the shape of the reconstructions”, i.e. produce hockey sticks from baseball statistics, telephone book numbers, and monte carlo random numbers.
5. The NAS said Mann downplayed the “uncertainties of the published reconstructions…Even less confidence can be placed in the original conclusions by Mann et al. (1999) that ‘the 1990s are likely the warmest decade, and 1998 the warmest year, in at least a millennium.’
Mann never mentions that a subsequent House Energy and Commerce Committee report chaired by Edward Wegman totally destroyed the credibility of the ‘hockey stick’ and devastatingly ripped apart Mann’s methodology as ‘bad mathematics’. Furthermore, when Gerald North, the chairman of the NAS panel — which Mann claims ‘vindicated him’ – and panel member Peter Bloomfield who Mann says above came to the opposite conclusions as Prof Hand, were asked at the House Committee hearings whether or not they agreed with Wegman’s harsh criticisms, they said they did:
CHAIRMAN BARTON. Dr. North, do you dispute the conclusions or the methodology of Dr. Wegman’s report?
DR. NORTH. No, we don’t. We don’t disagree with their criticism. In fact, pretty much the same thing is said in our report.
DR. BLOOMFIELD. Our committee reviewed the methodology used by Dr. Mann and his co-workers and we felt that some of the choices they made were inappropriate. We had much the same misgivings about his work that was documented at much greater length by Dr. Wegman.
WALLACE: ‘the two reports were complementary, and to the extent that they overlapped, the conclusions were quite consistent.’ (Am Stat Assoc.)
Thus, despite Mann’s incredible spin, Dr. Bloomfield did not “come to the opposite conclusion as Dr. Hand“, nor those of Dr. Wegman, Steve McIntyre, and Dr. McKitrick.