A Lawyer’s Examination of the IPCC “Evidence” for Man-made Global Warming

by Dennis Ambler

This is really very, very good, but politicians will not read it through, it is 79 pages long. It is a lawyer’s examination of the IPCC evidence as in a courtroom. He takes apart the IPCC conclusions and presents other evidence that disproves their insistence on incontrovertible evidence of AGW.


An example of his approach:

The Ability of Climate Models to Explain Past Climate
The IPCC and the climate establishment have vastly oversold climate models by declaring that such models are able to quite accurately reproduce past climates, including most importantly the warming climate of the late twentieth century. Mainstream climate modelers have themselves explained that climate models disagree tremendously in their predicted climate sensitivity ? response of temperature to a CO2 increase ? and are able to reproduce twentieth century climate only by assuming whatever (negative) aerosol forcing effect is necessary to get agreement with observations.

These kind of explanations, by leading climate modelers, suggest that climate models do not in fact reflect understanding of the key physical climate processes well enough to generate projections of future climate that one could rely upon. It seems unlikely that climate model projections would be accorded much policy significance if the way in which they were able to ?reproduce? past climate was generally understood. It seems more than plausible that policymakers (let alone the general public), take a model?s purported ability reproduce past temperatures as an indication that the model?s assumption about climate sensitivity is correct.

If policymakers were told that this is not so, that ability to reproduce past temperatures indicates only that a particular pairing of assumptions about climate sensitivity and aerosol forcing allowed the reproduction of past  temperatures, then the logical question would be: which model gets the correct pairing of sensitivity and aerosol forcing? In answer to this, climate modelers would have to say that they do not know, and the best that could be done would be to use all the models (this is called the ensemble approach). But of course it is possible that all the models were very badly wrong in what they assumed about sensitivity.

A policymaker aware of this would then have to ask whether it would be better to base policy on climate models, or a more naïve climate forecasting method, and whether further public funding of efforts to improve climate models was worthwhile.

The Existence of Significant Alternative Explanations for Twentieth Century Warming
The IPCC and the climate establishment story expresses great certainty in arguing that late twentieth century global warming was caused by the atmospheric buildup of human ghg emissions (this is the anthropogenic global warming or AGW story).

The IPCC reports confidently assert that solar activity could not have accounted for warming during this period, because this was a period of weakening and not strengthening solar 76 irradiance, and that there was no natural forcing during this period that could have accounted for the warming.

Yet a closer look at the literature shows that there is ongoing dispute about the possible role of the sun, with the debate coming down to conflicting views about the reliability of alternative datasets on solar activity. Perhaps even more importantly, a growing body of sophisticated theoretical work confirms that the nonlinear global climate is subject to inherent warm and cool cycles of about 20 to 30 years in duration, with substantial evidence that a warm cycle was likely to have begun in 1976.

The existence of alternative explanations for twentieth century warming obviously has enormous implications for policy, for in order to determine how much to spend to reduce human ghg emissions, one must know first have some idea how harmful those emissions will be if they continue unabated.

Questionable methodology underlying highly publicized projected impacts of global warming
One of the most widely publicized numbers in the establishment climate story is the projection that 20-30 per cent of plant and animal species now existing may become extinct due to global warming. This number is also one of the most troubling, because it comes from a single study whose methodological validity has been severely questioned by a large number of biologists. These biologists agree that the methodology neglects many key processes that determine how the number of species will respond to changing climate, and will always lead to an overestimate of species loss due to climatic change.

Among the most surprising and yet standard practices is a tendency in establishment climate science to simply ignore published studies that develop and/or present evidence tending to disconfirm various predictions or assumptions of the establishment view that increases in CO2 explain virtually all recent climate change.
Perhaps even more troubling, when establishment climate scientists do respond to studies supporting alternative hypotheses to the CO2 primacy view, they more often than not rely upon completely different observational datasets which they say confirm (or at least don?t disconfirm) climate model predictions.

We should not be using public money to pay for faster and faster computers so that increasingly fine-grained climate models can be subjected to ever larger numbers of simulations until we have got the data to test whether the predictions of existing models are confirmed (or not disconfirmed) by the evidence.

Policy carrying potential costs in the trillions of dollars ought not to be based on stories and photos confirming faith in models, but rather on precise and replicable testing of the models? predictions against solid observational data.