Previous Next

Global Warming and Climate Change: what Australia knew and buried

2. Loading the dice: humans as planetary force

What we see happening with new record temperatures, both warm and cold is in good agreement with what we predicted in the 1980s when I testified to Congress about the expected effect of global warming. I used coloured dice then to emphasize that global warming would cause the climate dice to be ‘loaded’—for risk of more extreme weather.

James Hansen, Director, Goddard Institute for Space Studies, interview with Bill McKibben, 22 December 2010

The current rate at which CO2 is rising, 2 ppm per year, is unprecedented in the recent history of the Earth, with the exception of the onset of greenhouse atmospheric conditions following major volcanic episodes and asteroid and comet impacts, which led to the large mass extinctions [and ended planetary periods like the Jurassic and Cretaceous].

Andrew Glikson, The lungs of the Earth,, 2009

The idea that the human species and its societies are a new ‘force of nature’ capable of altering planetary systems is a recent one that confronts long-held beliefs. That we are now in a new epoch called the Anthropocene is still resisted by some traditionally trained geologists and meteorologists, among others, and this has had implications for present-day sceptic debate.

How we got to this understanding takes us along the global warming/climate change science discovery path. Although there were earlier relevant discoveries, the path is generally described as starting with the 1890s hypothesis of Swedish scientist Svante Arrhenius that gases from burning fossil fuels could raise global temperatures. From then a complex, multidisciplinary research effort has led to the present-day understanding that human activities are changing the atmosphere–ocean–biosphere balance resulting in the warming greenhouse effect on the planet. Science historian Spencer Weart noted that at the turn of the 20th century, and for a long time thereafter, deeply embedded in human culture was the belief that either God or nature would take care of any human impacts:

Hardly anyone imaged that human actions, so puny amongst the vast natural powers, could upset the balance that governed the planet as a whole ... It was traditionally tied up with a religious faith in the God-given order of the universe ... Such was the public belief and scientists are members of the public, sharing most of the assumptions of their culture. (Weart 2004:8)

Not only was it considered unlikely that humans could affect earth systems, the rate of change indicated by climate models went counter to long-held beliefs and principles—promoted particularly by geologists who had a century earlier explained the phenomenon of coming and going ice ages for a disbelieving scientific community.

The discipline of geology holds that changes to planetary systems and climate can only occur as they have before (and, indeed, there have been many previous hotter and colder periods) which can be read from geological evidence researchers could measure on the ground. According to geologists, previous climatic changes occurred over thousands, if not millions, of years, which belief prompted a basic scepticism about rapid change induced by human activity.

During the second half of the 20th century other scientists were looking at planetary systems in detail—including the capacity of the oceans to absorb carbon dioxide (CO2), thus delaying measurable on-land impacts for additional decades—and they were steadily learning about the connections between the world’s biomass and ecosystems. Examples are the Arctic tundra as a reservoir of methane that would be released with warming, or the weakening of the Atlantic Gulf Stream leading to paradoxical cooling in the north Atlantic.

Earth scientists of all stripes only gradually learned that climate could change rapidly in just the span of a hundred years, or even a decade, and not solely over thousands of years or geological periods as previously thought. This understanding came with the disturbing corollary that rapid climate change might manifest no differently in the first instance than natural variation.

Rapid climate change and detecting the human fingerprint

How fast can our planet’s climate change? Too slowly for humans to notice, was the firm belief of most scientists through much of the 20th century ... Today, there is evidence that severe change can take less than a decade. A committee of the (US) National Academy of Sciences (NAS) has called this reorientation in thinking of scientists a veritable ‘paradigm shift’ ... but this new thinking is little known and scarcely appreciated in the wider community of natural and social scientists and policymakers. (Weart 2004)

The same process of discovery was leading climate researchers to the conclusion that it was indeed human agency changing the atmosphere and the climate in this period of history. As early as the 1950s, US oceanographer Roger Revelle, who was studying the uptake of CO2 in the oceans and CO2 emission from industrial processes, came to a radical conclusion: ‘Human beings are carrying out a large-scale geophysical experiment of a kind that could not have happened in the past nor be reproduced in the future’ (Weart 2004: 30).

Neither Revelle nor other researchers then foresaw just how this experiment would ramp up as both industrialisation and population exploded during the next 50 years, accelerating the level of greenhouse gas emissions accumulating in the atmosphere as well as other significant and sometimes related impacts on earth systems.

Building on this background of 20th century research, ecologists and other scientists studying global change during the past decades under the International Geosphere Biosphere Project (IGBP) have been publishing the evidence for an ‘Anthropocene epoch’; that is, the beginning of a time span where humans are the main planetary force altering natural systems. ‘Anthropogenic climate change’ refers to this human agency.

The IGBP dissected the cumulative human impact on the previous balance within the natural systems of soil, air, water, forests and species. Australia became involved in the project in 1990. In 2008 IGBP alumnus Will Steffen, former director of The Australian National University Climate Change Institute and recent member of the now disbanded federal Climate Commission, gave a seminar that summarised why the IGBP scientists believed that they had identified a new but massive human footprint over all earth systems. These scientists proposed a reconceptualisation of history, which tracks the evolution of modern societies against natural system benchmarks—including CO2 in the atmosphere (Steffen, Crutzen & McNeill 2007; Costanza, Graumlich & Steffen 2007). The story they present in regard to the greenhouse effect goes as follows.

Stability for about 250,000 years

About 250,000 years ago, fully modern humans emerged in Africa. At that time, the concentration of CO2 in the atmosphere was low—somewhere below 200 ppm—compared with today’s 400 ppm. Atmospheric methane was similarly low. The concentration of both these gases rose for centuries at a time (but not above 240 ppm) and then fell for longer periods of time. This pattern steadied at 240 ppm from the beginning of agriculture, 5,000–7,000 years before the present, and through the great European civilisations of Greece and Rome.

Early human activities that may have contributed to relatively small elevated levels of CO2 included fire-stick farming and forest clearing. Evidence for these conclusions come from Greenland ice cores.1 The dramatic increases in CO2 levels, however, started with the Industrial Revolution as can be seen in this graph.


Proxy evidence from 250,000 BC to the 1800s indicates natural systems remained remarkably stable in greenhouse gas concentrations until the beginning of the Industrial Revolution.

Source: Steffen, Crutzen & McNeill 2006.

CO2 levels remained fairly constant at approximately 270 ppm or lower until the beginning of the Industrial Revolution. Starting around 1800, came the invention of the internal combustion engine, using fossil fuel energy, and what we call ‘progress’ made possible by modern science and technology. CO2 levels started climbing slowly. During the period called ‘Anthropocene stage 2’ (1945 to 2010 or 2020), however, CO2 levels rose rapidly and are still climbing (390 ppm at the end of 2009; 400 ppm in 2013).

The IGBP researchers found a matching curve of impacts in a range of biophysical reactions—away from the relative stability of previous centuries in human history. Greenhouse gases, ozone depletion, full exploitation of fisheries, loss of forests, water shortages and species extinctions (among other benchmarks) zoomed upwards in just over 50 years.

The net result of maximised resource exploitation with industrialisation and population growth is described by the metaphor of a ‘global footprint’. The implication is that humans are depleting the natural capital of the planet at an unsustainable rate in comparison to how many earths we would need to keep up with our demands (more than one additional earth by 2001).

Along with humans depleting the natural physical environment has come another startling and depressing realisation: humans are now seen by many scientists as the force that will lead to the sixth great extinction of other species—akin to the impact of a comet strike, or a series of volcanic eruptions that blot out the sun for years, or a natural overdose of CO2 (which happened 252 million years ago at the end of the Permian) along with related impacts (also happening now), such as ocean acidification. The Permian climate change happened very rapidly in geologic terms.

Temperatures soared—the seas warmed by as much as 10 degrees—and the chemistry of the oceans went haywire, as if in an out-of-control aquarium. The water became acidified, and the amount of dissolved oxygen dropped so low that many organisms probably, in effect, suffocated. Reefs collapsed. (Kolbert 2014: 103)

Interestingly, the study of previous extinctions has also suggested that the Anthropocene might have started much earlier—with the extinction of the megafauna and resultant vegetation changes on various continents as humans took up habitation (Kolbert 2014).

In similar fashion, by burning fossil fuels and other greenhouse gas-emitting activities, humans are now changing or altering climatic patterns—with impacts we can witness—such as increased cyclone force and frequency, increased flooding, and the heatwaves of the past decades, all set to accelerate. So-called ‘one-in-a-hundred-year’ intense bushfire and extensive flooding have followed in quick succession, and have decimated communities and regions. Steffen and his co-researchers postulate that, by 2050, heatwaves (and their flow-on effects) will be an everyday event.

Some theorists about Earth climate cycles see a paradox here. They argue that the rise in CO2 and other greenhouse gases, prompting higher planetary temperatures (global warming), currently shield human civilisation from the natural climate pattern of the past millions of years. The dominant pattern is ice ages broken by short interglacial periods. Agricultural civilisations emerged in a balmy interglacial that started about 20,000 years ago, and some postulate that the Earth is due for another ice age. The problem for this theory is that it doesn’t help on the ground: the bumpy ride predicted with climate change already in motion cancels any modifying influence to stave off ice ages in a way that matters to current civilisations.

Of more immediate consideration, humans appear hardwired for short-term thinking. Psychological concepts of how we view the world around us, including ‘creeping normalcy’ or ‘landscape amnesia’, block day-to-day comprehension of what accelerating human activities represent—whether it is human population, the number of dammed rivers, forest destruction, or the impact of motor car emissions in a timespan that is geologically brief. Creeping normalcy refers to slow trends concealed in noisy fluctuations that people get used to without comment, while landscape amnesia describes forgetting how different the landscape looked 20–50 years ago (Diamond 2005: 425).

In his study of how societies fail, biogeographer Jared Diamond calls global warming a pre-eminent example of a ‘slow trend concealed by wide up and down fluctuations’ (2005: 425). He likens the denial of climate change impacts by leading politicians, including former US president George W. Bush (and his contemporary John Howard in Australia), in the late 1990s and early 2000s to the elite of ‘the medieval Greenlanders [who] had similar difficulties recognizing that their climate was gradually becoming colder, and the Maya and Anasazi (in Central and North America) [who] had trouble discerning that theirs was becoming drier’ (2005: 425).

The evidence that humans have become a geophysical force is compelling. But denying or not acknowledging this was a hallmark of the framing of climate change in Australia from at least the mid-1990s. Short-termism and, more importantly, a shift in dominant values shaped that response, as we will see in upcoming pages. But a look first at the dynamics of ‘what I say and what you hear’ helps set the outline of how communication works in practice and suggests how it can be manipulated.

1 Ice core data have been collected by the US National Oceanic and Atmospheric Administration (NOAA), among other expert worldwide agencies. Australian scientists pioneered extracting air from ice cores to ‘read’ prior history of atmospheric concentrations of greenhouse gases. NOAA’s website states that data from polar and mountain glaciers and ice caps are archived yielding ‘proxy’ climate indicators from the past in oxygen isotopes, methane concentrations, dust content and other parameters (

Previous Next