A Climate of Consilience (or the science of certitude)

There seems to be a lot of discussion about an apparently simple question:

Can science be ‘certain’ about, well, anything? 

If that meant not doing anything – not building a bridge; not releasing a new drug; not taking off for the flight to New York; not flying a spacecraft to Saturn; not vaccinating the whole world against polio; not taking action to decarbonise our energy supply; Etc. – then this lack of 100% certainty might totally debilitate a modern society, frozen with doubt and so unable to act.

But of course, we do not stop implementing solutions based on our current best knowledge of nature and the world, however limited it might be. We make judgments. We assess risks. We weigh the evidence. We act.

I think scientists often fall into the trap of answering a quite different question:

Do we have a complete and all encompassing theory of the world (or at least, ‘this’ bit of the world, say how black holes work or how evolution played out)?

And everyone will rush defensively to the obvious answer, “no”. Why? Because we can always learn more, we can always improve, and indeed sometimes – although surprisingly rarely – we can make radical departures from received bodies of knowledge.

We are almost 100% certain of the ‘Second Law of Thermodynamics’ and Darwin’s ‘Evolution by Natural Selection’, but almost everything else is of a lower order.

But even when we do make radical departures, it doesn’t always mean a complete eradication of prior knowledge. It does when moving from superstition, religious dogma, witch-doctoring and superstitious theories of illness: as when we move to the germ theory of disease and a modern understanding biology, because people get cured, and ignorance is vanquished.

But take Newtonian mechanics. This remains valid for the not too small (quantum mechanical) and not too massive or fast (relativistic) domains of nature, and so remains a perfectly good approximation for understanding snooker balls, the motion of the solar system, and even the motion of fluids.

Want to build a bridge? Move over Schrodinger and Einstein, come on board Newton! Want to understand the interaction of molecules? Thank you Schrodinger. Want to predict gravitational waves? Thank you Einstein.

That is why the oft promulgated narrative of science – the periodic obliteration of old theories to be replaced by new ones – is often not quite how things work in practice.  Instead of a vision of a singular pyramid of knowledge that is torn down when someone of Einstein’s genius comes along and rips away its foundations, one instead sees new independent pyramids popping up in the desert of ignorance.

The old pyramids often remain, useful in their own limited ways. And when confronting a complex problem, such as climate change, we see a small army of pyramids working together to make sense of the world.

As one such ‘pyramid’, we have the long and tangled story of the ‘atom’ concept, a story that began with the ancient greeks, and has taken centuries to untangle. Building this pyramid – the one that represents our understanding of the atom – we follow many false trails as well as brilliant revelations. Dalton’s understanding of the uniqueness and differentiation of atoms was one such hard fought revelation. There was the kinetic theory of gases that cemented the atomic/ molecular role in the physical properties of matter: the microscopic behaviour giving rise to the macroscopic properties such as temperature and pressure. Then there was the appreciation of the nuclear character and the electronic properties of atoms, leading ultimately to an appreciation of the fundamental reason for the structure of the periodic table, with a large dose of quantum theory thrown in. And then, with Chadwick’s discovery of the neutron, a resolution of the reason for isotopes very existence. Isotopes that, with the help of Urey’s brilliant insight, enabled their use in diverse paleoclimatogical applications that have brought glaciologists, chemists and atmospheric physicists together to track the progress of our climate and its forcing agents.

We can trace a similar story of how we came to be able to model the dynamical nature of our weather and climate. The bringing together of the dynamics of fluids, their thermodynamics, and much more.

Each brick in these pyramids starting as a question or conundrum and then leading to decades of research, publications, debate and resolutions, and yes, often many new questions.

Science never was and never will be the narrative of ignorance overcome by heroic brilliance overnight by some hard pressed crank cum genius. Galilieo was no crank, neither was Newton, nor was Einstein.

Even if our televisual thirst for instant gratification demands a science with instant answers, the reality is that the great majority of science is a long process of unfolding and developing the consequences of the fundamental principles, to see how these play out. Now, with the help of the computational facilities that are part of an extended laboratory (to add to the test tube, the spectometer, x-ray diffration, and so much more) we can see further and test ideas that were previously inaccessible to experimentation alone (this is true in all fields). Computers are the microscope of the 21st Century, as one molecular biologist has observed.

When we look at climate change we have a subject of undoubted complexity, that is a combination of many disciplines. Maybe for this reason, it was only in the late 1950s that these disparate disciplines recognised the need to come together: meteorology, glaciology, atmospheric chemistry, paleoclimatology, and much more. This convergence of disciplines ultimately led to the formation 30 years later to the IPCC in 1988.

At its most basic, global warming is trivial, and beyond any doubt: add more energy to a system (by adding more infra-red absorbing carbon dioxide to the atmosphere), and the system gets hotter (because, being knocked out of equilibrium, it will heat up faster than it loses heat to space, up and until it reaches a new equilibrium).  Anyone who has spent an evening getting a frying pan to the point where it is hot enough to fry a pancake (and many to follow), will appreciate the principle.

Today, we have moved out of a pre-existing equilibrium and are warming fast, and have not yet reached a new equilibrium. That new equilibrium depends on how much more fossil fuels we burn. The choice now is between very serious and catastrophic.

The different threads of science that come together to create the ‘climate of consilience’ are diverse. They involve everything from the theory of isotopes; the understanding of Earth’s meteorological system; the nature of radiation and how different gases react with different types of radiation; the carbonate chemistry of the oceans; the dynamics of heat and moisture in the atmosphere based on Newtonian mechanics applied to fluids; and so much more.

Each of these threads has a well established body of knowledge in its own right, confirmed through published evidence and through their multiple successful applications.

In climate science these threads converge, and hence the term consilience.

So when did we know ‘for certain’ that global warming was real and is now happening?

Was it when Tyndall discovered in 1859 that carbon dioxide strongly absorbed infra-red radiation, whereas oxygen and nitrogen molecules did not?  Did that prove that the world would warm dangerously in the future? No, but it did provide a key building block in our knowledge.

As did the findings of those that followed.

At each turn, there was always some doubt – something that suggested a ‘get out clause’, and scientists are by nature sceptical …

Surely the extra carbon dioxide added to the atmosphere by human activities would be absorbed by the vast oceans?

No, this was shown from the chemistry of the oceans to be wrong by the late 1950s, and thoroughly put to bed when sufficient time passed after 1958, when Charles Keeling started to accurately measure the concentration of carbon dioxide in the atmosphere. The ‘Keeling Curve’ rises inexorably.

Surely the carbon dioxide absorption of heat would become ‘saturated’ (unable to absorb any more heat) above a certain concentration.

No, this was raised in the early 20th Century but thoroughly refuted in the 1960s. Manabe & Wetherald’s paper in 1967 was the final nail in the coffin of denial for those that pushed against the ‘carbon dioxide’ theory.  To anyone literate in science, that argument was over in 1967.

But will the Earth system not respond in the way feared … won’t the extra heat be absorbed by the oceans?

Good news, bad news. Yes, 93% of the extra heat is indeed being absorbed by the oceans, but the remainder is more than enough to ensure that the glaciers are melting; the great ice sheets are losing ice mass (the loses winning out over any gains of ice); seasons are being affected; sea levels are rising inexorably; and overall the surface temperature is rising. No need for computer models to tell us what is happening, it is there in front of us, for anyone who cares to look.

Many pour scorn on consensus in science.

They say that one right genius is better than 100 fools, which is a fine argument, except when uttered by a fool.

Even the genius has to publish, and fools never will or can, but shout from the sidelines and claim genius. All cranks think they are geniuses, whereas the converse is not true.

Einstein published, and had to undergo scrutiny. When the science community finally decided that Einstein was right, they did so because of the integrity of the theory and weight of evidence were sufficient. It was not a show of hands immdiately after he published, but in a sense, it was a show of hands after years of work to interrogate and test his assertions.

It was consilience followed by consensus (that’s science), not consensus followed by consilience (that’s political dogms).

We are as certain that the Earth has warmed due to increases in greenhouse gases – principally carbon dioxide, arising from human activities – as we are of the effects of smoking on human health, or the benefits of vaccination, and much more.  And we are in part reinforced in this view because of the impact that is already occuring (observations not only theory).

The areas of doubt are there – how fast will the West Antarctica Ice Sheet melt – but these are doubts in the particulars not in the general proposition.  Over 150 years of accumulated knowledge have led to this consilience, and was until recently, received wisdom amongst leaders of all political persuasions, as important and actionable knowledge.

The same is true of the multiple lines of enquiry that constitute the umbrella of disciplines we call ‘climate science’. Not a showing of hands, but a showing of published papers that have helped create this consilience of knowledge, and yes, a consensus of those skilled in their various arts.

It would be quicker to list the various areas of science that have not impacted on climate science than those that have.

In the two tables appended to the end of this essay, I have included:

Firstly, a timeline of selected discoveries and events over a long period – from 1600 to nearly the present – over which time either climate has been the topic or the underlying threads of science have been the topic.  I have also included parallel events related to institutions such as the formation of meteorological organisations, to show both scientific and social developments on the same timeline.

Secondly, I have listed seminal papers in the recent history of the science (from 1800 onwards), with no doubt omissions that I apologise for in advance (comments welcome).

When running workshops on climate fluency I used a 5 metre long roll – a handwritten version of the timeline – and use it to walk along and refer to dates, personalities, stories and of course, key publications. It seems to go down very well (beats Powerpoint, for sure) …

Screen Shot 2017-05-03 at 06.56.56.png

All this has led to our current, robust, climate of consilience.

There was no rush to judgment, and no ideological bias.

It is time for the commentariat – those who are paid well to exercise their free speech in the comment sections of the media, at the New York Times, BBC, Daily Mail, or wherever –  to study this history of the science, and basically, to understand why scientists are now as sure as they can be. And why they get frustrated with the spurious narrative of ‘the science is not yet in’.

If they attempted such arguments in relation to smoking, vaccination, germ theory or Newtonian mechanics,  they would be laughed out of court.

The science of global warming is at least as robust as any of these, but the science community is not laughing … it’s deeply concerned at the woeful blindness of much of the media.

The science is well beyond being ‘in’; it is now part of a textbook body of knowledge. The consilience is robust and hence the consequent 97% consensus.

It’s time to act.

And if you, dear commentator, feel unable to act, at least write what is accurate, and avoid high school logical fallacies, or bullshit arguments about the nature of science.

Richard Erskine, 2nd May 2017.

Amended on 17th July 2017 to include Tables as streamed Cloudup content (PDFs), due to inability of some readers to view the tables. Click on the arrow on bottom right of ‘frame’ to stream each document in turn, and there will then be an option to download the PDF file itself.

TABLE 1 – Timeline of Selected Discoveries and Events (since 1600)

 

TABLE 2 – Key Papers Related to Climate Science (since 1800)

 

END of DOCUMENT

5 Comments

Filed under Uncategorized

BBC Science Reporting: Evidence, Values and Pollability

In his Harveian Oration to the Royal College of Physicians on 15th October 2015, Professor Sir Mark Walport made the following observation:

“My PhD supervisor, Sir Peter Lachmann, has framed the distinction between the subjective and the objective in a different way, by considering whether questions are ‘pollable’ or ‘non- pollable’; that is, whether a question can be answered in principle by a vote (a pollable question), or whether the question has a right answer that is independent of individual preferences and opinions (a non-pollable question). This distinction can be easily illustrated by a couple of examples. It is a non-pollable question as to whether there is an anthropogenic contribution to climate change. There is a correct answer to this question and your opinion or mine is ultimately irrelevant. The fact that there may be uncertainties about the scale and the nature of the contribution does not change the basic nature of the question. In contrast, it is a pollable question as to whether nuclear energy is an acceptable solution to providing low-carbon power, and I will return to this later.”

The question presents itself: does the BBC understand the distinction between pollable and non-pollable questions related to science?

BBC Radio 4’s Today programme on Tuesday 12th September included two discussions on the nature of science reporting and how it has changed over the years, particularly at the BBC.

The first was with Steve Jones , Emeritus Professor of Human Genetics at University College, who led a  review of the way the BBC itself reports science, about the changing nature of science reporting, while the second was with Richard Dawkins, Professor of evolutionary biology and David Willetts a former science minister, considering the “public’s evolving relationship with science, evidence and truth”.

Subsequent to this I wrote a letter to the Today team at the BBC, which is reproduced below, which I am now sharing on my blog:

Dear Sir/ Madam

I wanted to thank the BBC Today team for two excellent discussions that John Humphreys had, first with Prof. Steve Jones, and then subsequently with David Willetts and Richard Dawkins.

John Humphreys posed the challenge to Prof. Jones, as to why we should ‘believe’ climate change; and I am paraphrasing his words:

A. The world is warming

B. This warming is man made, and

C. There is only one way of stopping it.

This was an alarming way to approach the topic, for two reasons.

Firstly, the science – and by virtue of that statement, scientists – unequivocally answer A and B with a resounding ‘Yes’.  There is an aggregation of scientific evidence and analysis going back at least to John Tyndall in the mid 19th Century that brought us – no later than the 1980s in fact – to a consilience of science on these questions. I discuss this history and the nature of ‘consilience’ in an essay, here: https://essaysconcerning.com/2017/05/02/a-climate-of-consilience-or-the-science-of-certitude/ 

To question this is at the same level as questioning whether cigarettes cause lung cancer. There is no debate to be had here.  Yes, debate on how to get teenagers  to stop taking up smoking, but that’s a different question.  To say that everyone can have an opinion, and to set up a controversial ‘debate’ on these questions is the “false balance” Professor Jones identified in the report he did for the BBC. Representing opinions is not a license to misrepresent the evidence, by using ‘false balance’ in this way.

Secondly, however, scientists do NOT speak with one voice on how to stop it, as John Humphrey’s phrased his C question.  That is a why the UNFCCC takes up the question here which require policy input, and yes, the input of ‘values’.  Whilst the A and B questions are not questions where it is appropriate to bring values to bear on the answers; solutions are full of value-based inputs.  So the C that John Humphreys should be opening a dialogue on this:

C(amended): There are many solutions that can contribute to addressing the given man-made global warming – either by mitigation or adaptation – which ones do you advocate and why?

And of course many subsidiary questions arise when debating these solutions:

  • Are we too late to prevent dangerous climate change, therefore need a massive reduction in consumption – a degrowth strategy?
  • Can we solve this with a kind of Marshall Plan to decarbonise our energy supply, but also heat buildings and transport, through electrification?
  • What role does nuclear energy play?
  • Given the long time that excess carbon dioxide levels remain in the atmosphere, and the legacy of the developed worlds emissions, how can the developing world receive carbon justice?
  • Even if we decarbonised everything tomorrow, what solutions are feasible for reducing the raised levels of carbon dioxide in the atmosphere; what degree of sea-level rise are we prepared to tolerate, ‘baked in’ already to the Earth system?
  • Is a carbon tax ultimately the only way forward, and what price do we put on carbon?
  • … and so on.

Yes, science can help answer these kinds of questions, but the values play a large part too.  

The fact the BBC still gets stuck in the groove of ‘debating’ A and B, is I think woeful. As woeful as ‘debating’ if smoking causes cancer.

I think David Willetts acknowledged the difference in these classes of question, whereas Richard Dawkins was disappointingly black and white; not recognising the role of values in the C(amended) class of questions.

David Willetts made the interesting point that in social science, there is often greater difficulty in getting to the truth, and this is highly problematic for politicians, but that for the physical sciences, if we’ve discovered the Higgs Boson, it is much clearer.  He made a lot of the need to bring values to bear on decisions and ‘not being able to wait for yet another report’. However, there is a qualitative difference with climate change: it requires long term strategic thinking and it is a challenge to the normal, national political cycles.

On the question of Lord Lawson. By all means invite him to discuss the economics of decarbonising the economy. But last time he was asked on – more or less to do this – and had a discussion with Justin Webb, he was asked by Justin to comment on Al Gore’s statement that we needed to push ahead with the solutions that are already available to us. Move on, in other words.

Instead of answering this question Lord Lawson tried to poke holes in unequivocal science (A and B), instead of addressing C; he has no intention of moving on.  He lost, and seems quite bitter about it; as he went on to make personal attacks on Al Gore.  While the interviewer cannot stop Lord Lawson saying these things, he should be called out on them.

“I am not a scientist” is a statement that US Republican Congressman use to avoid confronting the fact that A and B are true, and not up for debate.  John Humphreys should not be using the same statement (but he did on this episode). 

If climate change is “the big one” as he himself noted, surely it is time he made the effort to educate himself to the point where he understands why A and B are unequivocally “Yes”, in the same way that “Does smoking cause lung cancer?” has an unequivocally “Yes” answer.  There are no shortage of scientists at the Met Office, Cambridge, Oxford, UCL and elsewhere who I am sure would be happy to help him out here.

Today was a good discussion – even a great step forward – but the BBC is still failing in its public service duty, on this topic of global warming.

Kind regards,

Richard Erskine

What seems to be clear to me is that John Humphreys is not alone amongst journalists in failing to distinguish between non-pollable (where evidence accumulated over many years holds sway, and values have no place) and pollable questions (where values can have as big a part to play as the science).

It is about time they started.

o o O o o

Leave a comment

Filed under Uncategorized

The Zeitgeist of the Coder

When I go to see a film with my wife, we always stick around for the credits, and the list has got longer and longer over the years … Director, Producer, Cinematographer, Stuntman, Grips, Special Effects … and we’ve only just started. Five minutes later and we are still watching the credits! There is something admirable about this respect for the different contributions made to the end product. The degree of differentiation of competence in a film’s credits is something that few other projects can match.

Now imagine the film reel for a typical IT project … Project Manager, Business Analyst, Systems Architect, Coder, Tester and we’re almost done, get your coat. Here, there is the opposite extreme; a complete failure to identify, recognise and document the different competencies that surely must exist in something as complex as a software project. Why is this?

For many, the key role on this very short credits list is the ‘coder’. There is this zeitgeist of the coders – a modern day priesthood – that conflates their role with every other conceivable role that could or should exist on the roll of honour.

A good analogy for this would be the small scale general builder. They imagine they can perform any skill: they can fit a waterproof membrance on a flat roof; they can repair the leadwork around the chimney; they can mend the lime mortar on that Cotswold stone property. Of course, each of these requires deep knowledge and experience of the materials, tools and methods needed to plan and execute them right.  A generalist will overestimate their abilities and underestimate the difficulties, and so they will always make mistakes.

The all purpose ‘coder’ is no different, but has become the touchstone for our digital rennaissance. ‘Coding’ is the skill that trumps all others in the minds of the commentariat.

Politicians, always keen to jump on the next bandwagon, have for some years now been falling over themselves to extol the virtues of coding as a skill that should be promoted in schools, in order to advance the economy.  Everyone talks about it, imagining it offers a kind of holy grail for growing the digital economy.  But can it be true? Is coding really the path to wealth and glory, for our children and our economy?

Forgetting for a moment that coding is just one of the skills required on a longer list of credits, why do we all need to become one?

Not everyone is an automotive engineer, even though cars are ubiquitous, so why would driving a car mean we all have to be able to design and build one? Surely only a few of us need that skill. In fact, whilst cars – in the days when we called them old bangers – did require a lot of roadside fixing, they are now so good we are discouraged from tinkering with them at all.  We the consumers have become de-skilled, while the cars have become super-skilled.

But apparently, every kid now needs to be able to code, because we all use Apps. Of course, it’s nonsense, for much the same reasons it is nonsense that all car drivers need to be automotive engineers. And as we decarbonise our economy Electric Vehicles will take over, placing many of the automotive skills in the dustbin. Battery engineers anyone?

So why is this even worth discussing in the context of the knowledge economy? We do need to understand if coding has any role in the management of our information and knowledge, and if not, what are the skills we require. We need to know how many engineers are required, and crucially, what type of engineers.

But lets stick with ‘coding’ for a little while longer. I would like to take you back to the very birth of computing, to deconstruct the wording ‘coding’ and place into context. The word coding originates the time when programming a computer meant knowing the very basic operations expressed as ‘machine code’ – Move a byte to this memory location, Add these two bytes, Shift everything left by 2 bytes – which was completely indecipherable to the uninitiated. It also had a serious drawback in that a program would have to be re-written to run on another machine, with its own particular machine code. Since computers were evolving fast, and software needed to be migrated from old to new machines, this was clearly problematic.

Grace Hooper came up with the idea of a compiler in 1952, quite early in the development of computers. Programs would then be written in a machine-agnostic ‘high level language’ (which was designed to be readable, almost like a natural language, but with a simple syntax to  allow logic to be expressed … If (A = B) Then [do-this] Else [do-that]). A compiler on a machine would take a program written in a high-level language and ‘compile’ it into the machine code that could run on that machine.  The same program could thereby run on all machines.

In place of ‘coders’ writing programs in machine code, there were now ‘programmers’ doing this in high-level language such as Cobol or FORTRAN (both of which were invented in the 1950s), and later ones as they evolved.

So why people still talk about ‘coders’ rather than ‘programmers’ is a mystery to me. Were it just an annoying misnomer, one could perhaps ignore it as an irritant, but it reveals a deeper and more serious misunderstanding.

Coding … I mean Programming … is not enough, in so many ways.  When the politician pictures a youthful ‘coder’ in their bedroom, they imagine the next billionaire creating an App that will revolutionize another area of our lives, like Amazon and Uber have done.

But it is by no means clear that programming as currently understood, is the right skill  for the knowledge economy.  As Gottfried Sehringer wrote in an article “Should we really try to teach everyone to code?” in WiRED, even within the narrow context of building Apps:

“In order to empower everyone to build apps, we need to focus on bringing greater abstraction and automation to the app development process. We need to remove code — and all its complexity — from the equation.”

In other words, just as Grace Hooper saw the need to move from Coding to Programming, we need to move from Programming to something else. Let’s call it Composing: a visually interactive way to construct Apps with minimal need to write lines of text to express logical operations. Of course, just as Hooper faced resistance from the Coders, who poured scorn on the dumbing down of their art, the same will happen with the Programmers, who will claim it cannot be done.

But the world of digital is much greater than the creation of ‘Apps’. The vast majority of the time spent doing IT in this world is in implementing pre-built commercial packages.  If one is implementing them as intended, then they are configured using quite simple configuration tools that aim to eliminate the need to do any programming at all. Ok, so someone in SAP or Oracle or elsewhere had to program the applications in the software package, but they are a relatively small population of technical staff when compared to the numbers who go out to implement these solutions in the field.

Of course it can all go wrong, and often does. I am thinking of a bank that was in trouble because their creaking old core banking system – written in COBOL decades ago by programmers in the bank – was no longer fit for purpose. Every time changes were made to financial legislation, such as tax, the system needed tweaking. But it was now a mess, and when one bug was fixed, another took its place.

So the company decided to implement an off-the-shelf package, which would do everything they needed, and more. The promise was the ability to become a  really ‘agile’ bank. They would be able to introduce new products to market rapidly in response to market needs or to respond to new legislation. It would take just a few weeks, rather than the 6 months it was currently taking. All they needed to do was to do some configurations of the package so that it would work just as they needed it too.

The big bosses approved the big budget then left everyone to it. They kept on being told everything was going well, and so much wanted to believe this, so failed to ask the right questions of the team. Well, guess what, it was a complete disaster. After 18 months and everything running over time and over budget, what emerged?  The departmental managers had insisted on keeping all the functionality from their beloved but creaking old system; the big consultancy was being paid for man-hours of programming so did not seem to mind that the off-shored programmers were having to stretch and bend the new package out of shape to make it look like the old system. And the internal project management was so weak, they were unable to call out the issues, even if they had fully understood them.

Instead of merely configuration, the implementation had large chunks of custom programming bolted onto the package, making it just as unstable and difficult to maintain as the old system. Worse still, it made it very difficult to upgrade the package; to install the latest version (to derive benefits from new features), given the way it had been implemented. There was now a large support bill just to keep the new behmoth alive.

In a sense, nothing had changed.

Far from ‘coding’ being the great advance for our economy, it is often, as in this sorry tale, a great drag on it, because this is how many large system implementations fail.

Schools, Colleges and Universities train everyone to ‘code’, so what will they do when in the field? Like a man with a hammer, every problem looks like a nail, even when a precision milling machine was the right tool to use.

Shouldn’t the student be taught how to reframe their thinking to use different skills that are appropriate to the task in hand? Today we have too many Coders and not enough Composers, and its seems everyone is to blame, because we are all seduced by this zeitgeist of the ‘coder’.

When we consider the actual skills needed to implement, say, a large, data-oriented software package – like that banking package – one finds that activities needed are, for example: Requirements Analysis, Data Modelling, Project Management, Testing, Training, and yes of course, Composing.  Programming should be restricted to those areas such as data interfaces to other systems, where it must be quarantined, so as not to undermine the upgradeability of the software package that has been deployed.

So what are the skills required to define and deploy information management solutions, which are document-oriented, aimed at capturing, preserving and reusing the knowledge within an organization?

Let the credits roll: Project Manager; Information Strategist; Business Analyst; Process Architect; Information Architect; Taxonomist; Meta-Data Manager; Records Manager; Archivist; Document Management Expert; Document Designer; Data Visualizer; Package Configurer; Website Composer; … and not a Coder, or even a Programmer, in sight.

The vision of everyone becoming coders is not only the wrong answer to the question; its also the wrong question. The diversity of backgrounds needed to build a knowledge economy is very great. It is a world beyond ‘coding’ which is richer and more interesting, open to those with backgrounds in software of course, but also in science and the humanities. We need linguists as much as it we need engineers; philosophers as much we need data analysts; lawyers as much as we need graphics artists.

To build a true ‘knowledge economy’ worthy of that name, we need to differentiate and explore a much richer range of competencies to address all the issues we will face than the way in which information professionals are narrowly defined today.

(C) Richard W. Erskine, 2017

——

Note:

In his essay I am referring to business and institutional applications of information management. Of course there will be areas such as scientific research or military systems which will always require heavy duty, specialist software engineering; but this is another world when compared to the vast need in institutions for repeatable solutions to common problems, where other skills are argued to be much more relevant and important to success.

Leave a comment

Filed under Essay, Information Management, Software Development

Elf ‘n Safety and The Grenfell Tower fire

The tragic fire at Grenfell Tower breaks one’s heart.

There was a question asked tonight on BBC’s Newsnight which amounted to:

How is it, in 21st Century UK, a rich and prosperous country despite everything, that a fire can engulf a tower block in the way it did last night?

This got me thinking.

People from the council, politicians and others talk of the need to ‘learn lessons’ in a way that makes one wonder if they really believe it.

Apparently, in the British Army they ban the use of such language. Because we all know what this means. Another report. Another expert ignored. Another tragedy, and another lesson unheard, and ignored. A lesson demonstrated through a change in behaviour, great, but some aspirational statement that one will change at some indeterminate time in the future? No thanks.

We know that tragedies like this are multi-causal, so no single cause can explain it. But that doesn’t mean it was unforeseen. In this case there are factors that have been raised:

  • cladding that is not fire-retardant, but rather designed to make a building more aesthetically pleasing, with scant regard for how it undermines the underlying fire-safety of the original building;
  • a lack of any alarm to warn the residents of fire;
  • a lack of sprinklers in rooms or hallways (whereas in hotels this is standard practice; why the difference);
  • a failure to implement a report by a Select Committee of Parliament published following a previous tower-block fire;
  • a building with only one staircase for escape;
  • building standards that are evidently not fit for purpose and widely criticised (for some time) as providing a very low bar for compliance;
  • an arms length management organisation that refused to listen to the concerns of residents.

These and no doubt other factors compounded to either make the fire worse than it should have been, or the response to the fire by residents and rescue workers less effective than it could have been.

No doubt there will be questions about how it is that experts have known about the risks of the kind of cladding used, and have published papers on this, but their knowledge has fallen on deaf ears. No one in authority has had the smidgen of intellectual curiosity or moral impulse to track it down using Google. We apparently need another report to rediscover stuff we already knew, which who knows, maybe they will read this time.

No doubt there are questions to be asked of organisations like the British Standards Institute (BSI) that produces standards in this case that seem to fail to challenge the industry to reach the highest common factor for health and safety, but instead, to arrive a lowest common denominator of standard. They specify tests that are clearly not real-world tests. One is bound to ask if the BSI is fit for purpose, and whether its processes lead to an excessive chumminess with the industries it works with. It has a business model where it generates and sells standards and associated consultancy. Better not rock too many boats? No doubt the standards are “pragmatic” in the business-speak synonym for barely adequate.

Christoper Miers, in his conclusion of a report entitled “Fire Risks From External Cladding Panels – A Perspective From The UK”, wrote:

“Can anything be done about the worldwide legacy of buildings with combustible cored composite panels?  Unless something radical is done, such as national retro-fitting subsidy schemes, it seems inevitable that there will be further fires involving aluminium-faced polyethylene core panels.  Nightmare scenarios include multiple-fatality building-engulfing fires as in China, or given the proximity of towers in some districts, the ignition of neighbouring buildings’ cladding from an external cladding fire, or disintegrated burning panels igniting the roofs of lower buildings adjacent.

It is difficult to envisage owners voluntarily stripping off entire existing aluminium composite panel facades and replacing them with Fire Code-compliant cladding panels, as the cost would be prohibitive.  Partial replacement with barrier bands of fire resistant panels has been suggested to stop fires spreading, [48] but given the flame heights at the Tamweel, Torch and The Address, such barrier bands would have to be substantially large.  The works necessary to provide these barriers would involve much of the scaffolding and associated costs of full replacement.

It seems inevitable that insurers will differentiate between buildings with and without combustible aluminium composite panels and will charge higher premiums for higher risks.  One or two more fires, or a fatal fire, could lead to insurance cover being refused if the risk is considered excessive.  Insurance issues, bad publicity and loss of property value might then make retro-fitting external cladding a viable option in commercial, as well as fire safety terms.”

But despite all these unlearned lessons, there is something far more insidious at work here.

The sneering right wing commentators like Richard Littlejohn of the Daily Fail have waged a campaign for many years against what they claim is an over-weaning attempt by the liberal elite to protect us from ourselves, which goes under the catchy title of “elf ’n safety” (snigger, snigger, sneer). Imagine …

Poor Johnny can’t even go diving off some rocks without someone doing a bloody risk assessment, then someone else has to hold a flag. 

Stuff and nonsense – in my day we used to ski down black runs blindfolded. Never did us any harm.

You get the picture.

I remember once doing a study for the HSE (Health & Safety Executive) back in the 90s, and some of the horror stories of what used to happen in industries like farming and chemicals would make your hair stand on end.

And of course deaths and injury in these and other industries have fallen dramatically in the last few decades, thanks to organisations like the HSE. Far from hurting productivity, it has helped it, by enhancing efficiency and professionalism. In some industries it even drives innovation, as with the noise regulations for aircraft.

And even in the more parochial area of school trips, there was plenty of evidence that just a little bit of prior planning might well prevent poor performance (and injury).

But no, to Richard Littlejohn and his ilk, the “world has gone mad”.

Too often the bureaucrats seem to have bought into – maybe unconsciously – this background noise of derision towards health and safety. They feel inclined to dismiss the concerns raised by experts or ride roughshod over citizens concerns.

What do they know? Business must go on.

And once again we have the chumminess effect: councillors too close to developers, and lacking the critical faculties to ask searching questions, or even obvious ones.

For example, one might have imagined a councillor asking the questions …

“This cladding we plan use… is it anything like that used on that tower block that went up in flames in Dubai? Have we assessed the risks? Can we assure the tenants we have investigated this, and its OK?”.  

There is good box-ticking (in the cock-pit of an aeroplane) and the bad kind. The good kind is used by engineers, pilots, surgeons, school-teachers and others who are skilled in their respective arts.

The bad kind is used by bureaucrats wanting to cover their arses. We heard some of this  last night on Newsnight “we got the design signed off”, “we followed the standards”, etc.

Where is the imagination, the critical thinking, the challenging of lazy assumptions?

And most importantly, where is the answering of tenants’ questions and concerns, and taking health and safety seriously as the number one requirement, not as an afterthought?

But risk assessment planning and execution is incessantly mocked by the sneering, curled lip brigade who inhabit the Daily Mail, Daily Telegraph and other right wing denigrators of “elf ’n safety”.

This has created a culture of jocular disregard for safety.

Try this. Go to a cafe with a few friends and ask “shall we have a chat about health and safety?”. I bet you that they will – whatever their political views – either laugh or roll their eyes.

Well, maybe not any more. Maybe they may feel suitably chasticised for a while at least, and stop their lazy sneering.

The champion sneerers have been successful through their drip, drip of cherry-picked stories or outright myths; their project has had an insidious effect, and has done its worst inundermining respect for health and safety.

But you see, it is not really health and safety that they have in their sights.  It’s just the easy to mock first hurdle in a wider programme.

There is a bigger prize: regulation!

What the de-regulators like Daniel Hannan want from Brexit is a bonfire of regulations, as he wrote about in his 2015 ‘vision’.

David Davis, the Secretary of State for Exiting the European Union, claims not to know the difference between a ‘soft’ Brexit and a hard one.

Well, here’s a guide, David.

A hard Brexit is one where we have a bonfire of regulations; where we have no truck with experts who advise us on risks of ethylene-based cladding or excess carbon dioxide in our atmosphere; where ‘risk assessment’ is a joke we have down the club; where the little people enjoy the fruits of ‘trickle down’ economics in a  thriving Britain, free of (allegedly) over-weaning regulation.

But the British have made it clear they do not want a hard Brexit.

I hope and trust that the time is over for the sneering, arrogant advocates for de-regulation, and their purile and dangerous disregard for people’s health, and their safety.

Whether in bringing forth and implementing effective measures to prevent another terrible fire like at Grenfell Tower, or in all the other areas of life and work in the UK that are important for a safe and secure future, the time to take experts and regulations seriously is needed now, more than ever.

 

Richard W. Erskine, 15th June 2017.

Leave a comment

Filed under Uncategorized

Lest we regret: science not silence

Cherish not only those who you love, but that which you love. Yesterday I went with my wife on the March for Science in Bristol, the city where we fell in love many years ago. We were on one of over 600 marches globally, to express a love for the science that has brought us so much, and promises so much more.

We do not want in the future to find ourselves mournfully recalling the words of some great poet, words of regret at our careless disregard, our taking for granted –

“When to the session of sweet silent thought,
I summon up remembrance of things past,
I sigh the lack of many a thing I sought,
And with old woes new wail my dear time’s waste….” 

(Shakespeare, Sonnet 30)

Humanity needs more experts now than ever before, but it also needs poets and novelists too to find that voice, that will reach the hearts of those who will be hurt by the cynical disregard for truth, for evidence.

This is no longer the preserve of cranks, but now influences men (and it is mostly men) in power who attack the science of evolution, vaccination and climate change, that has saved the lives of billions and promises to save the lives of billions more in the future. Notwithstanding the more prosaic inability to live without the fruits of science (try having a no science friday).

That is why the over 600 cities that Marched for Science yesterday spoke with a true voice. Science is for everyone and we all benefit from its fruits but just as few really know where their food comes from, we have become blind to the processes and creativity of the scientists who will bring us the next wonders, and the next solutions to the challenges we face. We the people, and scientists, must both now pledge to remedy our careless assumption that the Englightenment will prevail against the tide of ignorance that has reached the pinnacle of power, without strong and systemic defenses.

We ignore these threats at our peril.

Let’s not regret being so careless that we allowed an opinionated, ideologically motivated few to use their positions of power to drown out the voices of reason.

Let us, most of all, not waste our dear, precious time.

. . .. o o O o o .. . .

 

Richard W. Erskine, essaysconcerning.com, 23rd April 2017

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

The speakers at the Bristol event were Professor Bruce Hood from the Bristol University’s School of Experimental Psychology; TV naturalist Chris Packham; science writer and scientist Dr Simon Singh; At-Bristol’s creative director Anna Starkey; and, scientist and writer Dr Suzi Gage.

Youtube videos of their speeches available here >

https://www.youtube.com/playlist?list=PLz3n5TyzhVlR88vhkd8guOjH8F53kizSt 

2 Comments

Filed under Uncategorized

Beyond Average: Why we should worry about a 1 degree C rise in average global temperature

When I go to the Netherlands I feel small next to men from that country, but then I am 3 inches smaller than the average Brit, and the average Dutchman is 2 inches taller than the average Brit. So I am seeing 5 inches of height difference in the crowd around me when surrounded by Dutch men. No wonder I am feeling an effect that is much greater than what the average difference in height seems to be telling me on paper.

Averages are important. They help us determine if there is a real effect overall. Yes, men from the Netherlands are taller than men from Britain, and so my impressions are not merely anecdotal. They are real, and backed up by data.

If we are wanting to know if there are changes occurring, averages help too, as they ensure we are not focusing on outliers, but on a statistically significant trend. That’s not to say that it is always easy to handle the data correctly or to separate different factors, but once this hard work is done, the science and statistics together can lead us to knowing important things, with confidence.

For example, we know that smoking causes lung cancer and that adding carbon dioxide into the atmosphere leads to increased global warming.

But, you might say, correlation doesn’t prove causation! Stated boldly like that, no it doesn’t. Work is required to establish the link.

Interestingly, we knew the fundamental physics of why carbon dioxide (CO2) is a causative agent for warming our atmosphere – not merely correlated – since as early as Tyndall’s experiments which he started in 1859, but certainly no later than 1967, when Manabe & Wetherald’s seminal paper resolved some residual physics questions related to possible saturation of the infra-red  absorption in the atmosphere and the co-related effect of water vapour. That’s almost 110 years of probing, questioning and checking. Not exactly a tendency on the part of scientists to rush to judgment! And in terms of the correlation being actually observed in our atmosphere, it was Guy Callendar in 1938 who first published a paper showing rising surface temperature linked to rising levels of CO2.

Whereas, in the case of lung cancer and cigarettes correlation came first, not fundamental science. It required innovations in statistical methods to prove that it was not merely correlation but was indeed causation, even while the fundamental biological mechanisms were barely understood.

In any case, the science and statistics are always mutually supportive.

Average Global Warming

In the discussions on global warming, I have been struck over the few years that I have been engaging with the subject how much air time is given to the rise in atmospheric temperature, averaged for the whole of the Earth’s surface, or GMST as the experts call it (Global Mean Surface Temperature).  While it is a crucial measure, this can seem a very arcane discussion to the person in the street.

So far, it has risen by about 1 degree Centigrade (1oC) compared to the middle of the 19th Century.

There are regular twitter storms and blogs ‘debating’ a specific year, and last year’s El Nino caused a huge debate as to what this meant. As it turns out, the majority of recent warming is due to man-made global warming, and this turbo-charged the also strong El Nino event.

Anyone daring to take a look at the blogosphere or twitter will find climate scientists arguing with opinion formers ill equipped to ‘debate’ the science of climate change, or indeed, the science of anything.

What is the person in the street supposed to make of it? They probably think “this is not helping me – it is not answering the questions puzzling me – I can do without the agro thanks very much”.

To be fair, many scientists do spend a lot of time on outreach and in other kinds of science communications, and that is to be applauded. A personal favourite of mine is Katharine Hayhoe, who always brings an openness and sense of humility to her frequent science communications and discussions, but you sense also, a determined and focused strategy to back it up.

However, I often feel that the science ‘debate’ generally gets sucked into overly technical details, while basic, or one might say, simple questions remain unexplored, or perhaps assumed to be so obvious they don’t warrant discussion.

The poor person in the street might like to ask (but dare not for fear of being mocked or being overwhelmed with data), simply:

“Why should we worry about an average rise of 1oC temperature, it doesn’t seem that much, and with all the ups and downs in the temperature curve; the El Nino; the alleged pause; the 93% of extra heat going into the ocean I heard about … well, how can I really be sure that the surface of the Earth is getting warmer?”

There is a lot to unpick here and I think the whole question of ‘averages’ is part of the key to approaching why we should worry.

Unequivocally Warming World

Climate Scientists will often show graphs which include the observed and predicted annual temperature (GMST) over a period of 100 years or more.

Now, I ask, why do they do that?

Surely we have been told to that in order to discern a climate change trend, it is crucial to look at the temperature averaged over a period of at least 10 years, and actually much better to look at a 30-year average?

In this way we smooth out all the ups and downs that are a result of the energy exchanges that occur between the moving parts of the earth system, and the events such as volcanic eruptions or humans pumping less sulphur into the atmosphere from industry. We are interested in the overall trend, so we can see the climate change signal amongst the ‘noise’.

We also emphasis to people – for example, “the Senator with a snowball” – that climate change is about averages and trends, as distinct from weather (which is about the here and now).

So this is why the curve I use – when asked “What is the evidence that the world is warming?” – is a 30-year smoothed curve (red line) such as the one shown below (which used the GISS tool):

30 yr rolling average of GMST

[also see the Met Office explainer on global surface temperature]

The red line shows inexorable warming from early in the 20th Century, no ifs, no buts.

End of argument.

When I challenged a climate scientist on Twitter, why don’t we just show this graph and not get pulled into silly arguments with a Daily Mail journalist or whoever, I was told that annual changes are interesting and need to be understood.

Well sure, for climate scientists everything is interesting! They should absolutely try to answer the detailed questions, such as the contribution global warming made to the 2016 GMST. But to conflate that with the simpler and broader question does rather obscure the fundamental message for the curious but confused public who have not even reached base camp.

They may well conclude there is a ‘debate’ about global warming when there is none to be had.

There is debate amongst scientists about many things: regional impact and attribution; different feedback mechanisms and when they might kick in; models of the Antarctic ice sheet; etc. But not about rising GMST, because that is settled science, and given Tyndall et al, it would be incredible if it were not so; Nobel Prize winning incredible!

If one needs a double knock-out, then how about a triple or quadruple knock-out?

When we add the graphs showing sea level rise, loss of glaciers, mass loss from Greenland and Antarctica, and upper ocean temperature, we have multiple trend lines all pointing in one direction: A warming world. It ain’t rocket science.

We know the world has warmed – it is unequivocal.

Now if a the proverbial drunk, duly floored, still decides to get up and wants to rerun the fight, maybe we should be choosing not to play his games!?

So why do arguments about annual variability get so frequently aired on the blogosphere and twitter?

I don’t know, but I feel it is a massive own goal for science communication.

Surely the choice of audience needs to be the poor dazed and confused ‘person in the street’, not the obdurately ignorant opinion columnists (opinion being the operative word).

Why worry about a 1oC rise?

I want to address the question “Why worry about a 1oC rise (in global mean surface temperature)?”, and do so with the help of a dialogue. It is not a transcript, but along the lines of conversations I have had in the last year. In this dialogue, I am the ClimateCoach and I am in conversation with a Neighbour who is curious about climate change, but admits to being rather overwhelmed by it; they have got as far as reading the material above and accept that the world is warming.

Neighbour:  Ok, so the world is warming, but I still don’t get why we should worry about a measly 1oC warming?

ClimateCoach: That’s an average, over the whole world, and there are big variations hidden in there. Firstly, two thirds of the surface of the planet is ocean, and so over land we are already talking about a global land mean surface temperature in excess of 1oC, about 1.5oC. That’s the first unwelcome news, the first kicker.

Neighbour: So, even if it is 5oC somewhere, I still don’t get it. Living in England I’d quite like a few more Mediterranean summers!

ClimateCoach: Ok, so let’s break this down (and I may just need to use some pictures).  Firstly we have an increase in the mean, globally. But due to meteorological patterns there will be variations in temperature and also changes in precipitation patterns around the world, such as droughts in California and increased Monsoon rain in India. This  regionality of the warming is the second kicker.

Here is an illustration of how the temperature increase looks regionally across the world.

GISTEMP global regional

Neighbour: Isn’t more rain good for Indian farmers?

ClimateCoach: Well, that depends on timing. It has started to be late, and if it doesn’t arrive in time for certain crops, that has serious impacts. So the date or timing of impacts is the third kicker.

Here is an illustration.

Screen Shot 2017-04-15 at 08.45.34.png

Neighbour: I noticed earlier that the Arctic is warming the most. Is that a threat to us?

ClimateCoach: Depends what you mean by ‘us’. There is proportionally much greater warming in the Arctic, due to a long-predicted effect called ‘polar amplification’, in places as much as 10oC of warming. As shown in this map of the arctic. But what happens in the Arctic doesn’t stay in the Arctic.

Arctic extremes

Neighbour: I appreciate that a warming Arctic is bad for ecosystems in the Arctic – Polar Bears and so on – but why will that effect us?

ClimateCoach: You’ve heard about the jet stream on the weather reports, I am sure [strictly, the arctic polar jet stream]. Well, as the Arctic is warmed differentially compared to latitudes below the Arctic, this causes the jet stream to become more wiggly than before, which can be very disruptive. This can create, for example, fixed highs over Europe, and very hot summers.

Neighbour: But we’ve had very hot summers before, why would this be different?

ClimateCoach: It’s not about something qualitatively different (yet), but it is quantitatively. Very hot summers in Europe are now much more likely due to global warming, and that has real impacts. 70,000 people died in Europe during the 2003 heatwave.  Let me show you an illustrative graph. Here is a simple distribution curve and it indicates a temperature at and above which (blue arrow) high impacts are expected, but have a low chance. Suppose this represents the situation in 1850.

Normal distribution

Neighbour: Ok, so I understand the illustration … and?

ClimateCoach: So, look at what happens when we increase the average by just a little bit to a higher temperature, say, by 1oC to represent where we are today. The whole curve shifts right. The ‘onset of high impact’ temperature is fixed, but the area under the curve to the right of this has increased (the red area has increased), meaning a greater chance than before. This is the fourth kicker.

In our real world example, a region like Europe, the chance of high impact hot summers has increased within only 10 to 15 years from being a one in 50 year event to being a 1 in 5 year event; a truly remarkable increase in risk.   

Shifted Mean and extremes

Neighbour: It’s like loading the dice!

ClimateCoach: Exactly. We (humans) are loading the dice. As we add more CO2 to the atmosphere, we load the dice even more. 

Neighbour: Even so, we have learned to cope with very hot summers, haven’t we? If not, we can adapt, surely?

ClimateCoach: To an extent yes, and we’ll have to get better at it in the future. But consider plants and animals, or people who are vulnerable or have to work outside, like the millions of those from the Indian sub-continent who work in construction in the Middle East.  It doesn’t take much (average) warming to make it impossible (for increasingly long periods) to work outside without heat exhaustion. And take plants. A recent paper in Nature Communications showed that crop yields in the USA would be very vulnerable to excessive heat.

Neighbour: Can’t the farmers adapt by having advanced irrigation systems. And didn’t I read somewhere that extra CO2 acts like a fertiliser for plants?

ClimateCoach: To a point, but what that research paper showed was that the warming effect wins out, especially as the period of excessive heat increases, and by the way the fertilisation effect has been overstated. The extended duration of the warming will overwhelm these and other ameliorating factors. This is the fifth kicker.

This can mean crop failures and hence impacts on prices of basic food commodities, even shortages as impacts increase over time.

Neighbour: And what if we get to 2oC?  (meaning 2oC GMST rise above pre-industrial)

ClimateCoach: Changes are not linear. Take the analogy of car speed and pedestrian fatalities. After 20 miles per hour the curve rises sharply, because the car’s energy is a function of the square of the speed, but also the vulnerability thresholds in the human frame. Global warming will cross thresholds for both natural and human systems, which have been in balance for a long time, so extremes get increasingly disruptive. Take an impact to a natural species or habitat: one very bad year, and there may be recovery in the following 5-10 years, which is ok if the frequency of very bad years is 1 in 25-50 years. But suppose very bad years come 1 in every 5 years? That would mean no time to recover. Nature is awash with non-linearities and thresholds like this.

Neighbour: Is that what is happening with the Great Barrier Reef – I heard something fleetingly on BBC Newsnight the other night?

ClimateCoach: I think that could be a very good example of what I mean. We should talk again soon. Bring friends. If they want some background, you might ask them to have a read of my piece Demystifying Global Warming & Its Implications, which is along the lines of a talk I give.

Putting it together for the person in the street.

I have explored one of many possible conversations I could have had. I am sure it could be improved upon, but I hope it illustrates the approach. We should be engaging those people (the majority of the population) who are curious about climate change but have not involved themselves so far, perhaps because they feel a little intimidated by the subject.

When they do ask for help, the first thing they need to understand is that indeed global warming is real, and is demonstrated by those average measures like GMST, and the other ones mentioned such as sea-level rise, ice sheet mass loss, and ocean temperature; not to mention the literally thousands of indicators from the natural world (as documented in the IPCC 5th Assessment Report).

There are also other long-term unusual sources of evidence to add to this list, as Dr Ed Hawkins has discussed, such as the date at which Cherry blossom flowers in Kyoto, which is trending earlier and earlier.  Actually, examples such as these, are in many ways easier for people to relate to.

Gardeners the world over can relate to evidence of cherry blossom, wine growers to impacts on wine growing regions in France, etc. These diverse and rich examples are in many ways the most powerful for a lay audience.

The numerous lines of evidence are overwhelming.

So averages are crucial, because they demonstrate a long-term trend.

When we do raise GMST, make sure you show the right curve. If it is to show unequivocal global warming at the surface, then why not show one that reflects the average over a rolling 30 year period; the ‘smoothed’ curve. This avoids getting into debates with ‘contrarians’ on the minutae of annual variations, which can come across as both abstract and arcane, and puts people off.

This answers the first question people will be asking, simply: “Is the world warming?”. The short answer is “Unequivocally, yes it is”. And that is what the IPCC 5th Assessment Report concluded.

But averages are not the whole story.

There is the second but equally important question “Why worry about a 1oC rise (in global mean surface temperature)?”

I suspect many people are too coy to ask such a simple question. I think it deserves an answer and the dialogue above tried to provide one.

Here and now, people and ecosystems experience weather, not climate change, and when it is an extreme event, the impacts are viscerally real in time and place, and are far from being apparently arcane debating points.

So while a GMST rise of 1oC sounds like nothing to the untutored reader, when translated into extreme weather events, it can be highly significant.  The average has been magnified to yield a significant effect, as evidenced by the increasing chance of extreme events of different kinds, in different localities, which can increasingly be attributed to man-made global warming.

The kickers highlighted in the dialogue were:

  • Firstly, people live on land so experience a higher ‘GMST’ rise (this is not to discount the impacts on oceans);
  • Secondly, geographical and meteorological patterns mean that there are a wide range of regional variations;
  • Thirdly, the timing (or date) at which an impact is felt is critical for ecosystems and agriculture, and bad timing will magnify the effect greatly;
  • Fourthly, as the average increases, so does the chance of extremes. The dice are getting loaded, and as we increase CO2, we load the dice more.
  • Fifthly, the duration of an extreme event will overwhelm defences, and an extended duration can cross dangerous thresholds, moving from increasing harm into fatal impacts, such as crop failure.

I have put together a graphic to try to illustrate this sequence of kickers:

Screen Shot 2017-04-15 at 08.36.37.png

As noted on this graphic (which I used in some climate literacy workshops I ran recently), the same logic used for GMST can be applied to other seemingly ‘small’ changes in global averages such as rainfall, sea-level rise, ocean temperature and ocean acidification. To highlight just two of these other examples:

  • an average global sea-level rise translates into impacts such as extreme storm surges, damaging low-lying cities such as New York and Miami (as recently reported and discussed).
  • an average ocean temperature rise, translates into damage to coral reefs (two successive years of extreme events have caused serious damage to two thirds of the Great Barrier Reef, as a recent study has confirmed).

Even in the relatively benign context of the UK’s temperate climate, the Royal Horticultural Society (RHS), in a report just released, is advising gardeners on climate change impacts and adaptation. The instinctively conservative ‘middle England’ may yet wake up to the realities of climate change when it comes home to roost, and bodies such as the RHS reminds them of the reasons why.

The impacts of man-made global warming are already with us, and it will only get worse.

How much worse depends on all of us.

Not such a stupid question

There was a very interesting event hosted by CSaP (Centre for Science and Policy) in Cambridge recently. It introduced some new work being done to bring together climate science and ‘big data analytics’. Dr Emily Schuckburgh’s talk looked precisely at the challenge of understanding local risks; the report of the talk included the following observation:

“Climate models can predict the impacts of climate change on global systems but they are not suitable for local systems. The data may have systematic biases and different models produce slightly different projections which sometimes differ from observed data. A significant element of uncertainty with these predictions is that they are based on our future reduction of emissions; the extent to which is yet unknown.

To better understand present and future climate risks we need to account for high impact but low probability events. Using more risk-based approaches which look at extremes and changes in certain climate thresholds may tell us how climate change will affect whole systems rather than individual climate variables and therefore, aid in decision making. Example studies using these methods have looked at the need for air conditioning in Cairo to cope with summer heatwaves and the subsequent impact on the Egyptian power network.”

This seems to be breaking new ground.

So maybe the eponimous ‘person in the street’ is right to ask stupid questions, because they turn out not to be so stupid after all.

Changing the Conversation

I assume that the person in the street is curious and has lots of questions; and I certainly don’t judge them based on what newspaper they read. That is my experience. We must try to anticipate and answer those questions, and as far as possible, face to face. We must expect simple questions, which aren’t so stupid after all.

We need to change the focus from the so-called ‘deniers’ or ‘contrarians’ – who soak up so much effort and time from hard pressed scientists – and devote more effort to informing the general public, by going back to the basics. By which I mean, not explaining ‘radiative transfer’ and using technical terms like ‘forcing’, ‘anomaly’, or ‘error’, but using plain English to answer those simple questions.

Those embarrasingly stupid questions that will occur to anyone who first encounters the subject of man-made global warming; the ones that don’t seem to get asked and so never get answered.

Maybe let’s start by going beyond averages.

No one will think you small for doing so, not even a Dutchman.

[updated 15th April]

1 Comment

Filed under Climate Science, Essay, Science Communications

Complexity ain’t that complex

According to Megan McArdle in a Bloomberg View opinion piece we cannot trust computer models of the climate because economists have failed when they tried to model complex economic systems.

Leaving aside the fundamental fact that the ‘atoms’ of physics (molecules, humidity, etc.) are consistent in their behaviour, whereas the ‘atoms’ of economics (humans) are fickle and prone to ‘sentiment’, this is a failed form of denialism.

You do not have to be Champagne maker Taittinger investing in sparkling wine production in Kent (England), for example, to know that global warming is real, because there are thousands of scientifically observed and published indicators of a warming world. Most of these receive little attention in the media compared to the global average surface temperature (important though it is).

In her article she repeats something I believe is a key confusion in her piece:

“This lesson from economics is essentially what the “lukewarmists” bring to discussions about climate change. They concede that all else equal, more carbon dioxide will cause the climate to warm. But, they say that warming is likely to be mild unless you use a model which assumes large positive feedback effects.”

Matt Ridley is also often railing against the fact that the feedback from increased humidity turns a warming of 1C (from doubling CO2 from pre-industrial levels) into closer to 3C (as the mean predicted level of warming).

This has nothing to do with the inherent complexity in the climate models as it is derived from basic physics (the Infra-Red spectra of CO2 and H2O; the Clausius–Clapeyron relation that determines the level of humidity when the atmosphere warms; some basics of radiative transfer; etc.). Indeed, it is possible to get to an answer on the basic physics with pencil and paper, and the advanced computer models come to broadly the same conclusion (what the models are increasingly attempting to do is to resolve more details on geographic scales, time scales and within different parts of the Earth system, such as that big block of ice called Antarctica).

But even in the unlikely event that Megan McArdle were to accept these two incontrovertible points (the world is warming and the central feedback, from H2O, are not in any way compromised by some hinted at issue of ‘complexity’), she might still respond with something like:

“oh, but we do rely on complex models to make predictions of the future and things are too chaotic for this to be reliable.”

Well, we have learned from many great minds like Ilya Prigogine that there is complex behaviour in simple systems (e.g. the orbit of Pluto appears on one level to perform according to simple Newtonian mechanics, but in addition, has apparently random wobbles). One needs therefore to be careful at specifying at what level of order ‘chaotic behaviour’ exists. Pluto is both ordered and chaotic.

Whereas for other system that are complex (e.g. the swirling atmosphere of Jupiter) they can display ’emergent’ ordered behaviour (e.g. the big red spot). We see this all around us in the world, and ‘complexity theory’ is now a new branch of science addressing many phenomena that were otherwise inaccessible to pencil and paper: the computer is an essential tool in exploring these phenomena.

Complexity is therefore not in itself a reason for casting out a lazy slur against models, that predictability is impossible.  There is often an ability to find order, at some level, in a system, however complex it is.

Yet, it can also be very simple.

At its most basic, adding energy to the climate system as we are doing by adding heat-trapping gases to the atmosphere, tends to warm things up, because of well established basic physics.

In a similar way, printing too much money in an economy tends to lead to inflation, despite the irreducible random factors in human nature.

It ain’t rocket science and you don’t need to be an expert in complexity theory to understand why we are a warming world.

4 Comments

Filed under Uncategorized