As you walk or commute to work, does it ever occur to you how much thought and effort goes into keeping the lights on?
I remember many years ago doing some consulting for a major utilities company, and on one visit being taken to a room which was full of PhD level mathematicians. “What are they doing?” I asked, “Refining models for calculating the price of electricity!”. The models had to calculate the price on a half-hourly basis for the market. The modellers had to worry about supply including how electricity is distributed but also how fast incremental supply can be brought on stream; and on the demand side, the cycles of demand as well as those unusual events like 10 million electric kettles being put on at half time during a major football game.
It should be pretty obvious why modelling of the electricity supply and demand during a 24 hour cycle is crucial to the National Grid, generators, distributors and consumers. If we misjudge the response of the system, then that could mean ‘brown outs’ or even cuts.
In December 2012: “… the US Department of Homeland Security and Science held a two-day workshop to explore whether current electric power grid modelling and simulation capabilities are sufficient to meet known and emerging challenges.”
As explained in the same article:
“New modelling approaches could span diverse applications (operations, planning, training, and policymaking) and concerns (security, reliability, economics, resilience, and environmental impact) on a wider set of spatial and temporal scales than are now available.
A national power grid simulation capability would aim to support ongoing industry initiatives and support policy and planning decisions, national security issues and exercises, and international issues related to, for instance, supply chains, interconnectivity, and trade.”
So we see that we move rapidly from something fairly complex (calculating the price of electricity across a grid), to an integrated tool to deal with a multitude of operational and strategic demands and threats. The stakeholders’ needs have expanded, and hence so have the demands on the modellers. “What if this, what if that?”.
Behind the scenes, unseen to the vast majority of people, are expert modellers, backed up by multidisciplinary expertise, using a range of mathematical and computing techniques to support the operational and strategic management of our electricity supply.
But this is just one of a large number of human and natural systems that call out for modelling. Naturally, this started with the physical sciences but has moved into a wide range of disciplines and applications.
The mathematics applied to the world derives from the calculus of the 17th Century but was restricted to those problems that were solvable analytically, using pencil and paper. It required brilliant minds like Lagrange and Euler to develop this mathematics into a powerful armoury used for both fundamental science and applied engineering. Differential equations were the lingua franca of applied mathematics.
However, it is not an exaggeration to say that a vast range of problems were totally intractable using solely analytical methods or even hand-calculated numerical methods.
And even for some relatively ‘simple’ problems, like the motions of the planets, the ‘three-body problem’ meant that a closed mathematical expressions to calculate the positions of the planets at any point in time were not possible. We have to numerically calculate the positions, using an iterative method to find a solution. The discovery of Neptune was an example of how to do this, but it required laborious numerical calculations.
Move from Neptune to landing a man on the moon, or to Rosetta’s Philae lander on the surface of the comet 67P/Churyumov–Gerasimenko, and pencil and paper are no longer practical; we need a computer. Move from this to modelling a whole galaxy of stars, a collision of galaxies, or even the evolution of the early universe, and we need a large computer (for example)
Of course some people had dreamed of doing the necessary numerical calculations long before the digital computer was available. In 1922 Lewis Richardson imagined 64,000 people each with a mechanical calculator in a stadium executing numerical calculations to predict the weather.
Only with the advent of the modern digital computer was this dream to be realised. Although of course, the exponential growth in computing power has meant that each 18 month doubling of computing power has created new opportunities to broaden or deepen the model capabilities.
John von Neumann, a key figure in the development of the digital computer, was interested in two applications – modelling the processes involved in the explosion of a thermonuclear device and modelling the weather.
The innovation in the early computers was driven by military funding, and much of the pioneering work on computational physics came out of places like the Lawrence Livermore Laboratory.
The Monte Carlo method, a ubiquitous tool in many different models and applications, was invented by Stanislaw Ulam (a mathematician who is co-author of the Teller-Ulam configuration for the H-bomb). This is one of many innovations used in computer models.
The same mathematics and physics used for classical analysis has been reformulated in a form susceptible to computing, so that the differential calculus is rendered as the difference calculus. The innovations and discoveries made then and since are as much a part of the science and engineering as the fundamental laws on which they depend. The accumulated knowledge and methods have served each generation.
Some would argue that far from merely making complex problems tractable, in some passive role, the computer models provide a qualitatively different approach to that possible prior to digital computing. Because the computers acts like experimental devices from which insights can be gleaned, they may actually inspire new approaches to the fundamental science, in a proactive manner, helping to reveal emergent patterns and behaviours in systems not obvious from the basic physics. This is not a new idea …
“Given this new mathematical medium wherein we may solve mathematical propositions which we could not resolve before, more complete physical theories may possibly be developed. The imagination of the physicist can work in a considerably broader framework to provide new and perhaps more valuable physical formulations.” David Potter, “Computational Physics”, Wiley, 1973, page 3.
For the most part, if we think not of colliding galaxies, and other ‘pure science’ problems, the types of models I am concerned with here are ones that ultimately can impact human society. These are not confined to von Neumann’s preferred physical models.
An example from the world of genomics may help to illustrate just how broad the application of models are in today’s digital world. In looking at the relationship between adaptations in the genotype (e.g. mutations) and phenotype (e.g. metabolic processes), the complexities are enormous, but once again computer models provide a way of exploring the possibilities and patterns, that teach us something and help in directing new applications and research. A phrase used by one of the pioneers in this field, Andreas Wagner is revealing …
“Computers are the microscopes of the 21st Century”
BBC Radio 4, ‘Start The Week’, 1st December 2014.
For many of the complex real-world problems it is simply not practical, ethical or even possible to do controlled experiments, whether it is our electricity grid, the spread of disease, or the climate. We need to be able to conduct multiple ‘runs’ of a model to explore a range of things: its sensitivity to initial conditions; how good the model is at predicting macroscopic emergent properties (e.g. Earth’s averaged global temperature); response of system to changing external parameters (e.g. the cumulative level of CO2 in the atmosphere over time); etc.
Models are thereby not merely a ‘nice to have’, but an absolute necessity if we are to have get a handle on these questions, to be able to understand these complex systems better and to explore a range of scenarios. This in turn is needed if we as a society are to be able to manage risks and explore options.
Of course, no model is ever a perfect representation of reality. I could repeat George Box’s famous aphorism that “All models are wrong but some are useful”, although coming as this did from the perspective of a statistician, and the application of simple models, this may not be so useful when thinking about modern computer models of complex systems. May I suggest a different (but much less catchy) phrase:
“Models are often useful, sometimes indispensable and always work in progress”
One of the earliest mathematicians to use computers for modelling was the American mathematician Cecil Leith, who during the war worked on models of thermonuclear devices and later worked on models for the weather and climate. In a wide-ranging 1997 interview covering his early work, he responded to a question about those ‘skeptics’ who were critical of the climate models:
“… my concern with these people is that they have not been helpful to us by saying what part of the model physics they think may be in error and why it should be changed. They just say, “We don’t believe it.” But that’s not very constructive. And so one has the feeling they don’t believe it for other reasons of more political nature rather than scientific.”
When the early modellers started to confront difficult issues such as turbulence, did they throw their hands up and say “oh its too hard, let’s give up”? No, with the help of new ideas and methods, such as those originating from the Russian mathematician’s Kolmogorov and Obukhov, progress was made.
The cyclical nature of these improvements comes from a combination of improvements in methods, new insights, improved observational data (including filling in gaps) and raw computing power.
A Model of Models might look something like this (taken from my whiteboard):
In this modern complex world we inhabit, models are not a nice to have, but an absolute necessity if we are to embrace complexity and be able to gain insights into the these systems, and anticipate and respond to scenarios for the future.
We are not able to control many of the variables (and sometimes only a very few), but we can see what the response is to changes in the variables we do have control over (e.g. use of storage arrays to facilitate transition to greater use of renewable energy). This in turn is needed if we as a society are to be able to manage risks and explore options, for both mitigation and adaptation in the case of global warming. The options we take need to be through an inclusive dialogue, and for that we need the best information available to inform the conversation.
Some, like US Presidential candidate Ted Cruz would prefer to shoot the messenger and shut down the conversation, when they do not like what the science (including the models) is telling them (e.g. by closing down the hugely valuable climate research based at NASA).
While many will rightly argue that modelling is not the whole story, or even the main story, because the impacts of increased man-made CO2 are already evident in a wide range of observed changes (e.g. large number of retreating glaciers), one is bound to ask “what is the alternative to doing these models?” in all the diverse fields mentioned? Are we …
- To wait for a new disease outbreak without any tools to understand strategies and options for disease control and to know in advance the best deployment of resources, and the likely change in the spread of disease when a virus mutates to an air-borne mode of infection?
- To wait for a brown-out or worse because we do not understand the dynamical response of our complex grid of supply to large and fast changes in demand, or the complexities of an increasingly fragmented supply-side?
- To wait for the impacts of climate change and make no assessment of when and how much to invest in new defences such as a new Thames Barrier for London, or do nothing to advise policy makers on the options for mitigation to reduce the impact of climate change?
Surely not.
Given the tools we have to hand, the knowledge and methods we have, accumulated over decades, it would be grossly irresponsible for us as a society not to undertake modelling of these kinds; and not be put off by the technical challenges faced in doing so; and certainly not be put off by those naysayers who don’t ‘believe’ but never contribute positively to the endeavours.
We would live in a more uncertain world, prone to many more surprises, if we failed to model the human and natural systems on which we rely and our future depends. We would also fail to effectively exploit new possibilities if we were unable to explore these in advance (e.g. the positive outcomes possible from a transition to a decarbonised world).
Let’s be in praise of computer models, and be thankful that some at least – largely unseen and mostly unthanked – are providing the tools to help make sense of the future.
Richard Erskine, 24th May 2015
Pingback: Demystifying Global Warming and Its Implications | EssaysConcerning
Excellent essay.
LikeLike
Pingback: Following the science: what should that mean? | EssaysConcerning
Pingback: A Very Short And Fairly Understandable Introduction to Models | The unpublished notebooks of J. M. Korhonen