Climate change and its effects on the environment and society are one of the most important and controversial issues of the present day, as can be seen during these days at the United Nations Climate Change Conference. But in order to engage in an in-depth debate on climate change, it is important to know how the models on which the predictions and recommendations are based are built, how they are tested, what kind of predictions they produce and how reliable they are.
we start with a clear fact: it is very difficult to predict the weather. On the one hand, the atmosphere is a complex system, in which many factors affect and in which chaotic behaviors appear. Basically, climate is an interaction of the energy emitted by the sun, with the atmosphere, oceans, ice and vegetation, the evolution of which we describe by the laws of physics validated over the centuries. Most weather models start from weather models that offer a five-day weather forecast. They use Navier-Stokes equations in a rotating sphere – a set of partial derivative equations that determine the movement of the atmosphere, including information on the moment and energy of the air and oceans – and the laws of thermodynamics – which describe the evolution of the temperature and effect of the sun's heat on air, water, and evaporation–.
The resulting system of equations is solved every six hours to obtain approximate solutions, with the help of a computer. To do this, the Earth and its atmosphere are divided into small cubes in which solutions are estimated: this method is called discretion. To make a weather forecast, some one billion prying equations are solved. This process involves reversing very large arrays, which takes about an hour of calculation on a supercomputer. The results obtained today are quite accurate, and in addition, we can compare them with reality daily to determine the mistakes made.
In addition to weather forecasting algorithms, climate evolution models add physical, chemical and even biological information
In addition to these weather forecasting algorithms, climate evolution models add physical, chemical and even biological information. The result is systems of extremely nonlinear equations that can have chaotic solutions. It is also in the interest of obtaining forecasts not five days, but thousands or millions of years. It is also necessary to take into account the current and future impact of humanity (the increase of carbon dioxide in the atmosphere by burning fossil fuels, changes in agricultural practices, or by tropical rainforest stings...). This is perhaps what adds the most uncertainty to the model,
The size and complexity of the resulting systems makes it extremely difficult to verify, modify, and run them. It is also difficult to interpret the results, as they produce a gigantic amount of data, complicated to analyze and even store. Luckily, climate models have evolved considerably in recent decades, both in precision and complexity. And in parallel, computers are also improving – they are getting faster – and the software that allows to obtain good approximations of the solutions.
To assess predictions, it is also essential to control the errors made, which come from both the way the physics is represented, as well as the algorithms used to solve the equations, the coding of the algorithms, the data that are entered into the calculation and the initial conditions used to start the whole system. To assess the magnitude of the final error it is necessary to compare the estimated solutions with the actual solution.
To assess predictions, it is also essential to control the mistakes made
A first step is to test the weather model on which the climate model is based, which would expose, for example, any systematic code error. Also, by mathematical arguments – from a branch called numerical analysis – it is possible to see the convergence of the algorithms used; and statistics and probability tools are used to quantify the uncertainty of the data they are working with, both as initial conditions and possible scenarios. On the other hand, because it is impossible to contrast climate models directly with future data (unless we are willing to wait decades), they are compared to data from the past, using a method called hindcasting, which basically states that if a model can predict the past, we have arguments to believe that it will also be able to anticipate the future.
But even though we are able to anticipate certain events and estimate how accurate our forecast is, it is still very difficult to explain why we get the final results. To understand it we work with a hierarchy of different models, which increase their complexity, so that we start from bricks simple, that we are able to understand, and from them, are configured the following pieces, whose operation is more or less deduced from the above, and so on: energy balancing models, box models, models of intermediate complexity of the Earth or models of the climate reduced, models of the global climate for atmospheres and oceans. Together, all models are based on strong scientific statements and offer us the best way to explain changes in past climate, and predict what will happen in the future,
The final questions are: do the different models match in your predictions? As much as some politicians want to deny, they are all consistent with the observations that they show that the climate is indeed changing and being influenced by human action. We also know how wrong these statements are made. The conclusions, it's math,