• Ei tuloksia

THE SCIENTIFIC CHARACTER OF CLIMATE MODELS

4 CLIMATE SCIENCE AS A SCIENCE

4.4.4 THE SCIENTIFIC CHARACTER OF CLIMATE MODELS

The current theory of human-induced climate change is based, not only on measured observations (changes in both the surface temperatures of the earth and the oceans, and the changes in volume of the arctic sea ice), but also on predictions, which have been made on the basis of climate modelling. A lot of resources have been invested in climate model simulations, especially in climate research conducted in the field of meteorology. In these, both natural and anthropogenic factors are modelled, and their impacts on the climate are estimated, along with that of other feedback phenomena. The attempt is then made to utilise the results in decisions concerning both the mitigation of and adaptation to climate change.

After World War II more and more scientific fields have used computer simulations as their method. Some examples include space physics, materials science, design science, fluid mechanics, evolution biology, climate science, ecology, economics, etc. Certain scientific fields, like chaos theory, are even entirely based on computer modelling. While the method of computer modelling has experienced an explosive period of growth, its philosophical and epistemological evaluation is only in its infancy. As a matter of fact, the input of philosophers of science in this field of research is far from obvious. This is because it is not selfevident that modelling entails philosophically interesting challenges. Winsberg (2009) sees, however, that the adoption of modelling methods in science has brought up questions that are significant from a philosophy of science point-of-view, and that require input from philosophers. Simulations require their very own epistemology, and, as a matter of fact, Winsberg predicts that this field will grab the attention of philosophers in the coming years.

Philosophically, the question of climate modelling can be considered central.

The focus of philosophy has, in fact, increasingly shifted from classical truth-based

projects to something closer to models and modelling. In these latter types of projects, an attempt is made to simulate a real situation and consequently, a prediction of the future is made. Naturally, the question of the predicting history of the models also arises. (Godfrey-Smith 2003: 186–189, 230–231, Winsberg 2001: 453) The scientific practice of modelling poses both methodological and epistemological challenges.

To these questions, we can better respond by means of a philosophical analysis, in which the weight is rather put on the guidelines for modelling instead of the representational capacity of the theory (Winsberg 2001: 442–443).

One example of research on climate modelling is that conducted by the London School of Economics researcher Roman Frigg (2011), who holds that even the smallest changes in climatic preconditions can induce a so-called butterfly effect,101 which causes the loss of predictability of the models. In Frigg’s opinion this actually renders modelling an unnecessary effort and a waste of energy. When non-linear systems include even the smallest modelling mistake, their ability to provide significant and politically relevant and probable predictions is a lost cause. Therefore, in practice, they also cast severe doubts on large-scale climate projects that are conducted on the basis of the models.

In Frigg’s opinion, it is not a question of whether or not there is enough evidence of climate change for the philosopher who evaluates it. Rather, the problem is that the detailed local effects of climate change are unpredictable, at least when measuring them using the existing models. Allocating governmental investments worth millions to the development of more detailed modelling is a waste of money and a bad use of resources. According to Frigg, sensible climate policy can be made in another way, too. Frigg refers to Sir Francis Bacon: ”Truth will sooner come out from error than from confusion.” Thus, clearing up confusion and admitting a mistake is potentially a factor promoting success.

Climate models are non-linear and very complex from a computer-modelling point-of-view, and expensive as well. Frigg researched a simple non-linear system, a logistics map as a proxy for large-scale climate models. Even though a correct climate model does not have the clear structure of an algebraic logistics map, logical problems are typical for non-linear systems, and can therefore serve as models for climate modelling. The result proved that predictability disappears very quickly, as applied to a whole system. The non-linear nature of the system renders the deterministic or even probabilistic forecasting impossible.

101 The butterfly effect is a metaphor associated with chaos theory, which describes how a wing stroke of a butterfly in one place can cause a storm on the other side of the world. It is also used for the description of the characteristic of non-linear systems that small disruptions or changes in the initial conditions or intermediary phases can bring about great changes in the final occurrence of an event. The phenomenon was discovered by the meteorologist Edward Lorenz in 1961 during his research on weather forecasting, and it was to be known under the name “sensitivity to initial values”. In his presentation, Lorenz wondered whether a butterfly’s wing stroke in Brazil could cause a tornado in Mexico.

Nowadays, it is thought that non-linearity is a characteristic of all complex systems. The value of the function does not directly depend on the value of the argument, but it is rather a question of a series of points of discontinuity. Cell biology provides a good analogy: it should be possible to explain all the functions of the cell through the interaction of the macromolecules. It could be assumed that their modelling is a type of inventory of macromolecules and their interaction. The problem is a lot bigger, however, because the interactions are non-linear, and cannot be presented in a linear fashion. A model of the cell can actually not be properly built to commence from the interaction between two macromolecules. And when we are dealing with thousands of macromolecules, the problems multiply and grow bigger. The internal behaviour of the cell is very chaotic and does not follow the logic of reductionism.

The same is evident in climate science that concerns non-linear functional dependencies. When concerned with complex systems, the predictive power of mechanistic models is extremely weak. Building upon them is not philosophically justified.

In practice, however, modelling is still one of the most important methods of climate science: the goal is to produce a series of possible, likely and certain future scenarios concerning the climate, for the simple reason that political and societal demand has been so high. Prediction is pivotal for climate science, while modelling and simulations are decisive for prediction. Therefore, the validity of climate models is a key question in climate science. (Lenhard & Winsberg 2010:254).

Lenhard and Winsberg (2010) have analysed the types of epistemological questions that relate to modelling. The simplest starting point would be to compare the modelling to existing and known climate data. The “real world” counterpart, i.e.

the information concerning a historical average temperature, is also a reconstruction of information derived from a large number of different sources (since measured temperatures exist from a short period of time, these would include, for example, ice core drillings and lake bottom sediment sampling, the microfossiles of which can aid in obtaining climate-historical data). In other words, the reconstruction of climate data is heavily “model-laden”. But even if this were taken for granted, the uncertainty factor would continue to exist. This is why, for example, the IPCC tables always include several scenarios, in which the emission volume varies. These, in their turn, project the estimates of various political models with regard to emission trends. Therefore, the term “projection” has become a standard expression of climate science. It is a weaker expression than “prediction”, precisely, because the results of the modelling results depend on future emission trends. (Lenhard & Winsberg 2010: 255)

The justification of simulation models depends on various characteristics – a lot more than just how faithfully the model reflects the existing and known

data. It also depends on faithfulness to the theory, selected calculation methods and numerous other factors.

Therefore, Lenhard and Winsberg (2010: 253) consider climate science to be in a kind of permanent state of converging scepticism – and this, because of the endless possibilities in climate modelling. Numerous parallel possibilities exist, and unanimity cannot be reached. This has manifest consequences for climate policy.

Unanimity is not even worth the wait, but a certain kind of pluralistic view has to be accepted from the start.

The authors refer to a principle known as the classical Duhem-Quine thesis, according to which it is impossible to test a scientific hypothesis in isolation, because empirical testing requires one or more background assumptions, i.e. so-called “auxiliary assumptions”. The range of hypotheses as a whole is put to the test – not an individual hypothesis apart from the others. Winsberg speaks of a

”confirmation holism” that confirms itself (Lenhard & Winsberg 2010: 253–254).

In Winsberg’s view, this type of “theory holism” is typical of climate science.

The hypothesis in itself is not able to produce testable predictions. Instead, the consequences of the hypothesis typically rest on several background assumptions, from which the predictions are derived. Thus, the hypothesis cannot be falsified using empirical methods if the background assumptions are not proven. Instead, the entire range of hypotheses can be tested as a whole. In the case of climate modelling, the problem is that it is impossible to locate the source of the mistake within the model.

It is essential to understand that the confidence of constructed models depends on many factors at hand, of which none are guaranteed by our theoretical knowledge.

It depends on facts that we know from our computers and graphical techniques.

It depends on the trust felt towards various ad hoc models, and this trust prevails in laboratory results and observations. It also depends on the ability to calibrate the models against the background of observations. And finally, it depends on the trust that is felt towards the modellers’ experience, skill, observation, estimation and inferential abilities.

For this reason, it can also be claimed that value judgements are explicitly present in climate science. Not just for the evident reason that the researched problem may have significant human implications that have to be weighed, but also because the climate modellers are not able to exclude non-epistemic, i.e. non-cognitive, values from their research. (Biddle & Winsberg 2010:3)

According to them climate modelling includes many uncertain factors. The first concerns the basic structure of the climate modelling, which they call “structural model uncertainty”. A second relates to the parameters of the models, whose values and behaviour we may assess in different ways. They call this “parameter uncertainty”. The third has to do with the actual data yielded by the results. Climate modellers often compare their results with past climate changes. This information

can be obtained from meteorological measurement information or proxy materials, where climate history has been tracked with indirect methods. Both sources are also vulnerable to factors of uncertainty. The authors call this “data uncertainty”.

Biddle and Winsberg also stress that they do not want to question climate change as a scientific theory or promote climate scepticism. They do, however, criticise the fact that climate scientists have presented their climate models recently as if they were free from moral, social or other non-epistemic values. Researchers then hand over these predictions, along with their inherent uncertainties, to decision-makers, legislators and democratic representatives, who are supposed to decide on the best way to act. Separating value-neutral from value-laden practices is not so simple. Even though this is the ideal in science, it does not apply to climate change modelling, in their view. It is an example of a scientific field in which researchers cannot address the uncertainties of their modelling without making non-epistemic estimations.

Biddle and Winsberg stress repeatedly that they are not climate sceptics,102 but they still argue that more attention should be paid to the question of how large a role non-epistemic values ultimately have in climate science. (Biddle & Winsberg 2010: 32–33)