It’s the uncertainty, stupid!

Within science it is not unusual that great findings get congratulated, which consists of reduced uncertainties. “The next big thing” is sometimes rushed into publication, often in the form of a nice number, and sometimes significance of the result is the selling argument. Unfortuneately, a sole number is worth nothing in the most cases, as long as it is not backed up by the information how sure the author is about it. These information are usually called uncertainties and can be found in a lot of different forms. Sometimes they are some significance levels, sometimes a simple sigma-level. Often these uncertainties quantifying numbers and especially the methods how these numbers are retrieved is the really important thing in a publication.

Sure, for someone like me, who has uncertainty quantification as the main job, this is seen especially critical. But there are also good arguments that the statistical part in a publication is as important as the result itself. With any given quantification of uncertainty assumptions are made and these assumption decide in the end, whether the big result is really big or just nice to have. That this has become critical in science has been discussed for years in many fields, but especially this year these discussions got hot and led to consequences.

The starting point of the discussions were a piece by the Lancet, which critisised the research and publication system in general. A main focus here was the quality of research publications and methodologies especially in medical sciences. A month later Nature pubished an article, which especially hit on statistics and here th p-value. It gained much publicity and the discussions kept on going. A cosequence of this has been seen earlier this month, when the journal Science announced that further checks on statistics will be perfomed in future peer review decisions.

Nevertheless, these developments could be just the start of many changes in the scientific system, which have to be expected in the upcomming years. Statistics get more and more important, and this means not the high end statistical analysis to get the last bit out of a given dataset. It is the basic statistics, which gets more influence. Especially in connection with data reuse, it is essential that statistical information is delivered (which mainly means well documented uncertainties) with a published dataset.

One problem for the future is how to educate the upcomming scientists in the basics of statistics. Currentely, in many fields and courses the usual is not more as a quick introduction into some tools, and the basics behind it are often neglected. Since every statistical tools rely on assumptions, and every user have to know them before they can apply the methods in a correct manner, this is not getting easier. Of cause it is possible to breach the assumptions, which is usually the effective way to handle them in new fields, but for this the user have to be at least aware that this happens and know what the possible consequences.

It is a huge ask for the current educational system at universities to provide a better basic statistcial education, but I think it will be nessessary to handle the problems of the future. New tools will be necessary to develop in this. One way could be to provide documentations in a simple form of every standardly used statistical tool. The focus have to be on the assumptions for each of them, the possible applications and the limitations of the methods. This can be in existing environments like wikipedia, or in new forms of communication and repositories. Nevertheless, without teaching the basics, even those will not make it simpler to the next generation.

We in geoscience are especially vulnerable towards inaccurate determination of uncertainties, since we are working in an interdisciplinary environment with many different educational systems at hand. All have their advantages and disadvantages, but when it gets to communication with each other, about topics like uncertainties, many things are still in development.

What the KISS (Keep it simple, stupid!) principle is in design and programming, the ITUS (It’s the uncertainty, stupid!) principle should become in general science. We will see, which fields adapt to it at best in the upcomming decades, but those who do will be definetely in the advantage above those, who hesitate.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s