Skip to content

Is there a validation crisis in geology?

Many experimental sciences have recognised a worrying “replication crisis” in their methods. In 2005 John Ioannidis wrote “Why most published research finding are false”; in which he  demonstrated that most published research does not meet good standards of evidence. Others such as Brian Nosek (of theOpen Science Collaboration) have also highlighted the publish or perish culture in academia has a tendency towards generating falsehood “There is no cost to getting things wrong, the cost is not getting them published”. In the last decade the word “p-hacking” appeared, sadly never being shortlisted for “word of the year”, but highlighting the mess that experimental science is currently in, and the need for a change in practices.

Why should the non-interventional science of geology be any different? The same symptoms are there: the inability of work from different discipline to match up, resulting in insular, specialist areas (such as papers written just on seismic with no analytical data; or a minimum of material that agrees). There is the same malaise of a science that is not making any breakthroughs, and does not even seem to have a direction. Lots of papers being published of course, but lacking interested readers. How many geologists pick up a journal, fishing for new ideas in a similar basin or topic, hoping that maybe reading it will spark some new concepts in their subject area? I suspect close to zero. So much work is specialist and deliberately introverted, not wanting to interact with other studies – perhaps in case weakness are found that will prevent that important paper getting published, or the internal report not meeting requirements Perhaps I am being un-fair, as there must also be cases when related data is buried in proprietary reports (kudos to recent initiatives by Petronas and Migas in SE Asia that have opened many commercial files to explorers and academics).

What is validation?

Validation is an essential step when dealing with proxy data, which is most of geological data: such as stage of fossil evolution correlated to age, or spore colour and vitrinite reflectance for maximum thermal alteration, and thereby maximum depth of burial, and so on.

Below is one case of validation by cross-checking different disciplines. There are many, many more examples of wrong data that have been overlooked for years in SE Asia, with major implications for geological models. I will doubtless add other posts on this topic. Each has different implications. In the example below seismic and sequence stratigraphy had been forced into an impossibly condensed unit. This had mis-correlated unconformities around the high and confused the tectono-stratigraphic history. Relatively minor consequences, but it simply should never have happened.

Where there are multiple examples of un-validated work in an area, then the entire geological model can be severely degraded. This may be why model-based interpretations of stratigraphy, based on passive margin techniques, managed to survive and hold back exploration of the dynamic Asian region. If the evidence-based approach is polluted with un-validated evidence, then a project for revision can be condemned to failure before it has begun.

Dating of two nearby wells in east Java. Strontium dating with measurement error plus biostratigraphy as circles. The end Eocene line is emphasised as it was a major extinction in foraminifera and is extremely easy to determine

The figure above is a good example of precision, accuracy and trueness from real data. In the 1990’s a major company spent many thousands of dollars on the then new technique of strontium dating. These results are shown as black dots with measurement precision error bars. In Well 1, as well as three other wells nearby, core and sidewall core sample gave not only precise measurements but these were tightly clustered, and formed a linear trend. It seemed like analytical money was well spent, and seismic was forced to fit, especially the condensed section that included the Eocene-Oligocene boundary. This was not easy. Papers were published.

As we know there was a mass extinction of the larger foraminifera (the fossils used to date these sediments) on the Eocene-Oligocene boundary, there was a problem. However biostratigraphy was supplanted by new technology, and this was overlooked and ignored. Eventually the sidewall core and conventional core thin sections were obtained by a palaeontologist. Instead of Eocene foraminifera within a few feet of basement there were mid Oligocene ones, including the very distinctive genus Austrotrillina (with a unique wall structure), its range marked in green on the figure from Well-1. Many other Oligocene general were observed, but no earliest Oligocene ones, and no Eocene forms.

So what is happening?

One well (Well 2 above) had different strontium dating, with ages only to 30 or 31 Ma above basement, matching the biostratigraphy. However there were only five Sr dated samples in the 200 feet section above basement, whereas the other four wells had many dozens. This particular well location was unique as it was not above volcanic basement, but metasediments. It therefore seems that all the other wells with Eocene ages had been contaminated by 86Sr derived from the underlying volcanic rock and this had slightly changed the 86Sr/87Sr ratio, and the age deduced from it, to slightly older values. The effect increases with proximity to the volcanic basement.

So this is a case where precise, apparently accurate, data is untrue. As noted above, this is by no means unique, but only when experienced people read original data reports does this kind of error come to light.

The consideration of precision, accuracy and trueness is a key part of validation. It is a modern application of the Socratic method, or repeated questioning the details and hypothesis elimination. Nullius in verba (Latin for “take nobody’s word for it”). Do not believe modern high tech methods just because they are high tech.


References

Ioannidis, J. P. A., 2005. Why most published research findings are false. PloS Med 2(8), 

Open Science Collaboration, 2015. Estimating the reproducibility of psychological science. Science 349(6251), aac4716

Published inPrinciples and methods

Be First to Comment

Leave a Reply