Quantification and Representation of Uncertainty in Geological Interpretation
Many aspects of the geological sciences are inherently inexact. Geological and geophysical observations and measurements are always prone to error, yet field geologists generally lack a suitable framework for quantifying (or estimating) sources of error, or for expressing the innate uncertainty associated with geological interpretation. In this talk we consider some of the main sources of uncertainty that are typically associated with field and exploration geology, and with reference to other presentations in the conference, we summarise current and future strategies that can be used to analyse and convey geological uncertainty.
Projects involving spatial data from fieldwork and/or sub-surface studies (e.g. map production by national survey organisations; field-based PhD projects; hydrocarbon/mineral exploration & production, etc.) remain a cornerstone of the geosciences. In common with other observational sciences, a geoscience project of this type typically involves the acquisition of new observations and measurements (i.e. data), which are then processed, synthesised and analysed to derive a coherent interpretation (the geological model), which is made within the context of existing geological knowledge and understanding. Uncertainty exists at each step of this process, and is compounded from individual errors associated with a single acquisition of data, through to the final model which inevitably represents a complex hierarchical web of facts, interpretations, multiple-stage inferences, prejudices, gut-feelings and guesswork.
In the geosciences, the complexity of the information structure existing in most geological models is often compounded by a number of factors:
o Measurement error: spatial precision is rarely quoted; measurement error of field equipment is rarely quantified. This is easy to rectify from a technical perspective, but needs a radical change in current best practise, so that raw data (plus error) are accessible with the final model;
o Dimensionality: a geological model should be 3D, or ideally, 4D (i.e. consider temporal variation). This is sometimes extremely challenging because of the sparseness of data (particularly in the 3rd and 4th dimensions). Much work is still needed to analyse quantitatively the natural variation in 3D geometry of geological surfaces, and to develop improved methods to interpolate and extrapolate between sparse datapoints;
o Scaling and scale dependence: geological architectures can span many orders of magnitude, but methods of data acquisition typically capture only a limited range of scales. More work is needed to test scale invariance of different spatial aspects of geological surfaces. Improvements are also needed in visualisation software and hardware so that more types of data can be combined into a single multi-scale model that can be zoomed seamlessly across many orders of magnitude.
There are many different strategies available that can help to improve the quantification of uncertainty in geological models. These range from straightforward attempts to quantify measurement errors and maintain a direct link between raw data and final model, to more protracted methods that try to consider a range of competing interpretations. The latter can involve sophisticated knowledge engineering techniques that aim to trace (and make explicit) the dependency between facts and inferences within the information structure of the model, or a more brute force approach that produces multiple realisations of a model by changing input parameters and examining the stability of resultant solutions. Key to all these methods are improved ways to represent geological data and knowledge in computer-compatible ways.