The production of oil and gas, or hydrocarbons as they are collectively known, is big business. Between them, the world’s five largest privately owned Exploration and Production (E&P) companies, known as the ‘Super Majors’ – ExxonMobil, Royal Dutch Shell, Chevron, BP and Total – have profits totaling in the $100 billions.
However, a mixture of tougher regulations, higher taxes, political instability in the Middle East and the increasing cost of finding and extracting hydrocarbons are presenting difficulties for the industry. These challenges are in turn passed on to the consumer, reflected in the price of Brent crude oil, which has risen nearly 70 per cent in the past year.
To continue to thrive, the super majors have to look at more efficient and cost effective methods of exploration while optimising existing gas and oil fields. Exploration is where these companies experience much of the cost and financial risk associated with the production of gas and oil. For example, drilling an exploratory well can cost around £100 million, with only one in four producing a financially viable amount of hydrocarbon. With reducing this investment requirement in mind, increasing the efficiency of exploration is a top priority.
At sea, exploration involves seismic surveys, using the latest technology, of large areas of the seabed (typically between 2500km2 and 10,000km2). Ships systematically sweep areas of the oceans, using massive air guns to send sound waves through the layers of rock below the seabed. These are reflected back up to acoustic sensors (hydrophones), which are towed on cables several kilometres long and spaced at regular intervals. Each hydrophone stores a trace of several seconds of the received seismic energy. When underway, such arrays are the largest moving objects on Earth. The data collected is then processed by a vast computing clusters, initially onboard the ship and later in the data centres of the acquisition organisations, to build an image of the subsurface geology.
The process creates vast amounts of data - often terabytes in current surveys. Currently, due to the difficulties involved in storing, accessing and processing such a quantity of data, information is sampled at a lower resolution, which means that detail is lost when passed on to the geoscientist for interpretation. A similar process occurs when someone wants to send a high-resolution picture via email; they would have to reduce the size of the file. So, the picture is still recognisable, but some of the structure detail is missing.
For geologists the problem is that, by reducing the resolution, subtleties that might be vitally important are lost. For instance, because data is acquired by sending a sound wave straight down, it is difficult to image vertical features such as faults in the rocks. Subtle differences in the make-up of rock can indicate to a geologist where these are. This information is vital when trying to understand fluid pathways for the saline used to displace oil or gas within a reservoir. Faults can act as a conduit for the fluid, or they can compartmentalise the reservoir closing off some potentially significant reserves.
Geologists need to be able to locate accurately significant faults, but the technology in use today does not provide them with the data they need in a timely or interactive manner. Currently applications talk to huge files, usually stored locally on a disk. This involves regularly moving a big chunk of data in and out of RAM so an application can work on it. Additionally, on most systems the lowest level of detail is the seismic trace, which are often gathered together in a 3D grid that can then be sliced and diced into smaller cubes. This is very inefficient.
There is evidently a case for enabling geologists to examine the seabed subsurface quickly and in more detail. The solution is to make data easier to store, access and analyse at a more granular level.
For instance, if a four second-long trace is digitised every four milliseconds and each trace is 25 metres apart, then you know that you have a volume cell - or voxel - of 25m x 25m x 4 milliseconds. This is your data point, not the trace – the trace is effectively the file. By exposing just the data points required by the interpretation and modelling tools much of the computational overhead of data access is removed, meaning that more detailed data can be accessed quickly.
Accessing data this way makes the exploration workflow more efficient. Firstly, all spatially-located data types, including well data, outcrop data and infrastructure locations, are decomposed. These are then presented to applications and decision making tools through simple interfaces as binary objects that the software expects. This negates the problem of having to export data in a certain format to be passed to another application in the workflow – with the potential to lose valuable derived information and insight in the process.
Furthermore enabling the E&P companies to access all of their data across all areas of activity means that there is an enterprise-scale knowledge base available; this reduces the time it takes to transform geological data into actionable information and reduce risk. Huge savings can be made if the current 25 per cent success rate can be increased one percentage point by using data in a more enlightened way.
Ensuring all subsurface and operational data can be stored on a single, relational, platform and accessed quickly and efficiently, means that the E&P companies are able to optimise their operations, increase business insight and improve their safety and regulatory compliance over the life of a field.
With the oil and gas industry becoming an increasingly challenging environment, oil and gas companies need such new technology if they are to meet the scalability and complexity problems presented by the volumes of data they most work with on an operational basis.Teradata
Dr. Duncan Irving is Teradata’s EMEA industry consultant for the oil and gas industry. Dr Irving has spent his entire career in the fields of geophysics, subsurface imaging and seismic interpretation. He has a degree in physics and geophysics from the University of Edinburgh, and a PhD in glacial geophysics and numerical modelling from Cardiff University/ETH Zurich.
For further information please visit: www.teradata.com