Digital Advances Transform Seismic Data Processing

Over the last several years, the industry’s downturn has spurred the development of more efficient ways to interpret both new and old seismic data, allowing operators to continue to explore and discover with reduced staff.

The availability of cloud technology, super-computing resources and the application of machine learning techniques, such as use of neural networks in artificial intelligence, are transforming the ability to interpret seismic data and create evergreen Earth models – supporting the undeniable adage that necessity, indeed, is the mother of invention.

Needing a better return on their investments, operators have looked to service companies to streamline the lofty task of interpreting large volumes of seismic data. In response, Landmark, a division of Halliburton, developed a way for geoscientists, drilling and reservoir engineers to share geological, geophysical and engineering information in one accessible space, called DecisionSpace.

“Clients want to see the subsurface and drilling information together to make better decisions about planning and drilling,” said Milos Milosevic, senior director of technology for Landmark. “When multiple teams – drilling, subsurface geology, geophysics and economics – all integrate, how does that change the modeling and economics of a field?” A streamlined system of sharing information reduces the amount of time and money spent on interpreting data and making decisions about how to move forward.

It’s not just software that’s advancing, but also accessibility to high performance computing resources. In the past, geophysicists worked with an average of 3.7 gigs of RAM on their computers, each programed to handle one project at a time, explained Deborah Sacrey, AAPG Member and owner of Auburn Energy. Recently, Sacrey built a workstation that incorporates 80 CPUs totaling 512 gigs of RAM and the ability to process seismic data covering 100 square miles and 100 milliseconds of time in just one or two hours.

Applications from Cloud Technology

Furthermore, the ability to affordably store large quantities of seismic data also aids in processing larger volumes. The provision of public cloud services has enabled many operators to move away from internal data centers and gain access to computational power that can be leveraged on demand and in an instant, Milosevic said.

Please log in to read the full article

Over the last several years, the industry’s downturn has spurred the development of more efficient ways to interpret both new and old seismic data, allowing operators to continue to explore and discover with reduced staff.

The availability of cloud technology, super-computing resources and the application of machine learning techniques, such as use of neural networks in artificial intelligence, are transforming the ability to interpret seismic data and create evergreen Earth models – supporting the undeniable adage that necessity, indeed, is the mother of invention.

Needing a better return on their investments, operators have looked to service companies to streamline the lofty task of interpreting large volumes of seismic data. In response, Landmark, a division of Halliburton, developed a way for geoscientists, drilling and reservoir engineers to share geological, geophysical and engineering information in one accessible space, called DecisionSpace.

“Clients want to see the subsurface and drilling information together to make better decisions about planning and drilling,” said Milos Milosevic, senior director of technology for Landmark. “When multiple teams – drilling, subsurface geology, geophysics and economics – all integrate, how does that change the modeling and economics of a field?” A streamlined system of sharing information reduces the amount of time and money spent on interpreting data and making decisions about how to move forward.

It’s not just software that’s advancing, but also accessibility to high performance computing resources. In the past, geophysicists worked with an average of 3.7 gigs of RAM on their computers, each programed to handle one project at a time, explained Deborah Sacrey, AAPG Member and owner of Auburn Energy. Recently, Sacrey built a workstation that incorporates 80 CPUs totaling 512 gigs of RAM and the ability to process seismic data covering 100 square miles and 100 milliseconds of time in just one or two hours.

Applications from Cloud Technology

Furthermore, the ability to affordably store large quantities of seismic data also aids in processing larger volumes. The provision of public cloud services has enabled many operators to move away from internal data centers and gain access to computational power that can be leveraged on demand and in an instant, Milosevic said.

Cloud technology has put an end to storage limitation and access to data, including well logs, drilling and completion information, data gathered from sensors, etc.

“Today’s technology allows geoscientists to see their assets from a ‘digital twin’ – a digital replica of the physical field and well, and with this information they can make inferences and predict what will happen in the future,” Milosevic said.

Cloud technology also has allowed geoscientists to use machine learning in artificial intelligence, which is revolutionizing seismic interpretation.

“Seismic processing is a very labor-intensive area,” Milosevic said. “With the cloud and innovative algorithms, you can automatically sift through large datasets of any size, which was not possible before. Processing that data and the return of results now takes hours instead of weeks. It frees geophysicists to really explore the data and be creative about what they need to be doing, rather than focusing on the ‘drudgery’ of cleaning data and manually extracting features in a dataset.”

Neural Networks

The development of Earth modeling has also assisted subsurface interpretation. By leveraging the power of cloud computing and optimizing computational algorithms, geoscientists can now build super high-resolution giga models (billions of points) in less than a minute rather than in weeks or months. Scalable earth modeling allows geoscientists to rapidly incorporate new data into a model, rather than having to rebuild it with each new piece of information.

“You can add new data on the cloud and do it quite rapidly. It gives geologists a way to incorporate data on different scales and not lose fidelity,” Milosevic said. Hundreds of different scenarios can now be processed quickly to help companies quantify risks associated with sparse data and make better informed decisions in real time.  

In addition, the continued improvement of neural networks has led to discoveries from old seismic data that was once too voluminous to process, Sacrey said.

Focusing on unsupervised data, which contains no training datasets or known end points, Sacrey uses unsupervised neural networks to find patterns in large amounts of seismic data absent of human input or direction.

By studying samples of seismic data rather than wavelets, “I am looking at 15 times the amount of information than if I were mapping a peak or a trough,” she said, referring to the envelope of data in a wavelet. “This is a huge amount of statistical information for a sample of different attributes,” she added.

Neural networks are capable of finding attributes such as depositional environment, lithology, channels, reefs, carbonate banks, pinch-outs and lap-outs, to name a few.

Convolutional neural networks allow a geophysicist to train neurons to recognize patterns or attributes from certain datasets and subsequently look for them in separate datasets or within the same dataset but in different seismic lines.

“If you have five lines on which you have interpreted faults, the neurons can go in and learn those faults, their thrusts, their lengths, how they match together, and they take a digital picture and go through the rest of the dataset and find what they see as similar faults,” Sacrey explained. “It saves a lot of time, because the most boring task of interpreting 3-D seismic is picking out faults.”

Calling neural networks the “latest and greatest” of seismic data interpretation, Sacrey said stratigraphic units are more easily identified, including salt features, which often complicate seismic interpretation.

“Neural technology is on the cusp of becoming common in the industry,” she said. “Some companies are now assembling whole teams of people to expand this technology, especially when there is no way to ground truth certain features.”

A recent project in Brazoria County, Texas had Sacrey reprocessing seismic data shot 15 years ago using neural networks. The landowner was pleasantly surprised at a 2-million barrel discovery on his land, prompting her to ask, “How many other reservoirs like that are out there?”

Over the years, Sacrey has seen technology go from analog to digital, 2-D seismic to 3-D seismic, and she now sees the application of neural networks in artificial intelligence as the next digital revolution in the industry.

Looking forward, she said, “I can’t even imagine what’s going to happen 10 or 15 years from now.”

You may also be interested in ...