Food for Thought: Developing Tools Could Optimize Seismic Data Processing

Fiber, multi-parameter full wave inversion, and quantum computing may help make seismic data easier to digest.

Once upon a time, the most data a budding geophysicist could access was a ball in a toad’s mouth, or millennia later, some squiggly lines from a pencil hanging above a rotating drum.

Today, geoscientists have access to more seismic data than we might know what to do with. So, what are the best ways to measure and process it?

As with most questions in geophysics, the answer is a giant “it depends,” but some tools and processing methods have the potential to help us find new oil and more effectively pull oil out of already discovered reservoirs. These tools produce high-resolution data around the interaction of our wellbore and fracturing jobs, and they might be able to provide different or better insights without requiring new, expensive seismic surveys. Three such tools are fiber, multiparameter full waveform inversion and quantum computing.

Please log in to read the full article

Once upon a time, the most data a budding geophysicist could access was a ball in a toad’s mouth, or millennia later, some squiggly lines from a pencil hanging above a rotating drum.

Today, geoscientists have access to more seismic data than we might know what to do with. So, what are the best ways to measure and process it?

As with most questions in geophysics, the answer is a giant “it depends,” but some tools and processing methods have the potential to help us find new oil and more effectively pull oil out of already discovered reservoirs. These tools produce high-resolution data around the interaction of our wellbore and fracturing jobs, and they might be able to provide different or better insights without requiring new, expensive seismic surveys. Three such tools are fiber, multiparameter full waveform inversion and quantum computing.

Fiber, Good for Data Digestion

OK, it’s not the broccoli of geophysics, but fiber does help produce a lot of digestible data – easily terabytes every day. Fiber measurements typically fall into two types: distributed temperature survey and distributed acoustic survey. These measurements take advantage of the physics behind how light scatters. All measurements begin with a laser pulse in an interrogator placed at the start of the fiber. The light gets scattered along the fiber based on Ramen scattering in the case of DTS and Rayleigh scattering in the case of DAS.

Readings are produced along the full length of the fiber at high time resolutions – high enough to detect any microseismic events triggered as fractures move through the reservoir or temperature changes as fluids infiltrate the wellbore at specific locations. Though distance limits vary, fiber has more than enough distance to cover a wellbore!

Better ways to store and analyze the data that fiber collects are on the horizon, but for right now, some data might just be thrown out or not utilized properly (like the extra broccoli on your kids’ dinner plate!)

Chewing Your Food

In the world of edge to cloud, cloud storage, machine learning and AI, processing opportunities abound! It’s not hard to imagine a setup in which fiber data can be streamed through an algorithm that looks for anomalies and sends them to the cloud for storage, while the rest might be moved to cold storage or left on-site to be removed later. This opens the door to discovering events beyond timeframes when we know operations are occurring. Imagine having a tool downhole that constantly monitors your wellbore with data you can analyze automatically so that alerts pop up when something is amiss. I’m not sure if that’s actually happening, but the tools to do it are all there.

Another data processing option can provide an even clearer picture: multiparameter full waveform inversion. Utilizing hardware and software advancements, processing company DownUnder GeoSolutions Technology has found a way to apply MPFWI to high-frequency datasets, working beyond a typical limitation. “(MPFWI) needs lower frequencies than older vibrators can provide. This means you basically can’t apply it to any land data that wasn’t acquired with FWI in mind,” said independent geophysicist Damon Parker. But DUG is doing it: the company recently completed stage 1 of a joint venture project led by TGS in which it utilized data across 39 surveys from 1989 to 2014 (see sidebar).

Finally, companies such as IBM are not-so-quietly working on beefing up quantum computing’s capabilities. Quantum computing is best applied to inversion and optimization problems. Bob Parney, IBM’s quantum global lead for sustainability, oil and gas, energy and utilities said, “Work that has been done in oil and gas includes XoM routing of marine vessels and dynamics of molecular simulation, BP simulation of hydrogen dynamics and Woodside Energy’s optimization of LNG production.”

Ali Tura at the Colorado School of Mines said he has a student who is conducting studies using quantum computing, and scientists can rent time on these machines! But the tech is not perfect – yet.

“You can test and try problems on them, but there’s just not enough quantum computers or quantum computing resources to be able to run big jobs … that’ll change over time, as the technology grows,” said Tura.

Fortunately, geoscientists can use simulators to help optimize their time while working on real data and projects.

Time will show the full effects of how fiber, MPFWI and quantum computing can help geophysicists better measure and analyze data, but the interim will require folks willing to experiment with these new and exciting tools to help make them more mainstream.

You may also be interested in ...