While artificial intelligence and data-driven innovation seem to be making significant strides in many industries over the past few decades, specifically in areas of workflow automation and helping to reveal hidden insights from various data, there is a widespread sense within the industry that its pace in oil and gas is not progressing as rapidly.
Sridharan Vallabhaneni, data science manager for Halliburton’s Big Data Center of Excellence, said it may just be the general ebb and flow of how the industry works. Once AI is unleashed, however, he is confident it will have a direct and positive impact on revenues, cash flows and capital expenditures.
“It always takes some time for any industry to adopt new technologies,” he said.
This is because even if the technology is available for use, it requires extensive testing before it can be applied broadly. Most E&P organizations are willing to seek to leverage AI, but may not know where and how to begin to do so.
“First and foremost,” Vallabhaneni said, “a data-driven approach needs to evaluate all the data by breaking down silos between the departments within an organization. This approach enables holistic big data analytics and provides hidden insights about the business processes and efficiencies. Right now, this is a well-recognized challenge in our industry.”
One of the reasons that technology is needed in oil and gas is because, for years, the industry has been using been using statistics and geostatistics, most extensively in reservoir characterization to create 3-D property models.
“The problem is these models may not capture geological variations in detail as they rely heavily on spatial interpolation techniques,” said Vallabhaneni.
“A hybrid machine-learning-based approach, on the other hand, can establish a relationship between petrophysical properties and seismic attributes and use this relationship to support the property modeling between the wells,” he added.
Similarly, petrophysicists have been using shallow neural networks for the past several decades for log and core predictions. These models offer a limited degree of customization and do not handle complex scenarios as completely customizable as advanced deep learning or convolutional neural network models offer. With the advancements in computational capabilities, though, scientists are in a better position today to make use of AI by integrating data from multiple sources at different scales and obtaining actionable insights near real time, if not in real time.
The first step before any change happens, though, is to set-up internal teams for data science and digital transformation. Vallabhaneni has no problem with the progress that industry, generally, or specific sectors within the industry, specifically, are making or the time they are taking to do so. Like any technology, as the model matures with a wider range of training data, a more complete transition can take place. These timelines will depend on where an organization is in its transformation journey and the actual business problem that needs to be solved.
He is not pushing for a wholesale change in how the industry operations.
“We would rather call it a transformation rather than a revolution. We believe the industry should progress in a more agile manner rather than completely replacing existing technologies,” he said.
Vallabhaneni emphasized that organizations should only look to prioritize their business challenges with AI and data-driven innovation if and when conventional techniques are not providing satisfactory results.
“If successful, these models can be deployed in real time and organizations can start realizing value sooner,” he said.
He cautions, though, the new methods, by themselves, will not always be better.
“There are cases,” Vallabhaneni said, “where existing techniques may continue to perform better than AI and data-driven approach, especially in cases with high data sparsity.”
As an example, he said, the new methods cannot replace the expertise of a skilled interpreter in all scenarios.
“The better approach is to assess the business problem on a case-by-case basis, look for data from analog basins in case of data sparsity, and evaluate what can be achieved through machine learning to help efficiently deploy the resources of the skilled interpreter,” he said.
Ultimately, the overlapping of the two technologies is actually a good approach.
“We also try to support data-driven models with physics-based models so that they complement each other and provide a better solution. In order to assist this effort, we include a domain scientist, along with data scientists as a part of the scrum team, while working on projects,” Vallabhaneni explained.
Better, Stronger, Faster
He said that when the technologies are up and running, the interpretations may, in fact, be 20 percent more accurate and 10 times faster than is presently the case with conventional technologies.
“Typically, it takes several months to years to evaluate a prospect using conventional techniques and manual interpretations. A geophysicist took about two weeks to manually interpret faults from a 3-D seismic volume. With a well-trained model, we could interpret faults from the same seismic volume in less than a day, including quality control,” he said.
It can be even faster if cloud computing can be leveraged both for seismic attribute generation and to run the model itself.
“On a whole, this will help enable organizations to quickly screen for prospects and make faster business decisions to increase their profitability,” Vallabhaneni said.
Interpretation tasks can often be repetitive and involve pattern recognition or quantitative classification techniques.
“It is a well-known fact from other industries that computer-vision and machine-learning can do similar tasks much quicker than humans,” he said.
AI also can mitigate both human error and bias.
Vallabhaneni also spoke of the transformative effects of AI-based solutions and how they can – though he hates the word – revolutionize the current method of building subsurface models. Overall, AI can overcome some of the limitations of conventional techniques used during the previous few decades without compromising the quality of models. To put this another way, he said, “If a machine can detect a human face, it can help detect a fault from the seismic section.”
Vallabhaneni will lead a discussion at this year’s AAPG Annual Convention and Exhibition called, “Next-Generation Interpretation Workflows.”