The most
wide spread source for subsurface data is 2-D and, increasingly, 3-D seismic.
Data related to boreholes
-- such as well logs and rock samples -- provide crucial complementary
and calibration parameters.
In the past, and even today,
the prevailing approach in the interpretation of these subsurface data
is static.
This means that great efforts
are made to describe subsurface structures and property distributions
in their present state. However, understanding and modeling past geological
processes that were responsible for the present status of the subsurface
has so far not been sufficiently emphasized.
In petroleum exploration and
production it is an essential requirement to understand these past geological
processes -- especially petroleum generation and migration -- which
determine whether or not a trap contains hydrocarbons.
Hence, it is crucial to understand
the dynamics of relevant processes responsible for the present day geological
conditions.
As modeling of geological processes
relies entirely on a subsurface database and related, intelligently
structured data archives (often called data models), it is essential
that the numerical simulation is linked as closely as possible to these
data sources. This is easily achieved by direct binary access to seismic
data and interpretation tools like OpenWorks, GeoFrame, SeisWorks, IESX,
etc.
It is common practice to organize
and store subsurface data in more or less sophisticated data archives
that can be screened and manipulated electronically. An electronic data
archive enables information to be exchanged, reviewed and thereby enriched
and updated.
Even the most refined interpretation
utilizing advanced interpretation software and databases, however, produces
static information for stratal terminations, seismic facies, lithofacies
and property distributions, etc.
Such static data archives can
be brought to life -- and at the same time generate a great deal of
added value -- by dynamically modeling the geological processes behind
it.
The conversion of static data
to a dynamic process interpretation starts with a rigorous analysis
of the stratigraphic time record of the sedimentary column and by assigning
absolute ages. In this way an absolute time sequence of critical geological
events is derived and a conceptual geological process model is created,
forming the backbone of the dynamic process interpretation and the chain
of logics for a computer model.
A petroleum system includes
the entire hydrocarbon source, carrier and accumulation system, and
the goal must be to reconstruct the entire geological history of a petroleum
system, from its origin to the present.
The main focus must be on the
location and 3-D configuration of drainage areas for mature source rocks
through time, and on possible migration pathways to collect the corresponding
hydrocarbon charge.
The modeling of the petroleum
system, i.e. the numerical simulation of the relevant processes, rigorously
follows the geological time axis.
The principle concepts and methods
of this kind of modeling are well established in existing basin modeling
techniques. It commences with the deposition and compaction of the oldest
stratigraphic units at the bottom of the system and works its way upward
through younger and younger events to the present day.
The resulting dynamic modeling
requirements mean that our models must be able to take most important
changing factors through geologic time into account. These include:
- Changing geometries.
- Multi-dimensional, non-steady
state thermal histories.
- Overpressures due to compaction
disequilibrium and hydrocarbon generation.
- Changing hydrocarbon phase
relationships as a function of temperature and pressure.
- Many other processes.
Software programs today can
provide all of this functionality. Petroleum migration processes can
be modeled in two dimensions (2-D) along geological cross sections,
but any attempt to quantify hydrocarbons in a simulated system must
be based on three dimensional (3-D) data archives and modeling techniques.
The geometric resolution depends
on one hand on the quality and resolution of the data, and on the other
hand on such crucial parameters as the grid density, number of cells,
the computational power and allowable computing time.
Due to the need to reduce cycle
times (i.e. the time between acreage evaluation and drilling of a successful
well) in exploration, and the need to run multiple models to test sensitivities,
computing time is an important issue in petroleum systems modeling.
Fast computing times are needed to model changing configurations of
source and migration pathways over geologic times (i.e. 4-D).
Simulation runs that reconstruct
the geological history of a petroleum system inclusive of multi-phase
migration modeling should typically be performed in several hours on
a normal workstation or workstation cluster. Such "short" processing
times can only be achieved at present with hybrid migration simulators
that enable fully integrated 3-D Darcy flow/flowpath (also called ray
tracing) modeling to be performed.
Simulation runs with this technology
not only reconstruct the most likely generation, migration, accumulation
and spilling history in a petroleum system, but at the same time show
possible weaknesses of the 3-D data base and/or inconsistencies in the
conceptual geological model.
Overpressure zones can be fairly
well predicted by geological process modeling, so the technology can
even help to improve seismic interpretations, for instance, with respect
to selecting the right seismic interval velocities in overpressure prone
regions.
The new simulation technology
enables regional scale 3-D models with as many as a million-plus cells
-- and consequently, very reasonable resolutions -- to be processed
within acceptable time spans. It also reduces the risks associated with
upscaling geological models to a point where oversimplifications can
limit their value.
This kind of 3-D modeling can
therefore now be used as a guidance tool and a framework for play and
prospect evaluation throughout an entire exploration campaign. With
new data or insights it can be updated continuously.
The great advantage of this
technology is its potential to directly and immediately provide the
best possible understanding of all crucial processes responsible for
petroleum accumulation in a reproducible and quantitative manner.
Geological process modeling,
thus, is the logical continuation and refinement of static subsurface
data interpretation. It is the crucial step from static to dynamic interpretation
of subsurface data.
Today a complete array of technological
facilities is already available to extend "classical" but static subsurface
data interpretations into dynamic process modeling in a sequential manner
-- firstly seismic interpretation, and secondly process modeling.
The next step is to extend the
integration of the various technologies and data types to create even
more value by adding synergies. It is the provision and availability
of proper interfaces between the relevant software packages and intelligent
tools to interactively manipulate original data and results on both
sides.
This step, without any doubt,
will dramatically accelerate the application of more intelligent (dynamic)
data interpretation tools.
The cost of this type of dynamic
interpretation compares favorably with, for instance, the cost of sophisticated
seismic processing including attribute analysis, or obviously of drilling
dry wells in deep water environments.
All in all, a dynamic interpretation
of subsurface data greatly improves our understanding of crucial geological
processes -- and it narrows down the band width of uncertainties. Furthermore,
it is the ideal vehicle to integrate different geoscientific disciplines,
to create real links between exploration and exploitation data archives
and processing tools.
The result? Logically organized
work flows in interdisciplinary teams.