Reservoir models ease uncertainty
Parameters of a good reservoir model are reviewed and what aspects should be considered by oil and gas engineers.
As the use of petroleum for energy grew over the last century, engineers and scientists sought to better understand the nature of oil and gas reservoirs. Their efforts have been rewarded.
Immense data sets related to reservoirs are today transformed by some of the world’s most powerful computers into highly detailed 3-D geological and simulation models. Moreover, many reservoir modeling capabilities are available on PCs and laptops. The models help develop new fields and keep wells productive.
While models embody positive knowledge about a reservoir, often what they offer are probabilities. Reservoir management itself is a dynamic process that recognizes the uncertainties in reservoir performance resulting from the inability to fully characterize reservoirs and flow processes. The model addresses the uncertainties that ensure accessing the reservoir remains as much art as science.
Let’s briefly review some of the measurable parameters and uncertainties involved.
The where and how of it
A reservoir is a place where fluid collects. A petroleum reservoir is a subsurface hydrocarbon pool found in the micro-spaces of porous or fractured rock formations. The naturally occurring hydrocarbons, including crude oil or natural gas, found in a "conventional" reservoir, are trapped by overlaying rock having less permeability than that holding the hydrocarbons.
Exploiting any reservoir requires an in-depth knowledge of its geology. This includes depositional history, reservoir layering, relations among its various elements, rock attributes and fluid content, as well as fluid behavior within the geological setting.
Other types of measures relevant to the reservoir are derived from scientific and engineering disciplines that include geography, geology, geophysics, petro-physics, geo-mechanics, petroleum engineering and production economics.
These various measures come together in the reservoir model, where the relationships among the parameters are discovered, explored, and optimized. Rather than leave a project dependent on the engineers’ intuitive grasp of the details documented in a large body of paper documents, the interactive model exists as a single source of truth for all things reservoir, integrating multiple kinds of reservoir data acquired or generated by the engineering disciplines.
As such, the reservoir model is the basis for estimating a reservoir’s production-recovery potential and for deciding the methods of exploitation to be pursued, including investments and other important decisions.
Evolution of a tool
A well-made model is key to effective reservoir management. Geological and simulation models are used in hydrocarbon exploration, delineation, development, and production and are decision-making tools for geology and geophysics professionals, as well as oil and gas industry reservoir engineers and asset managers.
Since the introduction of reservoir models as a tool for upstream oil and gas production environments, they have evolved from a 2-D form-in other words, a 2-D stacked approach-to the now common 3-D form. In fact, the latest models encompass a fourth dimension, tracking changes in reservoir attributes over time. Finally, increasing input of real-time data into models increases their value.
Models support decisions on where to drill, what production strategies to adopt and how to maximize oil and gas recovery from operator assets. Accurate reservoir models form the basis for increased recovery rates. A good model not only reflects the reservoir accurately, it supports assessment of the environment’s inherent uncertainties and risks.
Users are challenged to keep reservoir models current, whether in oil and gas or elsewhere. Ease in model construction and updating alleviate this challenge and prepare the way for increased use of modeling. A cost-effective, pragmatic approach to modeling is especially important in today’s low-price oil and gas environment.
As noted, good reservoir models are based on interdisciplinary knowledge. Before building a reservoir model, asset team members first work together to confirm the model’s purpose. Questions to consider include the following:
- Is the model for short-term volume analyses only or is it meant to be a sector model? A sector model is smaller and carved out of a larger regional model. It supports quick-look scenarios when trying to understand near-well-bore and drive-mechanism effects.
- Will the model be a foundation for the complete field lifecycle?
- Will the model be flexible enough to expand the areas of investigation? Based on production data, for example, can existing layers be modified and new layers and faults incorporated easily?
- What if some barriers are not structural but lithological? (Editor’s note: the lithology of a rock unit is a description of its physical characteristics visible at the outcrop, in hand or core samples or with low magnification microscopy, such as color, texture, grain size, or composition.)
The answers dictate the approach. A good model addresses each discipline’s concerns and is flexible enough to accommodate new data and goals.
Flexibility also means incorporating uncertainties into the model when they matter. Uncertainties enter the model’s workflow at different stages. At each stage they are analyzed, qualified, and if need be, incorporated. If two or three geophysical interpretations are postulated, all three would be incorporated into the workflow. This could lead to three or more structural models, likewise incorporated. Ultimately, the workflows must include those uncertainty parameters that impact reservoir volumes.
Kinds of uncertainties include the following:
- Measurement uncertainties arising during data collection, due to tool calibration or operation specification.
- Interpretation uncertainties relating to data analysis. For example, identical data is interpreted differently by different petro-physicists, giving rise to varying results. Similarly, geophysicists’ interpretations differ.
- Workflow uncertainties can be introduced, for example, during the upscaling of a fine-resolution geological model to a coarser-resolution simulation model due to the upscaling techniques used or geological-grid orientation relative to the simulation grid. Upscaling is necessary to overcome model-size or hardware limitations.
Uncertainties also arise due to discrepancies between the scale at which data is measured and that at which it is applied. For example, in a reservoir model, initial uncertainty may arise when upscaling from the log resolution (typically well-log data is measured in half-foot increments) to the geo-model resolution. Typically, the data is transferred to 5 to 10 feet geo-modeling cells in the vertical direction and to several magnitudes—that is, 50 to 200 feet—in the x and y direction. Similar uncertainty arises when upscaling core data to the geo-modeling grid or upscaling geo-modeling grid data to the simulation grid, coarser in resolution than the geological grid.
Uncertainties due to sparse conditioning data must be accounted for in modeling different geological scenarios. For example, fluvial channel characteristics are rendered different ways when the well data is too sparse to guide the volume fraction parameter, or if there is not enough seismic data to establish the channel width and azimuth parameters. These are called "parameter uncertainties" and can result in the scenarios shown in figure 3.
The same input data can yield narrow or wide channels as traversing the same wells. Stochastic techniques also derive multiple realizations or outlooks using the same input data.
In fact, many narrow-channel realizations or outcomes are possible by changing the seed number to produce the corresponding realization, giving rise to the "random seed uncertainty."
In simple terms, a seed number is an integer (a whole number) that introduces a random element into the calculation of a predicted outcome. Different seed numbers realize different narrow-channel outcomes. Uncertainties can be reduced, however, by characterizing the model using data inputs drawn from a wider range of disciplines.
Thus, if a seismic attribute contains a channel system imprint to condition a reservoir model, it eliminates scenarios of width and azimuths that deviate from that channel imprint, reducing uncertainty.
More about workflow
A good reservoir model can be updated quickly and efficiently. User friendliness must be evidenced in appropriate model-building steps that are documented in detail. When new data arrives and updates are called for, modeling tasks must be easy to comprehend and execute.
The workflow and attendant documentation inform asset team members about the model’s construction, which uncertainties were considered, and parameter cutoffs and algorithms used to calculate model attributes. Workflow notes capture the "custom" aspects of the model.
A correct workflow allows model attributes to be reproduced time and time again. When new data is added, quick, automatic updates save time. A properly integrated workflow encompasses the contributions of many disciplines and synchronizes the geological and dynamic models. New input permeates throughout the entire workflow, from seismic to simulation and beyond.
We’ve looked at some reservoir model attributes that ensure its use as a management decision-making tool. Several things are needed to benefit: to gather knowledge across the engineering disciplines; to understand the model’s purpose from the get-go; and to address by means of simulation the uncertainties that most impact reservoir behavior.
A user-friendly workflow and a focus on integration from seismic to simulation are the best way to address uncertainty through reservoir modeling. Get these elements right and reservoir modeling secures its rightful place as a focal point for decision-making and a crucial tool in helping operators develop and produce resources.
Raj Damodaran is chief geoscientist for Roxar at Emerson Process Management.