5905657 | Performing geoscience interpretation with simulated data | 1999-05-18 | Celniker | 364/578 |
5838634 | Method of generating 3-D geologic models incorporating geologic and geophysical constraints | 1998-11-17 | Jones et al. | 367/73 |
5648937 | Method and apparatus for correlating geological structure horizons from velocity data to well observations | 1997-07-15 | Campbell | 367/27 |
5539704 | Bayesian sequential Gaussian simulation of lithology with non-linear data | 1996-07-23 | Doyen et al. | 367/73 |
5416750 | Bayesian sequential indicator simulation of lithology from seismic data | 1995-05-16 | Doyen et al. | 367/73 |
5321613 | Data fusion workstation | 1994-06-14 | Porter et al. | 364/420 |
5103920 | Surveying system and method for locating target subterranean bodies | 1992-04-14 | Patton | 175/45 |
4679174 | Method for seismic lithologic modeling | 1987-07-07 | Gelfand | 367/38 |
4340934 | Method of generating subsurface characteristic models | 1982-07-20 | Segesman | 364/422 |
FR2755244A1 | 1998-04-30 | |||
WO1998027498A1 | 1998-06-25 | MODELING GEOLOGICAL STRUCTURES AND PROPERTIES |
This application claims the benefit of U.S. Provisional Application No. 60/119,821, filed Feb. 12, 1999, and U.S. Provisional Application No. 60/165,333, filed Nov. 12, 1999.
This invention is related to subsurface modeling, and is more particularly concerned with a parametric subsurface modeling method, apparatus, and article of manufacture that use uncertainty estimates of subsurface model parameters.
Subsurface models are typically created by geoscientists and engineers to allow development strategies for the subsurface area to be evaluated. Models of this type are commonly created in connection with the development of hydrocarbon reservoirs and mining sites, but they can also used during drilling and related activities where the physical properties of the subsurface area are important. This patent application will focus on the process of creating and updating a model of a subsurface hydrocarbon reservoir, but it should be understood that this merely represents one specific example of how a model of any subsurface area may be created and updated.
Currently, hydrocarbon reservoir modeling is performed most commonly in high-risk, high-profile situations. Typical applications include discoveries in new areas, deepwater exploration, fields in which production surprises or drilling hazards have been encountered, fields in which secondary and tertiary recovery activities are planned, and fields which are being considered for sale or abandonment. The failure to adequately model hydrocarbon reservoirs can have numerous adverse financial consequences, including inaccurate reserve calculations, drilling or completion problems, improper production facility sizing, and suboptimal well placement.
The general problem addressed by this invention is how to construct a model of a subsurface area that is in agreement with multiple sets of measurement data. A model that is in agreement with all of the measurement data obtained from the reservoir can help address many of the problems noted above. By ‘reservoir model’ we mean a quantitative parameterized representation of the subsurface in terms of geometries and material properties. The geometrical model parameters will typically identify geological boundaries, such as contacts between different geologic layers, faults, or fluid/fluid interfaces. The material model parameters will typically identify properties of distributed subsurface materials, such as seismic wave velocities, porosities, permeabilities, fluid saturations, densities, fluid pressures, or temperatures.
By ‘agreement’ we mean that the data predicted from the reservoir model fit measurements made on the actual reservoir (seismic data, drilling data, well logging data, well test data, production history data, permanent monitoring data, ground penetrating radar data, gravity measurements, etc.). Virtually all types of measurement data have quantifiable uncertainties and the reservoir model agrees with the measurement data when the difference between data predicted by the reservoir model and measurement data obtained from the reservoir is less than this inherent measurement uncertainty. While creating a reservoir model that fits one data set is a relatively straightforward task, it is much more difficult to ensure that the model is in agreement with multiple data sets, particularly if the data sets consist of different types of data.
A reservoir model, however, is nonunique even if it is made to fit a variety of data, because different values of material properties and geometries within the model can result in similar predicted measurement values. In other words, the reservoir model has inherent uncertainties: each of the numerical parameters in the reservoir model (e.g., values of material properties within a layer) can take a range of values while the model remains in agreement with the data. This range in parameter values is the uncertainty associated with the reservoir model. The invention described herein is a method to integrate information from multiple measurements and to obtain a reservoir model with quantitative uncertainties in the model parameters. A model of the reservoir that fits the data and has quantified uncertainties can be used to assess the risk inherent in reservoir development decisions (e.g., deciding on the location of additional wells) and to demonstrate the value of additional measurements by showing how these measurements decrease uncertainties in model parameters of interest (e.g., the location of a drilling target or hazard).
A Shared Earth Model (SEM) is a geometrical and material property model of a subsurface area. The model is shared, in the sense that it integrates the work of several experts (geologists, geophysicists, well log analysts, reservoir engineers, etc.) who use information from a variety of measurements and interact with the model through different application programs. Ideally, the SEM contains all available information about a reservoir and thus is the basis to make forecasts and plan future actions.
Yet, in any practical case, the information in the measurements is not sufficient to uniquely constrain the parameters (geometries. and material properties) of a SEM. As noted above, any SEM has an associated uncertainty, defined here as the range that model parameters can take while fitting available measurements.
The invention has two primary aspects. The first aspect is a method to quantify and update model parameter uncertainties based on available measurements. One embodiment of this method is based on Bayes' rule, with SEM uncertainty quantified by a posterior probability density function (PDF) of the model parameters, conditioned on the measurements used to constrain the model. This posterior PDF may be approximated by a multivariate normal distribution, which is fully described by the posterior mean and covariance matrix of the SEM parameters. Alternatively, one can use a Monte Carlo method to obtain a sample of models drawn from the posterior PDF. This sample of models spans the uncertainty implied by the measurements.
The second aspect is how such a measure of uncertainty acts as a ‘memory’ of the SEM and can be used for consistent model updating. Quantified uncertainties provide a mechanism to ensure that updates of the SEM based on new data (e.g., well data) are consistent with information provided by data examined previously (e.g., surface seismic data). In particular, we show through a simple example how the effects of a local update of the model can be propagated using the posterior covariance matrix of the SEM parameters. We also show how to update a sample of models obtained by the Monte Carlo method to include new information.
The ideal of a SEM is that all specialists should be able to interact with a common geometry and material property model of the reservoir, incorporating changes into the model using measurements from their own domain of expertise, while maintaining model consistency with previous measurements. This SEM representation would always be consistent with all available information and should be easy to update as soon as new measurements become available (e.g., from additional wells). Model building would not be a task done episodically, but instead the reservoir model would evolve incrementally as more and more information becomes available during development and production.
While acquiring more measurements can reduce uncertainty, it is important to weigh the cost of data acquisition against the benefits of reducing uncertainty. This can be done using the tools of decision theory, where different decisions are compared given their associated gains/costs and current uncertainties. A consistent quantification of uncertainties can assist oil companies in making data acquisition, drilling, or development decisions.
Currently, reservoir models are simply modified to fit new data and confirming that the modification is not inconsistent with the previously obtained measurement data is left up to the discretion of the user. The reservoir model may be the result of years of effort and may incorporate measurement data from a wide variety of sources. A user will often only confirm that the change made is not inconsistent with the measurement data within his or her area of expertise (a well log analyst may confirm, for instance, that the change made is consistent with the other well logging data, but may not determine whether the change has introduced an inconsistency with the seismic or geologic data from the area). Many reservoir simulations rely heavily on production data from wells and only four types of geological or geophysical reservoir information: structure of the top of the reservoir, reservoir thickness, porosity, and the ratio of net pay to gross pay. These maps are often constructed from seismic and well log data alone. Incorporating all available data, such as core analyses, seismic-guided reservoir property distributions and fluid analyses, and making certain that the reservoir model is consistent with these different types of data is a cost-effective way to stregthen and validate reservoir models across disciplines.
An iterative method to obtain a model that fits some of the data has been described by George W. Celniker in commonly-assigned U.S. Pat. No. 5,905,657, issued Mar. 18, 1999 and incorporated herein by reference. In the Celniker method, the user examines the difference between predicted and measured data, modifies the model attempting to decrease this difference, and repeats the procedure until the fit is satisfactory. This procedure may be adequate if all data sets are considered simultaneously, which may be impractical for diverse and large data sets. If instead the model is modified to fit each of the N data sets in turn (say, from d
The invention comprises a parametric subsurface modeling method, apparatus, and article of manufacture that use measurement data to create a model of a subsurface area. The method includes the steps of: creating a parameterized model having an initial estimate of model parameter uncertainties; considering measurement data from the subsurface area; updating the model to fit the measurement data, the updated model having an updated estimate of model parameter uncertainties; and repeating the considering and updating steps with additional measurement data. A computer-based apparatus and article of manufacture for implementing the method are also disclosed. The method, apparatus, and article of manufacture are particularly useful in assisting oil companies in making hydrocarbon reservoir data acquisition and field development decisions. Features of the invention, preferred embodiments and variants thereof, possible applications and their advantages will become appreciated and understood by those skilled in the art from the following detailed description and drawings.
The model of the subsurface area will typically have geometrical model parameters representing geological boundaries and material parameters representing properties of distributed subsurface materials. The model of the subsurface area may be, for instance, a layered medium representing a layered earth with material properties that are constant or variable within each layer; a geocellular model having material property values defined on a regular or irregular three-dimensional grid; or may be a geometry-based model having material property values defined on a plurality of discrete geometrical sub-regions within the subsurface area.
The initial information may consist of prior knowledge of the spatial distribution of material properties in the subsurface, e.g., the increase of seismic velocity with depth. The initial information may come from physical laws or measurements made in subsurface areas other than the one being modeled. The initial measurement data may consist of seismic data, drilling data, well logging data, well test data, production history data, permanent monitoring data, ground penetrating radar data, gravity measurements, etc. or various combinations of these types of data.
The initial estimate of model parameter uncertainties will typically consist of probability density functions, and preferably consist of multivariate normal/lognormal probability density functions definable by mean vectors and covariance matrices.
In the Consider Measurement Data Step
In the Update Model and Uncertainty Estimate Step
The Consider Measurement Data Step
The Measurement Data
At least two additional alternative methods for managing the model update process are possible. In some cases, it may be preferable to allow the model parameters to be changed only when the change is consistent with the earlier estimate of model parameter uncertainties. In other cases, Identify Inconsistencies
The Consider Measurement Data Step
A preferred embodiment of the inventive method will now be described in substantially more detail. The inventive method addresses two primary issues: how to quantify uncertainties in a SEM given measurements and how to use these uncertainties to ensure consistent model updating. The latter is an important issue because in a SEM environment one should be able to continuously update the model; however, model updates based on a set of new data must be consistent with the information provided by data examined previously. We will now show how to generally address these issues using two simple examples where model uncertainties are calculated and updated using seismic and well data.
The first example will be used to illustrate the quantification of uncertainty in a multivariate normal distribution and consistent model updating environment and will use a simple two-dimensional SEM containing three layers (see
We suppose that at the outset there are measured surface seismic data in the form of four traces recorded at different locations (
The second example, see
Quantifying SEM Uncertainty from Measurements
To quantify uncertainty, we use the Bayesian approach widely adopted in the statistical and geophysical inversion literature. Denote the SEM parameters with a vector m=[m
where p(m|d, I) is the posterior PDF (the PDF of the SEM parameters m given the data d, p(m|I) is the prior PDF, L(m|d, I) is the likelihood function, and α denotes proportionality. It might seem an impossible task to write an expression for the model PDF given the poor state of knowledge of the reservoir. This is possible by making the PDFs conditional on some prior information I, which includes the parametric form of the model, the noise model, and the accuracy of the physics and simulators used to predict the measurements for a given value of the model parameters.
The prior PDF quantifies what is known about the model parameters from the prior information only, i.e., independently of the information provided by the data. For the initial SEM of diagram
This prior PDF represents an initial state of information where the layer thicknesses are unknown, while the prior PDF of the velocities reflects what is expected for sedimentary rocks.
Information provided by measurements is quantified by the likelihood function, which is formally the probability of observing the data d when the model parameters equal m. For the data of diagram
where g(m) is a forward modeling (i.e. simulation, prediction) operator that returns the value of the data calculated for a given value of the SEM parameters. In our case, this operator gives the data computed by convolving a seismic wavelet (assumed known) with the reflection coefficient sequence corresponding to the parameters in the SEM. The combination of the prior and likelihood tells us what we know about the model parameters a posteriori.
Diagrams
Diagrams
In this approximation, the posterior PDF is fully described by the mean value of the model parameters μ and by the covariance matrix C. The mean (indicated by the white triangles
A general practical recipe to evaluate μ and C uses nonlinear optimization of an objective function
A generic optimizer (e.g., quasi-Newton) can be used to find the maximum of the objective function, and the value of m at the maximum can be taken to correspond to the posterior mean μ. The posterior covariance matrix C can be computed as the inverse of the Hessian matrix (the matrix of second derivatives) of the objective function evaluated by finite differences around μ.
The result of applying the nonlinear optimization procedure to the SEM and data of diagrams
Generic optimization algorithms are typically ‘local’, in the sense that they find a maximum by moving toward higher values of the objective function from a starting point. Therefore, if the objective function has multiple maxima (as often is the case for band-limited seismic data), the optimizer may converge to a meaningless local maximum. If there are multiple local maxima, these optimizers will converge to the global maximum only if they start from a value of m that is close to (in the sense of being downhill from) that maximum. For these optimizers to be useful in practice, the user should have the capability to search for a reasonably good starting point by trial-and-error interactions with the SEM.
The uncertainties computed from the Hessian matrix are also ‘local’ because they are obtained from the local curvature of the objective function near its maximum. The uncertainties computed in this fashion will be accurate only if the objective function is well approximated by a quadratic, i.e., if the posterior PDF is well approximated by a multivariate normal distribution. An alternative is to use a Monte Carlo sampling strategy where values of the model parameter vector m are sampled from the posterior PDF. While uncertainties computed from the Hessian matrix are likely to be useful in many instances, there may be cases where a Monte Carlo approach is necessary to obtain a sufficiently accurate uncertainty quantification.
The Monte Carlo approach is illustrated in the example of FIG.
where n is the number of layers, t is a vector of travel times to the layer interfaces, and v is a vector of compressional wave velocities or acoustic impedances in each of the layers.
The three profiles
A method that may be used to obtain a sample from the posterior PDF is the Metropolis-Hastings algorithm. Each step of the algorithm consists of choosing a “candidate” layered medium by perturbing the current one (e.g., by adding a layer, deleting a layer, or changing the travel time to a layer interface). This amounts to choosing a candidate parameter vector m′ from a “candidate PDF” q(m′|m), that is the distribution of possible candidates given the current value m of the parameter vector. In the Metropolis-Hastings algorithm, the candidate is accepted with probability
It can be shown that a sequence of models obtained with this simple algorithm will eventually sample the posterior PDF p(m|d, I). Once a sample of layered media is obtained, it is easy to convert travel time to depth by starting from a known travel time-depth tie and computing the thickness of the i-th layer h
where deltat
The Monte Carlo approach provides a more detailed and accurate quantification of uncertainty compared to the multivariate normal distribution method described above. A Monte Carlo approach such as that shown in
Using SEM Uncertainty for Consistent Model Updating
The uncertainties computed and displayed in diagram
The most general way to update the model and its uncertainty when new data become available is illustrated in diagrams
An important use of quantified uncertainty is in propagating model updates to ensure consistency of the SEM with previously examined data. We illustrate the first method for consistent model updating, which is applicable when the uncertainty is quantified in a multivariate normal PDF form, by considering what happens when new information from a well becomes available. The simplest update is illustrated in diagrams
If layer thicknesses in the model of diagrams
In the case of time-depth conversion this is not unrealistic, and the consequences of changes in layer thicknesses can be calculated in a straightforward way. But in the general case where a SEM has been constrained by a variety of data, there is no obvious way to compute how the effects of a local update should be propagated to maintain consistency.
We now describe a simple mechanism where quantified uncertainties are used to propagate the effects of a local model update to ensure consistency of a SEM with data examined previously. The mechanism uses the transmission of information provided by Bayes' rule (diagrams
At the limit where the layer thickness is fixed with absolute certainty by the well observations (i.e., the rectangle in chart
where the vector m
It can be shown that the mean of the parameters in m
and the conditional covariance of the parameters in m
This approach can be easily applied to the example of
Diagrams
The limitations of this mechanism follow from the limitations on the quantification of uncertainty noted in the previous section. If the multivariate normal representation we use here does not approximate the posterior PDF closely, the computed consistent update will suffer. In our example, it is clear that while the seismic data predicted by the consistently updated SEM are close to the measurements, the fit may be improved (compare diagrams
On the other hand, the updated model should be close to the best value and thus an automated optimization applied at this stage should have a good chance of succeeding. Starting from the model of diagrams
In the previous section we also described a Monte Carlo approach to quantify uncertainty. If the model uncertainty is quantified in a multivariate normal distribution whose mean μ and covariance matrix C are evaluated from the result of Monte Carlo sampling, the consistent update method we have illustrated above can be used to compute a quick update, i.e. without requiring the Monte Carlo sampling process to be repeated. We now illustrate a second method that uses additional information to directly update the sample from the posterior PDF obtained by a Monte Carlo method.
The method we describe can be applied when the additional information that becomes available directly controls the properties of the posterior PDF sample. In the example of
Diagrams
Diagram
Uncertainty quantification and consistent model updating can improve significantly the efficiency of constructing and modifying a SEM. Because industry-standard interpretation workflows don't account for uncertainty, model consistency is maintained by making elements of the model ‘rigid’ as the interpretation progresses down the workflow. The term ‘rigid’ means here that once a domain expert has set the optimal values for parameters under his control (e.g., a geophysicist interpreting the model framework), these values are not changed by later experts (e.g., a reservoir engineer) for fear that the model will no longer be consistent with the previous data. For example, once the model framework is fixed, the reservoir engineer is only left flow parameters to adjust during history matching. Explicitly accounting for uncertainty would allow that same reservoir engineer to adjust all parameters of the model within their acceptable ranges to obtain the best history match. The remaining uncertainty in the model then provides input to decision-making tools which, for example, can be used to estimate a PDF of the net present value of a reservoir. Thus, uncertainty quantification allows us to approach the ideal of a SEM that is constrained by as many data as possible, can be easily updated, and ties directly into decision-making tools.
The foregoing description of the preferred and alternate embodiments of the present invention have been presented for purposes of illustration and description. It is not intended to be exhaustive or limit the invention to the precise examples described. Many modifications and variations will be apparent to those skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, thereby enabling others skilled in the art to understand the invention for various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the accompanying claims and their equivalents.