• Aucun résultat trouvé

Filling with petrophysical properties

N/A
N/A
Protected

Academic year: 2022

Partager "Filling with petrophysical properties"

Copied!
8
0
0

Texte intégral

(1)

Filling with petrophysical properties

One of the main challenges of geostatistics in reservoir characterization is to populate a portion of 3-D earth model with its petrophysical properties. This operation must be carried out while still honoring the information available along well logs: in our jargon, we say that it must be conditional. No restriction is required on this conditioning information, either in number or in its geometry (vertical, deviated or even horizontal wells are allowed).

As geostatistics is taking full advantage of any spatial correlation between variables, it is an essential condition for the data to be transformed into the depositional framework where the horizontal correlation makes sense. In this initial system, we can even assume that faulting is not relevant. This geometrical transformation should be performed beforehand by your package on the well conditioning information. Our piece of software would then populate a regular grid of petrophysical parameters which should be brought back to the stratigraphic position ultimately.

3-D block model in structural position painted in petrophysical parameters

The standard geostatistical operations consist in propagating information only known at few conditioning samples onto a large amount of grid nodes or cells, by linearly combining these data. Hence these techniques are limited to variables where linear combinations make sense (we say that the variables are additive). This is the case for most variables. However let us mention the case of the permeability tensor which does not belong to this category of variables and should be processed only after an adequate transformation.

-0 -0 -0 -0

-0 -0 -0 -0

300

400

500

600

700 300

400 500

600 700

-40 -30 -20 -10

-40 -30 -20 -10

300

400

500

600

700 300

400 500

600 700

(2)

169 526

499 378

345291 244166

12955

0 0

1 1

2 2

3 3

4 4

5 5

6 6

7 7

8 8

9 9

0.00 0.00

0.05 0.05

0.10 0.10

0.15 0.15

0.20 0.20

0.25 0.25

Estimation

The first geostatistical technique (called kriging) is a method which provides the estimation of a given variable, conditioned by some data measured at scattered locations. It produces, in any point in the space (and in particular at the nodes of the regular grid) the most probable interpolated value, constrained by the information. Its prerequisite is narrowed down to the knowledge of the variogram function which measures the spatial continuity of the variable of interest: this is usually referred to as the (geostatistical) model.

Experimental variogram with the number of pairs per lag (thin line) and the fitted model (thick line)

This technique has been gradually improved in order to account for various properties of the variable, such as the presence of measurement errors, the evidence of a drift in a given direction or any other particularity.

Moreover this same technique has been developed in order to study simultaneously several variables which are spatially correlated, introducing the cokriging technique, where the model involves not only the variograms of each variable but also the cross- variogram of variables considered two by two.

A direct application of this multivariate technique is the estimation of a main variable informed on few control points, guided by the knowledge of an auxiliary variable densely sampled. Depending on the nature of the link between the two variables, specific methods have been developed going from the external drift algorithm (for a weak dependency where the auxiliary variable only provides the global shape) to the collocated cokriging (when the link is established in the model).

(3)

Estimation of a variable in a monovariate case (Top) or using an auxiliary variable (Bottom) Model (Left) – Estimation map by (co-)kriging (Right)

This estimation technique carries a smoothing property inherent to its optimization procedure, although still controlled by the model. In parallel, it provides a measure of the accuracy of the result through the variance of the estimation error.

Simulations

Rather than this smoothed estimation, one may be interested instead by simulations which reproduce the variability of the variable of interest, as described by the variogram model. This technique produces several plausible scenarios (or outcomes), each one of them being compatible with the data information, hence the name of conditional simulations.

(4)

Nowadays the emphasis is placed on the different simulation techniques, their performance and ability to reproduce the correct variability.

Gaussian simulations

The traditional simulation consists in generating a gaussian random function (GRF) that obeys to the variogram model. Several techniques have been implemented which lead to different compromises between the quality of the results and the computing time. Let us name the Turning Bands algorithm (which stands as our favorite choice), the Sequential Gaussian algorithm (related to a nice kriging property), and the Spectral algorithm (using a FFT tool) among others.

Simulation outcomes according to the variogram model: hole effect (top), exponential (bottom) Model (left) – Random function (middle) – Random set (right)

This simulation method has the drawback to generate gaussian outcomes. Therefore in order to reproduce the information at the conditioning points, we need a prior conversion of the data from raw to gaussian space and finally to transform the conditional simulations from gaussian back to raw scale. This operation involves an invertible function called gaussian transform.

(5)

-5 -5

0 0

5 5

Gaussian values Gaussian values

0.0 0.0

0.5 0.5

1.0 1.0

Raw values Raw values

Experimental cumulated density function (violet) fitted using the gaussian transform (in blue: thick for valid portion; thin for extrapolation)

This simulation method has been applied repeatedly for populating with various petrophysical properties in homogenous reservoirs. But it fails in reproducing the case where a property experiences abrupt changes in values at short distances, as we move from one lithofacies to another in heterogeneous reservoirs. For that sake, we developed a new technique for simulating first this non-numeric lithofacies variable (we say categorical). Afterwards petrophysical characteristics are attributed to each lithofacies: they can be either considered as constant, or randomized, or even estimated or simulated.

Gaussian transformed techniques

The technique for simulating a categorical variable can be considered as an extension of the gaussian simulation technique. It simply consists in simulating one or two GRFs (controlled by their variogram) and to convert the results into lithofacies by thresholding the GRFs. This method is referred to as the Plurigaussian Truncated method.

This conversion is controlled by an additional parameter which consists of the proportions of each lithofacies. This function usually considers the variations of the proportions along the vertical axis and is represented by a vertical proportion curve.

After the initial horizontalisation, this function describes the sedimentary sequences. It may also take into account lateral variations and must then be described by a proportion matrix.

(6)

Construction of a vertical proportion curve from information along wells

An additional input parameter controls the vicinity of the different lithofacies and is summarized by the truncation scheme. The lithofacies assignment depends on the value of the first GRF (along horizontal axis) and the second GRF (along the vertical axis). The values of the thresholds (expressed in the gaussian scale) which indicate the change of lithofacies are calculated according to the proportions.

Plurigaussian truncation scheme

Usually the two GRFs are considered as independent, but the model can be enriched in assuming their correlation.

depth

% N

W1

W2

W3 W1

depth

sand shale

W2 W3

Bigaussian truncation scheme:

black:G1>s & G2 <t

white:G1>s & G2 >t

grey: G1<s s

t

G1

G2

(7)

Several transition schemes:

a) Strict sequential order: prograding sequence (thresholding of a single GRF)

b) Sequential order with structural constraint: draping of mud mounds (two correlated GRFs) c) Two successive independent episodes: weathering of a sequence limit (independent GRFs)

The real difficulty of this method lies in the conditioning technique as the conditioning data need to be converted into gaussian values consistent with the lithofacies information and still compatible with the models of the two GRFs.

The previous simulation technique, based on the transcription of underlying gaussian random functions, is not meant to generate individualized bodies (such as channels, crevasse splays …). They are sometimes classified as pixel-based simulations in opposition to objects-based ones.

Object based simulations

This technique relies on the basic process of drawing objects at random according to a Poisson process within the simulated area. The shape, dimensions and orientation of the objects are defined by the user using his geological knowledge. Their number if ruled by the intensity of the Poisson process.

a b c

(8)

The most famous technique is the Boolean model which serves in generating a random set (only two lithofacies). The intensity of the Poisson process can be deduced from the lithofacies proportion, when the characteristics of the objects are defined.

Relationship between the proportion (left) and the intensity of the Poisson process which governs the number of channels in a given level of the 3D simulation (right)

A subtle algorithm has been designed in order to allow the conditioning to well information.

The method has been improved for simulating several lithofacies and is currently extended to handle relationships between the different families of objects.

Boolean simulation with two families of objects

Références

Documents relatifs

In addition to automating the production of the code for the forward and inverse functions, our template compiler also (1) creates web services for the forward and inverse

The grid and cloud User Support Environment (gUSE) [11] on the other hand, was specifically created to use distributed computing infrastructures (DCI), but the creation of

Finally, the Gaussian Transform is exemplified in analytical form for typical distributions (e.g. Gaussian, Laplacian), and in numerical form for the Generalized Gaussian and

In recent years the idea of a guaranteed minimum income has received considerable attention, 2 and the response to the pandemic has, in effect, amounted to something like

Let us solve this system of equations by

On IMDb, the BANKS-I versions are the top performing algorithms, followed by BANKS-II, DPBF, and DISCOVER-II; in this case the UNIPD variants perform very simi- larly to the

Creation of workflows in the KNIME Analytics Platform requires a single step, thus the abstract and concrete layers are merged into the user-friendly workflow editor.. Each KNIME

Specifi- cally, we argue that (1) serialized geometry data beyond points and bounding boxes do not need to be expressed in RDF and that (2) geospatial queries on Linked Data