• Aucun résultat trouvé

A CROSS SECTION THEOREM. II PETER GRAY Several applications of a one-parameter stochastic process (

N/A
N/A
Protected

Academic year: 2022

Partager "A CROSS SECTION THEOREM. II PETER GRAY Several applications of a one-parameter stochastic process ("

Copied!
8
0
0

Texte intégral

(1)

PETER GRAY

Several applications of a one-parameter stochastic process (Xt)t∈R+ feature the predictable projectionpXtand the dual predictable projectionXtpofXt. Certain results surrounding these projections are derived using the Cross Section Theorem.

(For a statement of the Cross Section Theorem, see [D-M, p. 137].) The treatment of a two-parameter stochastic process (Xs,t)s,t∈R+ is similar: first a cross section theorem is established, after which it is used to derive uniqueness results about the projections ofXs,t. (For other results pertaining to two-parameter processes, see [C-W], [Dz], [L], and [W].) In our previous paper [G] we presented a cross section theorem based on the double filtration Fs,t = Fs− for s, t 0. The theorem states that for any predictable subsetA ×R2+ and for any > 0 there exists a predictable stopping time Z : Ω [0,∞]×[0,∞] for the double filtrationFs,t=Fs−such that [Z]AandP(π[A])P(π[ [Z] ]), where [Z]

denotes the graph ofZ,πis the projection onto the set Ω, andP the probability measure on Ω. In this paper we use this result to derive a cross section theorem for the more general double filtration Fs,t = Fs∨t− with s, t 0. Under this double filtration, desirably, information from the second time parametertis not lost. We will prove that for any predictable subsetA×R2+and for any >0 there exists a predictable stopping timeZ : Ω [0,∞]×[0,∞] for the double filtrationFs,t=Fs∨t−such that [Z]AandP(π[A])P(π[ [Z] ]).

AMS 2000 Subject Classification: Primary 60G020; Secondary 60G40.

Key words: two-parameter stochastic process, cross section.

1. PRELIMINARIES The framework for this paper consists of

• a probability space (Ω,F, P);

• a filtration (Ft)t∈R+ of subalgebras ofFsatisfying the usual conditions, and such that for the filtration (Gs)s∈R+ defined byGs=Fs−for every s∈ R+ we have GS = FS− for every predictable stopping time S for (Gs)s∈R+; and

• a double filtration (Fs,t00 )s,t∈R+ such thatFs,t00 =Gs∨tfor everys, t∈R+. (See [D, p. 323] for more on double filtrations.)

REV. ROUMAINE MATH. PURES APPL.,54(2009),1, 25–32

(2)

2. THE PREDICTABLE σ-ALGEBRA ℘00

In this section we characterize theσ-algebra ℘00 of subsets of Ω×R2+ in terms of σ-algebras which have already been studied. This will enable us in ensuing sections to readily provide results pertaining to℘00-measurable subsets A of the set Ω×R2+.

Definition 2.1. LetX: Ω×R2+→Rbe a two parameter process.

2.1aX isleft continuous if for each s0, t0 ∈R+ and $∈Ω we have

(s,t)→(slim0,t0) s≤s0

t≤t0

Xs,t($) =Xs0,t0($).

2.1bXisadaptedto the filtration (Fs,t00 ) if for eachs, t∈R+the function Xs,t is Fs,t00 -measurable.

2.1c The predictable σ-algebra of subsets of Ω×R2+ is the σ-algebra generated by left continuous, two-parameter processes X which are adapted to (Fs,t00 ). We denote thisσ-algebra by ℘00.

Remark. LetB : Ω×R+ →Rbe a left continuous, one parameter process that is adapted to the filtration (Ft)t∈R+ (for example, Brownian motion; see [O, p. 72]). Then the two-parameter process X : Ω×R2+ → R given by Xs,t($) =Bs($)Bt($) for every ($, s, t)∈Ω×R2+ is℘00-measurable.

Definition 2.2. The σ-algebra ℘ of subsets of Ω×R2+ is the σ-algebra generated by left continuous, two-parameter processes X which are adapted to the double filtration (Fs,t)s,t∈R+ given byFs,t =Gsfor everys, t∈R+. The σ-algebra ℘0 of subsets of Ω×R2+ is the σ-algebra generated by left contin- uous, two-parameter processes X which are adapted to the double filtration (Fs,t0 )s,t∈R+ given by given by Fs,t0 =Gt for everys, t∈R+.

Remark. For results surrounding the σ-algebra ℘ (and by analogy, ℘0), see [G].

Notation 2.3. LetU denote the set Ω× {(s, t)∈R2+ |s≤t}, letLdenote the set Ω× {(s, t) ∈ R2+ | s > t}, and for each n ∈ N let Un denote the set Ω× {(s, t)∈R2+|s, t= 0 or (1−1n)s < t}.

Notation 2.4. Let A and B be collections of sets. We denote by A ∨ B the collection {A∪B |A∈ AandB ∈ B} .

Theorem2.5. We have℘00= (℘∩L)∨(℘0∩U).

Proof. Set Q := (℘∩L)∨(℘0∩U). First, we show that Q ⊂℘00. The generators of the σ-algebra ℘∩L are the processes of the form X1L, where X : Ω×R2+ → R is left continuous and adapted to the double filtration

(3)

(Fs,t)s,t∈R+. Each such generatorX1L is left continuous and adapted to the double filtration (Fs,t00 )s,t∈R+. In fact,X and1L are both left continuous, and for any s, t∈R+ we have

Xs,t,(1L)s,t ∈ Gs⊂ Gs∨t=Fs,t00 . We conclude that ℘∩L⊂℘00.

Next, the generators of the σ-algebra ℘0 ∩U are processes of the form Y1U, where Y : Ω×R2+ → R is left continuous and adapted to the double filtration (Fs,t0 )s,t∈R+. Let the processY1U be such a generator. We have

Y1U= lim

n Y1Un,

and for each index n we have Y1Un ∈ ℘00, because both Y and 1Un are left continuous, and for any s, t∈R+ we have

Ys,t,(1Un)s,t∈ Gt⊂ Gs∨t=Fs,t00 .

We conclude that the process Y1Uis ℘00-measurable. Therefore, we have℘0∩ U⊂℘00. It follows that Q ⊂℘00.

Lastly, we show that ℘00 ⊂ Q. Let X : Ω×R2+ → R be a generator of

00; that is, let the process X be left continuous and adapted to the double filtration (Fs,t00 )s,t∈R+. Note that the processX1L is℘∩L-measurable. In fact, X and 1L are both left continuous, and so X1L is left continuous. Further, for any s, t∈R+ we have

(X1L)s,t =

( 0∈ Fs,t ifs≤t, Xs,t∈ Fs,t00 =Gs∨t=Gs=Fs,t ifs > t.

So, X1L ∈ ℘. Then we have (X1L)1L ∈ ℘∩ L; that is, X1L is ℘∩ L- measurable. Further, the process X1U is ℘0 ∩U-measurable. In fact, let Y : Ω×R2+→Rbe the process given by

Ys,t($) =

( Xs,t($) if$∈Ω and 0≤s≤t, Xt,t($) if$∈Ω ands > t.

Then Y is left continuous, and for each s, t∈R+ the random variable Ys,t is Gt-measurable. Hence Y is ℘0-measurable, and so the process X1U = Y1U is ℘0 ∩U-measurable. We conclude that the process X = X1L +X1U is Q-measurable. It follows that℘00⊂ Q.

Corollary2.6. LetX : Ω×R2+→Rbe a two-parameter process. Then X is ℘00-measurable iff X1L is℘-measurable and X1U is ℘0-measurable.

(4)

Proof. Assume first that X is ℘00-measurable. Then the process X1L is

00∩L-measurable. But we have

00∩L= ((℘∩L)∨(℘0∩U))∩L by Theorem 2.5

=℘∩L sinceU∩L=∅;⊂℘ sinceL∈℘.

Hence the process X1L is ℘-measurable. Also, the process X1U is ℘00∩U- measurable. But we have

00∩U = ((℘∩L)∨(℘0∩U))∩U =℘0∩U ⊂℘0 sinceU ∈℘0

(because 1U = limn1Un, and each process 1Un is ℘0-measurable). Therefore, the process X1U is ℘0-measurable. Next, assume that X1L is ℘-measurable and X1U is ℘0-measurable. Then the process X1L = (X1L)1L is ℘ ∩ L- measurable, and the process X1U = (X1U)1U is ℘0∩U-measurable. Hence the processX1L+X1U is (℘∩L)∨(℘0∩U)-measurable; that is, the process X is℘00-measurable.

3. STOPPING TIMES

In this section we define what is meant by a stopping time and a predict- able stopping time, for each of the three double filtrations of interest.

Definition 2.7. LetZ : Ω→[0,∞]×[0,∞] be a function.

2.7a We say that Z is a stopping time for the filtration (Fs,t00 )s,t∈R+

(respectively (Fs,t)s,t∈R+, (Fs,t0 )s,t∈R+) if the set{Z ≤z} is an element of Fz00 (respectively Fz,Fz0) for everyz∈R2+.

2.7b LetZ be a stopping time for (Fs,t)s,t∈R+. We define theσ-algebra FZ ⊂Ω byFZ ={A∈ F |A∩ {Z ≤z} ∈ Fz ∀z∈R2+}. Let Z be a stopping time for (Fs,t0 )s,t∈R+. We define the σ-algebra FZ0 ⊂ Ω by FZ0 = {A ∈ F | A∩ {Z ≤z} ∈ Fz0 ∀z∈R2+}. Let Z be a stopping time for (Fs,t00)s,t∈R+. We define the σ-algebra FZ00 ⊂Ω byFZ00 ={A∈ F |A∩ {Z ≤z} ∈ Fz00 ∀z∈R2+}.

2.7c We say that the stopping time Z = (S, T) for (Fs,t)s,t∈R+ is a predictable stopping time for (Fs,t)s,t∈R+ if

• S is a predictable stopping time for the filtration (Gs)s∈R+, and

• {S <∞} ⊂ {T <∞}.

Note, for any predictable stopping time Z = (S, T) for (Fs,t)s,t∈R+ we have FZ =FS−.

2.7d We say that the stopping time Z = (S, T) for (Fs,t0 )s,t∈R+ is a predictable stopping time for (Fs,t0 )s,t∈R+ if

• T is a predictable stopping time for the filtration (Gs)s∈R+, and

• {T <∞} ⊂ {S <∞}.

(5)

Note, for any predictable stopping time Z = (S, T) for (Fs,t0 )s,t∈R+ we have FZ0 =FT.

2.7eWe say that the stopping time Z = (S, T) for (Fs,t00 )s,t∈R+ is apre- dictable stopping time for (Fs,t00)s,t∈R+ if the following three conditions hold.

• Z{S>T} := (S{S>T}, T{S>T}) is a predictable stopping time for (Fs,t)s,t∈R+.

• Z{S≤T} := (S{S≤T}, T{S≤T}) is a predictable stopping time for (Fs,t0 )s,t∈R+.

• The supremum S∨T of S and T, is a predictable stopping time for (Gs)s∈R+.

Note, it can be shown that for any predictable stopping time Z = (S, T) for (Fs,t00 )s,t∈R+ we have FZ00 = GS∨T. Accordingly, for any predictable stopping time Z = (S, T) for (Fs,t00 )s,t∈R+ we have FZ00 =FS∨T.

Here are three examples of predictable stopping times for (Fs,t00 )s,t∈R+.

•Z = (S, T) such that S, T are predictable stopping times for (Gs)s∈R+. In fact, the stopping time S{S>T} for (Gs)s∈R+ is predictable because {S > T} ∈ FS−, and further we have

{S{S>T} <∞}={S <∞} ∩ {S > T}={S <∞} ∩ {S > T} ∩ {T <∞}

⊂ {T <∞} ∩ {S > T}={T{S>T} <∞},

thereby confirming thatZ{S>T} is a predictable stopping time for (Fs,t)s,t∈R+. Similarly, Z{S≤T} is a predictable stopping time for (Fs,t0 )s,t∈R+. Finally, note that S∨T is a predictable stopping time for (Gs)s∈R+ since bothS andT are predictable stopping times for (Gs)s∈R+.

•Z = (S, T) such that Z is a predictable stopping time for (Fs,t)s,t∈R+, andS > T orS =T =∞. In fact, we haveZ{S>T} =Z, which is a predictable stopping time for (Fs,t)s,t∈R+;Z{S≤T} = (∞,∞), which is a predictable stop- ping time for (Fs,t0 )s,t∈R+; andS∨T =S, which is a predictable stopping time for (Gs)s∈R+.

•Z = (S, T) such that Z is a predictable stopping time for (Fs,t0 )s,t∈R+, and S ≤T.

4. THE CROSS SECTION THEOREM

We now present the main result of this paper, a Cross Section theorem for ℘00-measurable subsetsA of Ω×R2+.

Theorem2.8. Let A⊂Ω×R2+ be ℘00-measurable, and let >0. There exists a predictable stopping time Z : Ω → [0,∞]×[0,∞] for the double filtration (Fs,t00 )s,t∈R+ such that

(6)

2.8a[Z]⊂Aand

2.8b P(π[A])−P(π[ [Z] ])≤.

Proof. By Corollary 2.6, the setA∩Lis℘-measurable, and the setA∩U is ℘0-measurable. By [G, Theorem 1.21], we have the following results.

• There is a predictable stopping timeZ1= (S1, T1) for the double filtra- tion (Fs,t)s,t∈R+ such that [Z1]⊂A∩LandP(π[A∩L])−P(π[ [Z1] ])≤2.

• There is a predictable stopping timeZ2 = (S2, T2) for the double filtra- tion (Fs,t0 )s,t∈R+ such that [Z2]⊂A∩U andP(π[A∩U])−P(π[ [Z2] ])≤2. Define the functionZ : Ω→[0,∞]×[0,∞] by

Z =Z11{T2>S1}+Z21{T2≤S1}

= (S11{T2>S1}+S21{T2≤S1}, T11{T2>S1}+T21{T2≤S1}) =: (S, T).

First, we show thatZ is a stopping time for (Fs,t00 )s,t∈R+. Letz= (s, t)∈ R2+. We must show that{Z ≤z} ∈ Fz00. We have

{Z≤z}={Z1 ≤z} ∩ {T2> S1}S{Z2≤z} ∩ {T2 ≤S1}

={Z1≤z} ∩ {S1 ≤s} ∩ {T2 > S1}S

{Z2 ≤z} ∩ {T2 ≤t} ∩ {T2 ≤S1}

={Z1 ≤z} ∩ {S1 ≤s} ∩({T2 > s} ∪({T2 ≤s} ∩ {T2 > S1})) S{Z2≤z} ∩ {T2 ≤t} ∩({S1> t} ∪({S1 ≤t} ∩ {T2 > S1}))∈ Gs∨t=Fz00, sinceZ1is a stopping time for (Fs,t)s,t∈R+,Z2is a stopping time for (Fs,t0 )s,t∈R+, and S1, T2 are stopping times for (Gs)s∈R+.

Next, we show that Z is a predictable stopping time for (Fs,t00 )s,t∈R+. To do this, we must show the following:

2.8cThe functionZ{S>T} is a predictable stopping time for (Fs,t)s,t∈R+. 2.8dThe functionZ{S≤T}is a predictable stopping time for (Fs,t0 )s,t∈R+. 2.8eThe function S∨T is a predictable stopping time for (Gs)s∈R+. First, we establish requirement2.8c. We have

{S > T}={S1 > T1} ∩ {T2> S1}S

{S2> T2} ∩ {T2 ≤S1}

={S1<∞} ∩ {T2>S1}S

∅∩ {T2≤S1} since [Z1]⊂Land [Z2]⊂U; ={T2> S1}. So, we have

Z{S>T} =Z{T2>S1} =Z1{T2>S1}. We must show that

2.8f The functionZ1{T2>S1} is a stopping time for (Fs,t)s,t∈R+.

2.8gThe functionS1{T2>S1} is a predictable stopping time for (Gs)s∈R+, and

2.8h {S1{T2>S1} <∞} ⊂ {T1{T2>S1} <∞}.

(7)

To prove2.8f, letz= (s, t)∈R2+. We have

{Z1{T2>S1} ≤z}={Z1≤z} ∩ {T2> S1}={Z1≤z} ∩ {S1≤s} ∩ {T2> S1}

={Z1≤z} ∩ {S1≤s} ∩({T2> s} ∪({T2≤s} ∩ {T2 > S1})∈ Gs=Fz. Next, we verify that condition2.8gis satisfied. Note, since bothS1 andT2are predictable stopping times for (Gs)s∈R+, the set{T2> S1}is an element ofGS1

(see [D-M, p. 128]). Therefore, the functionS1{T2>S1} is a predictable stopping time for (Gs)s∈R+. Lastly, we show that condition2.8h is met. We have

{S1{T2>S1} <∞}={S1 <∞} ∩ {T2> S1} ⊂ {T1<∞} ∩ {T2 > S1} since Z1 is a predictable stopping time for (Fs,t)s,t∈R+, and so {S1 < ∞} ⊂ {T1 <∞}; ={T1{T2>S1}<∞}.

This concludes our verification that requirement 2.8c holds true. Using a similar proof, we are able to establish requirement 2.8d.

Lastly, we establish requirement2.8e. We have

S∨T =S1∨T11{T2>S1}+S2∨T21{T2≤S1} =S11{T2>S1}+T21{T2≤S1} sinceS1 ≥T1 andS2≤T2 because [Z1]⊂L and [Z2]⊂U; =S1∧T2, which is a predictable stopping time for (Gs)s∈R+, since bothS1 andT2are predictable stopping times for (Gs)s∈R+.

To complete the proof of the theorem, we must show that assertions2.8a and 2.8b are true. We have [Z]⊂ [Z1]∪[Z2]⊂ A, and so assertion 2.8a is true. Next, we verify that assertion 2.8b is true. As preparation, we first show the following:

2.8i{T2 > S1} ⊂π[Z1]⊂π[A].

2.8j {T2<∞}=π[Z2]⊂π[A].

2.8k{T2 =S1 =∞} ⊂(π[Z1]∪π[Z2])c. Regarding2.8i, we have

{T2> S1} ⊂ {S1<∞}={S1 <∞} ∩ {T1<∞}sinceS1> T1 on Ω;

=π[Z1]⊂π[A∩L] ⊂π[A].

Next, regarding 2.8j, we have

{T2<∞}={T2<∞} ∩ {S2 <∞}sinceS2 ≤T2 on Ω; =π[Z2]⊂π[A].

Lastly, with respect to 2.8k, we have

{T2=S1=∞}={T2=∞} ∩ {S1=∞} ⊂(π[Z2])c∩(π[Z1])c = (π[Z1]∪π[Z2])c. We are now ready to prove assertion2.8b. We have

P(π[A])−P(π[Z]) =P(π[A]\π[Z]) since π[Z]⊂π[A];

=P

(π[A]∩ {T2 > S1}S

π[A]∩ {T2 ≤S1} ∩ {T2<∞}

Sπ[A]∩ {T2 ≤S1} ∩ {T2=∞})\(π[Z1]∩ {T2 > S1} Sπ[Z2]∩ {T2≤S1})

(8)

≤P

(π[A]∩ {T2 > S1} \π[Z1]∩ {T2 > S1})

S(π[A]∩ {T2 ≤S1} ∩ {T2 <∞} \(π[Z2]∩ {T2 ≤S1}) Sπ[A]∩ {T2 ≤S1} ∩ {T2 =∞}

=P

{T2 > S1} \ {T2 > S1}

S(π[A]∩ {T2 <∞} \π[Z2])∩ {T2≤S1} Sπ[A]∩ {T2 =S1 =∞}

 by 2.8i;

=P(∅S

({T2 <∞} \ {T2<∞})S

π[A]∩ {T2=S1=∞}) by 2.8j;

=P(∅S

∅S

π[A]∩ {T2 =S1 =∞})≤P(π[A]∩(π[Z1]∪π[Z2])c) by2.8k;

=P((π[A∩L]∪π[A∩U])T

(π[Z1]∪π[Z2])c)

≤P((π[A∩L]\π[Z1])S

(π[A∩U]\π[Z2]))

≤P(π[A∩L]\π[Z1]) +P(π[A∩U]\π[Z2])≤ 2 +

2 =.

REFERENCES

[C-W] R. Cairoli and J. Walsh,Stochastic integration in the plane. Acta Math.134(1975), 111–183.

[D-M] C. Dellach´erie et P.A. Meyer,Probabilit´es et Potentiel. Hermann, Paris, 1975–1980.

[D] N. Dinculeanu,Vector Integration and Stochastic Integration in Banach Spaces. Wiley, New York, 2000.

[Dz] M. Dozzi,On the decomposition and integration of two-parameter stochastic processes.

In: Lecture Notes in Math.863, pp. 162–171. Springer, Berlin–Heidelberg, 1980.

[G] P. Gray, A cross section theorem, I. Rev. Roumaine Math. Pures Appl. 53(2008), 297–316.

[L] Ch. Lindsey,Two-Parameter Stochastic Processes with Finite Variation. PhD Disser- tation, University of Florida, 1988.

[O] B. Oksendal,Stochastic Differential Equations. Springer, Berlin–Heidelberg, 2003.

[W] J. Walsh,Martingales with Multidimensional Parameter and Stochastic Integrals in the Plane. Cours de 3e Cycle, Univ. Paris VI, 1977.

Received 7 December 2007 University of Florida

College of Journalism and Communication Gainesville, FL 32611, USA

pgray@ufl.edu

Références

Documents relatifs

L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des

intensity distribution near the laser focus decides the crosssection configuration of waveguide, the princi ple to realize a circular crosssection waveguide is to keep the

In this section, we explain the link between the theory of irreducible characters of G F and the theory of Shintani descent for the general linear group.. For this purpose, we need

Such an extension of the Cross Section Theorem is dicult because, while the set R + of positive real numbers is totally ordered, the set R 2 + of ordered pairs of positive real

In this paper, it has been shown that the usual definition of the Radar Cross Section is not suitable for a surface wave propagation as it does not converge with regards

A DEPENDENCE OF THE EXCITATION ENERGY, WIDTH, AND CROSS SECTION OF THE ISOVECTOR MONOPOLE

Episode 15 – Cross sections of

The study of convergence of OT S n and further statistics will be made possible by using some tight controls related to the mixing properties of X at different times.. The