• Aucun résultat trouvé

Stopping times

Dans le document Introduction to random processes (Page 71-75)

Stopping times are random times which play an important role in Markov process theory and martingale theory.

Definition 4.1. An N-valued random variable τ is an F-stopping time if {τ ≤n} ∈ Fn for alln∈N.

From the definition above, notice that ifτ is anF-stopping time, then{τ =∞}=T

n∈N{τ ≤ n}c belongs to F.

When there is no ambiguity on the filtration F, we shall write stopping time instead of F-stopping time. It is clear that the integers are stopping time.

Example 4.2. For the simple random walk X = (Xn, n ∈ N), see Example 3.4, and F the natural filtration ofX, it is easy to check that the return time to 0,T0= inf{n≥1; Xn= 0}, with the convention that inf∅= +∞, is a stopping time. It is also easy to check thatT0−1

is not a stopping time. 4

In the next lemma, we give equivalent characterization of stopping times.

65

Lemma 4.3. Let τ be a N-valued random variable.

(i) τ is a stopping time if and only if {τ > n} ∈ Fn for alln∈N.

(ii) τ is a stopping time if and only if {τ =n} ∈ Fn for alln∈N.

Proof. Use that{τ > n}={τ ≤n}c to get (i). Use that{τ =n}={τ ≤n}T

{τ ≤n−1}c and that{τ ≤n}=Sn

k=0{τ =k}to get (ii).

We give in the following proposition some properties of stopping times.

Proposition 4.4. Let (τn, n ∈ N) be a sequence of stopping times. The random variables supn∈Nτn, infn∈Nτn, lim supn→∞τn and lim infn→∞τn are stopping times.

Proof. We have that{supk∈Nτk≤n}=T

k∈Nk≤n} belongs to Fn for all n∈Nasτk are stopping time for k ∈N. This proves that supk∈Nτk is a stopping time. Similarly, use that {infk∈Nτk≤n}=S

k∈Nk≤n} to deduce that infk∈Nτk is a stopping time.

Since stopping time are N-valued random variables, we get that{lim supk→∞τk≤n} = S

m∈N

T

k≥mk ≤n} forn∈N. This last event belongs to Fn as τk are stopping times for k∈N. We deduce that lim supk→∞τkis a stopping time. Similarly, use that{lim infk→∞τk ≤ n}=T

m∈N

S

k≥mk ≤n}forn∈Nto deduce that lim infk→∞τk is a stopping time.

It is left to the reader to check that theσ-field Fτ in the next definition is indeed aσ-field and a subset of F.

Definition 4.5. Let τ be a F-stopping time. The σ-field Fτ of the events which are prior to a stopping time τ is defined by:

Fτ ={B ∈ F;B∩ {τ =n} ∈ Fn for all n∈N}. Clearly, we have thatτ is Fτ-measurable.

Remark 4.6. ConsiderX = (Xn, n∈N) a Markov chain on a discrete state spaceE with its natural filtration F= (Fn, n∈N). Recall the return time to x∈E defined byTx = inf{n≥ 1;Xn =x} and the excursion Y1 = (Tx, X0, . . . , XTx) defined in section 3.4.3. It is easy to check that Tx is an F-stopping time and that FTx is equal to σ(Y1). Roughly speaking the σ-field FTx contains all the information on X prior toTx. ♦

We give an elementary characterization of theFτ-measurable random variables.

Lemma 4.7. LetY be aF-measurable real-valued random variable and τ a stopping time.

(i) The random variable Y is Fτ-measurable if and only if Y1{τ=n} is Fn-measurable for alln∈N.

(ii) If E[Y]is well defined, then we have that a.s.:

E[Y| Fτ] =X

n∈N

1{τ=n}E[Y| Fn]. (4.1)

Proof. We prove (i). Set Yn =Y1{τ=n}. We first assume that Y is Fτ-measurable and we prove that Yn is Fn-measurable for all n∈ N. If Y =1B withB ∈ F, we clearly get that

4.1. STOPPING TIMES 67 Ynis Fn-measurable for alln∈Nby definition ofFτ. It is then easy to extend this result to any F-measurable random variable which takes finitely different values in R, and then to extend to anyF-measurable real-valued random variableY by considering any sequence of random variables (Yk, k∈N) which converges toY and such thatYk isFτ-measurable and takes finitely many values in R(for example takeYk= 2−kb2kYc1{|Y|≤k}+Y1{|Y|=+∞}).

We now assume that Yn is Fn-measurable for all n ∈ N and we prove that Y is Fτ -measurable. Let A ∈ B(R) and set B = Y−1(A). Notice that B belongs to F as Y is F-measurable. First assume that 0 6∈ A. In this case, we get B ∩ {τ = n} = Yn−1(A) and thus B∩ {τ = n} ∈ Fn for all n ∈ N. This gives B ∈ Fτ. If 0 ∈ A, then uses that B =Y−1(A) = Y−1(Ac)c

to also get that B ∈ Fτ. This implies that Y is Fτ-measurable.

This ends the proof of (i).

We now prove (ii). Assume first that Y ≥0 and set:

Z =X

n∈N

E[Y| Fn]1{τ=n}.

Since Y is F-measurable, we also get that Y1{τ=∞} is F-measurable. Thus, we deduce from (i) that Z isFτ-measurable. For B∈ Fτ, we have:

E[Z1B] =X

n∈N

E

E[Y| Fn]1{τ=n}∩B

=X

n∈N

E

Y1{τ=n}∩B

=E[Y1B],

where we used monotone convergence for the first equality, the fact that{τ =n} ∩B belongs toFnand (2.1) for the second and monotone convergence for the last. AsZ isFτ-measurable, we deduce from (2.1) that a.s. Z =E[Y| Fτ].

Then considerY a F-measurable real-valued random variable. Subtracting (4.1) with Y replaced by Y to (4.1) withY replaced by Y+ gives that (4.1) holds as soon as E[Y] is well defined.

Definition 4.8. Let X = (Xn, n∈N) be a F-adapted process and τ a F-stopping time. The random variable Xτ is defined by:

Xτ =X

n∈N

Xn1{τ=n}.

This definition is extended in an obvious way whenτ is an a.s. finite stopping time andX a process indexed onNinstead ofN. By construction the random variableXτ from Definition 4.8 is Fτ-measurable. We can now give an extension of the Markov property, see Definition 3.2, when considering random times. Compare the next proposition with Corollary 3.12.

Proposition 4.9 (Strong Markov property). Let X = (Xn, n∈N) be a Markov chain with respect to the filtration F = (Fn, n∈ N), taking values in a discrete state space E and with transition matrixP. Letτ be aF-stopping time a.s. finite and define a.s. the shifted process X˜ = ( ˜Xk = Xτ+k, k ∈ N). Conditionally on Xτ, we have that Fτ and X˜ are independent and thatX˜ is a Markov chain with transition matrix P, which means that a.s. for all k∈N, allx0, . . . , xk ∈E:

P( ˜X0=x0, . . . ,X˜k=xk| Fτ) =P( ˜X0=x0, . . . ,X˜k=xk|Xτ) We get from Lemma 4.7 and Definition 4.8 that E

h

1B1{X˜0=x0,...,X˜k=xk}| Fτi

= 1BH(Xτ).

Then, taking the expectation conditionally on Xτ, we deduce that:

E and use the definition (4.3) of H to conclude that ˜X is conditionally onXτ a Markov chain with transition matrixP. Take B= Ω in the previous computations to get (4.2).

Using the strong Markov property, it is immediate to get that the excursions of a recurrent irreducible Markov chain out of a given state are independent and, but for the first one, with the same distribution, see the key Lemma 3.36.

We end this section with the following lemma.

Lemma 4.10. Let τ and τ0 be two stopping times.

(i) The events{τ < τ0}, {τ =τ0} and {τ ≥τ0} belongs to Fτ and Fτ0. (ii) IfB ∈ Fτ, then we have that B∩ {τ ≤τ0} belongs to Fτ0.

(iii) Ifτ ≤τ0, then we have Fτ ⊂ Fτ0.

Proof. We have{τ < τ0} ∩ {τ =n}={τ =n} ∩ {τ0 > n}which belongs toFnas{τ =n}and {τ0 > n} belong already toFn. Since this holds for alln∈N, we deduce that{τ < τ0} ∈ Fτ. The other results of property (i) can be proved similarly.

Let B ∈ Fτ. This implies that B∩ {τ ≤ n} belongs to Fn. We deduce that B∩ {τ ≤ τ0} ∩ {τ0 =n}=B∩ {τ ≤n} ∩ {τ0 =n} belongs toFn. Since this holds for n∈N, we get that B∩ {τ ≤τ0} ∈ Fτ0. This gives property (ii).

Property (iii) is a direct consequence of property (ii) as{τ ≤τ0}= Ω.

Remark 4.11. In some cases, it can be convenient to assume thatF0 contains at least all the P-null sets. Under this condition, if a N-valued random variable is a.s. constant, then it is a stopping time. And, more importantly, under this condition, property (iii) of Lemma 4.10

holds if a.s. τ ≤τ0. ♦

Dans le document Introduction to random processes (Page 71-75)