• Aucun résultat trouvé

The growing tree-valued process

Dans le document THÈSE présentée pour obtenir (Page 89-99)

...

Figure 3.1: The pruning process, starting from explosion time A defined in (3.32).

Consider the ascension time (or explosion time):

A=inf©

θ∈Θψ, σθ< ∞ª

, (3.32)

where we use the convention inf; =θ. The following Theorem gives the distribution of the ascension time A and the distribution of the tree at this random time. Recall that θ¯=ψ−1(ψ(θ))is defined in (3.10).

Theorem 3.24([AD12a]). Letψbe a critical branching mechanism satisfying Assumptions 1 and 2.

1. For allθ∈Θψ, we haveNψ[A>θ]=θ¯−θ.

2. Ifθ<θ<0, underNψ, we have, for any non-negative measurable functionalF, Nψ[F(TA+θ0,θ0≥0)|A=θ]=ψ0( ¯θ)Nψh

F(Tθ00≥0)σ0e−ψ(θ)σ0i . 3. For allθ∈Θψ, we haveNψ[σA< +∞|A=θ]=1.

In other words, at the ascension time, the tree can be seen as a size-biased critical Lévy tree. A precise description of TA is given in [AD12a]. Notice that in the setting of [AD12a], there is no need of Assumption 2.

3.3 The growing tree-valued process

Special Markov Property of pruning

In [ADV10], the authors prove a formula describing the structure of a Lévy tree, conditionally on theθ-pruned tree obtained from it in the (sub)critical case. We will give a general version of this result. From the measure of marks, M in (3.30), we define a measure of increasing marks by:

M(d x,dθ0)=X

iI

δ(xi,θi)(d x,0), (3.33)

with

I=n

iIskeInod;M(‚;,xiƒ ×[0,θi])=1o .

The atoms(xi,θi)foriIcorrespond to marks such that there are no marks ofM on‚;,xiƒ with aθ-component smaller thanθi. In the case of multipleθj for a given nodexi∈Br(T), we only keep the smallest one. In the caseΠ=0, the measureM describes the jumps of a record process on the tree, see [AD11] for further work in this direction. Theθ-pruned tree can alternatively be defined usingMinstead ofM as forθ≥0:

Λθ(T,M)=n

x∈T, M(‚;,x‚×[0,θ])=0o . We set:

Iθ=n

iI,xi∈Lf(Λθ(T,M))o

=n

iI,θi<θ and M(‚;,xi‚×[0,θ])=0o and foriIθ:

Ti=T\T;,xi={x∈T, xi∈ ‚;,xƒ},

whereTy,x is the connected component ofT\{x}containing y. For iIθ,Ti is a real tree, and we will considerxi as its root. The metric and mass measure onTi are the restriction of the metric and mass measure ofT onTi. By construction, we have:

T =Λθ(T,M)~iI

θ(Ti,xi). (3.34)

Now we can state the general special Markov property.

Theorem 3.25(Special Markov Property). Letψbe a branching mechanism satisfying Assump-tions 1 and 2. Letθ>0. Conditionally onΛθ(T,M), the point measure:

Mθ(d x,dT0,dθ0)=X

i∈Iθ

δ(xi,Ti,θi)(d x,dT0,0)

underPψr0 (or underNψ) is a Poisson point measure onΛθ(T,M)×T×(0,θ]with intensity:

mΛθ(T,M)(d x)³

2βNψ[dT0]+ Z

(0,+∞)Π(d r)re−θ0rPψr(dT0

1(0,θ](θ0)0. (3.35) Proof. It is not difficult to adapt the proof of the special Markov property in [ADV10] to get Theorem 3.25 in the (sub)critical case by taking into account the pruning times θi and the w-tree setting; and we omit this proof which can be found in [ADH12b]. We prove how to extend the result to the super-critical Lévy trees using the Girsanov transform of Definition 3.18.

Assume thatψis super-critical. Fora>0, we shall writeΛθ,a(T,M)=πaθ(T,M))for short. According to (3.34) and the definition of super-critical Lévy trees, we have that for any a>0, the truncated treeπa(T)can be written as:

πa(T)=Λθ,a(T,M)~i∈I

θ, Hxia

³πa−Hxi(Ti),xi´

3.3. The growing tree-valued process and we have to prove thatPiI

θδ(xi,Ti,θi)(d x,dT0,dθ0)is conditionally onΛθ(T,M)a Pois-son point measure with intensity (3.35). Since a is arbitrary, it is enough to prove that the point measureMa, defined by

Ma(d x,dT0,0)= X non-negative measurable functional onT. Let

B=Nψ£

F(Λθ,a(T,M)) exp(− 〈Ma,Φ〉)¤ .

Thanks to Girsanov formula (3.22) and the special Markov property for critical branching mechanisms, we get:

Thanks to the Girsanov formula and (3.29), we get:

and thanks to (3.7), we get:

G(h,x,θ)= Using (3.39) withF replaced byF R gives:

Nψh

exp(−〈Ma,Φ〉)F(Λθ,a(T,M))i

=B=Nψ£

F(Λθ,a(T,M))R(Λθ,a(T,M))¤ . This implies thatMa is, conditionally onΛθ,a(T,M), a Poisson point measure with intensity (3.36). This ends the proof.

3.3. The growing tree-valued process An explicit construction of the growing process

In this section, we will construct the growth process using a family of Poisson point measures.

Letψ be a branching mechanism satisfying Assumptions 1 and 2. Letθ∈Θψ. According to (3.20) and (3.7), we have:

Nψθ[T ∈ •]=2βNψθ[T ∈ •]+ Z

(0,+∞)Π(d r)re−θrPψrθ(T ∈ •). (3.40) LetT(0)∈Twith root;. Forq∈Θψand qθ, we set:

T(0)q =T(0) and m(0)q =mT(0).

We define the w-trees grafted on T(0) by recursion on their generation. We suppose that all the random point measures used for the next construction are defined on T under a probability measureQT(0)(dω).

Suppose that we have constructed the family((T(k)q ,m(n)q ), 0≤kn,q∈Θψ∩(−∞,θ)). We write

T(n)= G

q∈Θψ,q≤θ

T(n)q .

We define the (n+1)-th generation as follows. Conditionally on all trees from generations smaller thann,(T(k)q , 0kn, q∈Θψ∩(−∞,θ)), let

Nθn+1(d x,dT,d q)= X

j∈J(n+1)

δ(xj,Tj,θj)(d x,dT,d q) be a Poisson point measure onT(n)×T×Θψ with intensity:

µn+1θ (d x,dT,d q)=m(n)q (d x)Nψq[dT]1{q≤θ}d q.

Forq∈Θψ andqθ, we set

J(n+1)q

jJ(n+1), q<θj

ª

and we define the treeT(nq+1)and the mass measurem(nq +1) by:

T(nq+1)=T(n)q ~jJq(n+1)(T j,xj) and m(nq +1)= X

jJ(nq+1)

mTj(d x).

Notice that by construction, (T(n)q ,n∈N) is a non-decreasing sequence of trees. We set Tq the completion of∪n∈NT(n)q , which is a real tree with root ;and obvious metricdTq, and we define a mass measure onTq bymTq=P

n∈Nm(n)q .

For q∈Θψ and q <θ, we consider Fq the σ-field generated by T(0) and the sequence of random point measures (1{q0∈[q,θ]}Nθ(n)(d x,dT,d q0),n∈N). We setNθ=P

n∈NNθn. The backward random point process q7→1{q≤q0}Nθ(d x,dT,d q0) is by construction adapted to the backward filtration(Fq,q∈Θψ∩(−∞,θ]).

The proof of the following result is postponed to Section3.3.

Theorem 3.26. Let ψbe a branching mechanism satisfying Assumptions 1 and 2. UnderQψθ:=

Nψθ[dT(0)]QT(0)(dω), the process

³³

Tq,dTq,;,mT¯q´

,q∈Θψ∩(−∞,θ

is aT-valued backward Markov process with respect to the backward filtration Fθ=(Fq,q ∈ Θψ∩(−∞,θ]). It is distributed as((Tq,mTq),q∈Θψ∩(−∞,θ])underNψ.

Notice the Theorem in particular entails that(Tq,dTq,;,mT¯q)is a w-tree for allq. We shall use the following lemma.

Lemma 3.27. Let ψ be a branching mechanism satisfying Assumptions 1 and 2. Let K be a measurable non-negative process (as a function ofq) defined onR+×T×Twhich is predictable with respect to the backward filtrationFθ. We have:

Qψθ This means that the predictable compensator ofNθ is given by:

µθ(d x,dT,d q)=mTq(d x)Nψq[dT]1{q∈Θψ,q≤θ}d q.

Notice this construction does not fit in the usual framework of random point measures as the support at time q of the predictable compensator is the (predictable backward in time) random setTq×T×Θψ.

Proof. Based on the recursive construction, we have:

Qψθ Now, by construction, we have that:

Tq=T(n)q ~j∈J(n)q ( ˜Tj,xj)

3.3. The growing tree-valued process

It can be noticed that the mapq7→Tq is non-decreasing càdlàg (backwards in time) and that we have, for j∈ ∪n∈NJ(n), xj∈Tθj: Tθj=Tθj~(T j,xj). In particular, we can recover the random measureNθ from the jumps of the process (Tq,q∈Θψ∩(−∞,θ]). This and the natural compatibility relation ofNθ with respect toθgives the next Corollary.

Corollary 3.28. Letψbe a branching mechanism satisfying Assumptions 1 and 2. Let(Tθ,θ∈Θψ) be defined underNψ. Let

N =X

j∈J

δ(xj,Tj,θj)

be the random point measure defined as follows:

• The set{θj;jJ}is the set of jumping times of the process(Tθ,θ∈Θψ): for jJ,Tθj6=Tθj.

with respect to the backward left-continuous filtrationF=(Fθ,θ∈Θψ)defined by:

Fθ=σ((xj,T j,θj);θθj)=σ(Tq;θ≤q).

More precisely, for any non-negative predictable processK with respect to the backward filtrationF, we have:

Nψ

·Z

N(d x,dT,d q)K³

q,Tq,Tq

´¸

=Nψ

·Z

µ(d x,d T,d q)K³

q,Tq,Tq~(T,x)´

¸ . (3.41) Remark 5. Notice that Assumption 2 is assumed only for technical measurability condition, see Remark2. We conjecture that this results holds also if Assumption 2 is not in force.

As a consequence, thanks to property 3 of Theorem 3.24, we get, with the convention sup; =θ, that:

A=sup{θj,jJ andσj= +∞} with σj=mTj(T j).

Proof of Theorem3.26

By construction, it is clear that the process (Tq,q ∈Θψ∩(−∞,θ]) is a backward Markov process with respect to the backward filtration (Fq,q ∈Θψ∩(−∞,θ]). By construction this process is càglàd in backward time. Since the process (Tq,q ∈Θψ) is a forward càdlàg Markov process, it is enough to check that forθ0∈Θψ, such thatθ0<θ, the two dimensional marginals(Tθ0,Tθ)and(Tθ0,Tθ) have the same distribution.

Replacing ψby ψθ0, we can assume that θ0=0 and 0<θ. We shall decompose the big treeT0 conditionally on the small tree Tθ by iteration. This decomposition is similar to the one which appears in [AD07] or [Voi10] for the fragmentation of the (sub)critical Lévy tree, but roughly speaking the fragmentation is here frozen but for the fragment containing the root.

We set T(0)=Tθ and m˜(0)=mTθ, so that (T(0),m(0)) and (T(0), ˜m(0)) have the same distribution. Recall notationMfrom (3.33) as well as (3.34):T0=T(0)~i∈I↑,1

θ (Ti,xi), where we writeIθ,1=Iθ and whereP1=P

i∈Iθ,1δ(xi,Tii) is, conditionally onT(0), a Poisson point measure with intensity:

ν1(d x,dT0,d q)=m˜(0)(d x)³

2βNψ[dT0]+ Z

(0,+∞)Π(d r)re−qrPψr(dT0

1(0,θ](q)d q.

ForiIθ↑,1, we define the sub-tree ofTi: T˜i=

n

x∈Ti;Mxi,x‚×[0,θi])=0o .

Since Ti is distributed according to Nψ (or to Pψri for some ri >0), using the property of Poisson point measures, we have that conditionally onT0andθi, the treeT˜i is distributed as

3.3. The growing tree-valued process Λθi(T,M)underNψ(or underPψri) that is the distribution ofT˜i isNψθi[dT](orPψriθi(dT)), thanks to the special Markov property. Furthermore we haveTi=T˜i~i0∈Iθ,i↑,2(Ti0,xi0)where

X

i0Iθ,i↑,2

δ(xi0,Ti0,θi0)

is, conditionally onT(0) andT˜i a Poisson point measure on T˜i×T×(0,θ]with intensity:

mT˜i(d x)³

2βNψ(dT0)+ Z

(0,+∞)Π(d r)reqrPψr(dT0

1[0,θi)(q)d q.

Thus we deduce, using again the special Markov property, that:

θ1(d x,dT,d q)= X

i∈I,1

δ(xi, ˜Ti,θi)(d x,dT,d q)

is conditionally onT0 a Poisson point measure onT(0)×T×Θψ with intensity:

µ˜1(d x,dT,d q)=m˜(0)q (d x)Nψq[dT]1[0,θ)(q)d q, withm˜(0)q (d x)=m˜(0)(d x). We setT(1)=T(0)~iI↑,1

θ ( ˜Ti,xi)for the first generation tree and forq∈[0,θ]:

m˜(1)q (d x)= X

iIθ↑,1

mT˜i(d x)1[0,θi)(q).

See Figure 3.2 for a simplified representation. We get that (T(1)θ , (m(1)q ,q ∈[0,θ]),T(0),mT(0)) and(T(1), ( ˜m(1)q ,q∈[0,θ]),T(0), ˜m(0))have the same distribution.

Furthermore, by collecting all the trees grafted onT(1), we get that T =T(1)~i0∈Iθ↑,2(Ti0,xi0),

where Iθ↑,2= ∪iI↑,1

θ Iθ,i↑,2 and where

P2= X

i0∈Iθ↑,2

δ(xi0,Ti0i0)

is, conditionally on (T(1), ( ˜m(1)q ,q∈[0,θ]),T(0), ˜m(0)) a Poisson point measure onT(1)×T×

(0,θ]with intensity:

ν2(d x,dT,d q)=m˜(1)q (d x)³

2βNψ(dT0)+ Z

(0,+∞)Π(d r)reqrPψr(dT0

1[0,θ](q)d q.

Notice that:

T(1)={x∈T0;M(‚;,x‚×[0,θ])≤1} and m˜(1)θ (d x)+m˜(0)(d x)=1T(1)(x)mT0(d x). (3.42)

Figure 3.2: The tree T0, T(0), and a tree Ti and its sub-tree T˜i belonging to the first generation treeT(1)\T(0).

Then we can iterate this construction, and by taking increasing limits we obtain that the pair((n∈NT(n)θ ,P

n∈Nm(n)θ ),T0)has the same distribution as(T0,T(0)), where:

T0=n

x∈T0;M(‚;,x‚×[0,θ])< +∞o

and m˜0(d x)=1T0(x)mT0(d x).

To conclude, we need to check first that the completion ofT0isT0, or asT0 is complete that the closure ofT0 as a subset ofT0 is exactlyT0 and then thatmT0(T0c)=0.

Notice thatMhas less marks thanM. Then Proposition 1.2 in [AD07] in the case when β=0or an elementary adaptation of it in the general framework of [Voi10], gives there is no loss of mass in the fragmentation process. This implies that, ifψis (sub)critical, then:

mT0({x∈T0;M(‚;,x‚×[0,θ])= ∞}=0. (3.43) Then, if ψis super-critical, by considering the restriction of T0 up to level a, πa(T0), and using a Girsanov transformation from Definition3.18 withθ=θ and (3.43), we deduce that (3.43) holds forπa(T0). Sincea is arbitrary, we deduce by monotone convergence that (3.43)

3.4. Application to overshooting

Dans le document THÈSE présentée pour obtenir (Page 89-99)