• Aucun résultat trouvé

80-646-08 Stochastic Calculus I

N/A
N/A
Protected

Academic year: 2022

Partager "80-646-08 Stochastic Calculus I"

Copied!
38
0
0

Texte intégral

(1)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Martingales

80-646-08 Stochastic Calculus I

Geneviève Gauthier

HEC Montr éal

(2)

De…nition Lemma 1 Example lemma 2

Example Stopped process Optional Stopping Theorem Markovian process

De…nition

On the …ltered probability space ( Ω , F , F , P ) , where F is the

…ltration fF t : t 2 f 0, 1, 2, . . . gg , the stochastic process M = f M t : t 2 f 0, 1, 2, . . . gg

is a discrete-time martingale if

(M1) 8 t 2 f 0, 1, 2, . . . g , E

P

[ j M t j ] < ; (M2) 8 t 2 f 0, 1, 2, . . . g , M t is F t measurable;

(M3) 8 s , t 2 f 0, 1, 2, . . . g such that s < t, E

P

[ M t jF s ] = M s .

(3)

De…nition Lemma 1 Example lemma 2

Example Stopped process Optional Stopping Theorem Markovian process

Martingale

Constant expectation process

Lemma. Let M = f M t : t 2 f 0, 1, 2, . . . gg be a martingale built on the …ltered probability space ( Ω , F , F , P ) . Then

8 t 2 f 1, 2, . . . g , E

P

[ M t ] = E

P

[ M 0 ] . Proof of the lemma. 8 t 2 f 1, 2, . . . g ,

E

P

[ M t ] = E

P

h

E

P

[ M t jF 0 ] i by ( EC 3 )

= E

P

[ M 0 ] by ( M 3 ) .

Interpretation. A martingale is a stochastic process that,

on average, is constant. This doesn’t mean however that

such a process varies little, since the variance Var

P

[ M t ] ,

at every time, can be in…nite.

(4)

De…nition Lemma 1 Example lemma 2

Example Stopped process Optional Stopping Theorem Markovian process

Example I

Example. Let f ξ t : t 2 f 1, 2, . . . gg be a sequence of ( , F ) independent and identically distributed random variables with respect to the measure P and such that

E

P

[ ξ t ] = 0 and E

P

ξ 2 t < . Let’s de…ne

F 0 = f? , g ;

8 t 2 f 1, 2, . . . g , F t = σ f ξ s : s 2 f 1, . . . , t gg ; and

M 0 = 0, M t =

∑ t s = 1

ξ s .

The stochastic process M is a martingale on the space

( Ω , F , F , P ) .

(5)

De…nition Lemma 1 Example lemma 2

Example Stopped process Optional Stopping Theorem Markovian process

Example II

Indeed,

E

P

[ j M t j ] = E

P

"

∑ t s = 1

ξ s

# t

s ∑ = 1

E

P

[ j ξ s j ]

∑ t s = 1

q

E

P

ξ 2 s < ∞

where the second inequality comes from the fact that, for any random variable,

0 Var [ j X j ] = E h

j X j 2 i ( E [ j X j ]) 2 ) E [ j X j ] r

E h

j X j 2 i .

Given the selected …ltration, M is adapted (which is to say that

8 t 2 f 0, 1, 2, . . . g , M t is F t mesurable).

(6)

De…nition Lemma 1 Example lemma 2

Example Stopped process Optional Stopping Theorem Markovian process

Example III

Lastly, 8 s , t 2 f 0, 1, 2, . . . g such that s < t, E

P

[ M t jF s ]

= E

P

"

M s +

∑ t u = s + 1

ξ u jF s

#

= E

P

[ M s jF s ] +

∑ t u = s + 1

E

P

[ ξ u jF s ]

= M s +

∑ t u = s + 1

E

P

[ ξ u jF s ]

| {z }

=

EP

[

ξu

]

from ( EC 1 ) since M s is F s measurable, and from ( EC 7 ) since ξ u is independent from ξ 1 , . . . , ξ s .

= M s .

(7)

De…nition Lemma 1 Example lemma 2

Example Stopped process Optional Stopping Theorem Markovian process

Martingale I

Lemma

In the de…nition of a martingale, the condition ( M 3 ) is equivalent to

M3* 8 t 2 f 1, 2, . . . g , E

P

[ M t jF t 1 ] = M t 1 .

Proof of the lemma. Clearly, ( M 3 ) ) ( M3 ) since ( M 3 ) is only a special case of ( M 3 ) . Indeed, it is su¢ cient to de…ne s = t 1.

So we must show that ( M 3 ) ) ( M 3 ) . This can be proved by induction. Intuitively, if s < t then

E

P

[ M t jF s ] = E

P

h

E

P

[ M t jF t 1 ] jF s i from ( EC 3 ) ,

= E

P

[ M t 1 jF s ] from ( M 3 )

(8)

De…nition Lemma 1 Example lemma 2

Example Stopped process Optional Stopping Theorem Markovian process

Martingale II

But if s < t 1, then we can use the same logic again, and we get

E

P

[ M t 1 jF s ] = E

P

h

E

P

[ M t 1 jF t 2 ] jF s i from ( EC 3 ) ,

= E

P

[ M t 2 jF s ] from ( M 3 ) . Now just substitute this result into the …rst equation:

E

P

[ M t jF s ] = E

P

[ M t 2 jF s ] .

By iterating such an algorithm, we will eventually obtain E

P

[ M t jF s ] = E

P

[ M s + 1 jF s ]

= M s from ( M 3 ) .

(9)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Example I

ω ξ

1

ξ

2

ξ

3

P Q

ω

1

1 1 1

18 12

ω

2

1 1 1

18 141

ω

3

1 1 1

18 141

ω

4

1 1 1

18 141

ω ξ

1

ξ

2

ξ

3

P Q

ω

5

1 1 1

18 141

ω

6

1 1 1

18 141

ω

7

1 1 1

18 141

ω

8

1 1 1

18 141

On the sample space Ω = f ω 1 , . . . , ω 8 g , we will use the σ-algebra F = the set of all events in Ω . The …ltration F is made up of the σ-subalgebras

F

0

= f? , g ,

F

1

= σ f ξ

1

g = σ ff ω

1

, ω

2

, ω

3

, ω

4

g , f ω

5

, ω

6

, ω

7

, ω

8

gg ,

F

2

= σ f ξ

1

, ξ

2

g = σ ff ω

1

, ω

2

g , f ω

3

, ω

4

g , f ω

5

, ω

6

g , f ω

7

, ω

8

gg ,

F

3

= σ f ξ

1

, ξ

2

, ξ

3

g = F .

(10)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Example II

The stochastic process M, built on the …ltered measurable space ( , F , F ) is de…ned as follows :

M 0 = 0, M 1 = ξ 1

M 2 = ξ 1 + ξ 2 and M 3 = ξ 1 + ξ 2 + ξ 3 .

By construction, M is adapted to the …ltration F .

(11)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Example III

M = f M t : t 2 f 0, 1, 2, 3 gg is a martingale on ( Ω , F , F , P ) . Indeed, condition ( M2 ) is already veri…ed since M is F adapted.

Condition ( M1 ) is also satis…ed since 8 t 2 f 0, 1, 2, 3 g , E

P

[ j M t j ] E

P

[ j ξ 1 j ] + E

P

[ j ξ 2 j ] + E

P

[ j ξ 3 j ] = 3.

Let’s verify condition ( M 3 ) .

(12)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Example IV

E

P

[ M

1

jF

0

] = E

P

[ M

1

] from ( EC 4 )

= 0

= M

0

8 ω 2 f ω

1

, ω

2

, ω

3

, ω

4

g , E

P

[ M

2

jF

1

] ( ω ) = 1

1

2

2 1

8 2 1

8 + 0 1 8 + 0 1

8 = 1

M

1

( ω ) = 1;

8 ω 2 f ω

5

, ω

6

, ω

7

, ω

8

g , E

P

[ M

2

jF

1

] ( ω ) = 1

1

2

0 1

8 + 0 1

8 + 2 1 8 + 2 1

8 = 1

M

1

( ω ) = 1

(13)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Example V

8 ω 2 f ω

1

, ω

2

g , E

P

[ M

3

jF

2

] ( ω ) = 1

1

4

3 1

8 1 1

8 = 2 and M

2

( ω ) = 2;

8 ω 2 f ω

3

, ω

4

g , E

P

[ M

3

jF

2

] ( ω ) = 1

1

4

1 1

8 + 1 1

8 = 0 and M

2

( ω ) = 0;

8 ω 2 f ω

5

, ω

6

g , E

P

[ M

3

jF

2

] ( ω ) = 1

1

4

1 1

8 + 1 1

8 = 0 and M

2

( ω ) = 0;

8 ω 2 f ω

7

, ω

8

g , E

P

[ M

3

jF

2

] ( ω ) = 1

1

4

1 1

8 + 3 1

8 = 2 and M

2

( ω ) = 2.

(14)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Example VI

By contrast M = f M t : t 2 f 0, 1, 2, 3 gg is not a martingale on ( Ω , F , F , Q ) . Indeed,

E

Q

[ M 1 jF 0 ]

= E

Q

[ M 1 ] from ( EC 4 )

= 1

2 + 1

14 + 1 14 + 1

14 + 1 14 + 1

14 + 1 14 + 1

14

= 6

14

6

= 0

= M 0 .

(15)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Example VII

Conclusion. For a stochastic process, the property of

being a martingale depends on the …ltration and on the

measure. That’s why the notation ( F , P ) martingale

may sometimes be seen.

(16)

De…nition Example Stopped process

De…nition Example Theorem Optional Stopping Theorem Markovian process

De…nition

De…nition

The stochastic process X and the stopping time τ are built on the same …ltered measurable space ( Ω , F , F ) . The stochastic process X

τ

de…ned by

X t

τ

( ω ) = X t ^

τ

(

ω

) ( ω ) (1)

is called a stopped process with stopping time τ.

(17)

De…nition Example Stopped process

De…nition Example Theorem Optional Stopping Theorem Markovian process

Example I

ω X 0 X 1 X 2 X 3 τ X 0

τ

X 1

τ

X 2

τ

X 3

τ

ω 1 1 1 2 1 1 2 0 1 1 1 1 ω 2 1 1 2 1 1 2 3 1 1 2 1 1 2

ω 3 1 2 1 1 1 1 2 2 2

ω 4 1 2 2 1 1 1 2 2 2

(18)

De…nition Example Stopped process

De…nition Example Theorem Optional Stopping Theorem Markovian process

Stopped martingale I

Theorem

Theorem

If the martingale M and the stopping time τ are built on the

same …ltered probability space ( Ω , F , F , P ) then the stopped

process M

τ

is also a martingale on that space.

(19)

De…nition Example Stopped process

De…nition Example Theorem Optional Stopping Theorem Markovian process

Stopped martingale II

Theorem

Proof of the theorem. The key to the proof is to express M t

τ

in terms of the components of the process M .

M 0

τ

= M 0 and 8 t 2 f 1, 2, . . . g , M t

τ

= M t

τ

t 1 k ∑ = 0

I f

τ

= k g + I f

τ

t g

!

=

t 1 k ∑ = 0

I f

τ

= k g M t

τ

+ I f

τ

t g M t

τ

=

t 1 k ∑ = 0

I f

τ

= k g M k + I f

τ

t g M t .

(20)

De…nition Example Stopped process

De…nition Example Theorem Optional Stopping Theorem Markovian process

Stopped martingale III

Theorem

Verifying condition ( M 1 ) :

E

P

[ j M 0

τ

j ] = E

P

[ j M 0 j ] < ∞ and 8 t 2 f 1, 2, . . . g ,

E

P

[ j M t

τ

j ] = E

P

"

t 1 k ∑ = 0

I f

τ

= k g M k + I f

τ

t g M t

#

t 1 k ∑ = 0

E

P

I f

τ

= k g M k + E

P

I f

τ

t g M t

t 1 k ∑ = 0

E

P

[ j M k j ] + E

P

[ j M t j ] < ∞

since, M being a martingale, we have that 8 t 2 f 0, 1, 2, . . . g ,

E

P

[ j M t j ] < ∞ .

(21)

De…nition Example Stopped process

De…nition Example Theorem Optional Stopping Theorem Markovian process

Stopped martingale IV

Theorem

Verifying condition ( M 2 ) :

M 0

τ

= M 0 is F 0 measurable. (2) Now, 8 t 2 f 1, 2, . . . g ,

M t

τ

=

t 1 k ∑ = 0

I f

τ

= k g

| {z }

F

k

measurable since f

τ

= k g2F

k

M k

|{z}

F

k

measurable since M is adapted.

| {z }

F

t

measurable since k < t )F

k

F

t

+ I f

τ

t g

| {z }

F

t 1

measurable since

f

τ

t g = f

τ

t 1 g

c

2F

t 1

M t

|{z}

F

t

measurable since M is adapted.

is F t measurable.

(22)

De…nition Example Stopped process

De…nition Example Theorem Optional Stopping Theorem Markovian process

Stopped martingale V

Theorem

Verifying condition ( M 3 ) : 8 t 2 f 1, 2, . . . g , M t

τ

M t

τ

1

=

t 1 k ∑ = 0

I f

τ

= k g M k + I f

τ

t g M t

!

t 2 k ∑ = 0

I f

τ

= k g M k + I f

τ

t 1 g M t 1

!

= I f

τ

= t 1 g M t 1 + I f

τ

t g M t I f

τ

t 1 g M t 1

= I f

τ

t g M t I f

τ

t 1 g I f

τ

= t 1 g M t 1

= I f

τ

t g M t I f

τ

t g M t 1

since, f τ = t 1 g and f τ t g being disjoint, I f

τ

= t 1 g + I f

τ

t g = I f

τ

= t 1 g[f

τ

t g = I f

τ

t 1 g .

= I f

τ

t g ( M t M t 1 ) .

(23)

De…nition Example Stopped process

De…nition Example Theorem Optional Stopping Theorem Markovian process

Stopped martingale VI

Theorem

As a consequence, since I f

τ

t g is F t 1 measurable E

P

[ M t

τ

jF t 1 ] M t

τ

1

= E

P

[ M t

τ

M t

τ

1 jF t 1 ]

= E

P

I f

τ

t g ( M t M t 1 ) jF t 1

= I f

τ

t g E

P

[ M t M t 1 jF t 1 ]

= I f

τ

t g E

P

[ M t jF t 1 ] E

P

[ M t 1 jF t 1 ]

= I f

τ

t g ( M t 1 M t 1 ) = 0 hence

E

P

[ M t

τ

jF t 1 ] = M t

τ

1 .

(24)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Optional Stopping Theorem

Theorem

(Optional Stopping Theorem). Let

X = f X t : t 2 f 0, 1, 2, . . . gg be a process built on the …ltered probability space

( Ω , F , F , P ) , where F is the …ltration fF t : t 2 f 0, 1, 2, . . . gg . Let’s assume that the stochastic process X is F adapted and that it is integrable, i.e. E

P

[ j X t j ] < . Then X is a

martingale if and only if

E

P

[ X

τ

] = E

P

[ X 0 ]

for any bounded stopping time τ, i.e for any given stopping time τ, there exists a constant b such that

8 ω 2 Ω , 0 τ ( ω ) b.

(25)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Optional Stopping Theorem I

Proof of the theorem

First part. Let’s assume that X is a martingale and let’s show that, in such a case, E

P

[ X

τ

] = E

P

[ X 0 ] for any bounded stopping time.

Let τ be any bounded stopping time. Then, there exists a

constant b such that 8 ω 2 Ω , 0 τ ( ω ) b. As a

(26)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Optional Stopping Theorem II

Proof of the theorem

consequence,

E

P

[ X

τ

] = E

P

"

b

k=0

X

k

I

fτ=kg

#

= E

P

"

b k=0

X

k

I

fτ kg

I

fτ k+1g

#

=

b k=0

E

P

h

X

k

I

fτ kg

i

b

k

=0

E

P

h

X

k

I

fτ k+1g

i

= E

P

[ X

0

] +

b k=1

E

P

h

X

k

I

fτ kg

i

b 1

k=0

E

P

h

X

k

I

fτ k+1g

i

since I

fτ 0g

= I

= 1 and I

fτ b+1g

= I

?

= 0.

= E

P

[ X

0

] +

b k=1

E

P

h

X

k

I

kg

i

b

k=1

E

P

[ X

k 1

I

kg

(27)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Optional Stopping Theorem III

Proof of the theorem

= E

P

[ X

0

] +

b k=1

E

P

h

( X

k

X

k 1

) I

kg

i

= E

P

[ X

0

] +

b k=1

E

P

h E

P

h

( X

k

X

k 1

) I

fτ kg

jF

k 1

ii

from ( EC3 ) ,

= E

P

[ X

0

] +

b k=1

E

P

h

I

fτ kg

E

P

[ X

k

X

k 1

jF

k 1

] i from ( EC 6 ) ,

= E

P

[ X

0

]

since, X being a martingale,

E

P

[ X k X k 1 jF k 1 ] = E

P

[ X k jF k 1 ] E

P

[ X k 1 jF k 1 ]

= X k 1 X k 1 = 0.

(28)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Optional Stopping Theorem IV

Proof of the theorem

Second part. Let’s now assume that, for any bounded stopping time τ, E

P

[ X

τ

] = E

P

[ X 0 ] and let’s show that, in such a case, the adapted and integrable stochastic process is a martingale.

By hypothesis, X already satis…es conditions ( M1 ) and ( M 2 ) . The only thing left to verify is that 8 s , t 2 f 0, 1, 2, . . . g such that s < t, E

P

[ X t jF s ] = X s .

So, let’s set s and t 2 f 0, 1, 2, . . . g such that s < t.

We denote by P s = n A ( 1 s ) , . . . , A ( n s

s

)

o

the …nite

partition generated by F s .

(29)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Optional Stopping Theorem V

Proof of the theorem

For any i 2 f 1, . . . , n s g we build a random time :

S i ( ω ) = 8 >

<

> :

s if ω 2 A ( i s )

t if ω 2 / A i ( s ) . S i is a stopping time (obviously bounded) since 8 u 2 f 0, 1, 2, . . . g ,

f ω 2 Ω : S i ( ω ) = u g = 8 >

> >

> >

<

> >

> >

> :

A ( i s ) if u = s 2 F s A ( i s ) c if u = t 2 F s F t

? otherwise 2 F 0 F u

.

(30)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Optional Stopping Theorem VI

Proof of the theorem

So, by hypothesis, we have that

E

P

[ X S

i

] = E

P

[ X 0 ] .

Besides, since the random time τ t de…ned as 8 ω 2 Ω ,

τ t ( ω ) = t is also a bounded stopping time, we have, again by hypothesis, that

E

P

[ X t ] = E

P

[ X

τt

] = E

P

[ X 0 ] hence

E

P

[ X S

i

] = E

P

[ X t ] .

(31)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Optional Stopping Theorem VII

Proof of the theorem

As a consequence, 8 i 2 f 1, . . . , n s g , 0 = E

P

[ X t ] E

P

[ X S

i

]

= E

P

[ X t X S

i

]

= E

P

( X t X S

i

) I A

(s) i

+ ( X t X S

i

) I

A

i(s) c

= E

P

( X t X s ) I

A

i(s)

+ ( X t X t ) I

A

(is) c

= E

P

h

( X t X s ) I A

(s)

i

i

= ∑

ω

2 A

(is)

( X t ( ω ) X s ( ω )) P ( ω ) hence

ω

2 A

(is)

X t ( ω ) P ( ω ) = ∑

ω

2 A

(is)

X s ( ω ) P ( ω ) .

(32)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Optional Stopping Theorem VIII

Proof of the theorem

Now we can conclude the proof, since E

P

[ X t jF s ] =

n

s

i ∑ = 1

I A

(s)

i

P A i ( s )

ω

2 A

(is)

X t ( ω ) P ( ω )

=

n

s

i ∑ = 1

I A

(s) i

P A i ( s )

ω

2 A

(is)

X s ( ω ) P ( ω )

= E

P

[ X s jF s ]

= X s .

(33)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Markovian process I

De…nition

De…nition

A stochastic process X = f X t : t 2 T g , where T is a set of indices a , is said to be markovian if, for any

t 1 < t 2 < . . . < t n 2 T , the conditional distribution of X t

n

given X t

1

, . . . , X t

n 1

is equal to the conditional distribution of X t

n

given X t

n 1

, i.e. for any x 1 , . . . , x n 2 R ,

P [ X t

n

x n j X t

1

= x 1 , . . . , X t

n 1

= x n 1 ]

= P [ X t

n

x n j X t

n 1

= x n 1 ] .

a

Examples: T = f 0, 1, 2, . . . g , T = f 0, 1, 2, . . . , T g where T is a

positive integer, T = [ 0, T ] where T is a positive real number, T = [ 0, ∞ ) ,

etc.

(34)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Markovian process II

De…nition

Intuitively, if we assume that t are temporal indices, the process X is markovian if its distribution in the future, given the present and the past, only depends on the present.

”A Markov chain is then a memoryless random phenomenon: the distribution of an observation to come, given our present knowledge of the system and its whole history, is the same when only its present state is known.” 1

The set of values that the process may take is called the state space of X and we denote it by E X .

1

Jean Vaillancourt.

(35)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Markovian process I

Example

Example. We throw a dice repeatedly.

The random variable ξ n represents the number of points obtained on the n th throw.

The stochastic process X represents the total cumulative number of points obtained at any time, i.e. for any natural integer t,

X t =

∑ t n = 1

ξ n .

(36)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Markovian process II

Example

X is a Markovian process. Indeed, X t =

∑ t n = 1

ξ n =

t 1 n ∑ = 1

ξ n + ξ t = X t 1 + ξ t .

But the outcome of the t th throw of dice, ξ t , is independent from the results obtained on the …rst t 1 th throws,

σ f ξ n : n 2 f 1, . . . , t 1 gg . As a consequence, the distribution of X t depends on the past of the stochastic process,

σ f ξ n : n 2 f 1, . . . , t 1 gg , through σ f X t 1 g only.

(37)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Markovian process

Remark

Question. Why do we need a probability space? Wouldn’t a measurable space have been su¢ cient?

Answer. A probability measure is required to ensure that

the independence property is satis…ed.

(38)

De…nition Example Stopped process Optional Stopping Theorem Markovian process

Markovian process

Random walk

The stochastic process X , built on the probability space ( Ω , F , P ) , is a random walk if it admits the representation

X 0 = 0 et 8 t 2 f 1, 2, . . . g , X t =

∑ t n = 1

ξ n

where the sequence f ξ t : t 2 f 1, 2, . . . gg is made up of independent and identically distributed random variables.

Random walks are Markovian processes.

Références

Documents relatifs

There are several modi- fied live vaccines currently on the market for horses, including, but not limited to, vaccines for equine herpes, equine influ- enza, equine encephalitis,

The creation of the large TIPSTER collection in 1990, followed by the first Text REtrieval Conference (TREC) in 1992 reframed the shared concept to mean not only using the same

In this work, we have described the key requirements for a reusable framework for self-explanation in self-adaptive systems: a generic and extensible execution trace metamodel,

Edited by: F. C.: On the changing seasonal cycles and trends of ozone at Mace Head, Ireland, Atmos. S., Wang, T., and Giorgi, F.: Is ozone pollution affecting crop yields in

There are three prepositions in French, depuis, pendant and pour, that are translated as 'for' and are used to indicate the duration of an

Tense is the grammatical term that refers to the time when the action of the verb occurs: past, present, future.. The time frame of an action is usually established by referring to

place under the reader’s eyes as soon as he or she tries to trace a coherent system of ideas in Hardy’s thought as it is expressed in the poems, especially in Poems of the Past and

Write a degree 2 polynomial with integer coefficients having a root at the real number whose continued fraction expansion is. [0; a,