HAL Id: jpa-00211113
https://hal.archives-ouvertes.fr/jpa-00211113
Submitted on 1 Jan 1989
HAL is a multi-disciplinary open access
archive for the deposit and dissemination of
sci-entific research documents, whether they are
pub-lished or not. The documents may come from
teaching and research institutions in France or
abroad, or from public or private research centers.
L’archive ouverte pluridisciplinaire HAL, est
destinée au dépôt et à la diffusion de documents
scientifiques de niveau recherche, publiés ou non,
émanant des établissements d’enseignement et de
recherche français ou étrangers, des laboratoires
publics ou privés.
Entropy ultrametric for dynamical and disordered
systems
Sergio Caracciolo, Luigi A. Radicati
To cite this version:
Entropy
ultrametric
for
dynamical
and
disordered
systems
Sergio
Caracciolo andLuigi
A. RadicatiScuola Normale
Superiore,
Piazza dei Cavalieri, Pisa 56100,Italy
(Reçu
le 6 avril 1989,accepté
le 7juin
1989)
Résumé. - Le but de cet article est de montrer :
1)
que l’ensemble de toutes les histoires d’unsystème dynamique
à un temps donné a une structureultramétrique ;
2)
qu’on
peut définir pourun
système
dynamique (X, F,
m,T)
une distanceDm
sur l’ensemble des histoires, en terme del’entropie
de von Neumann-Shannon ;3)
qu’on
peut définir un ensemble à unparamètre
dedistances
ultramétriques
D(03B1)m
en utilisant les mesures d’information dedegré
a, a ~[0, 1];
4)
que leparamètre
Y utilisé dans la théorie dessystèmes
désordonnés est une fonction de lamesure d’information de
degré
2 ;5)
qu’une
construction similaires’applique
aussi auxsystèmes
désordonnés dans
lesquels
l’ultramétricité n’est satisfaite que dans un sensprobabiliste.
Abstract. 2014
The purpose of this paper is to show :
1)
that the set of all histories of adynamical
system at any
given
time has an ultrametric structure ;2)
that for adynamical
system(X,
F, m,T)
a distanceDm
on the set of histories can be defined in terms of the von Neumann-Shannon entropy ;3)
that a one parameter set of ultrametric distancesD(03B1)m
can be definedby
using
instead of the von Neumann-Shannon entropy the information measures ofdegree
03B1 for03B1 ~
[0, 1] ; 4)
that the parameter Y used in thetheory
of disordered systems is a function of the information measure ofdegree
2 ;5)
that a similar construction holds also for disordered systems whereultrametricity
is satisfiedonly
in aprobabilistic
sense.Classification
Physics
Abstracts 02.50 - 05.201. Introduction.
Ultrametric spaces, known since
long
to mathematicians(1),
haverecently
foundapplications
inphysics
in connection with Parisi’s solution of theSherrington-Kirkpatrick
model forspin
glasses
[2].
A recent review on theimportance
ofultrametricity
forphysics
and other sciencesis
given
in[3].
The purpose of this paper is to show that a similar ultrametric structure can be defined on
the set of all histories of a
general
dynamical
system
at agiven
time. Such a set is apartition
of thephase
space that becomes more and more refined as time goes on. Therefore theentropy
of the
partition
increases with time and we shall show that it ispossible
to define anultrametric distance between histories from the
entropy
distance betweenpartitions.
Such adefinition is
quite
natural because thecomplexity
of a set of histories isnaturally
related to itsentropy.
(1)
For a recent discussion ofultrametricity
in mathematics see forexample
[1].
The paper is
organized
as follows. In section 2 we shall define a distance between theelements of a
partition,
a distance that will turn out to be ultrametric. In section 3 we will introduce a new ultrametric distance between two histories at agiven
time thatdepends
uponthe
entropy
produced during
the evolution from the common ancestor(to
bedefined)
of thetwo histories. Thus this distance
depends
upon the choice of theprobability
measure. Insection 4 we shall
generalize
the distance defined in section 3by considering
instead of the usualentropy
the set ofa-entropies
[4]
for a E[0,1].
In section 5 we shall show that in thecase of disordered
systems
theanalogue
of the time evolution is the resolution process ofclusters of
ergodic
states into theircomponents.
The orderparameter
of thesesystems
is related to the2-entropy,
thusshowing
its true connection with the notion ofcomplexity.
Finally
we shallgive
the relation between thegeometric
distance onphase
space and theprobabilistic
distanceprovided by
the von Neumann-Shannonentropy.
2. The space of histories and its ultrametric structure.
Let us consider a
dynamical
system
(X,
J7,
m,T )
where(X, 57, m )
is aprobability
space andT is a measurable
automorphism
of(X, J7).
Here X denotes the Cartesianproduct
of thephase
spaces of aphysical
system
at timest = 1, 2, ... ; :F is a
Q algebra
of measurable subsets(events)
of X and m aprobability
measure(a state)
notnecessarily preserved by
T. Anexperiment
with a finite number of outcomesperformed
at time t = 1 on(X,:F, m)
isrepresented by
a measurablepartition
y ={Ao,
...,Ak-l}
of X. The sameexperiment
performed
at t = n(n
is aninteger)
isrepresented by
thepartition
T- (n - 1) y
={T- (n - 1) A °,
...,
T- (n -1 ) A k -1 }
and the sequence ofexperiments repeated n
times from t = 0 to t = n is
represented by
thepartition
{ y
(j)j =,
1 is
a sequence ofincreasingly
refinedpartitions,
i. e. y(j) >
y (j - 1>,
indexedbey
time.The number of elements of
y (’)
is kn. Each element of y (n)(ij
e {0,
...,k -1 }
for anyj)
represents
a sequence of outcomes of theexperiments
y,T -1
y ,
..., T - n y. We callA (i 1,
...,i n )
an-history
orhistory
oflength n originating
at timet = 1 from
A (i 1 ).
Similarly
T- i A (i 1, - - -,
i n )
is an-history originating
at time t= j
+ 1 fromA (i 1 ). Equation (2.4)
shows that eachn-history
can beregarded
as the intersection of af
-history
(l
n )
starting
at time t = 1 with a(n -
f)-history
starting
at t= i
+ 1. We shall callA (i 1,
...,if)
the commonf-prehistory
of all thekn - f
histories that branch off from it. The(n -
f
)-histories
are the elements ofT- f
y (n - l) so that :Thus the collection of all the histories can be
arranged
in a tree, indexedby
time,
as shown infigure
1. Eachpoint
represents
ahistory ;
those thatbelong
to the samepartition
occupy thesame horizontal line. Each
point
ofl’ (n)
is connected downwards to asingle point
ofl’ (n - 1)
andupwards
to kpoints
ofl’ (n + 1)
thatrepresent
the kpossible
extensions of aA
Fig. 1.
- A tree of historiescorresponding
to apartition
with k = 2. The common closest ancestor of E and F, i.e. P(E, G ),
belongs
to thepartition
l’ (2)
and has aprehistory
P(E, F),
whichbelongs
to y andis the common closest ancestor of E and F.
It is easy to see that the collection of all
points
on a horizontalline,
i.e. the elements of apartition,
y (n)
say, is an ultrametric set.Indeed,
let us consider two events E and F ofy (n)
and lety (t)
withQ , n
be the most refinedpartition
in thecollection { y
(j)} oo= 1
such thatone of its
elements,
denotedby
P(E, F),
contains both E and F. We call P(E, F)
the commonclosest ancestor
of
E and F. One caneasily verify
that the nonnegative
number.is an
acceptable
definition of distance between E and F.In addition to the
properties
of a distance D satisfies the more restrictive ultrametriccondition
where
E, F,
G aregeneric
elements of’Y (n).
The maximum distance between two elements of
y (n)
is n which obtains whenthey
have no commonprehistory.
The distance
(n -
i ) kt
is the same for all then-histories, E,
F e y(n) with
common closestancestor
P (E, F)
Ey (l).
Hencewhere
E,
F are anypair
iny (n)
such that P(E, F)
= G. D(E, F)
is the same for all suchpairs.
Thus the sum is the time difference between the
lengths
of the histories and that of theirprehistory.
3.
Entropy
ultrametric.The distance
(2.6)
ispurely dynamical
and does not take into account theprobabilistic
natureprobabilities
and on that of their common closest ancestor. We shall remove from the treehistories with zero measure
and,
in this way, we include thepossibility
ofobtaining irregular
trees.We
begin by
noticing
that the distance between twopartitions
y(n )
andy (l),
withl
n, can be measured eitherby
their timeseparation n -
f
orby
theirentropy
(information)
distance which is
given by
(2) :
where
H(y(j) ,
m )
is theentropy
of thepartition
y (j)
with
respect
to theprobability
measure m. If wekeep
f
fixeddm(y
(n),
y(f»)
increases with n, notlinearly
as n - f
but in a way thatdepends
upon the rate of informationproduced by
themapping
T on thepartition
y.Let E and F be two histories of
y(n)
with common closest ancestor P(E, F)
ey
(l).
We candefine their distance
by
where for any G E F
(mP
is the measure conditionedby
P).
Obviously only
those elements ofy (’)
thatbelong
tothe subtree
arising
from P(E, F)
contribute toH(y (n)
mp).
Since theentropy
of thepartition
y (l)
in the measure mp,P r= y (f),
vanishes we can write the distance(3.2)
in the formwhere
dmp
is defined in(3.1 ).
The distanceDm (E, F)
represents
theuncertainty
about which element ofy (")
the transformation Tgenerates
from the element Pduring
the time interval At = n -f.
Inparticular,
if all the historiesarising
from P(i.e.
all the n-historieshaving
a commonf-prehistory)
have the sameprobability,
1 / v
say, thenThe distance increases with the number of descendants of P. We shall now prove the
following :
Proposition
1. Lety (n)
= y vT- 1
y v ... vT- (n - 1)
y be afinite
partition o f
thedynamical
system
(X,
[F,
m,T ).
The distanceDm defined by
(3.2)
between elementso f
y (n) with non zeroprobability
is ultrametric.Proof :
Dm (E, F)
isobviously symmetric
in E and F and nonnegative.
It vanishes if andonly
if E = F = P (E, F )
because E and F have non zero measure. LetE,
F and G be threehistories in the
partition
y(n)
and suppose thatP (E, G),
the common closest ancestor of E andG,
be contained inP (E, F ).
ThenP (G, F ) = P (E, F ).
If we setL =
P (E,
F)BP(E,
G),
then theprobability
measuremp(E,
F) can be
decomposed
as theconvex combination
Owing
to theconcavity
of theentropy
we haveand therefore
From this it follows
If instead
P (E, G) P P (E, F)
we obtainby
a similarargument
From these two
inequalities
it followswhich proves the statement.
If 1£
is the uniform measure,i. e. J.L
(G) =
k - j
for anyhistory
G E y(j >,
thenwhere
D (E, F)
is defined in(2.6).
The sum over G of the distances
Dm (E, F )
such thatE,
F is anypair
in-Y(")
with ancestorG=P(E,F)E
l’ (1) ils
This relation for the sum of the
entropy
distancesDm generalises equation
(2.8).
Remarks :
(i)
the above results are valid also forpartitions
y with a countableinfinity
ofelements,
provided
that for each n theentropy
of thepartition
y (n) befinite,
because this condition ensures that all the distances are well defined. In this case the number of branches at eachevent in the tree is no
longer
limited ;
(ii)
the maximum distance between two histories of y (n) isH( y
(n),
m ),
which
is the meaninformation necessary to reconstruct the tree. The difference
(iii)
if mpreserved by
T the limit lims (n )
exists(it
is called in Ref.[9]
the tree’sn -,m
silhouette)
and coincides with the rate ofentropy
production
[5-7]
by
T on y definedby
If m is not
preserved by
T neither the existence of the rate ofentropy
production,
nor, afortiori,
that of the tree’s silhouette can beproved.
4. a-distances and
a-overlaps.
In the definition
(3.2)
of the ultrametric distanceDm (E, F)
we have used the vonNeumann-Shannon
entropy
of thepartition
y (n)
to which E and Fbelong
relative to the measure m,namely :
As is well known
H( y
(n),
m )
is the limit for au 1 of the information measures ofdegree
a definedby
(see
forexample
[4] ;
the case a = 2 was first considered in[10]).
For anypositive
real aHa
is apositive
function whose value increases with the refinement of thepartition,
and whichtakes values in the interval
[0,
nlog
k ].
It can be verified that for a E[0,1 ],
H" is
a concavefunction on the space of
probability
measures m.Therefore,
by
the sameargument
used in theproof
ofProposition
1 we can prove.Proposition
2. Let y (n) = y vT- 1 V ...
v T - (n - 1) y be afinite
partition
o f
thedynamical
system
(X, F ,
m,T ).
The distanceD (a )
between the elementso f
y (n) with nonzeroprobability
defined by
is ultrametric
for
every choiceof
a in the interval[0, 1].
We shall call
Dm(a)
a a-distance.To each ultrametric distance we can associate a
a-overlap by
the relationClearly
Q(a)(E,
E )
= 1. Moreover from theultrametricity
ofD (a)
it followsSince the maximum a-distance from two histories of
y (n)
isHa (1’ (n),
m )
it follows that the minimum value of thea-overlap
on thepartition
y (n)
isIn terms of
Yn" a,
the a-silhouetteslope
at time t = n, defined(see (3.14))
as the difference ofthe
a-entropies
ofl’ (n + 1)
and
l’ (n),
readsThe
limit,
if itexists,
of s,,,,
(n )
for n - oo, is the a-silhouette of the tree(the
usualproof
of the convergence ofsl (n )
fails when a # 1 becausea-entropies
are not ingeneral
subadditive).
Ifm is a measure for which T is a Bernoulli shift we have
In this case the a-silhouette is
The
quantity
Y,,(")
is definedby
(4.6)
for anypositive
value of a, also for « > 1 when noentropic
distance is defined. AsHa
increases with refinement(and
therefore withtime)
yn(a)
decreaseswith n,
i.e.In the case a = 2
represents
theprobability
that two histories at time t > n have a commonn-prehistory.
5. Disordered
systems.
The ultrametric structure of the
partition
y (n)
depends
upon the existence of a sequence ofpartitions {y (i)}
with i =0,
..., n, ... such that
y()
is more refined thany (i - 1).
In the
theory
of disorderedsystems
there also appears an ultrametric structure[11]
which however has itsorigin
in the metricproperties
of thephase
space and which canonly
bedefined in a
probabilistic
sense.Consider a
probability
space(X, 57, m )
where thephase
space X is assumed to beequipped
with a metricg (x, y),
x, y EX,
and where m is anequilibrium
measure, that is a convexcombination of
ergodic
measures[12]
pi :Let
E’
be the nonempty
subset invariant for theergodic
state p i , i. e. such that pi(Ei)
= 1. Then for anyergodic
statep j
we havebecause for i
# j
Therefore
If Z is the
subspace
of X with zero measure for all theequilibrium
states m, the set q =f E‘ } i E I is
apartition
of the spaceXBZ
ofequilibrium
configurations.
The
partition
is non trivialonly
when there is more than oneergodic
state, that is when thesystem
undergoes
aphase
transition. In the mean field solution ofspin glasses
there is aninfinity
ofergodic
states not related to each otherby
asymmetry
(this
is a consequence ofwhat is called
frustration
[13]).
Theentropy
of such apartition
has been considered
previously by
Palmer[14].
From the
metric g
(x, y)
in X we can define the Hausdorff distance between the elements ofQ 1
Let (do > dl > ... > dn
>... >0 }
be a sequence of differentdecreasing
distances betweenthe elements
of Ti
and let us consider a sequence ofpartitions
{ {X} , 0 }
= 77o - 7?1 - - - . , 71n ":-- - - - « ’nordered
according
torefinement,
whose elements are subsetsoff
withdecreasing
diameters,
respectively
do, dl,
...,dn,
...,0,
i.e.where
In reference
[2]
the elementsEn
are referred to as clusters.The
sequence {11 n}
is notuniquely
determinedby
the sequencet dn }
as several subsetsEn
with the same diameter can ingeneral
exist. Moreover it should be remarked that thedistance
d (Ei , Ej)
between two elementsE’
EEn
andEj E
En
may well be smaller thandn.
Forexample,
consider fourpoints
at the corners of a square. If d isequal
to the side of the square there are two ways ofpartitioning
the set of the four corners into sets of diameterd,
Fig.
2. - The twoways in which the corners of a square can be united to form subsets with a diameter
equal
to the side of the square.together
the left and theright pairs. Clearly
in both cases there arepairs
ofpoints belonging
todifferent subsets which have a distance d
(see
Fig.
2).
The situation is much
simpler
when ruz is ultrametric for the distancesdij.
In this case it iseasy to
verify
that thesequence {17n}
is determineduniquely.
Furthermoreultrametricity
implies
that the distance between two elementsH
EEn
andEi
EEn
isalways
greater
thandn
and does notdepend upon i, j.
In this case we haveThus
if q
is ultrametriceach Tl n
is also ultrametric.Remark
that,
ifEn + 1 c En
andE’ , 1 r-- E’,
weget
for theentropic
distance that weintroduced in section 3
where P is the common ancestor of
En +
1 andEs n
that is we find the silhouette of thesubtree which
originates
from P.Instead of
ordering
thepartitions {’Tln}
according
todecreasing
diameters it iscustomary
in thetheory
of disorderedsystems
to order themaccording
toincreasing
maximumoverlaps
of the subsets within their elements :0 = qo -- q, « - - - ...: -- qn - - « ...
qMax. The exact relation between distances andoverlaps
is notimportant.
It isenough
to assume that theoverlap
qij
= q(Ei , Ej)
between two elementsEi
andEi
ouf 17
is adecreasing
function of their distance. The distance and theoverlap
considered in this section are of a puregeometrical
nature incontrast with that discussed in section 3
(and
generalized
in Sect. 4 for a e[0,1))
whichdepends
upon theprobability
measure. To establish the relation between these two distanceswe shall
introduce,
following
Parisi,
theprobability
distribution of theoverlaps
qij
in thepartition q :
P
(q )
is a measure of thecomplexity
of thepartition q
which Parisi has called orderparameter
for the
glassy
transition[15].
A measure of the
departure
fromultrametricity
of thepartition
~ n isgiven by
thequantity
Indeed,
we can prove thefollowing :
Proposition
3. àn
is nonnegative
and it vanishesif
~ is ultrametric.Conversely, if
làn
vanishes,
for
all choiceso f qn,
then the elementso f
~ with nonvanishing
probability form
anultrametric set.
Proof :
where the first term arises from the case r = s. Therefore
If q is ultrametric the sum vanishes because elements
E‘ ,
Ei
withoverlaps
greater
than qn are contained in the same element of 17n. This concludes the firstpart
of theproposition.
To prove the second statement suppose that the elementsouf 17
with nonvanishing
probability
are not an ultrametric set. Then there exists at least three elementsEi,
Ej,
Ek
with nonvanishing probability
for whichqij >
min(qik,
qkj)
=qjk.
Choose qn
=qij-Then the three elements
E’, Ei,
Ek
cannot be contained in the same element of ~n, because this would have a diameterlarger
thandn.
But ifEi
is not contained in the sameelement of 17n that contains
E’
orEk
there is a nonnegative
contribution to the r.h.s. of(5.15).
Q.E.D.
In the
theory
ofspin glasses ultrametricity
occursonly
after the average over disorder.Indeed,
disorder istogether
with frustration a necessaryingredient
of thetheory
ofspin
glasses,
which is taken into accountby
aprobability
distribution ofcouplings
betweenspins.
In our
description
disorder is insteadrepresented by
aprobability
distribution for thecoefficients
ai’s.
Usually
the average over theequilibrium
measures is denotedby
a bar(3)
and one can consider the
averaged
versions of theprevious
relations.The statement of
ultrametricity
in the mean fieldtheory
ofspin
glasses
isexpressed by
theweaker relation
which means that the
probability
offinding
three elementsof TI n
with non zero measure thatdo not
satisfy
theultrametricity
inequality
is zero.Specifically,
letf n (W )
be theprobability
that an event of thepartition
(n) hasprobability
W,
that isWfn(W)
is theprobability
for theweight
W and weget
as a consequence of(5.16)
Similarly
the average of the von Neumann-Shannonentropy
isgiven by
For
example,
a Poisson distribution ofexponential weights
[16],
which istypical
of the random energy model[17],
willgive
rise towith xn
aparameter
characterizing
thepartition.
Such a distribution alsoapplies
tospin glasses
[11, 18]
and can beinterpreted
as a distribution of random freeenergies.
It follows that
where as a consequence of
equation (4.11)
0 Xn -1 xn _ 1. The average value of the vonNeumann-Shannon
entropy
is instead.where
is
increasing
in the interval(0, 1)
anddiverges
at minusinfinity
when zapproaches
zero. All theentropy
distances we have introduced are well definedwhen q
qMaX orequivalently
xn «-- 1.
Equations
(5.18), (5.21)
and(5.22)
provide
the connection between the orderparameter
and the average value of thecomplexity
of thesystem
at the scale qn.Acknowledgment.
One of us
(L.
A.R.)
wishes to thank thehospitality
at the Institute des HautesÉtudes
Scientifiques,
Bures-sur-Yvette,
wherepart
of this work has been done.References
[1]
KOBLITZ N.,p-adic
numbers,p-adic analysis
and zeta functions(Springer-Verlag,
NewYork)
1977.[2]
MÉZARD M., PARISI G. and VIRASORO M. A.,Spin glass theory
andbeyond
(World
Scientific,Singapore)
1988.[4]
RÉNYI A.,Wahrscheinlichkeitsrechnung
(VEB
DeutscherVerlag
der Wissenschaften,Berlin)
1962.