HAL Id: jpa-00212491
https://hal.archives-ouvertes.fr/jpa-00212491
Submitted on 1 Jan 1990
HAL is a multi-disciplinary open access
archive for the deposit and dissemination of
sci-entific research documents, whether they are
pub-lished or not. The documents may come from
teaching and research institutions in France or
abroad, or from public or private research centers.
L’archive ouverte pluridisciplinaire HAL, est
destinée au dépôt et à la diffusion de documents
scientifiques de niveau recherche, publiés ou non,
émanant des établissements d’enseignement et de
recherche français ou étrangers, des laboratoires
publics ou privés.
Pattern recognition in Hopfield type networks with a
finite range of connections
Eva Koscielny-Bunde
To cite this version:
Short Communication
Pattern
recognition
in
Hopfield
type
networks with
afinite range
of
connections
Eva
Koscielny-Bunde(1)
(1)
FachbereichInformatik,
UniversitätHamburg,
D-2000Hamburg
50,
F.R.G.(Received
on June15,1990,
accepted
on June20,1990)
Abstract. 2014 We
study
patternrecognition
in linearHopfield
type networks of N neurons whereeach neuron is connected to the z
subsequent
neurons such that the state of theith
neuron at timet + 1 is determined
by
the states of neurons i + 1,...,i
+ z at time t. We find that for small values ofz/N
the retrieval behavior differsconsiderably
from the behavior of dilutedHopfield
networks. The maximum number of randompatterns
that can be retrieved increases in a non linear way with z and theasymptotic
meanoverlap
betweeninput
andoutput patterns
decreasessharply
as z is decreased and reaches zero at a finite value of z.Classification
Physics
Abstracts 87.30 - 75. 10H - 64.60In recent years,
Hopfield
neural networks[1]
have been studiedextensively
(for
recentre-views see e.g.
[2-4]).
TheHopfield
networkprovides
a standard for neural nets since itslong
time retrieval behavior could be treatedanalytically
[5]
by
solving
for theequilibrium
statisticalme-chanics of the net. The solution could be achieved since all neurons in the net are interconnected.
In this
paper
wereport
on numerical studies of linearHopfield
type
nets of(usually
N =400)
neurons where each neuron is connected
only
to a certain number of other neurons, andperiodic
boundary
conditions areemployed.
Previous work in this direction wasmainly
devoted to nearestneighbor
and next nearestneighbor
connections,
or to dilute(damaged)
networks where a certain fraction of connections wasrandomly
cut[6-10].
The network consists of lV neurons, each of them can be in two states
?: = ±1.
Thecoupling
strength Ji,j
betweenpairs
of connected neurons is determinedby
the M randompatterns
(g> )
(eï
çfv), J-t
= 1,
2,...,
M,
which are to be stored in the network:The state of neuron i at a
given
time t+ 1
is obtained from the states of the zsubsequent
neuronsat time t
by
-
1798
For z =
N - 1,
(2)
is identical to the retrievalprocess
inHopfield
nets where each neuron receivesinput signals
from all other neurons and sendsoutput
signals
to all other neurons. For zN,
each neuron i receives
input
signals only
from the zsubsequent
neurons i+ 1,
...,i
+ z. The set ofneurons influenced
by
neuron i and the set of neuronsinfluencing
neuron ipartially
overlap
forz >
N/2.
For zN/2,
the neurons in both sets are all différent.A
particular
pattern
te"}
is called storedby
thenetwork,
if an initialconfiguration
{5’(0)} ==
(S1(0),
...,8N(O))
nearle"}
develops
under thedynamic
rule(2)
to a state whoseoverlap
with {ç I/}
islarge.
The maximum number of
patterns Me
that can be storeddepends
on the size of thenetwork,
the difference betweeninput
andoutput
patterns
(initial noise)
and the number of connectionsz
per
neuron.According
to Amit et al.[5],
we have in the limit oflarge Hopfield
networks(z = N - 1 - cxJ),
In this
work,
westudy
howequation (4)
is modified for zN - 1,
and how the retrieval ratedecays
in the nonretrievalregime.
In the numerical
study,
first M randompatterns
lem}
aregenerated
and the connectionstrengths J,j
are calculated. Then one of the nominatedpatterns
is chosen and anoisy input
pattern
isgenerated
where the state of each neuron ischanged
withprobability
pn. Thedynamic
rule(2)
isapplied
and the timedependent overlap
qm (t)
is calculated. The runstops
whenqm (t)
stays
constant for six timesteps
or oscillations ofperiod
2 with constantamplitude
occur. Thisprocedure
isrepeated
fortypically
101
initialconfigurations,
which are chosen from differentnom-inated
patterns.
By averaging
the results weobtained,
for agiven
number Il ofpatterns
and fixed initial noise pn, the mean retrievaloverlap
q > as a function of z.Figure
1 shows thedependence
of q > onz/N,
forpure
input
patterns
(pn
=0)
and networkswith N =
120,
200 and 400 neurons. In the differentnetworks,
the ratioM/.N
has beenkept
constant at 0.075. One can
distinguish
between 3régimes : (1)
the retrievalregime
forlarge z
whereM/z
> 0.14 and q > is close to one;(2)
an intermediate transientregime
where q >is well below one but
greater
than zero;(3)
a disorderedrégime
below some threshold value zc,where q > = 0 and the net behaves as if all connections were cut; Zc seems to increase
slightly
withincreasing
size of the network. The width of the transientregion (2)
shrinlcs if the size of the network is increased.Figure
2shows,
again
forpure
input
patterns,
the variation of q > with the number of nominatedpatterns M,
for N = 400 fixed and three values of z: z =399,
240,
and80; z
= 399corresponds
to the conventionalHopfield
net The result isplotted
versusMlz
rather thanMIN.
For
M/z
not toolarge,
the dataapproximately collapse
to asingle
line. At intermediate values ofM/z,
the datasplit
into three curves. For z =N - 1,
astrong
decrease in the meanoverlap
is followedby
some smooth behavior where q > variesonly
little withM/z
and seems to reach aplateau
> 0 atlarge
values ofMlz
(see
also[5,11]
).
For z =240,
again
the curvedrops sharply
and then decreases less
strongly
withincreasing
M/z.
For z =80,
finally,
the meanoverlap
showsa
sharp
decrease withM/x
and reaches zero atfinite
value ofMlz
roughly
atM/z ù
0.3. We havecompared
this behavior with the case where the connections between neurons have beenrandomly destroyed
[6, 10].
In this case, z can be referred to as mean number of connectionsper
neuron which is related to the concentration p ofdamaged
bondsby
z =(1- p)N.
Infigure
2,
Fig.
1. - Plot of theasymptotic
meanoverlap
as a function ofz/N
for N =120( Ó),
200(0),
and400(o).
The number of stored patterns per neuron is
kept
fixed,
M/N
= 0.075, and pn = 0.Fig.
2. - Plot of theasymptotic
meanoverlap
as a function of the number of nominatedpatterns
percon-nection,
M/z,
for N = 400 and pn = 0. Differentsymbols
represent
different z- values : z =80(A),
z =
240(0),
and z =399(o).
The results for thedamaged
networks are indictedby
A(corresponding
toz =
80),
andbye
(corresponding
to z =240).
and p =
0.8,
respectively.
The curves differstrongly
from thecorresponding
results forz/N
= 0.6and 0.2. In the diluted
network,
the mean retrievaloverlap
does not scale withM/z
and variescomparatively weakly
withM/z.
Figure
3finally
shows the maximum number ofpatterns
per
connecting
bondMlz
that can1800
pn. For
pure
input
patterns
(Pn = 0)
andz/N
>1/4,
Mc/z
isapproximately
the same as for theHopfield
net,which also
agrees
with the situation in dilutedHopfield
nets[10] .
BelowzIN c--- 1/4,
Mc(z)/z
decreases almostlinearly
withz/N,
with theexception Mc /z
= 1 for M = 1. Almost the samecurve is obtained for p, = 0.2
except
for M = 1 and atlarge
values ofz/N
whereMc/z
isconsiderably
smaller than forpure
input
patterns.
For verynoisy input
patterns
(p, = 0.4),
thecurve
again
is rather flat andexcept
at small values ofz/N
depends
ratherweakly
on z. It isinteresting
to note that in all casesMc /z
does not show anypronounced
features around the "critical" valuez/N
=1/2
where the sets ofinput
andoutput
neurons start to have nooverlap.
The behavior shown in
figure
3 differsclearly
from the behavior in diluted networks whereMc/ z
is describedby
asimple
crossover behavior. Below some crossover value zr(pn )
which decreaseswith
increasing
noise level pn,Mc/z
isroughly
constant and follows(5),
while abovezr mc /z
bends down[10].
Inparticular
forlarge
noise,
Mlz
decreases as1/z
for zapproaching
one,which is very différent from the behavior observed here.
Fig.
3. - Maximum fractionMc/ z
of learned randompatterns
in a network of 400 neurons that arerec-ognized
with less than 0.5 percent error, as a function of the number of connections per neuron z, for the initial noise pn =0(a), 0.2(o),
and0.4(A).
It is a
pleasure
for me to thankGreg Kohring
and Dietrich Stauffer for valuable discussions and criticalreading
of themanuscript.
References
[1]
HOPFIELDJ.J.,
Proc. Natl. Acad. Sci. USA 79(1982)
2554.[2]
ECKMILLERR.,
and v. MAISBURG Ch.eds.,
NeuralComputers
(Springer
Verlag,
Berlin,
Heidelberg)
1988;
SOMPOLINSKY
H.,
Physics Today
41(1988)
70.[3]
DOMANYE., J.
Stat.Phys.
51(1988)
743.[4]
AMITD.J.,
Modelling
brain function: The world of attractor neural networks(Cambridge University
[5]
AMITD.J.,
GUTFREUND H. and SOMPOLINSKYH., Ann.
Phys. 173 (1987)
30.[6]
DERRIDAB.,
GARDNER E. and ZIPPELIUSA.,
Europhys.
Lett. 4(1987)
167;
CANNING A. and GARDNER
E., J.
Phys.
A 21(1988)
3275.[7]
KÜRTENK.E.,
J.Phys.
France 50(1989)
2313,
and in press.[8]
FORRESTB.M., J.
Phys.
A 21(1988)
245 and J.Phys.
France 50(1989)
2003.[9]
NOESTA.,
Phys.
Rev. Lett. 63(1989)
1739.[10]
CLARKJ.W.,
RAFELSKI J. and WINSTENJ.V., Phys.
Rep.
123(1985)
205;
KOSCIELNY-BUNDEE., J.
Stat.Phys.
58(1990)
1257 and in: ParallelProcessing
in NeuralSystems
andComputers,
R.Eckmiller,
G.