R ENDICONTI
del
S EMINARIO M ATEMATICO
della
U NIVERSITÀ DI P ADOVA
C HARLES A. G REENHALL Signal and noise in nonlinear devices
Rendiconti del Seminario Matematico della Università di Padova, tome 42 (1969), p. 1-14
<http://www.numdam.org/item?id=RSMUP_1969__42__1_0>
© Rendiconti del Seminario Matematico della Università di Padova, 1969, tous droits réservés.
L’accès aux archives de la revue « Rendiconti del Seminario Matematico della Università di Padova » (http://rendiconti.math.unipd.it/) implique l’accord avec les conditions générales d’utilisation (http://www.numdam.org/conditions).
Toute utilisation commerciale ou impression systématique est constitutive d’une infraction pénale. Toute copie ou impression de ce fichier doit conte- nir la présente mention de copyright.
Article numérisé dans le cadre du programme Numérisation de documents anciens mathématiques
http://www.numdam.org/
CHARLES A.
GREENHALL *)
ABSTRACT - Independent signal and noise are presented to the input of a non- linear device, and we ask how the output time function is to be decompo-
sed into an output signal and output noise. On the basis of two require-
ments on this decomposition we determine that the signal output is just the conditional expectation of the output with respect to the original signal.
This makes the output signal and noise uncorrelated. The decomposition
is invariant to linear filtering. The signal output of a zero-memory device is given by another zero-memory device acting on the input signal. The output signal of a bandpass nonlinearity is written down in terms of an
integral. For the bandpass hard limiter in Gaussian noise this gives the output signal amplitude very quickly in terms of Bessel functions. The antocorre- lation of the output noise of a zero-memory device in Gaussian noise is derived. Another possible definition of signal output is investigated and rejected.
1. Introduction.
When
independent signal
and noise are components of theinput
to a nonlinear device such as a detector or
bandpass limiter,
thesetwo components are
inextricably
mixed in the output.Shutterly [11]
(and
see alsoCampbell [6])
expresses theoutput
of a zero-memory device as a sum ofproducts
ofsignal
and noise functions.However,
the concept of « outputsignal-to-noise
ratio » isapplied
to nonlinear*) This work was done during the tenure of a NASA Research Associateship
at Jet Propulsion Laboratory, Pasadena, California.
Indirizzo dell’A.: University of Southern California, University Park, Los Angeles, California 90007.
devices,
which suggests that the outputmight
bedecomposed
intotwo uncorrelated components, to be called
output signal
andoutput
noise.Davenport [7]
made such adecomposition
for the hardlimiter,
but in the autocorrelation or
spectral
domain. The purpose of this paper is to make the samedecomposition
in the time domain on the basis ofgeneral
andsimple requirements
on such adecomposition.
This would be of
help
intracing
the progress ofsignal
and noisethrough
a receivercontaining
nonlinear devices. We will define theoutput signal
and noise for a very broad class of black boxdevices,
and then
specialize
the result to zero memory devices andbandpass
zero-memory
devices, including
as aspecial
case thebandpass
limiter.2. Def inition of
Output Signal
and Noise.Suppose
the « nonlinear device » has twotime-varying
real in-puts
set)
andn(t)( - 00 t 00),
and one outputy(t),
where thesignal
s and noise n are
sample
functions ofindependent
realstationary
processes.
The
sample
functions s and n arepoints
ofindependent sample
spaces SZS and !1n which have
probability
measures Ps and Pn . Assumethat these processes have finite
variance, E(s2(t))
oo,E(n2(t» 00
forall t. For convenience we will take
E(n(t)) = 0.
The output
y(t)( - 00 t 00)
of the device is determinedby
the
input
functions s and n. Therefore we can consider y to be asample
function of a random process
Y(t,
s,n)
on theproduct probability
spaceflsxnn
with theproduct probability
PS .Pn . Werequire
that the device be timeinvariant;
in otherwords,
if theinputs
ares(t- 6)
andn,(t-B),
then the output is
y(t- S).
Thisguarantees
that Y is astationary
process.We also ask that
Our aim is to determine what is meant
by
the «signal
» and « noise »portions
ofy( t).
InDavenport’s
treatment of thebandpass
limiter[7]
[8]
the autocorrelation or spectrum of theoutput
was calculated. Thesebroke up
naturally
into a part due tosignal only,
and a part due to noise and intermodulation betweensignal
and noise. What we want is adecomposition
of the output in the timedomain,
into output
signal
and noiseportions.
We will
place
thefollowing
conditions on thedecomposition (2.1)r
oi)
For eacht( - 00 t 00), sy(t)
is a random variable of finitevariance on the
original signal sample
space !1s . Thesy(t)
form a statio--nary process.
ii)
Foreach t,
the random variableny(t) = y(t) - sy(t)
is uncorre-- lated with all random variables in the spaceS=L2(!1s, Ps)
ofsignal
random variables of finite
variance,
i. e., with all random variablesf(s)
on fls such thatE(f) 00.
Thusfor
all f
in S.It is
enough
torequire (2.2)
forall f
of the formf(s)=F(s(tl),
... , I
s(tk)),
where ti , ... , tk are distinct times and F is a function of k variables such thatE[f(s)] 00.
It is too much to
require
that ny beindependent
of theoriginal signal
process. Thus ny will ingeneral depend
both oninput signal
andinput
noise.Conditions
(i)
and(ii) imply
thatsince for
f ixed t’, s,(t’)
is asignal
random variable which canreplace f
in
(2.2).
Hence the powerspectrum
of yis,
exceptpossibly
for a dccomponent,
the sum of the powerspectra
of sy and ny .To see how far conditions
(i)
and(ii)
determine sy and ny , wewrite
(2.2)
asfor
all f
in S. Then writey(t)
asSince
E(ny)
isjust
a constant, itbelongs
to S. Thereforesy(t)+E(ny)
is in S
by (i),
andny(t)-E(ny)
isorthogonal
to Sby (2.4).
Thus forfixed t, sy(t)+E(ny)
is theprojection p(t) = P(t, s)
of the random variabley(t)
ontoS,
the «signal
space ».Furthermore,
the random variableswhere c is any constant,
satisy (i)
and(ii). (It
may be shown that the random variablesp(t)
form astationary process.).
Thus these conditions determine the outputsignal
and noise within a constant.The
projection p(t)
has anothersignificance.
We know thatp(t)
is an
integrable
random variable on fl,satisfying E(p(t) f ) =E(y(t) f )
forall bounded measurable
f
on Os. Thisimplies
thatp(t)
is just the con-ditional
expectation
of the random variabley(t)
with respect to all the random variabless(t’)( - 00 t’ 00): p(t) = E(y(t) I set’),
allt’).
Since(!1s, PS)
and(!1n, Pn)
areindependent probability
spaces, it mayeasily
be verified that this
projection
or conditionalexpectation
may be writtenWe now set c = 0 in
(2.5)
andadopt
as our definitions ofoutput signal
and noiseThen
E(sy(t)) = E(y(t)), E(ny(t»=O. Equation (2.6)
shows that thesignal portion
of the output at time t is obtainedby fixing
theinput
signal
andaveraging
the output at time t over allpossible
noiseinputs
belonging
to the noisesample
space On.(For
causal devices theoutput
is determined
by
the pasts(t’), n(t’)(t’ t),
but this does notfalsify
ourstatements.)
The definition
(2.7)
may also be of use fornon-stationary input signals.
The conditions(i)
and(ii),
withstationarity removed, yield
thatsy(t) = p(t) + c(t),
wherec(t)
is anarbitrary
deterministic function of time. Some further condition(like c=constant)
is needed to define say wellenough.
3. Linear Filters and
Zero-Memory
Devices.The conditional
expectation
in(2.7)
is of course in no convenient form forcalculation, being
an average over a whole function space. Ho- wever, in thespecial
case of a zero-memory device followedby
a linearfilter
(acting
on the sum of theinput signal
andnoise),
the conditionalexpectation
reduces toordinary integrals
over real variables.a)
Linear Filter. Letwhere x = s -~- n and the
impulse response h
satisfiesThe condition
(3.2)
on h ensures that withprobability
one the inte-gral (3 .1 )
exists for all t and thatE(f) 00, given
thatE(x2) w.
Wewill show that
where Sy and ny are defined
by (2.7).
The condition(3.2)
andE(S2)
~ensures that
(Hs)(t)
is a random variable on !1s with finite variance, so all we have to do isverify
theprojection
propertyfor all random variables
f
on !1s of finite variance. Thuswhich is
(3.4).
With the definition(2.7), then,
a linear filter does not mix thesignal
and noise.This
property
extends further.Suppose
we follow thegeneral
non-linear device of section 2
by
a linear filter H. Thus the output of thecomposite
device isz(t) = (Hy)(t) = (Hsy)(t) + (Hny)(t).
If wereplace
sand n in
(3.5) by sy
and ny , and note thatE( f ny(t)) = 0 by (ii)
andEny= 0,
we see that theanalog
of(3.4),
holds,
and henceWe
emphasize
thatsHy(t)
is theprojection
of(Hy)(t)
onto theoriginal signal
space. The result(3.6)
says thatdecomposition (2.7)
is not affec-ted
by
passagethrough
a linear filter. Thesignal
and noise outputs ofa device followed
by
a linear filter are obtainedby letting
the filter acton the
signal
and noise outputs of the device. Inparticular,
this willallow us to treat
bandpass
non-linearities.b) Zero-Memory
Device.Here we let
where
F,
the characteristic of thedevice,
is a real-valued function oftwo variables such that
For fixed t we are
dealing
withonly
the two random variabless(t)
andn(t).
Hence(2.6)
and(2.7)
becomewhere p is the
probability density
ofn(t).
As far as thesignal
is concer-ned,
the device acts like another device of characteristicG,
which of coursedepends
on the noisedensity p(n).
We will call G thesignal
characte-ristic of the device. In the next section this will be illustrated
by
thehard limiter. Blachman
[1] [2]
has considered this idea that thesignal
output is a zero-memory function of the
signal input,
obtainedby
avera-ring
the output over the noise.Now suppose the
input signal
and noise to this device F are narrow-band about a center
frequency
wo . Let thesignal
have therepresentation s(t) = V(t) sin (wot + 0(t)), V(t) 2:0,
where V sin 0 and V cos 8 are nar- rowband about zerofrequency
with bandwidths smallcompared
with~oo .
Also,
assume that the random variablesV(t)
and0(t), t fixed,
areindependent,
that0(t)
isuniformly
distributed in[0, 2~ ] ,
and that the distribution ofVet)
isindependent
of t. ThenHence with
probability
one, we canexpand G(V(t)
sin1»
in a Fourier series on 0::5 P::5 2’7t:convergent
inL2(0, 2x),
where the Fourier coefficients ck aregiven by
The same
change
of variables as was used in(3.8)
willgive
thatFor fixed t this converges in the mean. The terms in
(3.10)
are thesignal
components in the narrowfrequency
zones about each±kwo(k=0,
1, 2,...).
We can find a(nonrealizable)
filter Hsatisfying (3.2)
whosecomplex
transfer function is 1 in the kth zone and 0 in all other zones(by making
the transfer function smoothenough).
Ifsuch a filter passes the kth harmonic
unchanged
and annihilates all theothers,
thenby (3.6)
thesignal
output of the deviceconsisting
of the zero-memory device followed
by
H isIn the first zone,
k =1,
there arein-phase
andquadrature
compo-nents
sharing
theoriginal phase
modulation0(t).
Theamplitude
modu-lation is distorted
by
zero-memory characteristics Im ck and Re ck .For zero-memory devices of form
F(s(t)+n(t)),
Blachman[ 1 ] [3]
and
Doyle [9]
obtain the outputsignal amplitude
in the first zoneby averaging
over the noise the component of the totalbandpass
output inphase
with theinput signal.
Theresulting integral
may be transformed into(3.9) (k =1 )
under the condition thatn(t) = A(t)
sin~(t),
whereA(t)
and~(t)
areindependent
random variables and~(t)
isuniformly
distri-buted. ,
4.
Bandpass
Limiter.An
example
of a zero-memory device is the ideallimiter,
whereHenceforth the
input
noisenet)
will be astationary
Gaussian pro- cess, withFez)==0, E(n2)=rr.
Weeasily
calculate from(3.7)
that thesignal
at the output of the hard limiter isThe
signal
characteristic G is a smooth limiter. For moregeneral
noise
densities,
G has theshape
of the distribution function ofn(t).
For
large signal-to-noise
ratios thesignal
characteristic G is itself likea hard limiter. For small
signal-to-noise rations,
G is almostlinear,
i.e.Consider now the case of narrow-band
signal
and noiseinputs.
The harmonic
expansion (3.10)
ofsy(t)
can be writtenif k is even then
by= 0.
Tausworthe
[12]
and Blachman[2]
obtained theexpression (4.3)
for thesignal amplitude
in the kth zone. We can avoid the usualhyper- geometric
functions and express(4.3) directly
in terms of Bessel functions.Integrate (4.3) by
parts to obtainwhere the Im are modified Bessel functions of the first kind.
The
signal
power in the kth zone isFrom this we can compute
signal-to-noise
ratio in this zone, since the total power there is8/( 1tk)2,
and thesignal
and noiseportions
areuncorrelated
by
ourassumption (ii)
on thesignal-noise decomposition.
5. Noise and International
Output.
In the case of a zero-memory device in Gaussian noise with mean
0,
variance o- we willexpand
the noiseportion ny(t)
of the output ina Hermite series. This is a convenient form for
computing
theoutput
noise autocorrelation and spectrum[5] [ 13 ] .
We writewhere
and the
expansion
in(5.1 )
converges in the mean.(The
situation isanalogous
to(3.9)).
But the r=0 term in(5.1)
isjust
thesignal portion sy(t)
since Ho=1. ThusBecause
(Mehler’s formula),
the autocorrelation of ny iswhere
p( ’t) = cr-2E [n(t)n(t + ’t)].
Let
s(t)
have thestationary
narrow-band formV(t)sin(D(t), C(~)=~+8(f)+a,
where a is auniformly
distributed constantphase.
Write
Let
Vi=V(t), V2=V(t+-c), ~1=~(t), ~~=~(t-I--~),
etc. It can be shown that the terms in(5.4)
are uncorrelated in the sense thatHence
This
displays
the autocorrelation of ny as a series of intermodulationterms in the usual way.
Let us calculate the coefficients ark more
explicitly
for the hardlimiter, F(s, n) = sgn (s + n).
For convenience letE(n2) =1.
Thenwhere s is to be
replaced by sla.
ThenThis
procedure
was usedby
Tikhonov and Amiantov[ 13 ] ,
and ofcourse other
expressions
for the ark exist[4] [7] [8] [10] [11] [13].
We
display
theexpansions (5.3)
and(5.5)
to show the connection between theexisting theory
and ourdecomposition
6.
Investigation
of Another Definition ofSignal Output.
An alternative definition of
signal
outputmight
have been the cor-responding
« wide sense » conditionalexpectation west),
theprojection
of
y(t)
onto thesubspace generated by
linear combinations of the random variabless(t’), (or t’:::; t).
Theny(tl)
-w(ti)
isorthogonal
tow(t2), just
asy(ti) - p(ti)
isorthogonal
top(t2),
so thisgives
adecomposition
in thespectral
domain as before. Ingeneral,
fora zero-memory
device,
is notgiven by
a zero-memory transformation of thesignal,
even in the absence of noise. A sufficient condition thatw(t)
be a zero-memory function ofs(t),
and thusw(t) =
consts~t),
is thatfor
all t, t’,
wheref (t’, t)
is not a random variable. We willproceed
to prove this: We can
evaluate f by taking
theexpectation
of(6.1)
multiplied by s( t) :
the normalized autocorrelation. If G is a function such that
E[ G2(s(t»]
00then
Is should be noted that
(6.1)
is weaker than Barret andLampard’s
condition
[14], being merely
ageometric
condition on thejoint
distri-bution of
s( t), s( t’) .
Given a zero-memory device
y(t) = F(s(t), n(t))
and the noise process, let G be thesignal characteristic,
definedby (3.7).
For all t,t’,
This demonstrates that
As(tt
is theprojection
We see that under the condition
(6.1),
which is satisfiedby
Gaussianand sine-wave
signals,
the « linear » definitionw(t)
of thesignal output
does not include distortionhigher
harmonics. Theseparts
are classifiedas « noise » even in the case of no
input
noise. For this reason wewould not want to consider
w(t)
the entiresignal
output.REFERENCES
[ 1 ] N. M. BLACHMAN: The Output Signal-to-Noise Ratio of a Power-Law Device, J. Appl. Phys. 24 (1953), 783-5.
[2] N. M. BLACHMAN: The Effect of a Limiter Upon Signals in the Presence of Noise, IRE Trans. Info. Thy., IT-6 (1960), 52.
[3] N. M. BLACHMAN: Band-pass Nonlinearities, IEEE Trans. Info. Thy., IT-10 (1964), 162-4.
[4] G. BONNET: Nonlinear, Zero-Memory Transformations of Random Signals (In French), Annales de Télécomm., 19 (1964), 203-20.
[5] L. L. CAMPBELL: On the Use of Hermite Expansions in Noise Problems,
SIAM J., 5 (1967), 244-9.
[6] L. L. CAMPBELL: On a Class of Polynomials Useful in Probability Calcula- tions, IEEE Trans. Info. Thy., IT-10 (1964), 255-6.
[7] W. B. DAVENPORT, JR.: Signal-to-Noise Ratios in Band-Pass Limiters, J. Appl.
Phys., 29 (1955), 720-7.
[8] W. B. DAVENPORT, JR. and W. L. ROOT: Random Signals and Noise, McGraw- Hill, 1958.
[9] W. DOYLE: Elementary Derivation for Band-Pass Limiter Signal-to-Noise Ratio, IRE Trans. Info. Thy., IT-8 (1962), 259.
[10] W. J. HURD: Correlation Function of Quantized Sine-Wave Plus Gaussian Noise, IEEE Trans. Info. Thy., IT-13 (1967), 65-8.
[11] H. B. SHUTTERLY: General Results in the Mathematical Theory of Random Signals and Noise in Nonlinear Devices, IEEE Trans. Info. Thy., IT-9 (1963),
74-84.
[12] R. C. TAUSWORTHE: An Integral Formula for Limiter Suppression Factor, Jet Propulsion Laboratory Space Programs Summary, No. 37-35, Vol. IV (1963), p. 307-9.
[13] V. I. TIKHONOV and I. N. AMIANTOV: The Effect of Signal and Noise on
Nonlinear Elements (the direct method), (1956), p. 187-203 in P. I. Kuz- NETSOV. R. L. STRATONOVICH, and V. I. TIKHONOV: Nonlinear Transfor- mations of Stochastic Processes, Pergamon (1965).
[14] J. F. BARRETT and D. G. LAMPARD: An Expansion for Some Second-order
Probability Distributions and its Applications to Noise Problems, IRE Trans,
on Info. Thy., IT-1, No. 10 (1965).
[15] C. A. GREENHALL: Signal and Noise in Nonlinear Devices, Jet Propulsion Laboratory Space Programs Summary 37-44, Vol. IV (April 1967), p. 303-7.
Manoscritto pervenuto in redazione 1’8 febbraio 1968.