• Aucun résultat trouvé

On the Fisher information matrix for multivariate elliptically contoured distributions

N/A
N/A
Protected

Academic year: 2021

Partager "On the Fisher information matrix for multivariate elliptically contoured distributions"

Copied!
5
0
0

Texte intégral

(1)

This is an author-deposited version published in:

http://oatao.univ-toulouse.fr/

Eprints ID: 9579

To link to this article: DOI: 10.1109/LSP.2013.2281914

URL: http://dx.doi.org/10.1109/LSP.2013.2281914

To cite this version:

Besson, Olivier and Abramovich, Yuri On the Fisher

information matrix for multivariate elliptically contoured distributions.

(2013) IEEE Signal Processing Letters, vol. 20 (n° 11). pp. 1130-1133.

ISSN 1070-9908

O

pen

A

rchive

T

oulouse

A

rchive

O

uverte (

OATAO

)

OATAO is an open access repository that collects the work of Toulouse researchers and

makes it freely available over the web where possible.

Any correspondence concerning this service should be sent to the repository

administrator: staff-oatao@inp-toulouse.fr

(2)

On the Fisher Information Matrix for Multivariate

Elliptically Contoured Distributions

Olivier Besson, Senior Member, IEEE, and Yuri I. Abramovich, Fellow, IEEE

Abstract—The Slepian-Bangs formula provides a very

conve-nient way to compute the Fisher information matrix (FIM) for Gaussian distributed data. The aim of this letter is to extend it to a larger family of distributions, namely elliptically contoured (EC) distributions. More precisely, we derive a closed-form expression of the FIM in this case. This new expression involves the usual term of the Gaussian FIM plus some corrective factors that depend only on the expectations of some functions of the so-called modular variate. Hence, for most distributions in the EC family, derivation of the FIM from its Gaussian counterpart involves slight additional derivations. We show that the new formula reduces to the Slepian-Bangs formula in the Gaussian case and we provide an illustrative example with Student distributions on how it can be used.

Index Terms—Cramér-Rao bound, elliptically contoured

distri-butions, Fisher information matrix.

I. INTRODUCTION

T

HE CRAMÉR-RAO BOUND (CRB) provides a lower bound on the variance of any unbiased estimator and is thus the ubiquitous reference to compare the performance of a given estimator to [1]. The CRB is usually computed as the inverse of the Fisher information matrix (FIM) whose entries involve the derivatives of the log-likelihood function of the

ob-servation matrix where

stands for the -th snapshot. When the latter are independent and identically distributed (i.i.d.) vectors drawn from a com-plex Gaussian distribution, i.e., when

where denotes the set of unknown real-valued param-eters that describe the distribution, the Slepian-Bangs formula [2], [3] provides a general expression for the FIM as

(1) where and where, for the sake of simplicity, we have omitted the dependence of and on in the second line of (1). The convenience of such formula has been thor-oughly used for a myriad of statistical data models, at least

under the Gaussian framework. However, in many applications, non-Gaussianity of the data has been evidenced and hence, for each non-Gaussian distribution, the FIM must be speciÞcally computed from the Þrst line of (1). In this letter, we provide an extension of the Slepian-Bangs formula to a very general class of distributions, namely multivariate elliptically contoured (EC) distributions.

II. A BRIEFREVIEW OFEC DISTRIBUTIONS

Multivariate elliptically contoured distributions [4], [5] con-stitute a large family of distributions which have been used in a variety of applications, including array processing. In this sec-tion, we brießy summarize their deÞnitions and properties so as to provide the necessary background for derivation of the FIM in the next sections. A very detailed presentation of EC distribu-tions can be found in the book [4]. We would like also to point to the recent paper [6] where a very comprehensive review of complex elliptically symmetric distributions is given. A vector follows an EC distribution if it admits the following stochastic representation

(2) where the non-negative real random variable , called the modular variate, is independent of the complex random vector possessing a uniform distribution on the complex

sphere , which we denote as

. In (2), means “has the same distribution as”. The full-rank matrix is such that where is the so-called scatter matrix. In this paper, we consider the absolutely continuous case where is non singular and hence . In such a case, the probability density function (p.d.f.) of can be deÞned and is given by

(3) for some function called density generator that satisÞes Þnite moment condition

. In (3), stands for proportional to. The density generator is related to the p.d.f. of the modular variate by

(4) We adopt the following notation in the following

. The complex Gaussian distribution is obtained for the particular . While there is essentially a unique way to deÞne an ellip-tically contoured distribution for a vector, when it comes to extend it to the matrix-variate

(3)

focus on the two main matrix-variate distributions encountered in the array processing literature, namely [4], [5]

1) the multivariate elliptical distributions where essen-tially all snapshots are i.i.d. and drawn from (3). We refer to this type of distribution as EMS

and denote where

.

2) the vector elliptical distributions where

follows a (vector) EC

distribu-tion, i.e., with

and . We refer to this distribution as EVS

and denote .

III. THEFIMFOREMS DISTRIBUTIONS

Let us investigate Þrst the EMS type of distributions. The latter include the so-called compound-Gaussian models which have been widely considered in radar applications in order to model clutter [7]. As we said before, we assume that the snapshots are i.i.d random vectors drawn from . Therefore, the p.d.f. of is given by

(5) In the sequel, we assume that and depend on an unknown parameter vector , which we wish to estimate from , and we look for an expression for the FIM under this statis-tical model. For the sake of convenience, we rewrite the likeli-hood function in (5) as and we will omit the explicit dependence of and on .

In order to obtain the FIM, we must Þrst compute the Þrst-order derivative of the log-likelihood function

(6)

where . Differentiating (6) with

respect to (w.r.t.) , we obtain

(7)

where . Now,

(8) Let us Þrst prove that

(9) which is a necessary condition for the CRB theory to apply.

Making use of , one can observe that

and

with . Therefore,

(10) and are independent. Moreover : hence

and . Thus

(11) (12) It ensues that

(13) Reporting this equation in (7) proves (9).

Let us now turn to the derivation of the entry of the FIM, as given by

(14)

Since , we have

(15) Therefore, we need to evaluate the expected value of the three different terms in (15). Let us Þrst consider

(16) Next, we have

(4)

Indeed, let us consider for some vector

and Hermitian matrix . Let and note

that where , i.e.,

, and are independent. Since is

Gaussian distributed, one has .

How-ever,

and . Therefore, which

proves (17).

It remains to derive the last term in (15). Similarly to what was done before, let us consider with . It is well known that

(18a) (18b) Consequently

(19) Using (15), (16), (17) and (19) in the expression (14) of the FIM, we Þnally obtain the following extension of the Slepian-Bangs formula to EMS distributions:

(20) It is remarkable that despite the high generality of EC distri-butions, the formula for the FIM remains quite simple. Indeed, it is reminiscent of the FIM for Gaussian distributions (one recognizes the two terms of the Slepian-Bangs formula) but for different scaling factors. The latter depend only on the expected values of some functions of , and deviation from the Gaussian distribution manifests itself only through these terms. Overall, it means that any Fisher information matrix derived under the Gaussian assumption needs to be modiÞed only slightly to obtain the FIM for EMS distributions: indeed,

only computation of is necessary. In

some cases, see below, one can obtain an analytic expression for them. Would that not be the case, numerical tools to compute integrals or stochastic simulation methods can be advocated to compute . This property paves the way to extension of many FIM derived so far under the Gaussian umbrella. We also observe that if is known, then the FIM for EMS distributions is directly proportional to the Gaussian FIM: hence, non-Gaussianity results in scaling of the CRB. Accordingly, if where depends only on and depends only on , then the FIM is block-diagonal. Moreover, the FIM for estimation of only is proportional to the Gaussian FIM, a fact that was already discovered in

[8], [9]. In contrast, when the two FIM are no longer

proportional, due to the term .

We now provide illustrative examples of how this formula can be used. Of course, we start with the Gaussian assumption

for which and . In this case, we

have and

. Reporting this value in (20) yields the Slepian-Bangs formula (1).

Let us now consider the well-known Student distribution with degrees of freedom which corresponds to

and hence .

Moreover, , and hence follows a scaled -distribution:

(21)

Some straightforward calculations show that, in this

case and

. Consequently, in the Student case, the FIM has the following expression

(22) One can verify, as expected, that

.

IV. THEFIMFOREVS DISTRIBUTIONS Let us now study the case where

(23) with . This model has been used in the array processing context, e.g., in [10] where Christ Richmond investi-gated the extension of Kelly’s generalized likelihood ratio test in Gaussian settings to EVS distributions. We now have the p.d.f. of as

(24) Similarly to the previous section, we let

and . The log-likelihood

function is now

(5)

Differentiating (25) with respect to yields

(26) where we have used (8). Let us again prove that (9) holds. Since

, it follows that

(27) which, when reported in (26), proves (9). The entry of the FIM can thus be written as

(28)

Now, we have

(29) Proceeding along the same lines as in the previous section, it is straightforward to show that

(30)

(31)

(32)

Gathering the previous equations, we Þnally obtain the FIM for

EVS distributions:

(33) Again, let us prove that when the data is Gaussian distributed, we recover the Slepian-Bangs formula. For Gaussian distributed data, one has

In this case, and

, and (33) reduces to (1).

V. CONCLUSION

In this letter, we proceeded to the extension of the well-known Slepian-Bangs formula of the FIM for Gaussian distributed data to the larger family of elliptically contoured distributions. Sur-prisingly enough, the new expression is rather simple and in-volves only slight modiÞcations compared to its Gaussian coun-terpart. Only the expectation of some functions of the (scalar) modular variate are to be derived, which can be done often an-alytically otherwise numerically. This result paves the way to derivation of new Cramér-Rao bounds, e.g., for structured co-variance matrix estimation or clutter and/or external noise pa-rameters estimation in non-Gaussian environments.

REFERENCES

[1] S. M. Kay, Fundamentals of Statistical Signal Processing: Estimation Theory. Englewood Cliffs, NJ, USA: Prentice-Hall, 1993. [2] D. Slepian, “Estimation of signal parameters in the presence of noise,”

Trans. IRE Professional Group Inf. Theory, vol. 3, no. 3, pp. 68–89, 1954.

[3] G. W. Bangs, “Array Processing With Generalized Beamformers,” Ph.D. dissertation, Yale Univ., New Haven, CT, USA, 1971. [4] K. T. Fang and Y. T. Zhang, Generalized Multivariate Analysis.

Berlin, Germany: Springer-Verlag, 1990.

[5] T. W. Anderson and K.-T. Fang, Theory and Applications of Ellip-tically Contoured and Related Distributions Dept. Statist., Stanford Univ., Stanford, CA, USA, Tech. Rep. 24, Sep. 1990.

[6] E. Ollila, D. Tyler, V. Koivunen, and H. Poor, “Complex elliptically symmetric distributions: Survey, new results and applications,” IEEE Trans. Signal Process., vol. 60, no. 11, pp. 5597–5625, Nov. 2012. [7] E. Conte, M. Lops, and G. Ricci, “Asymptotically optimum radar

de-tection in compound-Gaussian clutter,” IEEE Trans. Aerosp. Electron. Syst., vol. 31, no. 2, pp. 617–625, Apr. 1995.

[8] A. Swami, “Cramér-Rao bounds for deterministic signals in additive and multiplicative noise,” Signal Process., vol. 53, no. 2–3, pp. 231–244, Sept. 1996.

[9] B. M. Sadler, R. J. Kozick, and T. Moore, “Performance analysis for direction Þnding in non-Gaussian noise,” in Proc. ICASSP, Phoenix, AZ, USA, May 15–19, 1999, vol. 5, pp. 2857–2860.

[10] C. D. Richmond, “A note on non-Gaussian adaptive array detection and signal parameter estimation,” IEEE Signal Process. Lett., vol. 3, no. 8, pp. 251–252, Aug. 1996.

Références

Documents relatifs

Recently in [10] we addressed sub-pixel detection using the re- placement model under a Gaussian background, and we derived the plain generalized likelihood ratio test (GLRT)

The SB formulas for mis- specified Gaussian models can, in fact, be derived as a spe- cial case of the ones given in (31) and (34) by posit- ing as true distribution the

Erwähnung im Zitat: Freiburg, K.U.B.: Papiere Theodor Scherer-Boccard (LD 5). Erwerbungsform: Schenkung von Frl. Hedwige Gressly, Solothurn, am 29. Beschreibung: Joseph Leisibach

The recent advances in the spectral analysis of large dimensional random ma- trices, and particularly of matrices of the sample covariance type, have triggered a new wave of

For any shape parameter β ∈]0, 1[, we have proved that the maximum likelihood estimator of the scatter matrix exists and is unique up to a scalar factor.. Simulation results have

In this paper, we studied the problem of linear optimization with joint probabilistic constraints in the case of elliptical distributions.. Further, we modeled the dependence of

L'article 1 a deux petites soies dont une pennée distalement; le deuxième offre une à deux soies simples et quatre pennées, dont celle située sur l'apophyse