memory parameter

Top PDF memory parameter:

Detection of non-constant long memory parameter

Detection of non-constant long memory parameter

On the other hand, some regression-based Lagrange Multiplier procedures have been recently dis- cussed in Hassler and Meller (2009) and Martins and Rodrigues (2010). The series is first filtered by (1 − L) d , where L is the lag operator and d is the long memory parameter under the null hypothesis, then the resulting series is subjected to a (augmented) Lagrange Multiplier test for fractional inte- gration, following the pioneer works by Robinson (1991, 1994). The filtering step can be done only approximatively and involves in practice an estimation of d. This is certainly the main reason for the size distortion that can be noticed in the simulation study displayed in Martins and Rodrigues (2010). In a nonparametric set up, following Kim (2000), Kim et al. (2002) proposed several tests (hereafter referred to as Kim’s tests), based on the ratio
En savoir plus

42 En savoir plus

Asymptotic normality of wavelet estimators of the memory parameter for linear processes

Asymptotic normality of wavelet estimators of the memory parameter for linear processes

M(d) processes encompass both stationary and non-stationary processes, depending on the value of the memory parameter d. The function f (λ) = |1 − e −iλ | −2d f ∗ (λ) (2) is called the generalized spectral density of X. It is a proper spectral density function when d < 1/2. In this case, the process X is covariance stationary with spectral density function f . The process X is said to have long-memory if 0 < d < 1/2, short-memory if d = 0 and negative memory if d < 0; the process is not invertible if d < −1/2. The factor f ∗ is a nuisance function which determine the “short-range” dependence.
En savoir plus

23 En savoir plus

Application of Malliavin calculus and analysis on Wiener space to long-memory parameter estimation for non-Gaussian processes

Application of Malliavin calculus and analysis on Wiener space to long-memory parameter estimation for non-Gaussian processes

A stochastic process fX t : t 2 [0; 1]g is called self-similar with self-similarity parameter H 2 (0; 1) when typical sample paths look qualitatively the same irrespective of the distance from which we look at them, i.e. for any xed time-scaling constant for c > 0, the processes c H X ct and X t have the same distribution. Self-similar stochastic processes are well suited to model physical phenomena that exhibit long memory. The most popular among these processes is the fractional Brownian motion (fBm), because it generalizes the standard Brownian motion and its self-similarity parameter can be interpreted as the long memory parameter.
En savoir plus

5 En savoir plus

On the spectral density of the wavelet coefficients of long memory time series with application to the log-regression estimation of the memory parameter

On the spectral density of the wavelet coefficients of long memory time series with application to the log-regression estimation of the memory parameter

CNRS LTCI and Boston University Abstract. In recent years, methods to estimate the memory parameter using wavelet analysis have gained popularity in many areas of science. Despite its widespread use, a rigorous semi-parametric asymptotic theory, comparable to the one developed for Fourier methods, is still missing. In this contribution, we adapt to the wavelet setting the classical semi-parametric framework introduced by Robinson and his co-authors for estimating the memory parameter of a (possibly) non-stationary process. Our results apply to a class of wavelets with bounded supports, which include but are not limited to Daubechies wavelets. We derive an explicit expression of the spectral density of the wavelet coefficients and show that it can be approximated, at large scales, by the spectral density of the continuous-time wavelet coefficients of fractional Brownian motion. We derive an explicit bound for the difference between the spectral densities. As an application, we obtain minimax upper bounds for the log-scale regression estimator of the memory parameter for a Gaussian process and we derive an explicit expression of its asymptotic variance.
En savoir plus

36 En savoir plus

A Wavelet Whittle estimator of the memory parameter of a non-stationary Gaussian time series

A Wavelet Whittle estimator of the memory parameter of a non-stationary Gaussian time series

A WAVELET WHITTLE ESTIMATOR OF THE MEMORY PARAMETER OF A NONSTATIONARY GAUSSIAN TIME SERIES 1 By E. Moulines, F. Roueff and M. S. Taqqu T´el´ecom Paris/CNRS LTCI and Boston University We consider a time series X = {Xk, k ∈ Z} with memory param- eter d0 ∈ R. This time series is either stationary or can be made sta- tionary after differencing a finite number of times. We study the “lo- cal Whittle wavelet estimator” of the memory parameter d0 . This is a wavelet-based semiparametric pseudo-likelihood maximum method estimator. The estimator may depend on a given finite range of scales or on a range which becomes infinite with the sample size. We show that the estimator is consistent and rate optimal if X is a linear process, and is asymptotically normal if X is Gaussian.
En savoir plus

34 En savoir plus

Wavelet estimation of the long memory parameter for Hermite polynomial of Gaussian processes

Wavelet estimation of the long memory parameter for Hermite polynomial of Gaussian processes

k=0 W j,k 2 , (1.5) adequately normalized as the number of wavelet coefficients n and j = j(n) → ∞. This is a necessary and important step in developing methods for estimating the underlying long memory parameter d, see the references mentioned at the beginning of this section. Indeed, using the wavelet scalogram, there is standard way to construct an estimator of the memory parameter. The asymptotic behavior of the scalogram gives the convergence rate of this estimator. We provide more details in Section 5.

39 En savoir plus

Adaptive estimator of the memory parameter and goodness-of-fit test using a multidimensional increment ratio statistic

Adaptive estimator of the memory parameter and goodness-of-fit test using a multidimensional increment ratio statistic

Keywords: Long-memory Gaussian processes; goodness-of-fit test; estimation of the memory parameter; minimax adaptive estimator. 1 Introduction After almost thirty years of intensive and numerous studies, the long-memory processes form now an im- portant topic of the time series study (see for instance the book edited by Doukhan et al, 2003). The most famous long-memory stationary time series are the fractional Gaussian noises (fGn) with Hurst parameter H and FARIMA(p, d, q) processes. For both these time series, the spectral density f in 0 follows a power law: f (λ) ∼ C λ −2d where H = d + 1/2 in the case of the fGn. In the case of long memory process d ∈ (0, 1/2)
En savoir plus

24 En savoir plus

Adaptive wavelet based estimator of the memory parameter for stationary Gaussian processes

Adaptive wavelet based estimator of the memory parameter for stationary Gaussian processes

Conclusion: Which estimator among those studied above has to be chosen in a practical frame, i.e. an observed time series? We propose the following procedure for estimating an eventual long memory parameter: 1. Firstly, since this procedure is very low time consuming and applicable to processes with smooth trends, draw the log-log regression of wavelet coefficients’ variances onto scales. If a linear zone appears in this graph, consider the estimator b D b N (or b D AT V ) of D.

35 En savoir plus

Central limit theorem for the robust log-regression wavelet estimation of the memory parameter in the Gaussian semi-parametric context

Central limit theorem for the robust log-regression wavelet estimation of the memory parameter in the Gaussian semi-parametric context

WAVELET ESTIMATION OF THE MEMORY PARAMETER IN THE GAUSSIAN SEMI-PARAMETRIC CONTEXT O. KOUAMO, C. L´ EVY-LEDUC, AND E. MOULINES Abstract. In this paper, we study robust estimators of the memory parameter d of a (possi- bly) non stationary Gaussian time series with generalized spectral density f . This generalized spectral density is characterized by the memory parameter d and by a function f ∗ which specifies the short-range dependence structure of the process. Our setting is semi-parametric since both f ∗ and d are unknown and d is the only parameter of interest. The memory pa- rameter d is estimated by regressing the logarithm of the estimated variance of the wavelet coefficients at different scales. The two estimators of d that we consider are based on robust estimators of the variance of the wavelet coefficients, namely the square of the scale estimator proposed by [27] and the median of the square of the wavelet coefficients. We establish a Central Limit Theorem for these robust estimators as well as for the estimator of d based on the classical estimator of the variance proposed by [19]. Some Monte-Carlo experiments are presented to illustrate our claims and compare the performance of the different estimators. The properties of the three estimators are also compared on the Nile River data and the In- ternet traffic packet counts data. The theoretical results and the empirical evidence strongly suggest using the robust estimators as an alternative to estimate the memory parameter d of Gaussian time series.
En savoir plus

34 En savoir plus

Log-average periodogram estimator of the memory parameter

Log-average periodogram estimator of the memory parameter

In this contribution, we study the averaged periodogram spectral estimator, based on the division of series into epochs, to obtain the memory parameter estimate of a long-memory process. The estimation method follows the GPH procedure, where the periodogram is replaced by the averaged periodogram in the regression equation. Some desirable asymptotic properties of the proposed estimator are derived and empirical investigation gives evidence to support the use of the procedure as an alternative method to GPH to reduce the variance of the fractional memory parameter.
En savoir plus

21 En savoir plus

Estimation of the Memory Parameter of the Infinite Source Poisson Process

Estimation of the Memory Parameter of the Infinite Source Poisson Process

− α = O P (2 −γJ 0 + 2 −γJ 0 ), and setting J 0 = J/(2γ + α) yields the second claim of Theorem 4.2 . 5. Concluding remarks In this work, we have proved the validity of a wavelet method for the estimation of the long-memory parameter of an infinite-source Poisson traffic model, either in a stable or in an unstable state, that is, when it does or does not converge to a stationary process. We have shown that a suitable choice of the scales in the estimator (see Remark 4.1 ) yields a consistent estimator in both situations, and checked that the estimator is robust to discrete data sampling.
En savoir plus

21 En savoir plus

On ordered normally distributed vector parameter estimates

On ordered normally distributed vector parameter estimates

[10] X. Mestre, P. Vallet, P. Loubaton, On the resolution probability of conditional and unconditional maximum likelihood doa estimation, in: Proceedings of the Eurasip EUSIPCO, 2013. [11] M.P. Clark, On the resolvability of normally distributed vector parameter estimates, IEEE Trans. Signal Process. 43 (12) (1995) 2975–2981 .

8 En savoir plus

Spectral Clustering: interpretation and Gaussian parameter

Spectral Clustering: interpretation and Gaussian parameter

In this paper, we propose a fully theoretical interpretation of spectral clustering whose first steps were introduced by Mouysset et al. ( 2010 ). From this, we define a new clustering property in the embedding space at each step of the study and new results showing the rule of the Gaussian affinity parameter. After recalling the spectral clustering method and the rule of the affinity parameter in Sect. 2.1 , we propose a continuous version of the Spectral Clustering with Partial Differential Equations (PDE). To do so, we consider a sampling of connected components and, from this, we draw back to original shapes. This leads to formulate spectral clustering as an eigenvalue problem where data points correspond to nodes of some finite elements discretization and to consider the Gaussian affinity matrix A as a representation of heat kernel and the affinity parameter  as the heat parameter t . Hence, the first step is to introduce an eigenvalue problem based on heat equation which is defined with a Dirichlet boundary problem. From this, in Sect. 2.2 , we deduce an “almost” eigenvalue problem which can be associated to the Gaussian values. Thus identifying connected component appears to be linked to these eigenfunctions. Then, by introducing the Finite Elements approximation and mass lumping, we prove in Sect. 2.3 that this property is preserved with conditions on t when looking at eigenvectors given by spectral clustering algorithm. Finally, in Sect. 3 , we study numerically the difference between eigenvectors from the spectral clustering algorithm and their associated discretized eigenfunctions from heat equation on a geometrical example, as a function of the affinity parameter t .
En savoir plus

11 En savoir plus

Dynamic DAG Scheduling Under Memory Constraints for Shared-Memory Platforms

Dynamic DAG Scheduling Under Memory Constraints for Shared-Memory Platforms

application. Otherwise, the traversal will require the use of swap mechanisms or out-of-core execution, which will dramatically (and negatively) impact the achieved makespan [34, 1]. Consider a task graph whose internal nodes require a large volume of tem- porary data, such as graphs arising from multifrontal solvers [3]. Improper scheduling decisions may lead dynamic schedules to hit a memory wall at some step while everything was going fine in the previous steps; the dynamic sched- ule suddenly reaches a state where any further decision (any choice of the next task to execute) will exceed the amount of available memory. This unfortunate scenario arises because dynamic schedules usually consider only tasks that are ready for execution, and have thus a very limited insight into the fraction of the task graph that is yet to be discovered and processed. To avoid such a pitfall, some global information on the task graph is required to guide the dynamic schedule and enforce safe execution paths.
En savoir plus

30 En savoir plus

Bayesian estimation for the multifractality parameter

Bayesian estimation for the multifractality parameter

ABSTRACT Multifractal analysis has matured into a widely used signal and im- age processing tool. Due to the statistical nature of multifractal pro- cesses (strongly non-Gaussian and intricate dependence) the accu- rate estimation of multifractal parameters is very challenging in sit- uations where the sample size is small (notably including a range of biomedical applications) and currently available estimators need to be improved. To overcome such limitations, the present contribution proposes a Bayesian estimation procedure for the multifractality (or intermittence) parameter. Its originality is threefold: First, the use of wavelet leaders, a recently introduced multiresolution quantity that has been shown to yield significant benefits for multifractal analysis; Second, the construction of a simple yet generic semi-parametric model for the marginals and covariance structure of wavelet lead- ers for the large class of multiplicative cascade based multifractal processes; Third, the construction of original Bayesian estimators associated with the model and the constraints imposed by multifrac- tal theory. Performance are numerically assessed and illustrated for synthetic multifractal processes for a range of multifractal param- eter values. The proposed procedure yields significantly improved estimation performance for small sample sizes.
En savoir plus

6 En savoir plus

Does long-term memory affect refreshing in verbal working memory?

Does long-term memory affect refreshing in verbal working memory?

Examining the impact of concreteness effect in an immediate serial recall task, Campoy, Castella`, Provencio, Hitch, and Badde- ley (2015) varied the availability of attentional resources by asking participants to concurrently perform either a simple tapping or a highly demanding random tapping task. Although the introduction of a higher concurrent attentional demand reduced recall perfor- mance, it did not moderate the concreteness effect. This absence of interaction between the concreteness effect and variation in con- current attentional demand was replicated in another experiment using different concurrent tasks. This absence of interaction could suggest that refreshing, which depends on the availability of at- tention, does not rely on semantic LTM. However, another study suggests that LTM knowledge does influence refreshing. Ricker and Cowan (2010) reported that blocking refreshing has an in- creasingly negative impact across longer retention intervals for English letters, but not for unfamiliar visual stimuli. Similarly, using a paradigm that assesses refreshing speed (see below for a description of this paradigm), Vergauwe et al. (2014) showed that maintenance of letters or words resulted in a postponement of an attentional demanding task proportionate to the memory load, while maintaining unknown fonts did not. Such results indicate that memoranda that are not represented in LTM are less likely to be refreshed in WM. Finally, if one agrees that complex span tasks provide more refreshing opportunities than simple span tasks (see McCabe, 2008), comparing LTM effects between these two types of tasks might indicate whether LTM is implicated in refreshing. Both Engle et al. (1990) and Loaiza et al. (2015a) found that LTM effects were enhanced in the simple span task compared with the operation span task, which suggests that WM recall benefits from LTM knowledge when refreshing opportunities are reduced. To summarize, there are currently no studies that directly address the question of the relationships between refreshing and semantic LTM, and the few relevant studies resulted in divergent findings that do not allow a straightforward conclusion on the issue. The present study directly tested how the efficiency of refreshing is moderated by retrieval from semantic LTM.
En savoir plus

19 En savoir plus

Associative memory in aging: The effect of unitization on source memory

Associative memory in aging: The effect of unitization on source memory

unitization. In two experiments, groups of 20 young and 20 older participants learned new associations between a word and a background color under two conditions. In the item detail condition, they had to imagine that the item is the same color as the background; an instruction promoting unitization of the associations. In the context detail condition, that did not promote unitization, they had to imagine that the item interacted with another colored object. At test, they had to retrieve the color that was associated to each word (source memory). In both experiments, the results showed an age-related decrement in source memory performance in the context detail but not in the item detail condition. Moreover, Experiment 2 examined receiver operating characteristics in older participants and indicated that familiarity contributed more to source memory performance in the item detail than in the context detail condition. These findings suggest that unitization of new associations can overcome the associative memory deficit observed in aging, at least for item-color associations.
En savoir plus

28 En savoir plus

Memory Analysis and Optimized Allocation of Dataflow Applications on Shared-Memory MPSoCs

Memory Analysis and Optimized Allocation of Dataflow Applications on Shared-Memory MPSoCs

7.2 Memory study of the stereo matching algorithm Table 2 shows the memory characteristics resulting from the application of the techniques presented in this paper to the SDF graph of the stereo matching algo- rithm. The memory characteristics of the application are presented for 4 scenarios, each corresponding to a different implementation stage of the algorithm. The |V | and δ(G) columns respectively give the number of memory objects and the density of exclusion of the MEG. The next two columns present the upper and lower allocation bounds for each scenario. Finally, the last two columns present the actual amount of mem- ory allocated for each target architecture. The alloca- tion results are expressed as the supplementary amount of memory allocated compared to the lower bound. These results were obtained with NbOffsets = 5, Nb- Disparities = 60 and a resolution of 450*375 pixels. 7.2.1 Effect of broadcast merging
En savoir plus

19 En savoir plus

Parameter estimation for peaky altimetric waveforms

Parameter estimation for peaky altimetric waveforms

Fig. 11. REs for 150 waveforms from Class 1 with (top) NM and (bottom) NR methods. samples, resulting in poor estimation. This corresponds to the range SWH ∈ [0, 2] m in Fig. 10(b). For larger values of SWH, the CRB of SWH is an increasing function of this parameter (as shown in Fig. 10(b) for SWH > 2 m) since the absolute error is directly related to the value of the parameter. Fig. 10(c) and (d) shows the BAGP, BGP, and Brown CRBs as a function of P u . Increasing P u implies a larger impact of the Brown echo with respect to the Gaussian peak. Thus, CRB(τ ) and CRB(SWH) are decreasing functions of P u . Fig. 10(e) and (f) shows the slight influence of τ on the CRBs of τ and SWH. Note that the square roots of the CRBs (RCRBs) have been displayed in order to compare them with the corresponding RMSEs. Note finally that the CRBs of BGP and BAGP model parameters are larger than those of the Brown model because BGP and BAGP involve additional unknown parameters (see [22] for more results about the CRBs).
En savoir plus

11 En savoir plus

Optimal Trading with Online Parameter Revisions

Optimal Trading with Online Parameter Revisions

We model the driving noise by a d-dimensional Brownian motion W (defined on the canonical space C([0, T ], R d ) endowed with the Wiener measure P). One could also consider jump type processes, such as compound Poisson processes, the same analysis would apply. The unknown parameter υ is supported by a (Polish) space (U, B(U)), and our initial prior on υ is assumed to belong to a locally compact subset M of the set of Borel probability measures on U (endowed with the topology of weak convergence). In the applications of Section 4, the collection of possible priors can be identified as a subset of a finite dimensional space (e.g. the parameters of a Gaussian distribution, the weights of law with finite support, etc.). Then, M can be simply viewed as a finite dimensional space.
En savoir plus

29 En savoir plus

Show all 1542 documents...