Haut PDF Correlations in the Monte Carlo Glauber model

Correlations in the Monte Carlo Glauber model

Correlations in the Monte Carlo Glauber model

All sources of correlations studied in this paper — namely, the autocorrelation, nuclear correlations, and the twin correlations — involve scales much shorter than the nuclear radius. The eccentricity and size fluctuations in the Glauber model appear then to be created by uncor- related, small-scale fluctuations in the transverse plane. Subnucleonic fluctuations, which are not incorporated in the Glauber model, are also intrinsically small-scale phe- nomena. They typically increase the magnitude of the lo- cal fluctuations, but do not give rise to any large-distance correlation. To a good approximation, the Monte-Carlo Glauber model provides a picture of energy deposition for the RHIC at LHC energies in terms of independent sources, that seems to capture the main features of these fluctuations and their correlations.
En savoir plus

15 En savoir plus

Using Monte Carlo ray tracing simulations to model the quantum harmonic oscillator modes observed in uranium nitride

Using Monte Carlo ray tracing simulations to model the quantum harmonic oscillator modes observed in uranium nitride

V. CONCLUSIONS We have used Monte Carlo ray tracing simulations with a variety of sample kernels to accurately model the total scattering response observed in a single crystal of uranium nitride on the SEQUOIA and ARCS time-of-flight chopper spectrometers. The simulations have verified that multiple scattering creates an essentially Q-independent background at the oscillator mode positions. The simulations have also shown that the measured scattering for uranium nitride can be reproduced extremely well by including QHO single- scattering events, acoustic phonon scattering events (both single and multiphonon), and multiple-scattering events that create any combination of oscillator excitations or acoustic phonons. Finally, the temperature dependence of the oscillator modes has been investigated on SEQUOIA from T = 8 to 300 K. Monte Carlo ray tracing simulations, incorporating intrinsic broadening of the oscillator modes according to the binary solid model, agree extremely well with the experimental data. This work shows that Monte Carlo ray tracing simulations can be an extremely effective tool for the accurate modeling of complex neutron scattering spectra.
En savoir plus

9 En savoir plus

Usefulness of the reversible jump Markov chain Monte Carlo model in regional flood frequency analysis.

Usefulness of the reversible jump Markov chain Monte Carlo model in regional flood frequency analysis.

On the other hand, the duality of the prior distribution and the fixed regional shape parameter allows the proposed estimator to be at least as efficient as the index flood model.. Thus [r]

14 En savoir plus

Validation of the Small Animal Biospace Gamma Imager Model Using GATE Monte Carlo Simulations on the Grid

Validation of the Small Animal Biospace Gamma Imager Model Using GATE Monte Carlo Simulations on the Grid

patients. In consequence, we can find many dedicated small field of view scanners for SPECT (Single Photon Emission Tomography) [2, 3] and PET (Positron Emission Tomography) [4] which have been designed in the last decade for these purposes. SPECT images have a very poor quality because generally used models do not incorporate all physical interactions such as scattering (30% of photons in SPECT images are scattered). Monte Carlo Simulations (MCS) have greatly contributed to these developments thanks to their accuracy and utility in the field of nuclear medicine. The use of the MCS has limitations in computing time. Different strategies have been suggested to reduce the computing time such as the acceleration techniques [5]. Another solution to speed up MCS is to combine Monte Carlo and non- Monte Carlo modelling [5]. A third option that has recently been explored is the deployment of computing grids, also known as the parallelization of the MCS [6]. The parallelization consists in sub- dividing a long simulation into short ones. Each MCS uses a Random Number Stream (RNS) to produce the physical interactions in question. The distribution of MCS on multiple computing resources requires that the generated streams are independent. In this work, we report the validation of the Biospace dedicated to small field of view SPECT scanner using the GATE MC simulation toolkit on the CIMENT Grid, a grid deployed on the university campuses in Grenoble.
En savoir plus

11 En savoir plus

The Monte Carlo Transformer: a stochastic self-attention model for sequence prediction

The Monte Carlo Transformer: a stochastic self-attention model for sequence prediction

1 Introduction Many critical applications (e.g. medical diagnosis or autonomous driving) require accurate fore- casts while detecting unreliable predictions, that may arise from anomalies, missing information, or unknown situations. While neural networks excel at predictive tasks, they often solely out- put a single-point estimate, lacking uncertainty measures to assess their confidence about their predictions. To overcome this limitation, an open research question is the design of neural gen- erative models able to output a predictive distribution instead of single point-estimates. First, such distributions would naturally provide the desired uncertainty measures over the model pre- dictions. Secondly, learning algorithms can build upon such uncertainty measurements to improve their predictive performance such as active learning [Campbell et al., 2000] or exploration in re- inforcement learning [Geist and Pietquin, 2010, Fortunato et al., 2018]. Thirdly, they may better model incoming sources of variability, such as observation noise, missing information, or model misspecification.
En savoir plus

22 En savoir plus

Population Monte Carlo

Population Monte Carlo

Summary. Importance sampling methods can be iterated like MCMC algorithms, while being more robust against dependence and starting values, as shown in this paper. The population Monte Carlo principle we describe here consists of iterated generations of importance samples, with importance functions depending on the previously generated importance samples. The advantage over MCMC algorithms is that the scheme is unbiased at any iteration and can thus be stopped at any time, while iterations improve the performances of the importance function, thus leading to an adaptive importance sampling. We first illustrate this method on a toy mixture example with multiscale importance functions. A second example reanalyses the ion channel model of Hodgson (1999), using an importance sampling scheme based on a hidden Markov representation, and compares population Monte Carlo with a corresponding MCMC algorithm. Keywords: Adaptive algorithm, degeneracy, hidden semi-Markov model, importance sampling, ion channel model, MCMC algorithms, mixture model, multiple scales, particle system, random walk, unbiasedness.
En savoir plus

23 En savoir plus

Monte Carlo Method Applied to the ABV Model of an Interconnect Alloy

Monte Carlo Method Applied to the ABV Model of an Interconnect Alloy

clusters tend to increase in size and get localized. If  AV= - 0.6E 0 <  BV= - 0.4E 0 , then voids drives A spatially with A at the borders of the vacancy clusters. Cycling of temperature between 300K and 600K do not have a marked effect on the distribution of the vacancy clusters. But when local temperature rise are taken into account in the algorithm to mimic the trapping of heat by the vacancy clusters, then a 1D spreading effect of the clusters is obtained which may be interpreted as crack propagation. The results obtained show that the ABV model can be used in correlation to FEM methods to study the thermo-mechanical evolution of interconnect alloy or material during the operation of a power mechatronic device and reduce by numerical simulation the uncertainties on the coupling between thermal and mechanical parameters of materials assembled in such devices. Then RBDO method can be applied for reliability issues.
En savoir plus

6 En savoir plus

Molecular mobility with respect to accessible volume in Monte Carlo lattice model for polymers

Molecular mobility with respect to accessible volume in Monte Carlo lattice model for polymers

Figure 5: Monte Carlo simulation algorithm from 10 5 to 10 8 steps for ratios of locked sites ranging from 0 (every site of the lattice that is not occupied by the polymer is accessible) to 0.6 (none of the lattice sites is accessible). Figure 6 shows the ratio of visited lattice sites with respect to the ratio of locked sites. For every tested duration, when the amount of locked sites reaches around 38%, the molecular motion is close to null. Above this value, increasing the simulation duration will not increase the polymer mobility, which is in a frozen-like state. The 38% value appears as a threshold below which the molecular mobility is activated and increases quickly. Remarkably, as the simulation duration increases, the change of mobility from null to full occurs in a very narrow span of ratio of locked sites. Another way to present this result, is to look at the duration required to visit a large part of the free sites of the lattice according to the ratio of locked sites. Figure 7 shows the duration needed to visit 95% of the unoccupied sites of the lattice according to the ratio of locked sites for the reference system. As one can read, such a duration grows exponentially after passing 35% of locked sites.
En savoir plus

14 En savoir plus

Extension of the GATE Monte-Carlo simulation package to model bioluminescence and fluorescence imaging

Extension of the GATE Monte-Carlo simulation package to model bioluminescence and fluorescence imaging

Monte-Carlo (MC) simulations play an increasing role in medical imaging techniques involving radiations [positron emission tomography (PET), single photon emission computed tomography (SPECT), and computed tomography (CT)]. For these applications, simulations are used to help design and assess new imaging devices, and to optimize the acquisi- tion and data processing protocols. The Geant4 Application for Emission Tomography (GATE) 1 , 2 open-source simulation platform, based on the Geant4 toolkit, 3 , 4 has been developed since 2002 by the OpenGATE collaboration ( www .opengatecollaboration.org ) and is currently widely used by the research community involved in SPECT and PET molecular imaging. Moreover, recently, the rise of optogenetics has opened the possibility to address and trigger action potentials in specific cells with light. 5 In these experimental paradigms, it is of a great interest to study the light path and fluence dis- tribution in tissues, as the penetration depth and deposited power per millimeter will determine the spatial extension of cell acti- vation. Yet, only coarse experimental measurements and simpli- fied two-dimensional (2-D) modeling are currently used in this rapidly growing field. 6 The MC simulations might be a great
En savoir plus

14 En savoir plus

Parallel Nested Monte-Carlo search

Parallel Nested Monte-Carlo search

All experiments use Morpion Solitaire disjoint model. All the time results are a mean over multiple runs of each algorithm, except for results in parenthesis which were run only once. The standard deviation is given between paren- thesis after the time results. The algorithms were tested on playing only the first move of a game, and on playing an entire game. All experiments consist in testing the al- gorithms at level 3 and 4 of nesting. Each rollout needs a time that is slightly different from others since random games inside each rollout can have different lengths. Times taken by two rollouts can be different. Standard deviations show these times variations.
En savoir plus

6 En savoir plus

Contributions to Monte Carlo Search

Contributions to Monte Carlo Search

What makes it so interesting in contrast to classic search algorithms is that it does not rely on the knowledge of the problem beforehand. Prior to MCS, algorithms designed to decide what is the best possible move to execute were mostly relying on an objective function. An objective function is basically a function designed by an expert in the field that decides which move is the best. Obviously the main problem with this approach is that such a function seldom covers every situation encountered, and the quality of the decision depends on the quality of the expert. Moreover, it is very difficult to tackle new problems or problems where there is little information available. MCS algorithms do not suffer from either of these drawbacks. All it requires is a model of the problem at hand upon which they can execute (a lot of) simulations. Here lies the strength of these algorithms. It is not in the ability to abstract like the human brain, but in the raw computational power that computers excel. Computers can do a lot of simulations very quickly and if needed simulations are suitable for massive parallelization. More importantly, from an optimization point of view most MCS algorithms can theoretically converge to the optimal decision given enough time. Before the description of a well-known MCS algorithms, Section 1.2 first introduces the underlying framework needed to define the algorithms.
En savoir plus

149 En savoir plus

Improving Cloud Simulation using the Monte-Carlo Method

Improving Cloud Simulation using the Monte-Carlo Method

simulation. Such DES-based simulators require at least a platform specification and an application description. The available cloud DESs can be divided in two categories. In the first category are the simulators dedicated to study the clouds from the provider point-of-view, whose purpose is to help evaluating the design decisions of the datacenter. Examples of such simulators are MDCSim [6], which offers specific and precise models for low-level components including network (e.g InfiniBand or Gigabit ethernet), operating system kernel and disks. It also offers a model for energy consumption. However, the cloud client activity that can be modeled is restricted to web-servers, application-servers, or data-base applications. GreenCloud [7] follows the same purpose with a strong focus on energy consumption of cloud’s network apparatus using a packet-level simulation for network communications (NS2). In the second category (which we focus on) are the simulators targeting the whole cloud ecosystem including client activity. In this category, CloudSim [8] is the most broadly used simulator in academic research. It offers simplified models regarding network communications, CPU, or disks. However, it is easily extensible and serves as the underlying simulation engine in a number of projects. Simgrid [3] is the other long-standing project, which when used in conjunction with the SchIaaS cloud interface provides similar functionnalities as CloudSim. Among the other related projects is iCanCloud [9] proposed to address scalability issues encountered with CloudSim (written in Java) for the simulation of large use-cases. Most recently, PICS [10] has been proposed to evaluate specifically the simulation of public clouds. The configura- tion of the simulator uses only parameters that can be measured by the cloud client, namely inbound and outbound network bandwidths, average CPU power, VM boot times, and scale-in/scale-out policies. The data center is therefore seen as a black box, for which no detailed description of the hardware setting is re- quired. The validation study of PICS under a variety of use-cases has nonetheless shown accurate predictions.
En savoir plus

14 En savoir plus

Optimized Population Monte Carlo

Optimized Population Monte Carlo

magnitude and/or present strong correlations. Some families of AIS methods use geometric information about the target for the adaptation of the location parame- ters, yielding to optimization-based adaptation schemes. For example, the GAPIS algorithm [18] is an AIS method that exploits the gradient and the Hessian of the logarithm of the target, and also introduces an artificial repulsion among proposals to promote the diversity (without any resampling step). Other methods such as [19], [20] adapt the location parameters by performing at each sample several steps of the unadjusted Langevin algorithm (ULA) [21], which can also be seen as an instance of a stochastic gradient descent method. The covariance is also adapted in those methods by either com- puting the sample autocorrelation [19] or using second-order information [18], [20]. A covariance adaptation has been also explored via robust moment-matching mechanisms in [22], [23]. We refer the interested reader to the survey [6]. The use of optimization techniques within PMC framework remains however unexplored. It is worth mentioning that optimization inspired schemes have also shown to be an efficient strategy to improve practical convergence rate in MCMC algorithms (see the survey paper [24] and references therein). In particular, the works [25], [26], [27], [28], [29] fall in the framework of the so-called Metropolis adjusted Langevin algorithms (MALA), where the ULA scheme is combined with a Metropolis- Hastings step. The Langevin-based strategy yields proposed samples that are more likely drawn from a highly probable re- gion, with the consequence of a larger acceptance probability. MALA can be further improved by rescaling the drift term by a preconditioning matrix encoding local curvature information about the target density, through the Fisher metric [30], the Hessian matrix [31], [32], [28] or a tractable approximation of it [33], [34], [35], [36]. Optimization-based methods for accelerating MCMC sampling of non-differentiable targets have also been considered, for instance in [26], [37].
En savoir plus

13 En savoir plus

Clock Monte Carlo methods

Clock Monte Carlo methods

RKKY-type interactions. We then consider the 2D and 3D Heisenberg models with oscillatory Ruderman- Kittel-Kasuya-Yosida (RKKY) interactions J ij = J 0 (cos(2k F r ij )/r d ij ) exp(−r ij /λ), where d is the spatial dimension, k F is the Fermi vector (k F ≈4.91 for the spin- glass system CuMn), and λ is the characteristic length in the damping term [37, 39–43]. Due to their approximate description of real materials, rich behaviors, and impor- tant roles in bridging the experimental study of glassy materials and the spin-glass theory of short-range inter- actions [37, 39], these systems are under extensive stud- ies. For simplicity, we set J 0 = 1 and k F = π, and take λ = 3 for 3D and λ = ∞ for 2D, so that the system is in the class of strict (3D) and marginal (2D) extensivities. The simulations are at β(2D) = 1 and β(3D) = 0.693, close to the critical temperature β c = 0.693 003(2) for the 3D pure Heisenberg model [44]. Box sizes are set to 1 and the achieved acceleration is again A∼ O(N) for the strict extensivity, and A ∼ O(N/ ln N) for the marginal extensivity, as illustrated in Fig. 4.
En savoir plus

7 En savoir plus

Validation of a Monte Carlo prediction model for portal images using PENELOPE

Validation of a Monte Carlo prediction model for portal images using PENELOPE

 Online verification of dose delivery during radiotherapy treatments has become essential in order to ensure that the dose planned by the treatment planning system (TPS) is delivered as accurately as possible and to detect possible deviations.  One strategy using EPIDs for dosimetric verification consists in comparing predicted dose images with acquired portal images before and/or during treatment [1].  Our goal is twofold:

2 En savoir plus

Analysis of correlations and their impact on convergence rates in Monte Carlo eigenvalue simulations

Analysis of correlations and their impact on convergence rates in Monte Carlo eigenvalue simulations

Abstract This paper provides an analysis of the generation-to-generation correlations as observed when solving full core eigenvalue problems on PWR systems. Many studies have in the past looked at the impact of these correlations on reported variance and this paper extends the analysis to the observed convergence rate on the tallies, the effect of tally size and the effect of generation size. Since performing meaningful analysis on such a large problem is inherently difficult, a simple homogeneous reflective cube problem with analytical solution was developed that exhibits similar behavior to the full core PWR benchmark. The data in this problem was selected to match the dimensionality of the reactor problem and preserve the migration length traveled by neutrons. Results demonstrate that the variance will deviate signifi- cantly from the 1/N (N being the number of simulated particles) convergence rate associated with truly independent generations, but will eventually asymptote to 1/N after 1000’s of generations regardless of the numbers of neutrons per generation. This indicates that optimal run strategies should emphasize lower number of active generations with greater number of neutrons per generation to produce the most accurate tally results. This paper also describes and compares three techniques to evaluate suitable confi- dence intervals in the presence of correlations, one based on using history statistics, one using generation statistics and one batching generations to reduce batch-to-batch correlation.
En savoir plus

27 En savoir plus

Monte Carlo Methods in Statistics

Monte Carlo Methods in Statistics

Z h(θ)π(θ |x) dθ and Bayes tests also involve integration, as mentioned earlier with the Bayes factors—, and optimisation difficulties with the likelihood perspective, this clas- sification is by no way tight—as for instance when likelihoods involve unmanageable integrals—and all fields of Statistics, from design to econometrics, from genomics to psychometry and environmics, have now to rely on Monte Carlo approximations. A whole new range of statistical methodologies have entirely inte- grated the simulation aspects. Examples include the bootstrap methodology (Efron 1982), where multi- level resampling is not conceivable without a com- puter, indirect inference (Gouri´eroux et al. 1993), which construct a pseudo-likelihood from simulations, MCEM (Capp´e and Moulines 2009), where the E- step of the EM algorithm is replaced with a Monte Carlo approximation, or the more recent approxi- mated Bayesian computation (ABC) used in phylo- genics (Beaumont et al. 2002), where the likelihood is not manageable but the underlying model can be simulated from.
En savoir plus

5 En savoir plus

Correlations in Monte Carlo eigenvalue simulations : uncertainty quantification, prediction and reduction

Correlations in Monte Carlo eigenvalue simulations : uncertainty quantification, prediction and reduction

correlation coefficients also explain the particular features of the variance convergence rates observed numerically. Chapter 3 focuses on predicting correlation coefficients before starting the active generations of the simulation. This provides not only correction to the underestimated variance but also provides an estimate on when a target accuracy will be reached. A Markov Chain Monte Carlo model is developed to show that the dependence of neutron source bank between consecutive generations contributes a significant fraction of the correlation of tallies between generations but also misses an important part caused by the multiplying effect of fission. The method of multitype branching process (MBP) is developed to capture both the source bank dependence and the correlation due to multiplication. The MBP model is exact in predicting correlations for neutrons transported in a discrete phase space and provides acceptable accuracy in continuous problems by constructing an approximate discrete problem from tallies. Since the MBP model is a discrete approximation, it can predict correlations and variances on tally meshes coarser than the discretized model mesh accurately.
En savoir plus

327 En savoir plus

Monte Carlo power iteration: Entropy and spatial correlations

Monte Carlo power iteration: Entropy and spatial correlations

Figure 4: Homogeneous cube reactor. The behaviour of the measured Shannon entropy S(g) during Monte Carlo power it- eration as a function of generations g, for different initial popu- lation sizes N and fixed L = 400 cm. The guess source at g = 0 consists of N neutrons located at the center of the cube. Power iteration is run for 1000 generations. Upper red curve: N = 10 5 ; central green curve: N = 10 4 ; lower blue curve: N = 10 3 . The dashed lines represent the expected entropy value S N as in Eq. (6) (red: N = 10 5 , green: N = 10 4 and blue: N = 10 3 , respectively), and the solid black line is the ideal expected en- tropy S∞ for an infinite number of particles per generation as in Eq. (5).
En savoir plus

16 En savoir plus

Addressing nonlinearities in Monte Carlo

Addressing nonlinearities in Monte Carlo

Richard Fournier 3 , Mathieu Galtier 7 , Jacques Gautrais 8 , Anaïs Khuong 8 , Lionel Pelissier 9 , Benjamin Piaud 5 , Maxime Roger 7 , Guillaume Terrée 2 & Sebastian Weitz 1,2 Monte Carlo is famous for accepting model extensions and model refinements up to infinite dimension. However, this powerful incremental design is based on a premise which has severely limited its application so far: a state-variable can only be recursively defined as a function of underlying state- variables if this function is linear. Here we show that this premise can be alleviated by projecting nonlinearities onto a polynomial basis and increasing the configuration space dimension. Considering phytoplankton growth in light-limited environments, radiative transfer in planetary atmospheres, electromagnetic scattering by particles, and concentrated solar power plant production, we prove the real-world usability of this advance in four test cases which were previously regarded as impracticable using Monte Carlo approaches. We also illustrate an outstanding feature of our method when applied to acute problems with interacting particles: handling rare events is now straightforward. Overall, our extension preserves the features that made the method popular: addressing nonlinearities does not compromise on model refinement or system complexity, and convergence rates remain independent of dimension.
En savoir plus

12 En savoir plus

Show all 10000 documents...