Abstract
This paper discusses aspects of **risk** **and** **uncertainty** relevant in an interdisciplinary assessment of climate change policy. It opposes not only the objective approach versus the subjective approach, but also situations when precise probabilities are well founded versus situations of broader forms of error such as Knightian or deep **uncertainty**, incompleteness, vagueness. Additional human **and** social dimensions of ignorance: strategic uncertainties, surprises, values diversity, **and** taboos, are discussed. We argue that the broader forms of error affect all sciences, including those studying Nature. For these aspects the IPCC guidance notes provides an interdisciplinary unified approach on **risk** **and** **uncertainty**. This is a significant advance from a simple multidisciplinary justaposition of approaches. However, these guidance notes are not universal, they mostly omit the human **and** social dimensions of ignorance.

En savoir plus
2, by relaxing the equality constraints into inequality constraints. This way, it is possible to quantify the sensitivity of the maximal quantile to the moment values.
7. SUMMARY
Metamodels are widely used in industry to perform **uncertainty** propagation, in particular to evaluate measures of **risk** such as high quantiles. In this work, we successfully increased the robustness of the quantile evaluation by removing the main sources of uncertainties tainting the inputs of the computer code. We evaluated the maximum measure of **risk** over a class of distribution. We focus on set of measures only known by some of their moments, **and** adapted the theory of canonical moments into an improved methodology for solving OUQ problems. Our objective function has been parameterized with the canonical moments, which allows the natural integration of the constraints. The optimization can therefore be performed free of constraints, thus drastically increasing its efficiency. The restriction to moment constraints suits most of practical engineering cases. We also provide an algorithm to deal with inequality constraints on the moments, if an **uncertainty** lies in their values. Our algorithm shows very good performances **and** great adaptability to any constraints order. However, the optimization is subject to the curse of dimension **and** should be kept under 10 input parameters.

En savoir plus
these two different spatial datasets. Up to our knowledge, this source of **uncertainty** has never been carefully discussed in any natural **risk** assessment.
Our paper is an attempt in this direction. We try to answer the following question: how much can the outputs of a natural **risk** assessment vary depending on the HASO technique used? Is this variability important with respect to other sources of **uncertainty** in natural **risk** assessments? To narrow the scope of our work, we focus on a single case-study, which investigates flood damage estimates on the Orb Delta, France.

En savoir plus
Depending on the nature of the spatial datasets used, various techniques can be considered to perform the HASO spatial overlay analysis, **and** important variations may be observed from one technique to one another. Unfortunately, this problem is often overlooked in the natural hazard literature. Up to our knowledge, this source of **uncertainty** has never been carefully discussed in any natural **risk** assessment.

Abstract: This article analyses management of hydropower dams within monopolistic **and** oligopolistic competition **and** when hydroelectricity producers are **risk** averse **and** face demand **uncertainty**. In each type of market structure we analytically determine the water release path in closed-loop equilibrium. We show how a monopoly can manage its hydropower dams by additional pumping or storage depending on the relative abundance of water between different regions to smooth the effect of **uncertainty** on electricity prices. In the oligopolistic case with symmetric **risk** aversion coefficient, we determine the conditions under which the relative scarcity (abundance) of water in the dam of a hydroelectric operator can favor additional strategic pumping (storage) in its competitor’s dams. When there is asymmetry of the **risk** aversion coefficient, the firm’s hydroelectricity production increases as its competitor’s **risk** aversion increases, if **and** only if the average recharge speed of the competitor’s dam exceeds a certain threshold, which is an increasing function of its average water inflows.

En savoir plus
Chapter 1 Introduction
Space programs are increasingly complex **and** suffer from **uncertainty** in many quan- tities of interest, leading to schedule **and** cost overruns. NASA’s “Faster, Better, Cheaper” approach worked for some programs (for example, Stardust as described in Atkins, 2003 [3]), but for several high-profile Mars missions this approach led to failure **and** the aerospace industry abandoned it [17]. While many papers have been written giving statistics of cost **and** schedule overruns in aerospace systems, we can build on this literature by identifying practical next steps in implementing the knowledge from these statistics to efficiently manage **uncertainty** in space systems. As summarized by Collopy (2011), leading industry systems engineers agreed during a series of NASA **and** NSF workshops that **uncertainty** management is one of the most needed areas of research in space systems engineering [8]. According to Collopy, “Systems engineerings first line of defense against **uncertainty** is a semi-quantitative **risk** management process with no rigorous foundation in the theory or calculus of probability.” [8]

En savoir plus
104 En savoir plus

3.4. **Uncertainty** assessment
The **uncertainty** characterizes the dispersion of the values that could reasonably be attributed to the measurand [28] . Several **uncertainty** results were generated **and** are presented in Table 2 : the **uncertainty** of bias of the method at each concentration level of the validation standard, the **uncertainty**, which combines the **uncertainty** of the bias with the **uncertainty** of the method obtained during the validation step, i.e. the intermediate precision standard deviation, **and** the expanded **uncertainty**, which equals to the **uncertainty** multiplied by a coverage factor k = 2, representing an interval around the results where the unknown true value can be observed with a con ﬁdence level of 95% [28] . In addition, the relative expanded uncertainties (%) sor serum **and** urine obtained by dividing the corresponding expanded uncertainties with the corresponding introduced concen- trations ( Table 2 ) are not higher than 10%, which means that with a con ﬁdence level of 95%, the unknown true value is located at a maximum of ±10% around the measured result. Only for the ﬁrst concentration level of iohexol in serum (8.63 μg/ml) is the relative expended **uncertainty** extremely high, about 36%.

En savoir plus
11
2.3 Plackett-Burman design
Monte Carlo simulations require knowledge about the distribution function (probability distribution) of the values of the relevant variables in the techno-economic model. Information with respect to these probabilities is often absent, **and** the best way one can do is to assign probabilities on the basis of their own opinion based on experience. Because the probabilities used in the Monte Carlo simulations are estimated on a subjective basis expressing our degrees of belief, Van Groenendaal **and** Kleijnen (1997) doubt the usefulness of Monte Carlo simulations. They propose methods from design of experiments (DOE), which is often used in industrial research, as an alternative for Monte Carlo simulations, to provide information on which factors or independent variables can make an investment project “go wrong”, without requiring knowledge of probability distributions. These independent variables are the uncertain variables identified in step 3 of the unifying approach expressing economic **risk**, as explained in section 2.2. Hence, the independent variables of the experimental design are the same as the uncertain variables for which probability distributions have been defined in the Monte Carlo simulations. Because Van Groenendaal (1998) expects that decision makers are mainly interested in information in what can go wrong, he suggests to analyse changes in the values of independent variables that have a negative impact on the dependent variable. The latter is the NPV, i.e. the overall system performance measure determined in step 1 of the unifying approach expressing economic **risk**. To determine these negative effects the first step is to apply a one-factor-at-a-time sensitivity analysis. It is assumed that every factor or independent variable takes on either one of two values: -1 if the independent variable is “off” **and** +1 if the independent variable is “on”. In other words, +1 corresponds to the base-case value of the corresponding independent variable, whereas -1 stands for the value that has a negative influence on the dependent variable. In DOE the effect of changes in the value of the uncertain independent variables on the NPV, i.e. the dependent variable is thus obtained by simulating the extreme points of the value ranges, **and** estimating a linear regression meta-model to detect which independent variables are important (Van Groenendaal **and** Kleijnen 1997).

En savoir plus
easily be counterbalanced by permits reallocation from other installations within the group. This situation is confirmed by the visual inspection of the data in Figures 1 to 3. Overall, the parent company is a net seller on the permits market. In Table 4, Dalkia also exhibits a large surplus of 2.4M EUAs in 2007. 4 out of 125 installations are net short, which supposes similarly that their deficit may be compensated internally by the parent company, thereby covering the **risk** of permits shortage for its subsidiaries. From Figures 4 to 6, one may remark that the distribution of installations is very heterogeneous with two installations above 1M of allocated allowances holding substantive surpluses. On a smaller scale, Eesti Energia displays in Table 5 a net long position of 0.27M EUAs in 2007 for one installation being reported in the Reuters Carbon Market Database. Without commenting further the possibility of pooling risks, Figures 7 to 9 reveals that this permits surplus has been increasing from 2005 to 2007. This second sub-sample of firms has confirmed the liquidity of the permits market in terms of extra- allowances available for trading during each compliance period. Given this high level of heterogeneity between firms, if parent companies are still in a net short position after pooling allowances internally, they may buy allowances on the market to be globally in compliance.

En savoir plus
Chapter 1
Introduction
1.1 General introduction
With most commodity prices near all time low in the last several years **and** exploration expenditures kept to a minimum, mining companies had to rely on breakthroughs in technology to lower their operating costs **and** find new deposits. Amongst new technology developed, we can mention the global positioning system (GPS), the haul truck dispatch system, the drill navigation system, heap leaching, bio-leaching **and** a series of geophysical methods such as induced polarisation (IP), magnetic resistivity **and** so on. Also included in this group are techniques used to model **and** estimate mineral deposits. Recently developed techniques comprise indicator-based algorithms for kriging **and** conditional simulation. During the last 20 years, mining companies realized that in order to stay competitive **and** maintain their profit margin, they not only had to embrace those new technologies, but also to invest in research **and** development. People at Inmet Mining, understood this **and** decided to fund the present project. The objective of this project was to estimate mineral reserves of the Troilus gold deposit with a non-linear interpolation method **and** to assess the **uncertainty** of the mineralization through conditional simulation.

En savoir plus
290 En savoir plus

support bounded on one side by the mode). Consequently, the extreme points are parametric measures, which is convenient for the optimization of the QoI. This result is what makes these two measure spaces **and** the OUQ framework so appealing.
Beyond the measure space, the other theoretical advantage of our framework is that it allows a wide choice of optimization functions. Indeed, quasi-convexity em- beds a very large class of functions. The lower semicontinuity assumption can also be replaced by upper semicontinuity associated to a compactness assumption of the measure space. If so, the reduction theorem is known as the Bauer maximum principle. One can also dispense with any regularity assumption when studying measure affine functions or ratio of measure affine functions. In addition to the theoretical aspect of the optimization function, we studied in Chapter 5 some useful practical QoIs. Most of them are classical quantities, but we consider it here as functions of the input measure of the computer model. In this context, we have shown that a failure probability is a measure affine function, **and** a quantile is a quasi-convex lower semicontinuous func- tion. We also studied the optimization of superquantiles **and** Sobol’ indices. Moreover, we embed the robust Bayesian analysis into our theoretical framework, showing that it is a particular case of an OUQ problem. Nevertheless, we emphasize that the literature on robust Bayesian analysis also explores prior measure spaces that are different from the moment class **and** the unimodal moment class.

En savoir plus
252 En savoir plus

1 INTRODUCTION
Interest has increased in recent years concerning the implementation of new generations of biorefineries using a variety of biomass residues **and** non-food crops. As a consequence of this interest, the agricultural sector in Canada has looked closely at integrating biorefinery activities into its current processes as a strategy to potentially enhance the competitiveness of the future industry. One of the biomass crops that may be used as biorefinery feedstock is triticale (X Triticosecale Wittmack ). This crop has attracted much interest, especially in Alberta, Canada, because it has characteristics similar to wheat while does not interfere with the food chain. This energy crop is a hybrid of wheat **and** rye **and** brings together the advantages of both crops: the high yield potential **and** grain quality of wheat, **and** the environmental tolerance of rye (http://en.wikipedia.org/wiki/Triticale, 2013). The unique advantages of triticale include its ability to grow on marginal land, higher yields compared to wheat, **and** non-competition with food-based crops, position triticale as a promising energy crop for the biorefinery industry 1 . The Canadian Triticale Biorefinery Initiative (CTBI) Network is a research **and** development program which has focused on developing triticale as an industrial biorefining crop for Canada (http://www.ctbi.ca, 2013). Their achievements have shown that a variety of possible triticale- based product-process combinations exist. However, not all these options are necessarily sustainable, **and** no study to date has sought to identify the most promising triticale-based biorefinery strategies.

En savoir plus
368 En savoir plus

their counterparts. Lachaud (2007) provides a detailed study of the determinants of HIV-infection in Burkina Faso. Using the Demographic **and** Health Survey col- lected in 2003, he shows that the probability of being HIV-infected increases with non monetary welfare, proxied by an index of physical assets. Distinguishing men **and** women, de Walque (2006) does not confirm the positive relation found in the case of Burkina Faso but he validates this pattern in Cameroon for both males **and** females **and** in Ghana, Kenya **and** Tanzania where rich women are found more likely to be infected than the poor. Kazianga (2004) **and** Luke (2006) give some insights into two channels through which wealth might be related to a higher **risk** of HIV- infection. It is documented that in Sub-Saharan Africa, beyond transactions with commercial sex workers, money **and** gifts are driving most extramarital or casual sex. A rich man seems more vulnerable due to his potentially extended sexual network. The role of transfers is all the more crucial considering that Luke (2006) points out a negative relationship between transfers **and** condom use in informal relationships in urban Kenya. On the other hand, Kazianga (2004) demonstrates that the demand for casual sex increases with wealth for urban men in Burkina Faso **and** rural men in Guinea **and** Mali.

En savoir plus
the volatility in yields as the independent variable of interest. The evidence shows a positive association between the variance of the fluctuations in yields **and** the prevalence in most estimations. Countries in which the yields are highly volatile are much more affected by the epidemic than their counterparts. High **and** frequent crop shocks discourage people to invest in self-protective behaviors all the more than in most countries, the vast majority of the population is working in the agricultural sector, have no outside option to avoid crop shocks **and** their livelihoods are directly affected by these fluctuations in yields. In columns 1 **and** 2, GDP per capita is still significantly **and** positively related to the prevalence. Contrary to previous estimations, the effects of GDP per capita is not offset by the effects of the economic volatility because here the volatility in yields is used instead of its own volatility. However in Column 1, the standardized regression coefficient for volatility in yields exceeds that for the income meaning that a standard deviation-change in yields instability has a greater impact on prevalence than a standard deviation-change in GDP growth. An increase in one standard deviation in yields instability **and** in GDP growth lead to a rise of 0.46% **and** 0.15% of the prevalence respectively. To get the same change in prevalence as the change resulting from a one-standard deviation change in yields instability, the growth in GDP per capita must increase by three standard deviations.

En savoir plus
4. Global sensitivity analysis of functional **risk** curves
The objective of sensitivity analysis is to determine those input variables that mostly influence the model response [9, 29, 30]. Global sensitivity analy- sis methods take into account the overall **uncertainty** of the input parameters. Previous works on the POD [31] **and** seismic fragility context [32] have only con- sidered sensitivity analysis on the variance **and** distribution of the model output variable Y . In this section, we propose two global sensitivity indices attached to the whole FRC: one already introduced in [13] (aggregated Sobol’ indices) **and** a new one based on a completely different idea (perturbed-law based in- dices). Therefore, the influence of the parameter of interest a, as it is the FRC abscissa, will not be considered as an input variable whose sensitivity index has to be computed.

En savoir plus
wind generation prediction system should be seen as a multi- objective problem, **and** defined from the end-user specific objectives (cost minimization on a specific power market, control of the large forecasting errors, etc.). A model per- formance evaluation is done on a ”global” basis, i.e. over a long period of time. However, this performance is highly variable from one season to another, from one month to an- other, or even from one day to another. Therefore, in con- trast to that ”global” performance evaluation, it is necessary to provide end-users with the possibility to assess the predic- tion accuracy in a more dynamic way. The two main features we aim to introduce here are the **uncertainty** **and** prediction

En savoir plus
4.1. INTRODUCTION 133 It is well-known that an asset price is theoretically the discounted of expected future cash flows, which are random processes **and** must be estimated. Thus, the value of an asset is an estimation obtained under **uncertainty** **and** may therefore be represented by a random variable. In other words, at any fixed time, the value of an asset is not observable but a random variable where its law or at least its mean value **and** variance may be characterized. In conformity with this point of view, the presence of many sell **and** buy order prices can be explained by different **risk** aversions of market participants. In order to clarify this idea, we consider a “representative” price setter-market participant who has to place both a buy **and** a sell limit orders, i.e. the prices **and** the number of shares he is willing to buy **and** sell. Prior to setting the buy **and** sell orders, he obtains the distribution of possible asset values from the market information but has no possibility to observe the asset’s realized values. A rational decision is to send a limit buy (sell) order with a price lower (higher) with respect to the asset mean value such that their difference justifies the **risk** taken. Of course, he adjusts those prices by increasing them if he runs short of stock, or cutting them if he starts accumulating excessive stocks.

En savoir plus
171 En savoir plus

At the initial design stage engineers often rely on low-fidelity models that have high epis- temic **uncertainty**. Traditional safety-margin-based deterministic design resorts to testing (e.g. prototype experiment, evaluation of high-fidelity simulation, etc.) to reduce epistemic **uncertainty** **and** achieve targeted levels of safety. Testing is used to calibrate models **and** prescribe redesign when tests are not passed. After calibration, reduced epistemic model **uncertainty** can be leveraged through redesign to restore safety or improve design perfor- mance; however, redesign may be associated with substantial costs or delays. In this paper, a methodology is described for optimizing the safety-margin-based design, testing, **and** re- design process to allow the designer to tradeoff between the **risk** of future redesign **and** the possible performance **and** reliability benefits. The proposed methodology represents the epistemic model **uncertainty** with a Kriging surrogate **and** is applicable in a wide range of design problems. The method is illustrated on a cantilever beam bending example **and** then a sounding rocket example. It is shown that redesign acts as a type of quality control measure to prevent an undesirable initial design from being accepted as the final design. It is found that the optimal design/redesign strategy for maximizing expected design per- formance includes not only redesign to correct an initial design that is later revealed to be unsafe, but also redesign to improve performance when an initial design is later revealed to be too conservative (e.g. too heavy).

En savoir plus
Moreover, Conconi et al. (2016) use data about Belgian firms to conduct their analysis. We use two sources of industry-level FDI data, EUROSTAT data for the European Union Member States **and** the BEA (Bureau of Economic Analysis) for the United States. These data are collected for several industries, including the food processing, chemical **and** transport equip- ment industries. Our database is unique because we are able to collect bilateral data (allowing us to exploit the dimension of the countries of origin **and** destination), sectoral data as well as annual data on FDI. We also collected data on FDI in European Union Member States **and** the United States. The choice of countries forming our database may be rationalized as the major global actors of FDI in the food industry are European Union Member States **and** the United States. We focus our analysis on manufacturing industries, similar to Ramondo et al. (2014). The interest in studying this question for different industries is motivated by the fact that comparative advantages are heterogeneous across sectors (Antràs **and** Yeaple, 2014). Thus, we do not expect that all industries will be impacted by **risk** to the same extent. We will see how the effect of **risk** on FDI timing changes across different manufacturing industries. We use several variables to capture the degree of **uncertainty** multinational firms face in our empirical analysis. The computation of the volatility measure is discussed in Section 1.3.2. We consider the volatility of output, the volatility of consumption, the volatility of exports **and** the volatility of imports. The goal is to determine the consequences of these risks on FDI timing. To understand the roles that consumption, production, import **and** export **risk** play in FDI timing, we assert that FDI exists under different forms. We distinguish between domestic- oriented FDI **and** export-oriented FDI 1 . In the first case, the main purpose of FDI is to satisfy

En savoir plus
143 En savoir plus

1 Introduction
Fibre reinforced composites are known to be light weight, have high specific strength **and** excellent weathering stabilities. These properties make them ideal to be used for very critical structural applications such as pressure vessels **and** components for aerospace applications [2, 3]. These two applications represent about 40% of the composite market in volume [4]. In-service safety **and** reliability assessment are key challenges for these load- bearing applications **and** require great care to be taken during their design. Although using high design safety factors on the influential parameters make it possible to secure the structures in a deterministic manner, it con- ceals the extent of the **risk** that is being taken in doing so. Structural reliability analysis based on omnipresent uncertainties has the capability to provide the required **risk** information to engineers, if it could be appropri- ately conducted. To do that, the inherent variability in the properties of the constituent materials should be determined or predicted as best as possible [5, 6].

En savoir plus