• Aucun résultat trouvé

3. THE APPLICATION OF JUDGEMENT AGGREGATION AND UNCERTAINTY

3.3. Sensitivity analysis and uncertainty treatment

There is a difference between an uncertainty analysis and a sensitivity analysis. An uncertainty analysis is performed in order to describe the range of possible outcomes for a given set of inputs (where each input has some uncertainty). A sensitivity analysis is performed in order to describe how sensitive the outcome variables are to the variation of individual input parameters. Since there may be multiple input parameters, a sensitivity analysis can help to determine which ones drive the majority of the variations in the outcome (for more details, see Annex III).

3.3.1. General comments on uncertainty treatment

Uncertainties in input data need to be accounted for in the KIND framework, in particular when the focus is on less mature technologies characterized by a lack of detailed data in some areas relevant to the design, operation and costs. There is no universal guidance on uncertainty treatment, but the widely implemented steps are: the identification and estimation of sources of uncertainty, and the evaluation of uncertainty in the results.

The sources of input uncertainties in the KIND approach can be objective, associated with indicator values, and subjective, associated with indicator weights. Additional uncertainty may be related to specific parameters used in a particular MCDA tool, for example, uncertainties associated with the shape of single-attribute value/

utility functions (in MAVT/MAUT) or preference functions (PROMETHEE).

While evaluating the impact of uncertainty on the weighting factors’ vector w=( ,...,w1 wm), the fact that w will satisfy the restrictions 0≤wi≤1 and w1+ +... wm=1 needs to be respected. In this regard, special procedures

(a) (b)

(c) (d) FIG. 3.4. Presentation of results. (a) Value path; (b) radar chart; (c) bar chart; (d) pie chart.

for generating weights have to be included within MCDA tools that allow a treatment of the weights’ uncertainty with constraint.

The uncertainty in indicators needs to be evaluated correctly in cases in which they are objectively calculated versus evaluations based on subjective information elicited from experts. Large uncertainties in the initial data (for instance, NFC unit costs) will not always lead to large uncertainties in the indicators (for instance, LUEC); thus, an accurate evaluation of uncertainties is needed for a correct uncertainty treatment.

Sensitivity and uncertainty analyses are useful to evaluate the impact of uncertainty in input data on alternative ranking. Such analyses are used to increase the clarity of alternative selection; they enable decision makers to reach a conclusion regarding the stability and robustness of results and estimate risk. The purpose of a sensitivity analysis is to examine the change in model output values (ranking order) that results from modest changes in model input values (indicators, weights, value function). An uncertainty analysis is aimed at incorporating multiple uncertainty sources into comparative evaluations to provide overall ranking results with uncertainty.

The most widely known methods for evaluation of the impact of uncertainty on the results in MCDA based studies can be subdivided into two groups: deterministic and probabilistic sensitivity analyses. Both of them have their advantages and disadvantages. Some of them are more or less universal for application in MCDA tools requiring different time considerations and prerequisite knowledge for implementation. The deterministic approach is, in most cases, sufficient for the majority of decisions because of its low complexity and straightforward implementation.

The main advantages of a deterministic sensitivity analysis are: (1) it may be easily applied to uncertainty in both indicators and weights, because a corresponding model parameter (weight or indicator value) can be varied separately, and (2) little time is needed and additional information is not required when implementing such an analysis.

The main disadvantages of deterministic sensitivity analysis are: (1) the range over which weights or indicator values are varied is usually chosen arbitrarily and it is assumed that all parameter values in the range are equally probable, and (2) a large number of uncertain model parameters cannot be taken into account simultaneously, so it does not provide an evaluation of the cumulative impact of uncertainty on multiple model parameters.

Probabilistic sensitivity analysis requires the specification of probability distributions for model parameters of interest (for instance, based on objective statistics or by eliciting information from subject matter experts) and takes uncertainty from multiple model parameters into account.

It is important to note that the impact analysis for one variable at a time may mislead in the presence of dependences and correlations between input variables. Probabilistic analyses (e.g. techniques such as Monte Carlo simulations) could help to fix this problem.

Many other approaches and frameworks (such as fuzzy set theory, interval judgements, percentile uncertainty estimates, grey theory) may be used for uncertainty analysis within the MCDA, but they are not widely applied for decision support in the field of engineering and there is limited scope for their implementation in decision supporting tools. An extended description of the above mentioned approaches is given in Annex III and in Refs [3.18–3.20]. Examination of the impact of uncertainty on the ranking of the results has not been considered within this publication. It is planned to extend the KIND approach by providing guidance on uncertainty analysis within the follow-up collaborative project titled Comparative Evaluation of Nuclear Energy System Options (CENESO), which runs from 2017 to 2020.

3.3.2. Sensitivity to weights

Weight sensitivity analysis is a tool for understanding the influence of weights assigned to alternative ranking. This analysis evaluates the impact of weights’ values on the outcomes (scores and ranks of alternatives).

At the start, an expert may assign the appropriate weights to a base/reference case and then change any weight and compare the ranking results.

3.3.2.1. Direct approach

A direct approach to weight sensitivity analysis is a simple form of deterministic sensitivity analysis in which alternative ranking results are calculated for different weighting factor options. A possible weight sensitivity analysis within the direct approach may be realized in the following way: each weight is changed using an (a) (b)

(c) (d) FIG. 3.4. Presentation of results. (a) Value path; (b) radar chart; (c) bar chart; (d) pie chart.

appropriate factor (for example, ±10% [3.21]) while maintaining the sum of the weights constant (equal to 1 or, equivalently, 100%). If there is no impact on alternative rankings, the decision support analysis is considered to be a stable and robust one. If an alternative ranking order will change owing to the weight variation, it is necessary to collect additional information or to explain the impacts that the potential errors in weighting factors have on the alternative selection to the decision maker.

An important condition to be satisfied while analysing the sensitivity to weight values is that the sum of all weights is to be equal to 1. Subject to this condition, it is possible to implement different methods of analysing the sensitivity to weight values, of which the most common one is modifying one of the selected weight values, provided that other weights vary proportionally (this approach is known as a ‘walking weights’ approach).

Another, more detailed, possibility for demonstrating the alternative ranking sensitivity to weighting factor values may be realized by using the ‘linear weight’ approach. In the linear weight approach, the expert can choose an indicator for which a weight sensitivity analysis will be performed and investigate how the ranking alternatives will change while a weighting factor varies from 0 to 1 (in this procedure, other weights are to be automatically proportionally adjusted so as to hold the weight sum equal to unity).

The linear weight approach is very effective. The graphs in Fig. 3.5 show, for each alternative, the variation of its overall score as a function of the corresponding weighting factor, keeping the value of other weighting factors unchanged. Based on this information, the ranks of alternatives may be identified for different weighting factor values, while the weighting factor areas may be obtained that deliver the same ranking result.

The uncertainty of decision makers’ preferences (uncertainty of weights) may be taken into account using, for example, the concepts of fuzzy numbers, probability theory or interval algebra. The applications of the MCDA methods allowing incorporation of uncertainties are realized in special versions of MCDA software (see Annex III and Refs [3.10–3.12]).

3.3.3. Sensitivity to single-attribute value functions

A single-attribute value function sensitivity analysis evaluates the impact on the final results (ranks of alternatives) with respect to changes in single-attribute value functions. A value function sensitivity analysis may only be implemented for value-based MCDA methods, such as MAVT or MAUT.

FIG. 3.5. Linear weight approach to weight sensitivity analysis.

FIG. 3.6. NES rank distribution: illustration of statistical approach to value function sensitivity analysis.

3.3.3.1. Direct approach

A direct approach used to determine the sensitivity of the ranking results with respect to a value function type involves direct observation of how the ranking results are affected by a change in one or more value functions, the type of which varies within certain limits. This approach provides many opportunities for analysing sensitivity to a value function type, for example by simultaneous variation of several value functions or by a change in the value function parameters defining it.

Generally, this analysis is applied for a qualitative (often visual) check: an expert modifies a value function type and observes the changes in the order of the alternatives’ ranking, as well as the values of the multi-attribute value function. In this way, a posteriori knowledge can be gathered that reveals the main components affecting the ranking results. However, in this case, a smart strategy is required to determine how to modify value function types in order to find a quantitative description of the observed regularities.

Given the multi-factorial character of the evaluation of the sensitivity to value function types in the decision support systems, implementing the direct approach to sensitivity analysis is a common practice. However, other approaches are potentially realizable, including those based on the statistical approach, as well as improved analytical methods which suggest incorporating uncertainties caused by value function types directly into the analysis (i.e. design of the decision rule).

3.3.3.2. Statistical approach

The statistical approach to value function sensitivity supplements the direct approach. It is reasonable to implement this approach in situations where the form of single-attribute value functions is not well understood.

This approach assumes the random generation of a set of single-attribute value functions from a certain set of functions, and the identification of alternative ranks while applying them. Based on this information, the rank distribution of alternatives may be evaluated for each alternative (Fig. 3.6).

The probability rank distributions of each alternative can be constructed based on this analysis in order to determine the most probable rank values as well as their mean values, variance, and so on. As a whole, this information characterizes the degree of alternative rank sensitivity to a value function type. Based on this information, it is possible to make quantitative judgements on the attractiveness of an alternative with due account of uncertainties in the form of value functions.

It should be noted that using this approach may cause the alternatives to be seen as indistinguishable. This is due to the variation of a value function type over an unnecessarily wide range, which will lead to the alternative rank probability distribution being close to equally probable. Therefore, such an approach requires special attention when determining the boundaries of a value function shape variation; they should be neither too narrow nor too wide.

FIG. 3.5. Linear weight approach to weight sensitivity analysis.

FIG. 3.6. NES rank distribution: illustration of statistical approach to value function sensitivity analysis.

3.3.4. Sensitivity to and uncertainty of KIs

As a rule, the exact values of indicators are unknown; instead, the indicators are characterized by a certain range of values. This statistical dispersion may be caused, for example, by an error bias of measured value. In cases when an indicator is evaluated qualitatively, for example, based on expert judgements, the uncertainty in the indicator value may be caused by the ambiguity of reflecting expert qualitative judgements in a score scale. Thus, analysing sensitivity to a scatter of possible indicator values might be important.

It should be noted that evaluations of the impact with respect to indicator uncertainties may be reduced when analysing sensitivity to other model parameters while implementing one or another MCDA method (for example, within the MAVT method, indicator uncertainty may be reduced to the uncertainty of a value function type). However, it is necessary to clearly identify the cause of uncertainty (the objective or subjective nature of the uncertainty) to avoid double counting its impacts when performing an analysis.

3.3.4.1. Direct approach

A direct approach to determine the sensitivity of ranking results to indicator values may be a direct observation of the impact of indicator value changes within certain limits. The direct approach may be based on techniques such as the simultaneous variation of several indicators within specified boundaries or the generation of indicator values in accordance with a specific rule.

This approach is used for a qualitative (often visual) analysis: an expert modifies the indicator values and directly observes changes in the ranking results and in the values of multi-attribute value functions. Based on this procedure, a posteriori experience is gained that enhances our understanding of the sensitivity of ranking results to indicator values. In order to reflect certain quantitative regularities, a strategy for modifying indicator values needs to be set up so as to enable quantitative description of the observed regularities.

In decision support systems, a sensitivity analysis with respect to indicator values is, as a rule, not realized as a separate functional module. If necessary, an analysis of this kind can be carried out by modifying the basic calculation model, making a series of variant calculations with different indicator values and then analysing the results.

3.3.4.2. Advanced MCDA methods

Another possible option for considering uncertainties in indicator values is to apply advanced MCDA methods that provide such a capability, for instance, MAUT or methods from the fuzzy MCDA toolkit.

As discussed in Section 3.1.2, MAUT is an extension of MAVT that allows the indicator value uncertainty represented by a random variable with a given probability density function to be accounted for. The single-attribute and overall utility functions for each alternative are also, as a result, random variables with corresponding probability distributions. The alternatives’ ranks are determined within MAUT based on a comparison of the expected overall utilities of the considered options that allow incorporation of the uncertainties in the indicator values.

Fuzzy MCDA methods involve considering uncertainties in the indicator values by using fuzzy numbers (singleton, triangular, trapezoidal and piecewise are the most commonly used). Indicators, like other elements of fuzzy sets, have a given degree of membership in distinction from the binary (yes/no) membership used in the regular sets within point estimations. It is deemed that a fuzzy approach corresponds better to individual judgements regarding indicator values characterized by uncertainties, and allows, within decision models, the incorporation of relevant uncertainties. In such models, the conventional crisp judgement scale is replaced with fuzzy numbers to indicate the fuzziness of judgements regarding indicator values.

3.3.5. Comparison with other MCDA methods

Comparing the results obtained using different MCDA methods is another possible option to examine the overall stability and robustness of the ranking results, significantly increasing the confidence level.

In the general case, the ranks of alternatives may be different when using different MCDA methods.

Differences in the rank order may occur owing to different representations of a performance table demonstrating the performance of the alternatives in terms of the chosen criteria; the results from evaluating weights using different

procedures (e.g. swing and pairwise) and realized via different MCDA methods may vary significantly; and the decision rules implemented in the MCDA methods may have a significant impact on the ranking order. There are no specific rules for conversion among different variants of the model parameters used in different methods.

Notwithstanding, the different theoretical frameworks applied in different MCDA methods lead to well coordinated and similar outcomes that allow the overall stability and robustness of the results to be examined.

3.3.6. Summary

Uncertainty and sensitivity analysis approaches need not to be limited to those considered in this section. The selection of the most suitable approach depends on the scope of the specific case study and the related audiences and expert preferences, which should be mentioned in the case study reports.

The uncertainty treatment for both the indicators and the weights and their consideration in the framework of the decision making model related to the comparative evaluation of NESs is a significant problem because there are no universal ready-made recommendations. In this regard, it is helpful to thoroughly analyse the application feasibility of existing approaches within a case study in order to treat uncertainties.

The MCDA methods applied to a multi-criteria comparative evaluation of NESs’ performance need to include uncertainty analyses with respect to weights, indicators and other model specific parameters (for example, single-attribute value functions within the MAVT method). Uncertainty treatment evaluates the impact of model parameter uncertainties on the overall scores. A balance needs to be reached between elimination of the uncertainty sources and overestimation of the uncertainties, which may lead to the alternatives becoming indistinguishable.

For most problems, simple approaches within a deterministic sensitivity analysis are sufficient to examine the impact of uncertainties, owing to the advantages of straightforward implementation, intuitive appeal and ability to be implemented within different MCDA methods. With these approaches, the weights or indicators are varied as a single value. Within the MAVT method, the sensitivity analysis explores the impact of changes in indicators, weights and value function on ranking results.

At the same time, more sophisticated methods may be required in cases where multiple sources of uncertainty are to be taken into account simultaneously, where dependence relations exist among the input data and where there are no time constraints for uncertainty modelling. Such approaches seem most appropriate if only the lower and upper bounds are known (interval or grey approaches) or if experts’ opinions are characterized by specific distributions (probabilistic or fuzzy set approaches) in a group decision process.

In the framework of the KIND approach, it is reasonable to examine, as the basic option, the impact of uncertainties in indicators, weights and method parameters through a deterministic sensitivity analysis. For more sophisticated users, additional options may be interesting in order to provide robust judgement aggregation in which the weights are not determined by their average values, but are distributed within certain intervals or characterized by distributions. This option has not yet been performed at full scale.

REFERENCES TO SECTION 3

[3.1] INTERNATIONAL ATOMIC ENERGY AGENCY, Guidance for the Application of an Assessment Methodology for Innovative Nuclear Energy Systems: INPRO Manual — Overview of the Methodology, IAEA-TECDOC-1575 (Rev. 1), IAEA, Vienna (2008).

[3.2] INTERNATIONAL ATOMIC ENERGY AGENCY, Framework for Assessing Dynamic Nuclear Energy Systems for Sustainability: Final Report of the INPRO Collaborative Project GAINS, IAEA Nuclear Energy Series No. NP-T-1.14, IAEA, Vienna (2013).

[3.3] FIGUEIRA, J., GRECO, S., EHRGOTT, M., Multiple Criteria Decision Analysis: State of the Art Surveys, Springer, Boston, MA (2005).

[3.4] BELTON, V., STEWART, T., Multiple Criteria Decision Analysis: An Integrated Approach, Kluwer Academic Publishers, Dordrecht (2002).

[3.5] KEENEY, R., RAIFFA, H., Decisions with Multiple Objectives, John Wiley & Sons, New York (1976).

[3.6] VON WINTERFELDT, D., EDWARDS, W., Decision Analysis and Behavioral Research, Cambridge University Press, Cambridge (1986).

[3.7] HWANG, Ching-Lai, YOON, Kwangsun, Multiple Attribute Decision Making: Methods and Applications, Springer, Berlin (1981).

[3.8] BRANS, J.P., VINCKE, P., A preference ranking organisation method: (The PROMETHEE method for multiple criteria decision-making), Manage. Sci. 31 6 (1985) 647–656.

[3.9] SAATY, T.L., The Analytic Hierarchy Process, McGraw-Hill, New York (1980).

[3.10] DYER, J., BUTLER, J., EDMUNDS, T., JIA, J., A multivariate utility analysis of the alternatives for the disposition of surplus weapons-grade plutonium, Oper. Res. 46 6 (1998) 749–762.

[3.11] KEENEY, R., NAIR, K., Evaluating Potential Nuclear Power Plant Sites in the Pacific Northwest Using Decision Analysis,

[3.11] KEENEY, R., NAIR, K., Evaluating Potential Nuclear Power Plant Sites in the Pacific Northwest Using Decision Analysis,