Mobiquitous systems are gaining more and more weight in our daily lives. They are becoming a reality from our home and work to our leisure. The use of Location-Based Services (LBS) in these systems is increasingly demanded by users. Yet, while on one hand they enable people to be more “connected”, on the other hand, they may expose people to serious privacy issues. The design and deployment of Privacy-Enhancing Technologies (PETs) for LBS has been widely addressed in the last years. However, strikingly, there is still a lack of methodologies to assess the risk that using LBS may have on users’ privacy (even when PETs are considered). This paper presents the first steps towards a privacy risk assessmentmethodology to (i) identify (ii) analyse, and (iii) evaluate the potential privacy issues affecting mobiquitous systems.
Jing Li, Marcus Barkowsky, Patrick Le Callet
Subjective assessmentmethodology for 3DTV
May the Absolute Category Rating (ACR) be used with 3D stereoscopic content? Experts agree that as long as the degradations are on one single perceptual scale, notably image degradations such as coding artifacts, the previously employed assessment methods such as ACR or DSCQS may be suitable.
ADVANCED DAMAGE TOLERANCE AND RISK ASSESSMENTMETHODOLOGY AND TOOL FOR AIRCRAFT STRUCTURES CONTAINING MSD/MED
For built-up structures, a FE-based global load reduction factor is also used in the NSY criterion of the cracked component. For PZL, the flow stress, i.e. the average of ultimate strength and yield strength of the material, is used to calculate link-up between two adjacent cracks, or between a crack and a hole or an edge. The combined RS curve is obtained by taking the minimum of the RS values from the multiple criteria, including the possible link-ups, and by applying a set of guidelines to make the final RS curve continuous and monotonically decreasing. An example of this process is illustrated in Fig. 9 for two cracking scenarios obtained for the CW-1 location (Section 4.1). In this case, an operational limit stress was also applied and the RS was normalized to this limit stress.
transition probabilities for prognostics by using hidden Markov models (HMMs). In the same domain, a data–driven‐ based point MHA methodology was proposed in a previous research 32 for failure prognostics. In this work, the point machine degradation levels were extracted and the remaining‐useful‐life (RUL) was calculated using the state transition time values. The authors in a previous study 33 proposed a systematic health assessmentmethodology based on self‐ organizing maps (SOMs) and PCA techniques for point machine fault diagnostics. In this work, different statistical fea- tures were extracted from the segmented power signals. The extracted features were further used in point machine degradation‐level assessment and incipient fault detection. An air leakage detection and failure prediction approach for the train braking system was proposed in another study 34 based on regression and clustering. The regression classi- fier was used in failure severity prediction, and the density‐based clustering was utilized to detect the leakage anomalies. However, clustering‐based degradation level detection, ie, health state or fault severity detection, approaches may not guarantee that the change in health state transitions are due to the machine degradation. Because, the clusters, ie, health states, found by these tools may refer to variations of the operational conditions, rather than the variation due to degradation, which is one of the disadvantages of using unsupervised learning for fault detection and severity eval- uation. In addition, the clusters extracted by using any clustering algorithms can be different, 35 and therefore may not be a consistent approach for machine health state detection.
1 Department of Chemical Engineering, University of Liège, Belgium.
The aim of this paper is to present the teaching approach which has been followed concerning the Life Cycle Assessmentmethodology (LCA). This course was launched three years ago by the Chemical Engineering Department, being aware of the potential of this methodology but also the pressure demand of LCA in industries.
Figure 4 - Risk abacus for buildings with 100 occupants in seismic zone 1 (PGA=0.04g), 2 (PGA=0.07g), 3 (PGA=1.1g) and 4 (PGA=0.16g).
2.2 Step 2: Conformity assessment
Step 2 aims to highlight buildings with significant seismic risk identified in step 1 (unacceptable or ALARP) according to their conformity to target resistance by regulation. Conformity is set by a dimensionless conformity factor (α) defined as the ratio of acceleration that can generate damages (resistance acceleration) to the reference acceleration (from regulation) at building location.
Studies have shown that data-quality (DQ) and information-quality (IQ) assessment are essential activities in organizations that want to improve the efficiency of communication and information systems. So far, research on the evaluation of DQ and IQ has focused on approaches, models or classification of attributes. However, context-specific DQ and IQ assessment methodologies are difficult to find in the literature. While assessment methodologies do exist for office document processing in general, there are none for forms. The focus of this thesis is the need for a context-specific tool with which to assess the DQ input and the IQ output in communication and information systems. The channel analysed for this purpose is the form. This thesis proposes a novel methodology based on: 1) an adaptation of the “manufacturing of information” approach, which adopts the communication-system point of view; 2) an existing DQ classification system that classifies attributes as intrinsic, contextual, representational or accessible; and 3) a new conceptual model which provides the guidelines for assessment of forms. This evaluation only takes into consideration established contextual attributes, such as completeness, appropriate amount of data (here called “sufficiency”), relevance (which emphasises content), timeliness (which emphasises process) and actual value. To present the applicability of the contextual-information quality assessment (CIQA) methodology, two representative forms were used as case studies. The main results suggest that a novel data representation allows data to be classified by type (indispensable or verification) and composition (simple or composite). In one of the two case studies, the data quantity was reduced by 50%, resulting in a 15% improvement of IQ and a more efficient document processing system. The streamlining and new structure of the form led not only to a reduction in data quantity but also to increased information quality. This suggests that data quantity is not directly correlated to IQ, as IQ may increase in the absence of a corresponding increase in data quantity. In addition, the design of the forms requires particular attention to content, not simply aesthetics. Furthermore, in data processing, there could be great benefits in combining IQ assessment and computerization processes, in order to avoid problems such as data overload; of course, data security would need to be considered as well.
Since a decrease of the fatigue strength may result from punching operations, this study proposes a methodology for designing punched parts against high cycle fatigue crack initiation. To reach this goal, high cycle fatigue tests are performed on different specimens configurations with either punched or polished edges. Due to punching effects, the fatigue strength of punched specimens is significantly decreased. Fracture surfaces observations reveal that crack initiation occurs always on a punch defect. Additional investigations are combined to characterize how the edges are altered by the punching operations. High tensile residual stress levels along the loading direction are quantified using X-Ray diffraction techniques. Furthermore, micro-hardness measurements and X-Ray diffraction results reveals a strong hardness gradient due to punching operation. For a better understanding of crack initiation mechanisms, the edge geometries have been scanned with 3D optical microscopy, allowing us to identify the most critical defect (and its real geometry) by comparing the edges before and after fatigue failure. Finally, FEA are performed on identified defects. A non-local high cycle multiaxial fatigue strength criterion has been used as post-processing of FEA to take into account the effect on the HCF strength of defects and the strong stress-strain gradients around them.
PoE is used to specify the outcome of the subjective test using pair comparison. It represents the preference of the QoE of the ob- servers as the observers provided their preferences between each two videos rather than an absolute scale value for each video sequence. The target of this paper is to introduce some assessment methods for PoE and the corresponding statistical analysis methods.
The second methodology is the Subjective AssessmentMethodology for Video Quality (SAMVIQ) . This is a multiple stimuli assessmentmethodology using a continu- ous quality scale shown on the left side of Figure 1. Two reference sequences are used in a session. The first one is explicit, defined as the high quality anchor for the rest of the current presentation. The second one is hidden, randomly included amongst processed sequences. The observer is al- lowed to choose the viewing order of the sequences. He/she can modify notes and repeat viewings as he/she wants, but every sequence has to be assessed. Several contents, each processed several times, are assessed in a session. SAMVIQ is only able to assess 48 sequences in an around 35-minute- long session. However, the possibility to refine the judg- ment with multi-viewing allows to increase the measure pre- cision and to decrease the number of observers. Thus, the EBU recommends to use at least 15 observers.
8 crack tip. Finally, Sauvage proposed a normal stress criterion for the initiation of adhesive debonding. Nonetheless, his two approaches were not coupled with a single criterion.
Thus, using the identifiable initiation zone, this work took into account the finite fracture mechanics, by using the coupled energy and stress criterion (CC) proposed by Leguillon  . The assessment of the adhesive-to-adherend interface helped to overcome the previous deficiency and to consider an instantaneous and finite crack debonding. Several types of stress concentration cases applied successfully the CC, such as interface debonding by Martin et al. (2016)  , Weißgraeber and Becker (2013)  and by Carrère et al. (2015)  , on notched strength and bond strength.
A new methodology is developed to assess concrete cover resistivity using the instantaneous response of the polarization of a metal rebar (galvanostatic pulse method). The instantaneous ohmic drop is linked only with the concrete resistance, which depends on the concrete cover and resistivity, and rebar diameter. A numerical model was developed in Comsol Multiphysics® in order to create a graph linking concrete resistivity to concrete resistance for concrete cover ranging between 1 to 160 mm. This graph and the measured ohmic drop can be used to determine concrete resistivity for any rebar diameter/concrete cover configuration. The theory developed numerically was then confirmed using an experimental setup with controlled water resistivity. The theory is then generalized for counter electrode (CE) diameter ranging from 20 to 70 mm. Finally, the study reveals that the graph developed for a single rebar can be used for any rebar framework density.
The goal of this study is to assess the environmental impacts of chicken meat production from cradle to PPP gate by coupling the LCA methodology with simulation and artiﬁcial intelligence tech- niques to overcome its limitations. Process simulation allows quantifying inputs and outputs of the process according to both the real system conditions and parameters not to create a black box (complex processes modeled by using literature data). Monte Carlo simulation makes possible to quantify and propagate variability and uncertainty into the LCA results. The classical mass allocation method and alternative impact allocation procedures were compared. The results obtained showed similar results. Finally, a multiobjective optimization model was used to generate alterna- tives of optimal process parameters that reduce environmental impacts in the system per functional unit (FU). The model considers three criteria based on technical, economic and environmental aspects, and a Genetic Algorithm (GA) is used to generate optimal alternatives. GA solves the problem caused by both the non-linear nature of a system and the multiple criteria assessment. The GA results were evaluated through a multi-criteria decision-making (MCDM) method to ﬁnd the best solution.
Figure 2. (a) The measurements of temperature and deflection; (b) The updated probability of failure.
In timber structures subjected to cracking, the variations of temperature and deflections have important influences on its serviceability and safety. This research proposed a coupled me- chanics-probabilistic methodology with BN to facilitate the representation of structural per- formance. The obtained results showed that this approach is useful for reliability assessment of timber structure from experimental data.
Lignite 16.8 17.8
Natural gas 45.4 (36.3 MJ/m^3) 50.4 (40.3 MJ/m^3)
The lower and upper limits in the time horizon are chosen to be consistent with the IPCC’s time horizons for climate change impact assessment (IPCC, 2006). One-hundred and five-hundred years were chosen as the lower and upper limit of the time horizon. One hundred years is roughly the time that our immediate next generation would still be present, so any resource depletes in less than 100 years is deemed scarce. 500 years is considered so far into the future that it is believed that humans will have fully relieved their dependence on fossil resources, or discovered technologies to overcome the problem of depletion. The choice of a time horizon is a crucial value choice to evaluate the importance of the reduced availability. A sensitivity analysis is conducted on the choice of the upper and lower limits of the time horizon see section 18.104.22.168.