Haut PDF A uniform approach to type theory

A uniform approach to type theory

A uniform approach to type theory

L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignemen[r]

50 En savoir plus

Knowledge and its Game-Theoretical Foundations: The Challenges of the Dialogical Approach to Constructive Type Theory

Knowledge and its Game-Theoretical Foundations: The Challenges of the Dialogical Approach to Constructive Type Theory

However, the epistemic perspectives did not all reduce to the proof-theoretical framework: epistemic features were also implemented via game-theoretical approaches. Indeed, on one hand, by the 1960s appeared Dialogical logic developed by Paul Lorenzen and Kuno Lorenz, as a solution to some of the problems that arouse in Lorenzen’s Operative Logik. 7 Herewith, the epistemic turn initiated by the proof theoretic was tackled with the notion of games that provided the dynamic features of the traditional dialectical reasoning. Inspired by Wittgenstein’s meaning as use the basic idea of the dialogical approach to logic is that the meaning of the logical constants is given by the norms or rules for their use. On the other hand, a bit later on, still in the sixties, Jaakko Hintikka developed game-theoretical semantic (GTS). GTS is an approach to formal semantics that, like in the dialogical framework, grounds the concepts of truth or validity on game-theoretical concepts, such as the existence of a winning strategy for a player, though differently to the dialogical framework it is build up on the notion of model. 8 Furthermore, Hintikka combined the model-theoretical, the epistemic and the game-based traditions by means of the development of what is now known as explicit epistemic logic, where the epistemic content is introduced into the object language as an operator (of some specific modal kind) which yields propositions from propositions rather than as meaning conditions on the notion of proposition and inference. These kinds of operators were rapidly generalized covering several propositional attitudes including notably knowledge and belief.
En savoir plus

59 En savoir plus

A Coordination-theory Approach to Exploring Process Alternatives for Designing Differentiated Products

A Coordination-theory Approach to Exploring Process Alternatives for Designing Differentiated Products

Lee and Tang have proposed a production process model in which one serialized chain of operations is divided into two at a specific point, and have explained differentiation approaches and their effects by shifting the chain-dividing point in that production process [Lee & Tang 1997]. However, they have not addressed product design processes; their model does not describe process alternatives of a design process. There is also excellent research about how to create process structures [ Crowston 1997 ; Davenport 1993; Grover et al 1995; Kettinger et al 1997; Malone et al 1999; Nadler & Tushman 1997; Pentland et al 1999; Sterman 2000; von Hippel 1990 ] and also how to differentiate products [Kotler 1999; Nobeoka 1996; Porter 1990; Ulrich & Eppinger 2000]. However, the research has not given clear, theoretical explanation associating a type of differentiated product and the structure of a product design process. Thus, one cannot systematically explore design process alternatives that meet a specific differentiation approach. Not having an easy way to choose the correct product design process can lead to delay (and therefore cost) or impact the quality of the product design by choosing an inferior product design process.
En savoir plus

12 En savoir plus

A Principal-Agent Theory Approach to Public Expenditure Management Systems in Developing Countries

A Principal-Agent Theory Approach to Public Expenditure Management Systems in Developing Countries

As already stated, the MoF has a number of instruments and strategies at its disposal to limit agency problems. First, it can use incentive schemes, designed solely on observable information, and promise to grant the line ministry a transfer equivalent to the sum of a suitable compensation for the line ministry’s effort and an informational rent (which depends on incentive compatibility constraints) in case of high productivity. 15 If such a contract exists, it prevents the line ministry from exerting little effort – but at the expense (for the MoF) of a loss equivalent to the informational rent, in addition to a distortion created by requiring a lower level of effort in some occurrences of the productivity factor. Although commonly applied to models of the corporate world (e.g. granting board members bonuses or shares), this strategy is not always directly applicable in the public sector. 16 Alternatively, the MoF can supervise the line ministry using a number of instruments and can threaten it with appropriate sanctions if cheating is detected. The design of the appropriate control system must take a number of factors into account, for instance the choice between ex ante and ex post (or internal and external) controls, the type of variables to be monitored (input versus result indicators), and the choice between systematic or random audits. In our model, there are two unobservable variables (effort and state of nature). Supervision could thus turn either to the exogenous productivity factor, from the observation of which the agent’s behaviour could be inferred (this relates, for instance, to public sector reforms aimed at improving the economic statistical data collection, or to audits targeted at assessing the programme design), or directly to the agent’s effort. In this article, we assume that the MoF will audit the line ministry’s effort. 17 The timing of the game is the same as in all principal-agent models (see Leruth and Paul, 2006, for more details).
En savoir plus

30 En savoir plus

AN ELEMENTARY APPROACH TO UNIFORM IN TIME PROPAGATION OF CHAOS

AN ELEMENTARY APPROACH TO UNIFORM IN TIME PROPAGATION OF CHAOS

Remark 5. In [ 7 , 22 , 11 ], it has been shown that for a fixed center of mass, exponential contractivity of the nonlinear SDE and uniform propagation of chaos hold if V + 2W is, for example, strictly uniformly convex. This suggests that convexity of the interaction potential W can make up for non-convexity of V , a fact that is not visible from our current approach. The problem is that a symmetrization trick is used to get a benefit from the convexity of the interaction potential. This trick does not carry over in the same form to the ℓ 1 type distances considered here. We will address this challenging
En savoir plus

13 En savoir plus

The decoupling approach to quantum information theory

The decoupling approach to quantum information theory

the reference. We will consider a very general operation: a fixed unitary trans- formation followed by an arbitrary completely positive superoperator T A→E . We will show that if we choose the unitary transformation randomly according to the Haar measure (which can essentially be viewed as the uniform distribu- tion over all unitaries), then the resulting protocol will on average perform well. This generalizes all of the decoupling theorems in the literature that the author is aware of, including the Fully Quantum Slepian-Wolf theorem [ADHW06], which corresponds to the special case in which T traces out part of the system, as well as the state merging [HOW07] theorem, in which T A→EX corresponds to making a rank-|E| measurement and then storing the measurement result in the classical register X and the residual quantum state in E. One advantage of this generalization is that it allows us to choose T to be a very complex operation; one especially interesting example is to pick T to be the complementary channel (the channel to the environment) of a channel we are interested in coding for. Another advantage is the use of (smooth) conditional 2-entropies rather than purities and dimension bounds as was done in all of these theorems (although, in the case of state merging, this was already done in [Ber08] and [BCR09], and, in the case of FQSW, by Hayden in [Hay06]). This theorem allows to show di- rectly that the environment is decoupled from any system of interest, which is usually what we need to show.
En savoir plus

133 En savoir plus

Non-classical expected utility theory with application to type indeterminacy

Non-classical expected utility theory with application to type indeterminacy

7.2 Playing with a Non-classical opponent We shall consider a situation where our decision-maker faces uncertainty about the type (or preferences) of the agent that he is interacting with. That is what we earlier called "Nature" is another decision-maker. The idea that agents (represented by their preferences and beliefs) may be viewed as non-classical systems was first proposed in Lambert-Mogiliansky, S. Zamir and H. Zwirn (2003) and further developped in e.g. Busemeyer et al (2006a, 2006b) and Danilov and Lambert-Mogiliansky (2007). The motivation for this approach is that a variety of empirical phenomena, so-called behavioral anomalies, can be explained when representing uncertainty about the type (preferences) of a decision-maker with a non-boolean ortholattice. In that context the term type is equivalent to the term "state" when talking about arbitrary systems. A decision situation or DS is an ODU that measures a type characteristics. 8
En savoir plus

28 En savoir plus

A Multivariate Extreme Value Theory Approach to Anomaly Clustering and Visualization

A Multivariate Extreme Value Theory Approach to Anomaly Clustering and Visualization

representation of the extremal dependence structure is obtained when only a few such groups of variables can be exhibited (compared to 2 d − 1) and/or when these groups involve a small number of variables (with respect to d). Here we develop this framework further, in order to propose a (soft) clustering technique in the region of extremes and derive effective 2-d visual displays, sheding light on the structure of anomalies/extremes in sparse situations. This is achieved by modelling the distribution of extremes as a specific mixture model, where each component generates a different type α of extremes. In this respect, the present paper may be seen as an extension of Boldi and Davison (2007); Sabourin and Naveau (2014), where a Bayesian inference framework is designed for moderate dimensions (d ≤ 10 say) and situations where the sole group of variables with the potential of being simultaneously large is {1, . . . , d} itself. In the context of mixture modelling (see e.g. Fruhwirth-Schnatter et al. (2018)), the Expectation- Maximization algorithm (EM in abbreviated form) permits to partition/cluster the set of extremal data through the statistical recovery of latent observations, as well as posterior probability distributions (inducing a soft clustering of the data in a straighforward manner) and, as a by-product, a similarity measure on the set of extremes: the higher the probability that their latent variables are equal, the more similar two extreme observations X and X 0 are considered. The similarity matrix thus obtained naturally defines a weighted graph, whose vertices are the anomalies/extremes observed, paving the way for the use of powerful graph-mining techniques for community detection and visualization, see e.g. Schaeffer (2007), Hu and Shi (2015) and the references therein. Be- yond its detailed description, the methodology proposed is applied to a real fleet monitoring dataset in the aeronautics domain and shown to provide useful tools for analyzing and interpreting abnormal data.
En savoir plus

26 En savoir plus

A Reasonably Exceptional Type Theory

A Reasonably Exceptional Type Theory

An alternative, and much lower-level way to address the issue is to represent the effectful fragment of the type theory as a deep embedding of the syntax of this fragment inside the theory. This happens commonly in the implementation of compilers in some flavor of type theory, like e.g. CompCert [ Leroy et al. 2016 ] or Cake ML [ Kumar et al. 2014 ]. While this approach is extremely simple and readily available in weak theories such as LF, it is completely oblivious of the advantages of dependent types. That is, the equational rules of the embedded language have to be computed explicitly in the proof, which in turn also requires proving that the properties of the host language are stable under these rules. As such, handling advanced features like higher-order functions is painful, let alone the preservation of typing of the various programs being considered.
En savoir plus

30 En savoir plus

A multivariate extreme value theory approach to anomaly clustering and visualization

A multivariate extreme value theory approach to anomaly clustering and visualization

representation of the extremal dependence structure is obtained when only a few such groups of variables can be exhibited (compared to 2 d − 1) and/or when these groups involve a small number of variables (with respect to d). Here we develop this framework further, in order to propose a (soft) clustering technique in the region of extremes and derive effective 2-d visual displays, sheding light on the structure of anomalies/extremes in sparse situations. This is achieved by modelling the distribution of extremes as a specific mixture model, where each component generates a different type α of extremes. In this respect, the present paper may be seen as an extension of Boldi and Davison ( 2007 ); Sabourin and Naveau ( 2014 ), where a Bayesian inference framework is designed for moderate dimensions (d ≤ 10 say) and situations where the sole group of variables with the potential of being simultaneously large is {1, . . . , d} itself. In the context of mixture modelling (see e.g. Fruhwirth-Schnatter et al. ( 2018 )), the Expectation- Maximization algorithm (EM in abbreviated form) permits to partition/cluster the set of extremal data through the statistical recovery of latent observations, as well as posterior probability distributions (inducing a soft clustering of the data in a straighforward manner) and, as a by-product, a similarity measure on the set of extremes: the higher the probability that their latent variables are equal, the more similar two extreme observations X and X 0 are considered. The similarity matrix thus obtained naturally defines a weighted graph, whose vertices are the anomalies/extremes observed, paving the way for the use of powerful graph-mining techniques for community detection and visualization, see e.g. Schaeffer ( 2007 ), Hu and Shi ( 2015 ) and the references therein. Be- yond its detailed description, the methodology proposed is applied to a real fleet monitoring dataset in the aeronautics domain and shown to provide useful tools for analyzing and interpreting abnormal data.
En savoir plus

26 En savoir plus

Normalisation & Equivalence in Proof Theory & Type Theory

Normalisation & Equivalence in Proof Theory & Type Theory

Starting from the admissibility of the cut-rule in G3ii, we use our framework with terms called λG3 to relate inductive proofs of term-irrelevant admissibility to rewrite systems that eliminate the cut-constructor. These systems in fact make sense even without the notion of typing for λG3, although we do need typing to prove their strong normalisation. We identify the structure of such rewrite systems that perform cut-elimination in a typed framework, in that they are made of a kernel that reduces principal cuts/cut constructors and propagation systems which may vary. In this generic framework we show the critical pairs of these systems, which can be solved in two canonical ways leading to the introduction of a generic notion of CBN and CBV sub-systems. We present three kinds of propagation system with a comparative approach, essentially by investigating their ability to simulate β-reduction through Gentzen’s or Prawitz’s encodings described in Chapter 2. We also compare the CBN and CBV equational theories that these propagation systems produce.
En savoir plus

377 En savoir plus

Mapping walkability. A subjective value theory approach

Mapping walkability. A subjective value theory approach

in Figure 3 ) were selected considering both location and walkability features and their image, from Google Street View, inserted in the questionnaire. We then started investigating which path the interviewed would prefer and choose to walk to a particular destination in the city. Our final aim was to order the streets considering citizens’ preferences. For this purpose, we asked citizens to judge the walkability of a list of 10 streets, represented by pictures, in a Likert-type scale from 0 to 3. The reader should note that for this work we did not really used the “Likert” property of these scales which were considered as just ordinal scales (with 4 values or grades). The reason is that we actually constructed upon these scales value functions (an interval scale of differences of preferences among the possible grades). WE choose to use a Likert-scale in order to facilitate the responses of interviewed. For each street picture the question posed to the interviewed was: “Do you think this road is suitable for walking?” (Give a rating from 0 to 3, with 0 corresponding to the negative answer “not suitable for walking
En savoir plus

38 En savoir plus

A Uniform Approach to Analogies, Synonyms, Antonyms, and Associations

A Uniform Approach to Analogies, Synonyms, Antonyms, and Associations

As far as we know, this is the first time a stan- dard supervised learning algorithm has been ap- plied to any of these four problems. The advantage of being able to cast these problems in the frame- work of standard supervised learning problems is that we can now exploit the huge literature on su- pervised learning. Past work on these problems has required implicitly coding our knowledge of the nature of the task into the structure of the algo- rithm. For example, the structure of the algorithm for latent semantic analysis (LSA) implicitly con- tains a theory of synonymy (Landauer and Dumais, 1997). The problem with this approach is that it can be very difficult to work out how to modify the algorithm if it does not behave the way we want. On the other hand, with a supervised learning algo- rithm, we can put our knowledge into the labeling of the feature vectors, instead of putting it directly into the algorithm. This makes it easier to guide the system to the desired behaviour.
En savoir plus

10 En savoir plus

Reference transmission network: A game theory approach

Reference transmission network: A game theory approach

The existing transmission network has been designed, planned, and built in a vertically integrated environment, fol- lowing the traditional planning criteria. Since the system was operated by a unique entity, the transmission planning was required to achieve a certain level of efficiency in a framework where only centralized and coordinated decisions were made. In [3], an extensive classification of transmission expansion planning models and of different synthesis algorithms for trans- mission planning is presented. The synthesis planning models can be classified as mathematical optimization or heuristic methods. The former type includes linear programming, dy- namic programming, nonlinear programming, mixed integer programming, and optimization techniques such as Bender decomposition and hierarchical decomposition [4]–[6]. The latter one includes simulated annealing, tabu search, expert systems, fuzzy set theory, and greedy randomized adaptive search procedures [7]–[10].
En savoir plus

11 En savoir plus

A Possibility Theory-based Approach to Desire Change

A Possibility Theory-based Approach to Desire Change

The paper is organized as follows. In Section 2, we highlight the main intuitions behind the concept of desire change in contrast with the concept of belief change, from a philosophical and AI perspec- tive. Section 3 introduces the idea of a hedonic entrenchment relation that rank-order desires, and provide axioms for such a relation, whose unique numerical counterpart is a guaranteed possibility distribution, associated with a guaranteed possibility measure in the sense of pos- sibility theory. In Section 4, desires are then represented in this set- ting. The guaranteed possibility distribution enables us to associate any set of desires with a level of unacceptability, which is the coun- terpart of the level of inconsistency for a set of beliefs represented in a possibilistic logic manner. Section 5 provides axioms for de- sire revision and Section 6 presents the revision of sets of prioritized desires axiomatically, semantically, and syntactically using a special type of possibilistic logic. Expansion and contraction of desires are also characterized and discussed.
En savoir plus

10 En savoir plus

A Possibility Theory-based Approach to Desire Change

A Possibility Theory-based Approach to Desire Change

The paper is organized as follows. In Section 2, we highlight the main intuitions behind the concept of desire change in contrast with the concept of belief change, from a philosophical and AI perspec- tive. Section 3 introduces the idea of a hedonic entrenchment relation that rank-order desires, and provide axioms for such a relation, whose unique numerical counterpart is a guaranteed possibility distribution, associated with a guaranteed possibility measure in the sense of pos- sibility theory. In Section 4, desires are then represented in this set- ting. The guaranteed possibility distribution enables us to associate any set of desires with a level of unacceptability, which is the coun- terpart of the level of inconsistency for a set of beliefs represented in a possibilistic logic manner. Section 5 provides axioms for de- sire revision and Section 6 presents the revision of sets of prioritized desires axiomatically, semantically, and syntactically using a special type of possibilistic logic. Expansion and contraction of desires are also characterized and discussed.
En savoir plus

11 En savoir plus

The Uniform geometrical Theory of Diffraction for elastodynamics: Plane wave scattering from a half-plane

The Uniform geometrical Theory of Diffraction for elastodynamics: Plane wave scattering from a half-plane

I. INTRODUCTION The scattering of elastic waves from an obstacle is of great interest in ultrasonic non-destructive evaluation (NDE), the main scattering phenomena being specular reflection and diffraction. When both the wavefront of an incident wave and the boundary of the scattering object can be modeled as locally plane, the scattered field is usually simulated using the Kirchhoff Approximation (KA). This represents the field as an integral over the scattering surface. 1 KA gives a reliable and continuous description of specular reflections and fictitious fields compensating the incident field in the obstacle shadow, which together form the so- called Geometrico-Elastodynamic (GE) field. However, the diffracted fields are not always well described. 2 , 3 In the absence of interaction with other waves, the best description of diffracted fields is obtained via the Geometrical Theory of Diffraction (GTD). 4 This postulates existence of rays dif- fracted from the structure irregularities such as edge or tip, additional to the incident and reflected rays and also gives a recipe for calculating the amplitudes carried by these rays. In elastodynamics the underlying canonical problem is the scattering of a plane wave by a stress-free half-plane. Its
En savoir plus

11 En savoir plus

Corporate Reputation and Social Media: A Game Theory Approach

Corporate Reputation and Social Media: A Game Theory Approach

1 Introduction Five hundred million messages are sent everyday on Twitter. If one considers a message from a Chinese fortune cookie to be as long as a tweet (140 symbols), then these tweets represent more than 2500 tons of crispy dough. Daily. Twitter is just one example, and we can add Facebook, Google+, LinkedIn, etc. The multiplication of social networks is at the origin of the often-cited “buzz”. The exponential replication and amplification of information is of paramount importance for firms in this day and age. Reputation is and will be the most valuable asset of a firm. Paradoxically, a firm's reputation is defined as an intangible asset, since we do not know how to measure it. In this digital age, at once, firms' reputation is more and more important, and also more and more exposed through the buzz created on social media (De Marcellis- Warin and Teodoresco, 2012).
En savoir plus

19 En savoir plus

A Principal-Agent Theory Approach to Public Expenditure Management Systems in Developing Countries

A Principal-Agent Theory Approach to Public Expenditure Management Systems in Developing Countries

information. But the LM’s effort may also comprise some negative actions (such as corruption), and this leads us to assume that cheating entails some costs to be concealed. This allows us to make the link with the economic literature on collusion in organizations. 43 The literature distinguishes two types of collusion costs, according to whether they are exogenous (e.g., negotiation costs, “physical” strategies to divert monies from their intended purposes), or endogenous (e.g., costs stemming from the risk of future detection, see Faure-Grimaud, Laffont, and Martimort, 1999; Khalil and Lawarrée, 2003). Ways in which the principal can avoid corruption include (i) create incentive payments; (ii) decrease the stake of collusion; and (iii) increase the transaction cost of collusion (Laffont and Rochet, 1997). In this section, we introduce an exogenous cost of cheating and explain how it affects the constraints of the MoF’s problem. In a second step, we interpret ex ante controls, undertaken by the MoF before the commitment and/or the payment of LM’s expenditures, as increasing the cost of cheating. We then discuss the relative value of ex post and ex ante controls.
En savoir plus

45 En savoir plus

Counterexample to a Lyapunov Condition for Uniform Asymptotic Partial Stability

Counterexample to a Lyapunov Condition for Uniform Asymptotic Partial Stability

Counterexample to a Lyapunov condition for uniform asymptotic partial stability Jakub Orłowski 1 , Antoine Chaillet 2 , and Mario Sigalotti 3 Abstract— Partial stability characterizes dynamical sys- tems for which only a part of the state variables exhibits a stable behavior. In his book on partial stability, V. I. Vorotnikov proposed a sufficient condition to establish this property through a Lyapunov-like function whose total derivative is upper-bounded by a negative definite function involving only the sub-state of interest. In this note, we show with a simple two-dimensional system that this statement is wrong in general. More precisely, we show that the convergence rate of the relevant state variables may not be uniform in the initial state. We also discuss the impact of this lack of uniformity on the connected issue of robustness with respect to exogenous disturbances.
En savoir plus

7 En savoir plus

Show all 10000 documents...