• Aucun résultat trouvé

Normalisation and Deep Inference

N/A
N/A
Protected

Academic year: 2022

Partager "Normalisation and Deep Inference"

Copied!
494
0
0

Texte intégral

(1)

Normalisation and Deep Inference

Course at ESSLLI 2015 in Barcelona

Anupam DasandAlessio Guglielmi

École Normale Supérieure de Lyon and University of Bath 10–14 August 2015

These slides are available athttp://cs.bath.ac.uk/ag/t/NDI.pdf Course web page athttp://cs.bath.ac.uk/ag/ESSLLI/

Deep inference web site athttp://alessio.guglielmi.name/res/cos/

(2)

Outline

Introduction

Topics and objectives

The problem of proof identity

Lecture 1: Open deduction and the `experiments' method De nition of open deduction

Properties of open deduction and deep inference Locality and splitting

Atomic ows (informally) Cut elimination by `experiments' Lecture 2: Atomic ows and streamlining

De nition of atomic ow

Local transformations of atomic ows Streamlining

The path breaker

Complexity and atomic ows Lecture 3: Decomposition

(3)

Outline (cont.)

Separating contractions and modalities

Example: Building a logic for a Kleene algebra, called KV … Example: … proving splitting for KV …

Example: … proving decomposition for KV by ows … Example: … and nally getting cut elimination for KV

Lecture 4.0: Leftovers from Lecture 2: Some remarks on the complexity of atomic ows

Complexity of sequentialising atomic ows Complexity of normalising cut-free proofs

Lecture 4.1: Quasipolynomial-time cut-elimination in deep inference Motivation

Monotone threshold formulae Basic counting arguments

Quasipolynomial-time normalisation Conclusions

Lecture 4.2: A short survey of complexity results for KS

(4)

Outline (cont.)

Restating some basic simulations

Separations via the functional pigeonhole principle Appendix: Some proofs for Lecture 4

Proof of the main lemma

Self-contained proof of simulation of truth tables Proofs for the functional pigeonhole principle

Lecture 5.1: Quasipolynomial-size proofs via atomic ows The unrestricted pigeonhole principle

Symmetry of counting formulae

A high-level proof of the pigeonhole principle Conclusions

Lecture 5.2: Towards a solution to the proof identity problem Substitutions and Formalism B

References

(5)

Topics

Proof theoryandtheoretical computer science.

Representation,manipulation,compositionandcomplexityofproofs andalgorithms.

Some of our objectives:

I attacking the problem of theidentity of proofs (and algorithms);

I simplifyingandextendingproof theory;

I giving areasonable normalisation theoryto logics that need one;

I integrateprocess algebrasinto proof theory.

This course will illustrate all these points.

(6)

Prerequisites

An acquaintance with the basics ofGentzen proof theoryandanalyticity [37] is helpful.

An acquaintance with somelinear logic[18] is also helpful but not necessary.

Unfortunately there is no textbook yet on deep inference. Everything is still pretty new and the notation is not well-established. What we show here requires a bit of maturity just to cope with the different notation styles in the various papers.

(7)

Prerequisites

An acquaintance with the basics ofGentzen proof theoryandanalyticity [37] is helpful.

An acquaintance with somelinear logic[18] is also helpful but not necessary.

Unfortunately there is no textbook yet on deep inference. Everything is still pretty new and the notation is not well-established. What we show here requires a bit of maturity just to cope with the different notation styles in the various papers.

(8)

Moral objectives of this course

We want to change the way youlook atproofs, by adopting ageometric perspective.

This way weget rid of most of proof bureaucracyand we can focus on the important factors that affect proofs, their complexity and their normalisation procedures.

(9)

Technical objectives of this course

We introduceatomic owsand their transformations and investigate the mainproof normalisation techniquesthat bene t from them:

1. cut elimination byexperiments;

2. streamlining bynormalisers;

3. cut elimination bydecomposition( + splitting);

4. quasipolynomial-time cut elimination bythreshold formulae.

These techniques are in order of increasing dependency on logical structure and each corresponds to one of the rst four lectures.

We will show the impact of thinking about proof theory via atomic ows with two in-depth examples:

1. the development of a new logic and its normalisation theory; 2. improving the complexity of proofs and their normalisation.

(10)

Technical objectives of this course

We introduceatomic owsand their transformations and investigate the mainproof normalisation techniquesthat bene t from them:

1. cut elimination byexperiments;

2. streamlining bynormalisers;

3. cut elimination bydecomposition( + splitting);

4. quasipolynomial-time cut elimination bythreshold formulae.

These techniques are in order of increasing dependency on logical structure and each corresponds to one of the rst four lectures.

We will show the impact of thinking about proof theory via atomic ows with two in-depth examples:

1. the development of a new logic and its normalisation theory;

2. improving the complexity of proofs and their normalisation.

(11)

The problem of proof identity

Hilbert himself thought about this problem, but it is very hard to formulate it in a technical way. Nonetheless there is progress.

(12)

Proof Systems

Proof= string of symbols.

Proof system= algorithmcheckingproofs in polytime. Theorem 1 (Cook and Reckhow [12])

superproof system iff

NP=co-NP where

super = with polysize proofs over each proved tautology

Therefore proofs can be almost anything but they better besmall.

(13)

Proof Systems

Proof= string of symbols.

Proof system= algorithmcheckingproofs in polytime.

Theorem 1 (Cook and Reckhow [12])

superproof system iff

NP=co-NP where

super = with polysize proofs over each proved tautology

Therefore proofs can be almost anything but they better besmall.

(14)

Proof Systems

Proof= string of symbols.

Proof system= algorithmcheckingproofs in polytime.

Theorem 1 (Cook and Reckhow [12])

superproof system iff

NP=co-NP where

super = with polysize proofs over each proved tautology

Therefore proofs can be almost anything but they better besmall.

(15)

Proof Systems

Proof= string of symbols.

Proof system= algorithmcheckingproofs in polytime.

Theorem 1 (Cook and Reckhow [12])

superproof system iff

NP=co-NP where

super = with polysize proofs over each proved tautology

Therefore proofs can be almost anything but they better besmall.

(16)

The problem of proof identity: Questions

This is sometimes called `Hilbert's 24th problem' (see [38]).

CAN WE DECIDE WHETHER TWO PROOFS ARE THE SAME?

CAN WE DECIDE WHETHER TWOALGORITHMSARE THE SAME? Questions:

I What is a good representation?

I What is a good decision procedure?

(17)

The problem of proof identity: Questions

This is sometimes called `Hilbert's 24th problem' (see [38]).

CAN WE DECIDE WHETHER TWO PROOFS ARE THE SAME?

CAN WE DECIDE WHETHER TWOALGORITHMSARE THE SAME?

Questions:

I What is a good representation?

I What is a good decision procedure?

(18)

The problem of proof identity: Questions

This is sometimes called `Hilbert's 24th problem' (see [38]).

CAN WE DECIDE WHETHER TWO PROOFS ARE THE SAME?

CAN WE DECIDE WHETHER TWOALGORITHMSARE THE SAME?

Questions:

I What is a good representation?

I What is a good decision procedure?

(19)

The problem of proof identity: State of the art

What we have:

I Representation: plenty of formalisms,i.e.,syntax:e.g, Hilbert-Frege systems, Gentzen systems, resolution, logic and functional

programming languages, …, C++.

←−Mathematically poor.

I Mathematically powerful abstraction methods,i.e.,semantics.

←−Driven by mathematically poor computation models.

I Inadequate decision procedures.

←−Especially for concurrent algorithms.

(20)

The problem of proof identity: State of the art

What we have:

I Representation: plenty of formalisms,i.e.,syntax:e.g, Hilbert-Frege systems, Gentzen systems, resolution, logic and functional

programming languages, …, C++. ←−Mathematically poor.

I Mathematically powerful abstraction methods,i.e.,semantics.

←−Driven by mathematically poor computation models.

I Inadequate decision procedures.

←−Especially for concurrent algorithms.

(21)

The problem of proof identity: State of the art

What we have:

I Representation: plenty of formalisms,i.e.,syntax:e.g, Hilbert-Frege systems, Gentzen systems, resolution, logic and functional

programming languages, …, C++. ←−Mathematically poor.

I Mathematically powerful abstraction methods,i.e.,semantics.

←−Driven by mathematically poor computation models.

I Inadequate decision procedures.

←−Especially for concurrent algorithms.

(22)

The problem of proof identity: State of the art

What we have:

I Representation: plenty of formalisms,i.e.,syntax:e.g, Hilbert-Frege systems, Gentzen systems, resolution, logic and functional

programming languages, …, C++. ←−Mathematically poor.

I Mathematically powerful abstraction methods,i.e.,semantics.

←−Driven by mathematically poor computation models.

I Inadequate decision procedures. ←−Especially for concurrent algorithms.

(23)

The problem of proof identity: Example 1

We could say that two proofs are the same if theynormalise to the same proof. This `works' for intuitionistic logic (it is basically computation in theλ-calculus).

Perhaps OK! But:

I we equate proofs ofwildly different sizes, because cut-elimination is at least exponential;

I the decision cost corresponds to the computation cost,i.e.it can behuge;

I this ideafails for classical logic, because cut-elimination is non-con uent.

←−In Gentzen!

One big problem is that semantics usually does not takesizeinto consideration (only results matter).

Another big problem (in AG's opinion) is that we should design

languages starting fromhowwe compute with them, not fromwhatwe compute.Gentzen was mostly interested in the `what' and his view has not been suf ciently challenged.

(24)

The problem of proof identity: Example 1

We could say that two proofs are the same if theynormalise to the same proof. This `works' for intuitionistic logic (it is basically computation in theλ-calculus). Perhaps OK! But:

I we equate proofs ofwildly different sizes, because cut-elimination is at least exponential;

I the decision cost corresponds to the computation cost,i.e.it can behuge;

I this ideafails for classical logic, because cut-elimination is non-con uent.

←−In Gentzen!

One big problem is that semantics usually does not takesizeinto consideration (only results matter).

Another big problem (in AG's opinion) is that we should design

languages starting fromhowwe compute with them, not fromwhatwe compute.Gentzen was mostly interested in the `what' and his view has not been suf ciently challenged.

(25)

The problem of proof identity: Example 1

We could say that two proofs are the same if theynormalise to the same proof. This `works' for intuitionistic logic (it is basically computation in theλ-calculus). Perhaps OK! But:

I we equate proofs ofwildly different sizes, because cut-elimination is at least exponential;

I the decision cost corresponds to the computation cost,i.e.it can behuge;

I this ideafails for classical logic, because cut-elimination is non-con uent.

←−In Gentzen!

One big problem is that semantics usually does not takesizeinto consideration (only results matter).

Another big problem (in AG's opinion) is that we should design

languages starting fromhowwe compute with them, not fromwhatwe compute.Gentzen was mostly interested in the `what' and his view has not been suf ciently challenged.

(26)

The problem of proof identity: Example 1

We could say that two proofs are the same if theynormalise to the same proof. This `works' for intuitionistic logic (it is basically computation in theλ-calculus). Perhaps OK! But:

I we equate proofs ofwildly different sizes, because cut-elimination is at least exponential;

I the decision cost corresponds to the computation cost,i.e.it can behuge;

I this ideafails for classical logic, because cut-elimination is non-con uent.

←−In Gentzen!

One big problem is that semantics usually does not takesizeinto consideration (only results matter).

Another big problem (in AG's opinion) is that we should design

languages starting fromhowwe compute with them, not fromwhatwe compute.Gentzen was mostly interested in the `what' and his view has not been suf ciently challenged.

(27)

The problem of proof identity: Example 1

We could say that two proofs are the same if theynormalise to the same proof. This `works' for intuitionistic logic (it is basically computation in theλ-calculus). Perhaps OK! But:

I we equate proofs ofwildly different sizes, because cut-elimination is at least exponential;

I the decision cost corresponds to the computation cost,i.e.it can behuge;

I this ideafails for classical logic, because cut-elimination is non-con uent. ←−In Gentzen!

One big problem is that semantics usually does not takesizeinto consideration (only results matter).

Another big problem (in AG's opinion) is that we should design

languages starting fromhowwe compute with them, not fromwhatwe compute.Gentzen was mostly interested in the `what' and his view has not been suf ciently challenged.

(28)

The problem of proof identity: Example 1

We could say that two proofs are the same if theynormalise to the same proof. This `works' for intuitionistic logic (it is basically computation in theλ-calculus). Perhaps OK! But:

I we equate proofs ofwildly different sizes, because cut-elimination is at least exponential;

I the decision cost corresponds to the computation cost,i.e.it can behuge;

I this ideafails for classical logic, because cut-elimination is non-con uent. ←−In Gentzen!

One big problem is that semantics usually does not takesizeinto consideration (only results matter).

Another big problem (in AG's opinion) is that we should design

languages starting fromhowwe compute with them, not fromwhatwe compute.Gentzen was mostly interested in the `what' and his view has not been suf ciently challenged.

(29)

The problem of proof identity: Example 1

We could say that two proofs are the same if theynormalise to the same proof. This `works' for intuitionistic logic (it is basically computation in theλ-calculus). Perhaps OK! But:

I we equate proofs ofwildly different sizes, because cut-elimination is at least exponential;

I the decision cost corresponds to the computation cost,i.e.it can behuge;

I this ideafails for classical logic, because cut-elimination is non-con uent. ←−In Gentzen!

One big problem is that semantics usually does not takesizeinto consideration (only results matter).

Another big problem (in AG's opinion) is that we should design

languages starting fromhowwe compute with them, not fromwhatwe compute.

Gentzen was mostly interested in the `what' and his view has not been suf ciently challenged.

(30)

The problem of proof identity: Example 1

We could say that two proofs are the same if theynormalise to the same proof. This `works' for intuitionistic logic (it is basically computation in theλ-calculus). Perhaps OK! But:

I we equate proofs ofwildly different sizes, because cut-elimination is at least exponential;

I the decision cost corresponds to the computation cost,i.e.it can behuge;

I this ideafails for classical logic, because cut-elimination is non-con uent. ←−In Gentzen!

One big problem is that semantics usually does not takesizeinto consideration (only results matter).

Another big problem (in AG's opinion) is that we should design

languages starting fromhowwe compute with them, not fromwhatwe compute.Gentzen was mostly interested in the `what' and his view has not been suf ciently challenged.

(31)

The problem of proof identity: Example 2

Lafont's counterexample [19]: the proof

π1

w A A,C

π2

w B C,¯ A cut A,B reduces to either

π1

w A or

A,B

π2

w B .

A,B

The lack of con uency is justlack of compositional freedomand too strongdependency on syntax.Semantics follows blindly, unfortunately. Just take

π1∨π2 !

(32)

The problem of proof identity: Example 2

Lafont's counterexample [19]: the proof

π1

w A A,C

π2

w B C,¯ A cut A,B reduces to either

π1

w A or

A,B

π2

w B .

A,B

The lack of con uency is justlack of compositional freedomand too strongdependency on syntax.Semantics follows blindly, unfortunately.

Just take

π1∨π2 !

(33)

The problem of proof identity: Example 2

Lafont's counterexample [19]: the proof

π1

w A A,C

π2

w B C,¯ A cut A,B reduces to either

π1

w A or

A,B

π2

w B .

A,B

The lack of con uency is justlack of compositional freedomand too strongdependency on syntax.Semantics follows blindly, unfortunately.

Just take

π1∨π2 !

(34)

The problem of proof identity: Our path towards the solution

Slogan:good semantics = good notion of proof identity.

Normalisation is central.

Slogan:computation = normalisationbut not just in the Curry-Howard sense.

Normalisation strongly depends on composition(because normalising means shuf ing proofs).

Syntax (bureaucracy) is an enemy, and (existing) semantics is not necessarily a friend.

(35)

The problem of proof identity: Our path towards the solution

Slogan:good semantics = good notion of proof identity.

Normalisation is central.

Slogan:computation = normalisationbut not just in the Curry-Howard sense.

Normalisation strongly depends on composition(because normalising means shuf ing proofs).

Syntax (bureaucracy) is an enemy, and (existing) semantics is not necessarily a friend.

(36)

The problem of proof identity: Our path towards the solution

Slogan:good semantics = good notion of proof identity.

Normalisation is central.

Slogan:computation = normalisationbut not just in the Curry-Howard sense.

Normalisation strongly depends on composition(because normalising means shuf ing proofs).

Syntax (bureaucracy) is an enemy, and (existing) semantics is not necessarily a friend.

(37)

The problem of proof identity: Our path towards the solution

Slogan:good semantics = good notion of proof identity.

Normalisation is central.

Slogan:computation = normalisationbut not just in the Curry-Howard sense.

Normalisation strongly depends on composition(because normalising means shuf ing proofs).

Syntax (bureaucracy) is an enemy, and (existing) semantics is not necessarily a friend.

(38)

The problem of proof identity: Our path towards the solution

Slogan:good semantics = good notion of proof identity.

Normalisation is central.

Slogan:computation = normalisationbut not just in the Curry-Howard sense.

Normalisation strongly depends on composition(because normalising means shuf ing proofs).

Syntax (bureaucracy) is an enemy, and (existing) semantics is not necessarily a friend.

(39)

Problem:

getting rid of bureaucracy in proofs

Proof Nets and the Identity of Proofs 11

id

!a, a id

!a, a

! !a, a!a, a

"

!a"(a!a), a id

!a, a

! !a"(a!a), a!a, a exch

!a"(a!a), a, a!a

"!a"(a!a), a"(a!a) id

!a, a id

!a, a id

!a, a

! !a, a!a, a exch"!a, a"(a!a, a, a!a!a)

! !a, a!a, a"(a!a)

"!a"(a!a), a"(a!a) id!a, a

id

!a, a id

!a, a

! !a, a!a, a exch"!a, a, a!a

!a, a"(a!a)

! !a, a!a, a"(a!a)

"!a"(a!a), a"(a!a) exch!a"(a!a), a"(a!a)

id!, id

!,

! !,!,

"

!"(!), id

!,

! !"(!),!,

exch!"(!), ,!

"

!"(!),"(!) id!a, a id

!a, a

!"!a, a!a, a

!a"(a!a), a id

!a, a

! !a"(a!a), a!a, a exch!a"(a!a), a, a!a

"

!a"(a!a), a"(a!a) id

!, id!, id

!,

! !,!,

exch"!, ,!

!,"(!)

!"!,!,"(!)

!"(!),"(!) id

!a, a id!a, a id

!a, a

! !a, a!a, a exch"!a, a"(a!a, a, a!a!a)

!"!a, a!a, a"(a!a)

!a"(a!a), a"(a!a) id!,

id

!, id

!,

! !,!,

exch"!,,!

!,"(!)

! !,!,"(!)

"!"(!),"(!) exch!"(!),"(!) id!a, a

id

!a, a id

!a, a

! !a, a!a, a exch"!a, a, a!a

!a, a"(a!a)

! !a, a!a, a"(a!a)

"!a"(a!a), a"(a!a) exch!a"(a!a), a"(a!a)

id!, id

!,

! !,!,

"!"(!), id

!,

! !"(!),!, exch"!"(!), ,!

!"(!),"(!) id!a, a id

!a, a

! !a, a!a, a

"!a"(a!a), a id

!a, a

! !a"(a!a), a!a, a exch"!a"(a!a), a, a!a

!a"(a!a), a"(a!a) id!,

id!, id

!,

! !,!,

exch!, ,!

"

!,"(!)

! !,!,"(!)

"

!"(!),"(!) id!a, a

id!a, a id

!a, a

! !a, a!a, a exch!a, a, a!a

"

!a, a"(a!a)

! !a, a!a, a"(a!a)

"

!a"(a!a), a"(a!a) id!,

id!, id

!,

! !,!,

exch

!,,!

"!,"(!)

! !,!,"(!)

"

!"(!),"(!) exch

!"(!),"(!) id!a, a

id!a, a id

!a, a

!exch!a, a!a, a

!a, a, a!a

"!a, a"(a!a)

!"!a, a!a, a"(a!a)

!a"(a!a), a"(a!a) exch

!a"(a!a), a"(a!a)

a a a a a a

! !

" "

a a a a a a

! !

" "

a a a a a a

! !

" "

Figure 2: From sequent calculus to proof nets via coherence graphs

2.3.4 ExerciseReduce in (6) the leftmost instance ofidto atomic version. And draw the proof net according to the method in Figure 1. What does change compared to the net in (9)?

For dealing with cuts (without forgetting them!), we can prevent the flow-graph from flowing through the cut, i.e., by keeping the information that there is a cut. What is meant by this is shown in Figure 4.

2.3.5 ExerciseCompare the net obtained in Figure 4 with your result of Exercise 2.3.4.

Now, we indeed get the same result with both methods, and it might seem foolish to emphasize the different nature of the two methods if they yield the same notion of proof net. The point to make here is that this is the case only forMLL, which is a very fortunate coincidence. For any other logic, which is more sophisticated, like classical logic or larger fragments of linear logic, the two methods yield different notions of proof nets. We will come back to this in later sections when we discuss these logics.

2.4 From deep inference to proof nets

The flow graph method has the advantage of being independent from the formalism that is used for describing the deductive system for the logic. We will now repeat exactly the same exercise we did for the sequent calculus

RR n 6013

Picture taken from Lutz Stra burger's lecture [34] at ESSLLI 2006

Idea: in order tocompareproofs

I from `different' Gentzen sequent proofs we get possibly isomorphicproof nets[18],

I but they aretoo small: for propositional logic, they probably do not form a proof system.

That said,geometry is a good way of capturing symmetriesthat syntax struggles with.

(40)

Problem:

getting rid of bureaucracy in proofs

Proof Nets and the Identity of Proofs 11

id

!a, a id

!a, a

! !a, a!a, a

"

!a"(a!a), a id

!a, a

! !a"(a!a), a!a, a exch

!a"(a!a), a, a!a

"!a"(a!a), a"(a!a) id

!a, a id

!a, a id

!a, a

! !a, a!a, a exch"!a, a"(a!a, a, a!a!a)

! !a, a!a, a"(a!a)

"!a"(a!a), a"(a!a) id!a, a

id

!a, a id

!a, a

! !a, a!a, a exch"!a, a, a!a

!a, a"(a!a)

! !a, a!a, a"(a!a)

"!a"(a!a), a"(a!a) exch!a"(a!a), a"(a!a)

id!, id

!,

! !,!,

"

!"(!), id

!,

! !"(!),!,

exch!"(!), ,!

"

!"(!),"(!) id!a, a id

!a, a

!"!a, a!a, a

!a"(a!a), a id

!a, a

! !a"(a!a), a!a, a exch!a"(a!a), a, a!a

"

!a"(a!a), a"(a!a) id

!, id!, id

!,

! !,!,

exch"!, ,!

!,"(!)

!"!,!,"(!)

!"(!),"(!) id

!a, a id!a, a id

!a, a

! !a, a!a, a exch"!a, a"(a!a, a, a!a!a)

!"!a, a!a, a"(a!a)

!a"(a!a), a"(a!a) id!,

id

!, id

!,

! !,!,

exch"!,,!

!,"(!)

! !,!,"(!)

"!"(!),"(!) exch!"(!),"(!) id!a, a

id

!a, a id

!a, a

! !a, a!a, a exch"!a, a, a!a

!a, a"(a!a)

! !a, a!a, a"(a!a)

"!a"(a!a), a"(a!a) exch!a"(a!a), a"(a!a)

id!, id

!,

! !,!,

"!"(!), id

!,

! !"(!),!, exch"!"(!), ,!

!"(!),"(!) id!a, a id

!a, a

! !a, a!a, a

"!a"(a!a), a id

!a, a

! !a"(a!a), a!a, a exch"!a"(a!a), a, a!a

!a"(a!a), a"(a!a) id!,

id!, id

!,

! !,!,

exch!, ,!

"

!,"(!)

! !,!,"(!)

"

!"(!),"(!) id!a, a

id!a, a id

!a, a

! !a, a!a, a exch!a, a, a!a

"

!a, a"(a!a)

! !a, a!a, a"(a!a)

"

!a"(a!a), a"(a!a) id!,

id!, id

!,

! !,!,

exch

!,,!

"!,"(!)

! !,!,"(!)

"

!"(!),"(!) exch

!"(!),"(!) id!a, a

id!a, a id

!a, a

!exch!a, a!a, a

!a, a, a!a

"!a, a"(a!a)

!"!a, a!a, a"(a!a)

!a"(a!a), a"(a!a) exch

!a"(a!a), a"(a!a)

a a a a a a

! !

" "

a a a a a a

! !

" "

a a a a a a

! !

" "

Figure 2: From sequent calculus to proof nets via coherence graphs

2.3.4 ExerciseReduce in (6) the leftmost instance ofidto atomic version. And draw the proof net according to the method in Figure 1. What does change compared to the net in (9)?

For dealing with cuts (without forgetting them!), we can prevent the flow-graph from flowing through the cut, i.e., by keeping the information that there is a cut. What is meant by this is shown in Figure 4.

2.3.5 ExerciseCompare the net obtained in Figure 4 with your result of Exercise 2.3.4.

Now, we indeed get the same result with both methods, and it might seem foolish to emphasize the different nature of the two methods if they yield the same notion of proof net. The point to make here is that this is the case only forMLL, which is a very fortunate coincidence. For any other logic, which is more sophisticated, like classical logic or larger fragments of linear logic, the two methods yield different notions of proof nets. We will come back to this in later sections when we discuss these logics.

2.4 From deep inference to proof nets

The flow graph method has the advantage of being independent from the formalism that is used for describing the deductive system for the logic. We will now repeat exactly the same exercise we did for the sequent calculus

RR n 6013

Picture taken from Lutz Stra burger's lecture [34] at ESSLLI 2006

Idea: in order tocompareproofs

I from `different' Gentzen sequent proofs we get possibly isomorphicproof nets[18],

I but they aretoo small: for propositional logic, they probably do not form a proof system.

That said,geometry is a good way of capturing symmetriesthat syntax struggles with.

Références

Documents relatifs

Equivalence is Ptime for sequential tree-to-word transducers [7], that pre- vents copying in the output and forces subtrees to produce following the order of the input..

Back to the fact that V is the initial R-linear induction algebra, we show here how to define linear maps by primitive recursion in a way similar to the usual clone of

The first idea of normal form is due to Bestle and Zeitz (1983) for time variant dynamical systems, and to Krener and Isidori (1983) for time invariant dynamical systems, where

• To propose a direct and “elementary” proof of the main result of [3], namely that the semi-classical spectrum near a global minimum of the classical Hamiltonian determines the

Section 3 then shows how we can compute in polynomial time from a given assertion a graph in normal form that represents the same set of models of the formula.. We then show in

The identity problem for V is solved by deriving from A a term rewriting System 5R which converts every term a of G to a unique term of (normal form) such that a = cT.. The

Furthermore, the geo- metrical conditions to guarantee the existence of a local diffeomorphism and an output injection to transform a nonlinear system in a ‘canonical’ normal

This paper gives the sufficient and necessary conditions which guarantee the existence of a diffeomorphism in order to transform a nonlinear system without inputs into a