• Aucun résultat trouvé

Communication security: Formal models and proofs

N/A
N/A
Protected

Academic year: 2022

Partager "Communication security: Formal models and proofs"

Copied!
117
0
0

Texte intégral

(1)

Communication security: Formal models and proofs

Hubert Comon

September 14, 2016

(2)

The context (I)

I credit cards

I contactless cards

I telephones

I online transactions

I cars, fridges,... Internet of Things

I Big Brother: NSA

I Biomedical applications

I ...

(3)

The context (II)

(4)

The context (III)

I Security protocols

I Testing is not very useful

I Hiding the code is not a good idea

I The scope of formal methods

(5)

A simple handshake protocol

A→B: νn,r.aenc(hA,ni,pk(skB),r) B →A: νr0.aenc(n,pk(skA),r0)

(6)

The formal verification problem

∀A. A k P |= φ

∀A. A k P1 ∼ A k P2

Universal quantification onA: we cannot apply directly model-checking techniques.

One important issue: range ofA?

(7)

The formal verification problem

∀A. A k P |= φ

∀A. A k P1 ∼ A k P2

Universal quantification onA: we cannot apply directly model-checking techniques.

One important issue: range ofA?

(8)

The formal verification problem

∀A. A k P |= φ

∀A. A k P1 ∼ A k P2

Universal quantification onA: we cannot apply directly model-checking techniques.

One important issue: range ofA?

(9)

The formal verification problem

∀A. A k P |= φ

∀A. A k P1 ∼ A k P2

Universal quantification onA: we cannot apply directly model-checking techniques.

One important issue: range ofA?

(10)

Attacker models

The DY-attacker

Messages are terms, the attacker is defined through an equation theory or an inference system

The computational attacker

Messages are bitstrings, the attacker is a probabilistic polynomial time Turing machine

Other attackers

(11)

Attacker models

The DY-attacker

Messages are terms, the attacker is defined through an equation theory or an inference system

The computational attacker

Messages are bitstrings, the attacker is a probabilistic polynomial time Turing machine

Other attackers

(12)

Attacker models

The DY-attacker

Messages are terms, the attacker is defined through an equation theory or an inference system

The computational attacker

Messages are bitstrings, the attacker is a probabilistic polynomial time Turing machine

Other attackers

(13)

Attacker models

The DY-attacker

Messages are terms, the attacker is defined through an equation theory or an inference system

The computational attacker

Messages are bitstrings, the attacker is a probabilistic polynomial time Turing machine

Other attackers

(14)

Goals of the lecture

Verification inputs

I Cryptographic libraries

I Protocol programs

I Attacker model

I Security property

Goals of the lecture

Show how to derive the proof obligations in a parametric way, abstracting from crypto libraries, attacker models.

Focus on the semantics of protocols, for arbitrary libraries and attacker models.

(15)

Goals of the lecture

Verification inputs

I Cryptographic libraries

I Protocol programs

I Attacker model

I Security property Goals of the lecture

Show how to derive the proof obligations in a parametric way, abstracting from crypto libraries, attacker models.

Focus on the semantics of protocols, for arbitrary libraries and attacker models.

(16)

Roadmap

4 successive versions of the calculus, by increasing expressiveness (we could have considered the last case only...)

1. Simple case

2. Adding events: required for agreement properties 3. Adding replication

4. Adding channel generation: required for computational semantics

Then indistinguishability properties (privacy).

(17)

Cryptographic libraries

Syntax

I An arbitrary set of cryptographic primitivesF : hash, public-key encryption(s), symmetric encryption(s), zkp,...

represented by (typed) function symbols

I At least one random generation algorithm. Random numbers are represented by namesn,n1,r, ... out of a set N

Termsare built over variables, function symbols and names.

(18)

Cryptographic libraries

Semantics

Mis an interpretation domain. Typically ground or constructor terms (the DY semantics) or bitstrings (the computational semantics).

Mincludes error messages (exceptions)Err.

Ifσ is an environment (mapping from variables to M), u is a term, [[u]]Mσ

is the interpretation ofu in Mw.r.t. σ: M is a (partial) F-algebra.

The interpretation is strict:

ui ∈Err ⇒ [[f(u1, . . . ,un)]]Mσ ∈Err

(19)

Cryptographic libraries

A possible set of function symbols

I aenc(u,pk,r) is (supposed to be) the asymmetric encryption of u with the public key pk and random input r.

I dec(u,sk) is (supposed to be) the decryption of u with the secret key sk

I pk(sk) is (supposed to be) the public key associated with the secret key sk

I hu,vi

I π1(u), π2(u)

(20)

Cryptographic libraries

A DY model

MDY (messages) is the least set of ground terms such that:

I N ⊆MDY

I if u,v ∈MDY thenhu,vi ∈MDY

I if k ∈ N then pk(k)∈MDY

I if u∈MDY,k,r ∈ N, then aenc(u,pk(k),r)∈MDY. MDY also includes special error termsErr (not messages).

dec(aenc(u,pk(k),r),k) → u For k,r ∈ N,u a message π1(hu,vi) → u u,v are messages

π2(hu,vi) → v u,v are messages [[u]]Mσ DY =uσ ↓

Any irreducible ground term, which is not a message, is an error.

(21)

Cryptographic libraries

Computational models

I η ∈Nis a security parameter

I τ mapsN to {0,1}η

I Mc(τ,η)⊆ {0,1}

I [[n]]Mc(τ,η)=τ(n)

I aenc( , , ),dec( , ),pk( ) are interpreted as a public-key encryption scheme.

I with an interpretation of pairing/projections, Mc(τ,η)is an F-algebra

(22)

A simple process calculus

Syntax

P ::= 0 null process (stalled)

| in(x).P input ofx (binds x)

| out(t).P output oft

| ifEQ(u,v) thenP elseP conditional branching

| lety =u inP evaluation (bindsy)

| νn.P random generation

| PkP parallel composition

All variable occurrences are bound.

(23)

Example

The simple handshake protocol

AB: νn,r.aenc(hA,ni,pk(skB),r) BA: νr0.aenc(n,pk(skA),r0)

A(ska,pk(skb)) =

νn,r.out(aenc(hpk(ska),ni,pk(skb),r)). in(z).letz1= dec(z,ska) in

ifEQ(z1,n) then0(Success) else0(Fail)

B(skb) =

νr0.in(x).lety= dec(x,skb) in lety11(y) in lety22(y) in

out(aenc(y2,y1,r0)).0.

νska,skb. out(hpk(ska),pk(skb)i). (A(ska,pk(skb))kB(skb))

(24)

Example

The simple handshake protocol

AB: νn,r.aenc(hA,ni,pk(skB),r) BA: νr0.aenc(n,pk(skA),r0)

A(ska,pk(skb)) =

νn,r.out(aenc(hpk(ska),ni,pk(skb),r)).

in(z).letz1= dec(z,ska) in

ifEQ(z1,n) then0(Success) else0(Fail)

B(skb) =

νr0.in(x).lety= dec(x,skb) in lety11(y) in lety22(y) in

out(aenc(y2,y1,r0)).0.

νska,skb. out(hpk(ska),pk(skb)i). (A(ska,pk(skb))kB(skb))

(25)

Example

The simple handshake protocol

AB: νn,r.aenc(hA,ni,pk(skB),r) BA: νr0.aenc(n,pk(skA),r0)

A(ska,pk(skb)) =

νn,r.out(aenc(hpk(ska),ni,pk(skb),r)).

in(z).letz1= dec(z,ska) in

ifEQ(z1,n) then0(Success) else0(Fail)

B(skb) =

νr0.in(x).lety= dec(x,skb) in

lety11(y) in lety22(y) in out(aenc(y2,y1,r0)).0.

νska,skb. out(hpk(ska),pk(skb)i). (A(ska,pk(skb))kB(skb))

(26)

Example

The simple handshake protocol

AB: νn,r.aenc(hA,ni,pk(skB),r) BA: νr0.aenc(n,pk(skA),r0)

A(ska,pk(skb)) =

νn,r.out(aenc(hpk(ska),ni,pk(skb),r)).

in(z).letz1= dec(z,ska) in

ifEQ(z1,n) then0(Success) else0(Fail)

B(skb) =

νr0.in(x).lety= dec(x,skb) in lety11(y) in lety22(y) in

out(aenc(y2,y1,r0)).0.

νska,skb. out(hpk(ska),pk(skb)i). (A(ska,pk(skb))kB(skb))

(27)

Example

The simple handshake protocol

AB: νn,r.aenc(hA,ni,pk(skB),r) BA: νr0.aenc(n,pk(skA),r0)

A(ska,pk(skb)) =

νn,r.out(aenc(hpk(ska),ni,pk(skb),r)).

in(z).letz1= dec(z,ska) in

ifEQ(z1,n) then0(Success) else0(Fail)

B(skb) =

νr0.in(x).lety= dec(x,skb) in lety11(y) in lety22(y) in

out(aenc(y2,y1,r0)).0.

νska,skb. out(hpk(ska),pk(skb)i). (A(ska,pk(skb))kB(skb))

(28)

Example

The simple handshake protocol

AB: νn,r.aenc(hA,ni,pk(skB),r) BA: νr0.aenc(n,pk(skA),r0)

A(ska,pk(skb)) =

νn,r.out(aenc(hpk(ska),ni,pk(skb),r)).

in(z).letz1= dec(z,ska) in

ifEQ(z1,n) then0(Success) else0(Fail) B(skb) =

νr0.in(x).lety= dec(x,skb) in lety11(y) in lety22(y) in

out(aenc(y2,y1,r0)).0.

νska,skb. out(hpk(ska),pk(skb)i). (A(ska,pk(skb))kB(skb))

(29)

Example

The simple handshake protocol

AB: νn,r.aenc(hA,ni,pk(skB),r) BA: νr0.aenc(n,pk(skA),r0)

A(ska,pk(skb)) =

νn,r.out(aenc(hpk(ska),ni,pk(skb),r)).

in(z).letz1= dec(z,ska) in

ifEQ(z1,n) then0(Success) else0(Fail) B(skb) =

νr0.in(x).lety= dec(x,skb) in lety11(y) in lety22(y) in

out(aenc(y2,y1,r0)).0.

νska,skb. out(hpk(ska),pk(skb)i). (A(ska,pk(skb))kB(skb))

(30)

Structural equivalence

0kP ≡ P P kQ ≡ QkP P k(Q kR) ≡ (P kQ)kR

νn.P ≡ νn0.P{n7→n0} in(x).P ≡ in(x0).P{x 7→x0} letx=u inP ≡ letx0 =u inP{x7→x0}

(νn.P)kQ ≡ νn0.(PkQ) ifn∈/ freenames(Q)

(31)

Operational semantics

States of the network are tuples (φ, σ,P), where

I φis a frame of the formνn.m1, . . . ,mk, wheren is a set of names (used so far) and m1, . . . ,mk is a sequence of values in M(that have been sent out so far)

I σ is an environment: an assignment of the free variables to values in M

I P is a process

The semantics is a labeled transition system, whose labels are the inputs provided by the attacker (sometimes, an empty input)

(32)

Operational semantics

The transition system (I)

(φ, σ,in(x).P) −→u (φ, σ] {x7→u},P)

(φ, σ,P) −→u0, σ0,P0)

(φ, σ,ifEQ(s,t) thenP elseQ) −→u0, σ0,P0) if[[s]]Mσ = [[t]]Mσ ∈/ Err

(φ, σ,Q) −→u0, σ0,P0)

(φ, σ,ifEQ(s,t) thenP elseQ) −→u0, σ0,P0) if[[s]]Mσ 6= [[t]]Mσ or [[s]]Mσ ∈Error [[t]]Mσ ∈Err

(33)

Operational semantics

The transition system (I)

(φ, σ,in(x).P) −→u (φ, σ] {x7→u},P)

(φ, σ,P) −→u0, σ0,P0)

(φ, σ,ifEQ(s,t) thenP elseQ) −→u0, σ0,P0) if[[s]]Mσ = [[t]]Mσ ∈/ Err

(φ, σ,Q) −→u0, σ0,P0)

(φ, σ,ifEQ(s,t) thenP elseQ) −→u0, σ0,P0) if[[s]]Mσ 6= [[t]]Mσ or [[s]]Mσ ∈Error [[t]]Mσ ∈Err

(34)

Operational semantics

The transition system (I)

(φ, σ,in(x).P) −→u (φ, σ] {x7→u},P)

(φ, σ,P) −→u0, σ0,P0)

(φ, σ,ifEQ(s,t) thenP elseQ) −→u0, σ0,P0) if[[s]]Mσ = [[t]]Mσ ∈/ Err

(φ, σ,Q) −→u0, σ0,P0)

(φ, σ,ifEQ(s,t) thenP elseQ) −→u0, σ0,P0) if[[s]]Mσ 6= [[t]]Mσ or [[s]]Mσ ∈Error [[t]]Mσ ∈Err

(35)

Operational semantics

The transition system (II)

if[[u]]Mσ =w ∈/ Err (φ, σ,letx=uinP) −→ (φ, σ] {x7→w},P)

(νn.θ, σ,out(s).P) −→ (νn.θ·[[s]]Mσ , σ,P) (φ, σ,P) −→u0, σ0,P0)

(φ, σ,PkQ) −→u0, σ0,P0kQ)

ifn∈/n∪freename(θ) (νn.θ, σ, νn.P) −→ νn]n.θ, σ,P)

(36)

Operational semantics

The transition system (II)

if[[u]]Mσ =w ∈/ Err (φ, σ,letx=uinP) −→ (φ, σ] {x7→w},P)

(νn.θ, σ,out(s).P) −→ (νn.θ·[[s]]Mσ , σ,P)

(φ, σ,P) −→u0, σ0,P0) (φ, σ,PkQ) −→u0, σ0,P0kQ)

ifn∈/n∪freename(θ) (νn.θ, σ, νn.P) −→ νn]n.θ, σ,P)

(37)

Operational semantics

The transition system (II)

if[[u]]Mσ =w ∈/ Err (φ, σ,letx=uinP) −→ (φ, σ] {x7→w},P)

(νn.θ, σ,out(s).P) −→ (νn.θ·[[s]]Mσ , σ,P) (φ, σ,P) −→u0, σ0,P0)

(φ, σ,PkQ) −→u0, σ0,P0kQ)

ifn∈/n∪freename(θ) (νn.θ, σ, νn.P) −→ νn]n.θ, σ,P)

(38)

Operational semantics

The transition system (II)

if[[u]]Mσ =w ∈/ Err (φ, σ,letx=uinP) −→ (φ, σ] {x7→w},P)

(νn.θ, σ,out(s).P) −→ (νn.θ·[[s]]Mσ , σ,P) (φ, σ,P) −→u0, σ0,P0)

(φ, σ,PkQ) −→u0, σ0,P0kQ)

ifn∈/n∪freename(θ) (νn.θ, σ, νn.P) −→ νn]n.θ, σ,P)

(39)

Example

On the black board.

(40)

Restricting the feasible transitions

1, σ1,P1) −→ · · ·u1 −−−→uk−1k, σk,Pk)

is possible w.r.t. modelMand an attacker Aif , for every i, A([[φi]]Mσi ,Pi) = [[ui]]Mσi

Note: could include a state inA.

(41)

Example DY

There is a DY attackerA such thatA(φ) = [[u]]Mσ DY iff φ`I uσ↓

whereI is defined by:

φ`u1· · ·φ`un

φ`f(u1, . . . ,un)↓ For everyf ∈ F

νn.u1, . . . ,un`ui

νn.θ`n0 ifn0∈ N \n.

(42)

Exercise

In the simple handshake example, describe all feasible transition sequences in the DY model (assume the name extrusion, let, conditionals and outputs are always performed before inputs).

Is the noncen secret ?

(43)

Example computational

Ais a Probabilistic Polynomial Time Turing machine (PPT).

Some inputs that were not possible in the DY model might now be possible.

A typical example

Amight be able to compute (with a significant probability) [[aenc(u,pk(k1),r1)]]Mc(τ,η) from [[aenc(v,pk(k1),r1)]]Mc(τ,η)

∃A, Prob{τ, ρ: A([[aenc(v,pk(k1),r1)]]Mc(τ,η)) =

[[aenc(u,pk(k1),r1)]]Mc(τ,η)} > (η) isnon-negligible: there is a polynomial Pol such that

lim inf

η→+∞(η)×Pol(η)>1

(44)

Example computational

Ais a Probabilistic Polynomial Time Turing machine (PPT).

Some inputs that were not possible in the DY model might now be possible.

A typical example

Amight be able to compute (with a significant probability) [[aenc(u,pk(k1),r1)]]Mc(τ,η) from [[aenc(v,pk(k1),r1)]]Mc(τ,η)

∃A, Prob{τ, ρ: A([[aenc(v,pk(k1),r1)]]Mc(τ,η)) =

[[aenc(u,pk(k1),r1)]]Mc(τ,η)} > (η) isnon-negligible: there is a polynomial Pol such that

lim inf

η→+∞(η)×Pol(η)>1

(45)

Confidentiality

In the DY case

Is there a DY attackerA and a feasible transition sequence (∅,∅,P) −→ (φ, σ,Q)

such thatA(φ,Q) =s ?

This problem is in NP

In the computational case

Is there a PPTAsuch that, for every computational model Mc(τ,η), the probability that there is a feasible sequence

(∅,∅,P) −→ (φ, σ,Q) such thatA(φ,Q) =s is negligible inη ?

This requires in general assumptions on the libraries

(46)

Confidentiality

In the DY case

Is there a DY attackerA and a feasible transition sequence (∅,∅,P) −→ (φ, σ,Q)

such thatA(φ,Q) =s ? This problem is in NP

In the computational case

Is there a PPTAsuch that, for every computational model Mc(τ,η), the probability that there is a feasible sequence

(∅,∅,P) −→ (φ, σ,Q) such thatA(φ,Q) =s is negligible inη ?

This requires in general assumptions on the libraries

(47)

Confidentiality

In the DY case

Is there a DY attackerA and a feasible transition sequence (∅,∅,P) −→ (φ, σ,Q)

such thatA(φ,Q) =s ? This problem is in NP In the computational case

Is there a PPTAsuch that, for every computational model Mc(τ,η), the probability that there is a feasible sequence

(∅,∅,P) −→ (φ, σ,Q) such thatA(φ,Q) =s is negligible inη ?

This requires in general assumptions on the libraries

(48)

Exercises

In the following cases, give reasonable processesA,B and either give an attack on the confidentiality ofs or prove that there is no such attack in the DY model.

1.

AB: νs, νr. hpk(skA),aenc(s,pk(skB),r)i BA νr0.hpk(skB),aenc(s,pk(skA),r0)i

P =νska, νskb.out(hpk(skA),pk(skB)i)·(A(ska,pk(skB))kB(skb))

2.

AB: νs,r1,r2.aenc(hpk(skA),aenc(s,pk(skB),r1)i,pk(skB),r2) BA: νr3,r4.aenc(hpk(skB),aenc(s,pk(skA),r3)i,pk(skA),r4)

P =νska, νskb.out(hpk(skA),pk(skB)i)·(A(ska,pk(skB))kB(skb)kB(skb))

(49)

Gathering feasability conditions

States of the network are tuples (φ, σ,P, θ), where

I φ, σ,P as before

I θ is a constraint: equalities, disequalities and computational constraints of the formφ.u.

(φ, σ,in(x).P, θ) −→ (φ, σ,P, θ∧φ.x)

(φ, σ,ifEQ(s,t) thenP elseQ, θ) −→ (φ, σ,P, θ∧EQ(s,t))

(φ, σ,ifEQ(s,t) thenP elseQ, θ) −→ (φ, σ,P, θ∧ ¬EQ(s,t))

(50)

Gathering feasability conditions

States of the network are tuples (φ, σ,P, θ), where

I φ, σ,P as before

I θ is a constraint: equalities, disequalities and computational constraints of the formφ.u.

(φ, σ,in(x).P, θ) −→ (φ, σ,P, θ∧φ.x)

(φ, σ,ifEQ(s,t) thenP elseQ, θ) −→ (φ, σ,P, θ∧EQ(s,t))

(φ, σ,ifEQ(s,t) thenP elseQ, θ) −→ (φ, σ,P, θ∧ ¬EQ(s,t))

(51)

Gathering feasability conditions

States of the network are tuples (φ, σ,P, θ), where

I φ, σ,P as before

I θ is a constraint: equalities, disequalities and computational constraints of the formφ.u.

(φ, σ,in(x).P, θ) −→ (φ, σ,P, θ∧φ.x)

(φ, σ,ifEQ(s,t) thenP elseQ, θ) −→ (φ, σ,P, θ∧EQ(s,t))

(φ, σ,ifEQ(s,t) thenP elseQ, θ) −→ (φ, σ,P, θ∧ ¬EQ(s,t))

(52)

Gathering feasability conditions

States of the network are tuples (φ, σ,P, θ), where

I φ, σ,P as before

I θ is a constraint: equalities, disequalities and computational constraints of the formφ.u.

(φ, σ,in(x).P, θ) −→ (φ, σ,P, θ∧φ.x)

(φ, σ,ifEQ(s,t) thenP elseQ, θ) −→ (φ, σ,P, θ∧EQ(s,t))

(φ, σ,ifEQ(s,t) thenP elseQ, θ) −→ (φ, σ,P, θ∧ ¬EQ(s,t))

(53)

Consequences

Advantages

I A finite transition system (regardless of the model)

I Confidentiality reduces to constraint satisfaction θ∧φf .s

in NP in the DY model

(54)

Consequences

Computational case

Specify the assumptions on the libraries: impossibility conditions.

6.n

S,aenc(n,pk(k),r).n ⇒ S.n S1.x∧S2,x.y ⇒ S1,S2.y

S1.x1∧. . .∧Sn.xn ⇒ S1, . . . ,Sn.f(x1, . . . ,xn) S,S1,S2 are finite sets of terms.

Check the constraint satisfiability, together withφ.s and the above axioms (in PTIME !!)

(55)

Consequences

Computational case

Specify the assumptions on the libraries: impossibility conditions.

6.n

S,aenc(n,pk(k),r).n ⇒ S.n S1.x∧S2,x.y ⇒ S1,S2.y

S1.x1∧. . .∧Sn.xn ⇒ S1, . . . ,Sn.f(x1, . . . ,xn) S,S1,S2 are finite sets of terms.

Check the constraint satisfiability, together withφ.s and the above axioms (in PTIME !!)

(56)

Exercise

Back to the simple handshake protocol. Study its security in the computational model, assuming the properties of the cryptographic libraries that are described in the lecture.

(57)

Adding events to the process calculus

syntax

Eva,Evb, ... is a set of event symbols.

P ::= 0 null process (stalled)

| in(x).P input of x (bindsx)

| out(t).P output of t

| ifEQ(u,v) thenP elseP conditional branching

| lety=uinP evaluation

| νn.P random generation

| PkP parallel composition

| Eve(u)·P the eventEve is raised with a sequence of values u

(58)

Adding events to the process calculus

semantics

The states have now an event componentE, a sequence of event symbols together with values.

E is only modified when an event occurs in the process:

(φ, σ,Eve(u)·P,E) −→ (φ, σ,P,E·Eve([[u]]Mσ )

(59)

Example

A(ska,pk(skb)) =

νn,r.out(aenc(hpk(ska),ni,pk(skb),r).

in(z).letz1 = dec(z,ska) in

ifEQ(z1,n) thenEva(n,pk(skb),pk(ska)) else0 B(skb) =

νr0.in(x).lety = dec(x,skb) in lety11(y) in lety22(y) in

out(aenc(y2,y1,r0)).Evb(y2,pk(skb),y1)0.

∀x,y,z. Eva(x,y,z)⇒Evb(x,y,z)

(60)

Example

A(ska,pk(skb)) =

νn,r.out(aenc(hpk(ska),ni,pk(skb),r).

in(z).letz1 = dec(z,ska) in

ifEQ(z1,n) thenEva(n,pk(skb),pk(ska)) else0 B(skb) =

νr0.in(x).lety = dec(x,skb) in lety11(y) in lety22(y) in

out(aenc(y2,y1,r0)).Evb(y2,pk(skb),y1)0.

∀x,y,z. Eva(x,y,z)⇒Evb(x,y,z)

(61)

Agreement property

∀x,y,z. Eva(x,y,z)⇒Evb(x,y,z) More generally

∀~x. Eva1(u1), . . . ,Evak(uk)⇒Eva(u) For every feasible trace, ending with an event setE, for any assignmentσ, ifEva1(u1σ), . . . ,Evak(ukσ)∈E, then Eva(uσ)∈E

(62)

Exercise

In the DY-mode, does the simple handshake protocol satisfy the agreement property ?

What happens if

1. we move forward theEva

2. B replies withy2 (instead of the encrypted version) ?

(63)

Adding replication in the calculus

P ::= 0 null process

| in(x).P input of x (bindsx)

| out(t).P output of t

| ifEQ(u,v) thenP elseP conditional branching

| lety=uinP evaluation

| νn.P random generation

| PkP parallel composition

| Eve(u)·P the eventEve is raised with a sequence of values u

| !P

Operational semantics:

!P ≡ P k!P

(64)

Adding replication in the calculus

P ::= 0 null process

| in(x).P input of x (bindsx)

| out(t).P output of t

| ifEQ(u,v) thenP elseP conditional branching

| lety=uinP evaluation

| νn.P random generation

| PkP parallel composition

| Eve(u)·P the eventEve is raised with a sequence of values u

| !P

Operational semantics:

!P ≡ P k!P

(65)

Example

The handshake protocol

!(νska,skb. out(hpk(ska),pk(skb)i). (!A(ska,pk(skb))k!B(skb))

(66)

Verification in the DY-model

I This is undecidable

I Most popular: approximation and translation into Horn clauses

(67)

Translation into Horn clauses

Attacker clauses

Att(a) ⇐

Att(hx1,x2i) ⇐ Att(x1), Att(x2) Att(pk(x)) ⇐ Att(x)

Att(aenc(x1,x2,x3)) ⇐ Att(x1)Att(x2), Att(x3)

Att(x1) ⇐ Att(aenc(x1,pk(x2),x3)), Att(x2) Att(x1) ⇐ Att(hx1,x2i)

Att(x2) ⇐ Att(hx1,x2i) ais a free name.

(68)

Translation into Horn clauses

Protocol clauses

T(P) = [[P]]∅,∅ where [[P]]ρ,H is defined as follows [[0]] = ∅

[[PkQ]]ρ,H = [[P]]ρ,H∪[[Q]]ρ,H [[!P]]ρ,H = [[P]]ρ,H

[[νn.P]]ρ,H = [[P]]ρ]{n7→n(H)},H

[[in(x).P]]ρ,H = [[P]]ρ,H]{x}

[[out(s).P]]ρ,H = [[P]]ρ,H ∪ {Att(sρ)⇐Att(s1ρ), . . .Att(snρ)} IfH={s1, . . . ,sn} [[ifEQ(s,t) thenP elseQ]]ρ,H = [[P]]ρσ,Hσ∪[[Q]]ρ,H Ifσ =mgu(s,t)

[[letx=sinP]]ρ,H = [[P]]ρσx,s,Hσx,s Ifσx,s is defined

[[letx=sinP]]ρ,H = ∅ Ifσx,s is undefined

(69)

Examples

B=νr.in(x).lety = dec(x,skb) in lety11(y) in lety2= π2(y) in out(aenc(y2,y1,r))

x= aenc(hy1,y2i,pk(skB),z):

B=νr.in(aenc(hy1,y2i,pk(skB),z)).out(aenc(y2,y1,r))

Att(aenc(y2,y1,r(· · ·))) ⇐ Att(aenc(hy1,y2i,pk(skB),z)))

Att(aenc(hpk(skA),ni,pk(skB),r0)) ⇐

(70)

Examples

B=νr.in(x).lety = dec(x,skb) in lety11(y) in lety2= π2(y) in out(aenc(y2,y1,r))

x= aenc(hy1,y2i,pk(skB),z):

B=νr.in(aenc(hy1,y2i,pk(skB),z)).out(aenc(y2,y1,r))

Att(aenc(y2,y1,r(· · ·))) ⇐ Att(aenc(hy1,y2i,pk(skB),z)))

Att(aenc(hpk(skA),ni,pk(skB),r0)) ⇐

(71)

Examples

B=νr.in(x).lety = dec(x,skb) in lety11(y) in lety2= π2(y) in out(aenc(y2,y1,r))

x= aenc(hy1,y2i,pk(skB),z):

B=νr.in(aenc(hy1,y2i,pk(skB),z)).out(aenc(y2,y1,r))

Att(aenc(y2,y1,r(· · ·))) ⇐ Att(aenc(hy1,y2i,pk(skB),z)))

Att(aenc(hpk(skA),ni,pk(skB),r0)) ⇐

(72)

Examples

B=νr.in(x).lety = dec(x,skb) in lety11(y) in lety2= π2(y) in out(aenc(y2,y1,r))

x= aenc(hy1,y2i,pk(skB),z):

B=νr.in(aenc(hy1,y2i,pk(skB),z)).out(aenc(y2,y1,r))

Att(aenc(y2,y1,r(· · ·))) ⇐ Att(aenc(hy1,y2i,pk(skB),z)))

Att(aenc(hpk(skA),ni,pk(skB),r0)) ⇐

(73)

ProVerif

A tool available online, whose main developer isBruno Blanchet.

I Translate the specification of primitives into rewriting system + Attacker clauses

I Translate the protocol into clauses

I Negate the security goal

I Check for satisfiability (Ordered resolution strategy).

Three possible outputs:

1. Unsatisfiable: the protocol is secure

2. Satisfiable: there might be an attack (but might also be secure)

3. Non termination

Very successful tool : works well in practice.

(74)

False attacks

Using twice the same protocol clause (rigid variables !)

A→B : νn1,n2,r1,r2. haenc(n1,pk(skB),r1),aenc(n2,pk(skB),r2)i B →A: νn,r3. aenc(n,pk(skA),r3)

A→B : νs. hn,senc(s,hn1,n2i)i

Approximating names

A→B : νn.aenc(n,pk(skB),r) B →A: νn0.aenc(n0,pk(skA),r0)

A→B : νs.in(aenc(x,pk(skA),y).ifx=nthen out(s) else out(n)

(75)

False attacks

Using twice the same protocol clause (rigid variables !)

A→B : νn1,n2,r1,r2. haenc(n1,pk(skB),r1),aenc(n2,pk(skB),r2)i B →A: νn,r3. aenc(n,pk(skA),r3)

A→B : νs. hn,senc(s,hn1,n2i)i

Approximating names

A→B : νn.aenc(n,pk(skB),r) B →A: νn0.aenc(n0,pk(skA),r0)

A→B : νs.in(aenc(x,pk(skA),y).ifx=nthen out(s) else out(n)

(76)

The determinacy issue

Currently the calculus is non-deterministic, while the computational security is probabilistic.

As security is an asymptotic propery, it is harmless when the (symbolic) transition sytem is finite (for a fixed attacker).

Instead of non-determinism, we let the attacker choose the transition.

Communications are using differentcommunication channels, that solve the non-determinism and are chosen by the attacker.

(77)

The determinacy issue

Currently the calculus is non-deterministic, while the computational security is probabilistic.

As security is an asymptotic propery, it is harmless when the (symbolic) transition sytem is finite (for a fixed attacker).

Instead of non-determinism, we let the attacker choose the transition.

Communications are using differentcommunication channels, that solve the non-determinism and are chosen by the attacker.

(78)

The determinacy issue

Currently the calculus is non-deterministic, while the computational security is probabilistic.

As security is an asymptotic propery, it is harmless when the (symbolic) transition sytem is finite (for a fixed attacker).

Instead of non-determinism, we let the attacker choose the transition.

Communications are using differentcommunication channels, that solve the non-determinism and are chosen by the attacker.

(79)

Extending the calculus with channels

c,c1, . . . are channel names

P ::= 0 null process (stalled)

| in(c,x).P input of x (bindsx)

| out(c,t).P output of t

| ifEQ(u,v) thenP elseP conditional branching

| lety=uinP evaluation

| νn.P random generation

| PkP parallel composition

| Eve(u)·P the eventEve is raised with a sequence of values u

| !P

| νc.P

(80)

Operational semantics

Only known channels can be observed

(φ, σ,out(c,u)·P,E, θ) −→ (φ·[[u]]Mσ , σ,P,E, θ∧φ.c)

Inputs are only possible on the designated channels

(φ, σ,in(c,x)·P,E, θ) −→ (φ, σ] {x 7→[[u]]Mσ },P,E, θ) IfA(φ, γ) = [[c]]M (γ is the global control state).

(81)

Operational semantics

Only known channels can be observed

(φ, σ,out(c,u)·P,E, θ) −→ (φ·[[u]]Mσ , σ,P,E, θ∧φ.c)

Inputs are only possible on the designated channels

(φ, σ,in(c,x)·P,E, θ) −→ (φ, σ] {x7→[[u]]Mσ },P,E, θ) IfA(φ, γ) = [[c]]M (γ is the global control state).

(82)

Internal synchronizations

(φ, σ,in(c,x)·Pkout(c,u)·Q,E, θ) −→ (φ, σ] {x 7→[[u]]Mσ },PkQ,E, θ)

(83)

Restrictions on the processes

We assume that, for any reachable process

in(c1,x1).P1 k · · · kin(ck,xk).Pk kQ c1, . . . ,ck are pairwise distinct.

Lemma

Any processP without channel can be translated into a processP0 such thatP0 satisfies the above restriction and anyP has the same (DY) operational semantics as forgetting the channels in the semantics ofP0.

(84)

Restrictions on the processes

We assume that, for any reachable process

in(c1,x1).P1 k · · · kin(ck,xk).Pk kQ c1, . . . ,ck are pairwise distinct.

Lemma

Any processP without channel can be translated into a processP0 such thatP0 satisfies the above restriction and anyP has the same (DY) operational semantics as forgetting the channels in the semantics ofP0.

(85)

Example

P = in(x).out(t) Q = in(y).out(u)

Trans(PkQ) = in(cP,x).out(cP,t)kin(cQ,y).out(cQ,u)

Trans(!(PkQ)) = ! in(c!(PkQ),?).νcP,cQ.out(c!(PkQ),hcP,cQi). Trans(PkQ)

(86)

Example

P = in(x).out(t) Q = in(y).out(u)

Trans(PkQ) = in(cP,x).out(cP,t)kin(cQ,y).out(cQ,u)

Trans(!(PkQ)) = ! in(c!(PkQ),?).νcP,cQ.out(c!(PkQ),hcP,cQi).

Trans(PkQ)

(87)

Indistinguishability

Tr(M,A,P) =m1,· · ·,mk,· · ·

the (unique) sequence of outputs ofP, with attacker Ain model M.

Informally: P1 is indistinguishablefromP2, for a (familly of) model(s)M, if, for everyA1,A2,A3,

A2(Tr(M,A1,P1)) = A3(Tr(M,A1,P1)) iff

A2(Tr(M,A1,P2)) = A3(Tr(M,A1,P2))

(88)

Indistinguishability

Tr(M,A,P) =m1,· · ·,mk,· · ·

the (unique) sequence of outputs ofP, with attacker Ain model M.

Informally: P1 is indistinguishablefromP2, for a (familly of) model(s)M, if, for everyA1,A2,A3,

A2(Tr(M,A1,P1)) = A3(Tr(M,A1,P1)) iff

A2(Tr(M,A1,P2)) = A3(Tr(M,A1,P2))

(89)

Examples in the DY case (I)

A1 gives (deducible) inputs to the process and A2,A3 observe some identities on the output.

Example 1

P1 =νn.out(c,n) P2=νn0.out(c,n0) P1 ∼P2

A1 can’t input anything and A2(n) =A3(n) iff A2 =A3 Example 2

P1= out(c,ok) P2 =νn.out(c,n) P1 6∼P2

A2 is the identity,A3 computesok.

(90)

Examples in the DY case (I)

A1 gives (deducible) inputs to the process and A2,A3 observe some identities on the output.

Example 1

P1 =νn.out(c,n) P2=νn0.out(c,n0)

P1 ∼P2

A1 can’t input anything and A2(n) =A3(n) iff A2 =A3 Example 2

P1= out(c,ok) P2 =νn.out(c,n) P1 6∼P2

A2 is the identity,A3 computesok.

(91)

Examples in the DY case (I)

A1 gives (deducible) inputs to the process and A2,A3 observe some identities on the output.

Example 1

P1 =νn.out(c,n) P2=νn0.out(c,n0) P1 ∼P2

A1 can’t input anything and A2(n) =A3(n) iff A2 =A3

Example 2

P1= out(c,ok) P2 =νn.out(c,n) P1 6∼P2

A2 is the identity,A3 computesok.

(92)

Examples in the DY case (I)

A1 gives (deducible) inputs to the process and A2,A3 observe some identities on the output.

Example 1

P1 =νn.out(c,n) P2=νn0.out(c,n0) P1 ∼P2

A1 can’t input anything and A2(n) =A3(n) iff A2 =A3 Example 2

P1 = out(c,ok) P2 =νn.out(c,n)

P1 6∼P2

A2 is the identity,A3 computesok.

(93)

Examples in the DY case (I)

A1 gives (deducible) inputs to the process and A2,A3 observe some identities on the output.

Example 1

P1 =νn.out(c,n) P2=νn0.out(c,n0) P1 ∼P2

A1 can’t input anything and A2(n) =A3(n) iff A2 =A3 Example 2

P1 = out(c,ok) P2 =νn.out(c,n) P1 6∼P2

A2 is the identity,A3 computesok.

(94)

Examples in the DY case (II)

Example 3

P1=νk.out(pk(k)).out(c,aenc(n1,pk(k),r)) P2=νk.out(pk(k)).out(c,aenc(n2,pk(k),r))

P1 6∼P2 Example 4

P1= in(c,x)·out(c,hx,ni) P2= in(c,x)·out(c,hn,xi) P1 6∼P2

(95)

Examples in the DY case (II)

Example 3

P1=νk.out(pk(k)).out(c,aenc(n1,pk(k),r)) P2=νk.out(pk(k)).out(c,aenc(n2,pk(k),r)) P1 6∼P2

Example 4

P1= in(c,x)·out(c,hx,ni) P2= in(c,x)·out(c,hn,xi) P1 6∼P2

(96)

Examples in the DY case (II)

Example 3

P1=νk.out(pk(k)).out(c,aenc(n1,pk(k),r)) P2=νk.out(pk(k)).out(c,aenc(n2,pk(k),r)) P1 6∼P2

Example 4

P1 = in(c,x)·out(c,hx,ni) P2 = in(c,x)·out(c,hn,xi)

P1 6∼P2

(97)

Examples in the DY case (II)

Example 3

P1=νk.out(pk(k)).out(c,aenc(n1,pk(k),r)) P2=νk.out(pk(k)).out(c,aenc(n2,pk(k),r)) P1 6∼P2

Example 4

P1 = in(c,x)·out(c,hx,ni) P2 = in(c,x)·out(c,hn,xi) P1 6∼P2

(98)

Results in the DY case

Indistinguishability is decidable in the DY case (for processes without replication). The complexity is unknown.

Verification tool: Apte.

An approximated equivalence is considered inProVerif(works with replication)

(99)

Computational indistinguishability (I)

For any PPTA1,A2,A3,

Prob{τ, ρ:A2(Tr(Mc(τ,η),A1,P1)) =A3(Tr(Mc(τ,η),A1,P1))}

−Prob{τ, ρ:A2(Tr(Mc(τ,η),A1,P2)) =A3(Tr(Mc(τ,η),A1,P2))}

is negligible.

Example 1

P1 =νn.out(c,n) P2=νn0.out(c,n0) P1 ∼P2

The difference of probabilities is 0 !!

(100)

Computational indistinguishability (I)

For any PPTA1,A2,A3,

Prob{τ, ρ:A2(Tr(Mc(τ,η),A1,P1)) =A3(Tr(Mc(τ,η),A1,P1))}

−Prob{τ, ρ:A2(Tr(Mc(τ,η),A1,P2)) =A3(Tr(Mc(τ,η),A1,P2))}

is negligible.

Example 1

P1 =νn.out(c,n) P2=νn0.out(c,n0)

P1 ∼P2

The difference of probabilities is 0 !!

(101)

Computational indistinguishability (I)

For any PPTA1,A2,A3,

Prob{τ, ρ:A2(Tr(Mc(τ,η),A1,P1)) =A3(Tr(Mc(τ,η),A1,P1))}

−Prob{τ, ρ:A2(Tr(Mc(τ,η),A1,P2)) =A3(Tr(Mc(τ,η),A1,P2))}

is negligible.

Example 1

P1 =νn.out(c,n) P2=νn0.out(c,n0) P1 ∼P2

The difference of probabilities is 0 !!

(102)

Computational indistinguishability (II)

Example 2

P1 =νk1,r1,r2.out(c,aenc(ok,pk(k1),r1)).out(c,aenc(ok,pk(k1),r2)) P2 =νk1,k2,r1,r2.out(c,aenc(ok,pk(k1),r1)).out(c,aenc(ok,pk(k2),r2))

P1 ∼P2 depends on assumptions on the crypto libraries Example 3

P1 = in(c,x).ifx = 0 then out(c,0) else out(c,x) P2= in(c,x).out(c,x)

P1 ∼P2

(103)

Computational indistinguishability (II)

Example 2

P1 =νk1,r1,r2.out(c,aenc(ok,pk(k1),r1)).out(c,aenc(ok,pk(k1),r2)) P2 =νk1,k2,r1,r2.out(c,aenc(ok,pk(k1),r1)).out(c,aenc(ok,pk(k2),r2)) P1 ∼P2 depends on assumptions on the crypto libraries

Example 3

P1 = in(c,x).ifx = 0 then out(c,0) else out(c,x) P2= in(c,x).out(c,x)

P1 ∼P2

Références

Documents relatifs

– A Parallel Turing machine (short named PTM) model is proposed. The formulation of the proposed PTM consists of two interrelated parts: a) a parallel program execution model; b)

of our approaches, finite models will be presented in the form of tree automata, and formally checking models in this case essentially amounts to producing formal proofs of

In this paper we show how Computational SLR [36] (for short, CSLR), a probabilistic lambda- calculus with a type system that guarantees that computations are PPT, can be extended so

We see that using the omniscient attacker model for getting the real picture of the security strength is prone to errors. Moreover, the more nodes the graph has and the higher

The foundational proof certificates framework [21, 33] uses logic program- ming techniques to check a range of proof systems in classical and intuitionistic logics by allowing for

Appears in the Proceedings of The 2nd International Workshop on Contex- tual Information Access, Seeking and Retrieval Evaluation (CIRSE 2010), March 28, 2010, Milton Keynes,

Dominique BERNARD, physicien Marc RAYNAUD, mathématicien. Gabriel GORRE, physicien chimiste

‘full’ parameterization of general discrete probabilistic models, and suggested adding constraints to the model parameters in order to obtain a minimal parameteri- zation.. For