• Aucun résultat trouvé

Probabilistic modeling natural way to treat data

N/A
N/A
Protected

Academic year: 2021

Partager "Probabilistic modeling natural way to treat data"

Copied!
34
0
0

Texte intégral

(1)

Probabilistic modeling natural way to treat data

Hussein Rappel

University of Luxembourg h.rappel@gmail.com

February 12, 2019

(2)

y

x 1

2 3 4

1 2 3 4

a 1

y = −ax + b

Introduction to Gaussian Processes, Neil Lawrence

2 / 34

(3)

y

x 1

2 3 4

1 2 3 4

(4)

y

x 1

2 3 4

1 2 3 4

4 / 34

(5)

y

x 1

2 3 4

1 2 3 4

(6)

6 / 34

(7)

Each point can be written as themodel+ a corruption:

y1 = ax + c + ω1 y2 = ax + c + ω2 y3 = ax + c + ω3

ω is the difference between real world and model which can be presented by a probability distribution.

We call ω noise!

(8)

What if our observations are less than model parameters?

Underdetermined system

8 / 34

(9)

y

x 1

2 3 4

1 2 3 4

How can we fit the y = ax + b line, having only one point?

Introduction to Gaussian Processes, Neil Lawrence

(10)

y

x 1

2 3 4

1 2 3 4

If b is fixed =⇒ a = y −bx

10 / 34

(11)

y

x 1

2 3 4

1 2 3 4

(12)

y

x 1

2 3 4

1 2 3 4

b ∼ π1 =⇒ a ∼ π2

12 / 34

(13)

I This is called Bayesian treatment.

I The model parameters are treated asrandom variables.

(14)

Bayesian perspective

Original belief

Observations

New belief

14 / 34

(15)

Bayesian formula (inverse probability)

posterior

z }| { π(x |y ) =

prior

z }| { π(x )×

likelihood

z }| { π(y |x ) π(y )

| {z }

evidence

y :=observation x :=parameter π(x ) :=original belief

π(y |x ) :=given by the mathematical model that relates y to x π(y ) :=is a constant number

(16)

Bayesian formula (inverse probability)

π(x |y ) ∝ π(x ) × π(y |x )

16 / 34

(17)

BI in computational mechanics

σ



(18)

Linear elasticity

σ = E 

σ

 E

1

18 / 34

(19)

Linear elasticity

y = E  + ω Ω ∼ πω(ω)

Capital letters denote a random variable

(20)

Linear elasticity

 σ

πω(ω) = 1

2πsωexp2sω22

ω



Noise PDF is modeled through calibration test.

20 / 34

(21)

Linear elasticity

Bayes’ formula:

π(E |y ) = π(E )π(y |E )

π(y ) = π(E )π(y |E ) k

π(E |y ) ∝ π(E )π(y |E )

(22)

Linear elasticity

y = E  + ω Ω ∼ N(0, sω2)

22 / 34

(23)

Linear elasticity

π(y |E ) = √1 2πsω

exp(y − E )2 2sω2



(24)

Linear elasticity

Posterior:

π(E |y ) ∝exp(E −E )2

2sE2

exp(y −E )2s2 2

ω



24 / 34

(25)

Linear elasticity

I Prediction interval: An estimate of an interval in which an observation will fall, with a certain probability.

I Credible region:A region of a distribution in which it is believed that a random variable lie with a certain probability.

(26)

Linear elasticity

I Increase in number of observations/measurements makes us more sure of identification result.

26 / 34

(27)

Prior effect

I Increase in number of observations/measurements decreases the effect of prior.

(28)

Conclusion

I Probability is the natural way of dealing with

uncertainties/unknowns (what Laplace calls it our ignorance).

I From Bayesian perspective (inverse probability) the parameters are treated asrandom variables.

I The same logic can be used to model other kinds of

uncertainties/unknowns e.g. model uncertainties and material variability.

I In Bayesian paradigm our assumptions are clearly stated (e.g. the prior, model and ...).

I As the number of observation/measurements increases we become more sure of our identification results.

28 / 34

(29)

Conclusion

I Probability is the natural way of dealing with

uncertainties/unknowns (what Laplace calls it our ignorance).

I From Bayesian perspective (inverse probability) the parameters are treated asrandom variables.

I The same logic can be used to model other kinds of

uncertainties/unknowns e.g. model uncertainties and material variability.

I In Bayesian paradigm our assumptions are clearly stated (e.g. the prior, model and ...).

I As the number of observation/measurements increases we become more sure of our identification results.

(30)

Conclusion

I Probability is the natural way of dealing with

uncertainties/unknowns (what Laplace calls it our ignorance).

I From Bayesian perspective (inverse probability) the parameters are treated asrandom variables.

I The same logic can be used to model other kinds of

uncertainties/unknowns e.g. model uncertainties and material variability.

I In Bayesian paradigm our assumptions are clearly stated (e.g. the prior, model and ...).

I As the number of observation/measurements increases we become more sure of our identification results.

30 / 34

(31)

Conclusion

I Probability is the natural way of dealing with

uncertainties/unknowns (what Laplace calls it our ignorance).

I From Bayesian perspective (inverse probability) the parameters are treated asrandom variables.

I The same logic can be used to model other kinds of

uncertainties/unknowns e.g. model uncertainties and material variability.

I In Bayesian paradigm our assumptions are clearly stated (e.g. the prior, model and ...).

I As the number of observation/measurements increases we become more sure of our identification results.

(32)

Conclusion

I Probability is the natural way of dealing with

uncertainties/unknowns (what Laplace calls it our ignorance).

I From Bayesian perspective (inverse probability) the parameters are treated asrandom variables.

I The same logic can be used to model other kinds of

uncertainties/unknowns e.g. model uncertainties and material variability.

I In Bayesian paradigm our assumptions are clearly stated (e.g. the prior, model and ...).

I As the number of observation/measurements increases we become more sure of our identification results.

32 / 34

(33)

Acknowledgement

The DRIVEN project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement No 811099.

(34)

Some references

I P.S. marquis de Laplace, 1902. A philosophical essay on probabilities.

Translated by F.W. Truscott and F.L. Emory, Wiley.

I E.T. Jaynes, 2003. Probability theory: The logic of science. Cambridge university press.

I D.J. MacKay, 2003. Information theory, inference and learning algorithms.

Cambridge university press.

I A. Gelman et al., 2013. Bayesian data analysis. Chapman and Hall/CRC.

I Courses by N. Lawrence. http://inverseprobability.com/

References by Legatoteam

I H. Rappel et al., 2018. Bayesian inference to identify parameters in viscoelasticity. Mechanics of Time-Dependent Materials.

I H. Rappel et al., 2018. Identifying elastoplastic parameters with Bayes’ theorem considering output error, input error and model uncertainty. Probabilistic Engineering Mechanics.

I H. Rappel et al., 2019. A tutorial on Bayesian inference to identify material parameters in solid mechanics. Archives of Computational Methods in Engineering.

I H. Rappel and L.A.A. Beex, 2019. Estimating fibres’ material parameter distributions from limited data with the help of Bayesian inference. European Journal of Mechanics-A/Solids.

34 / 34

Références

Documents relatifs

Il a pour but de soutenir la réflexion professionnelle de la personne accompagnée en l’aidant à préciser ses besoins et à développer des compétences professionnelles

In this paper we study how to make joint exten- sions of stochastic orderings and interval orderings so as to extend methods for comparing random vari- ables, from the point of view

Asiatic Society, 1910 (Bibliotheca Indica, New Series; 1223) ― Information ob- tained from Elliot M. 3) Dharma Pātañjala ― Old Javanese (prose) and Sanskrit (a few verses), Ba-

In this work we present an algorithm called Deep Parameter learning for HIerarchical probabilistic Logic pro- grams (DPHIL) that learns hierarchical PLP parameters using

A deux ans, la cicatrisation endoscopique, d efinie par l’absence d’ulc erations a l’il eo- coloscopie, etait observ ee chez 73,1 % des patients sous comboth erapie pr ecoce

Activite´ se´ve`re (saignement spontane´e, ulce´rations) 3 Treat-to-target et rectocolite h emorragique.. une inefficacit e th erapeutique et, inversement, un d elai trop

Axial, mixed, radial, Banki, Pelton, centripetal or centrifugal, open or closed, full or partial admission turbomachines base concept can be inferred by considering

Otherwise I’m going to stay at home and watch television. ) $ Jack and I are going to the Theatre.. There is a show of Mr