• Aucun résultat trouvé

Inverse Problems

N/A
N/A
Protected

Academic year: 2022

Partager "Inverse Problems"

Copied!
44
0
0

Texte intégral

(1)

Gabriel Peyré

www.numerical-tours.com

Low Complexity Regularization of

Inverse Problems

Cours #1

Inverse Problems

(2)

Overview of the Course

• Course #1: Inverse Problems

• Course #2: Recovery Guarantees

• Course #3: Proximal Splitting Methods

(3)

Overview

• Inverse Problems

• Compressed Sensing

• Sparsity and L1 Regularization

(4)

y = x0 + w 2 RP

Inverse Problems

Recovering x0 RN from noisy observations

(5)

y = x0 + w 2 RP

Examples: Inpainting, super-resolution, . . .

Inverse Problems

Recovering x0 RN from noisy observations

x0

(6)

x = (pk)16k6K

Inverse Problems in Medical Imaging

(7)

Magnetic resonance imaging (MRI):

x = (pk)16k6K

x = ( ˆf(!))!2

Inverse Problems in Medical Imaging

ˆ x

(8)

Magnetic resonance imaging (MRI):

Other examples: MEG, EEG, . . .

x = (pk)16k6K

x = ( ˆf(!))!2

Inverse Problems in Medical Imaging

ˆ x

(9)

Inverse Problem Regularization

observations y parameter

Estimator: x(y) depends only on Observations: y = x0 + w 2 RP .

(10)

Inverse Problem Regularization

observations y parameter

Example: variational methods

Estimator: x(y) depends only on

x(y) 2 argmin

x2RN

1

2||y x||2 + J(x)

Data fidelity Regularity Observations: y = x0 + w 2 RP .

(11)

J(x0)

Regularity of x0

Inverse Problem Regularization

observations y parameter

Example: variational methods

Estimator: x(y) depends only on

x(y) 2 argmin

x2RN

1

2||y x||2 + J(x)

Data fidelity Regularity Observations: y = x0 + w 2 RP .

Choice of : tradeoff

||w||

Noise level

(12)

J(x0)

Regularity of x0

x? 2 argmin

x2RQ, x=y

J(x)

Inverse Problem Regularization

observations y parameter

Example: variational methods

Estimator: x(y) depends only on

x(y) 2 argmin

x2RN

1

2||y x||2 + J(x)

Data fidelity Regularity Observations: y = x0 + w 2 RP .

No noise: 0+, minimize Choice of : tradeoff

||w||

Noise level

(13)

J(x0)

Regularity of x0

This course: Performance analysis.

Fast computational scheme.

x? 2 argmin

x2RQ, x=y

J(x)

Inverse Problem Regularization

observations y parameter

Example: variational methods

Estimator: x(y) depends only on

x(y) 2 argmin

x2RN

1

2||y x||2 + J(x)

Data fidelity Regularity Observations: y = x0 + w 2 RP .

No noise: 0+, minimize Choice of : tradeoff

||w||

Noise level

(14)

Overview

• Inverse Problems

• Compressed Sensing

• Sparsity and L1 Regularization

(15)

˜ x0

Compressed Sensing

[Rice Univ.]

(16)

P measures N micro-mirrors

˜ x0

y[i] = hx0, 'ii

Compressed Sensing

[Rice Univ.]

(17)

P/N = 0.16 P/N = 0.02 P/N = 1

P measures N micro-mirrors

˜ x0

y[i] = hx0, 'ii

Compressed Sensing

[Rice Univ.]

(18)

Physical hardware resolution limit: target resolution f RN. micro

mirrors array

resolution CS hardware

˜

x0 2 L2 x0 2 RN y 2 RP

CS Acquisition Model

CS is about designing hardware: input signals ˜f L2(R2).

(19)

Physical hardware resolution limit: target resolution f RN. micro

mirrors array

resolution

CS hardware

,

.. .

Operator

˜

x0 2 L2 x0 2 RN y 2 RP

CS Acquisition Model

CS is about designing hardware: input signals ˜f L2(R2).

, ,

x0

(20)

Overview

• Inverse Problems

• Compressed Sensing

• Sparsity and L1 Regularization

(21)

Q

N Dictionary = ( m)m RQ N , N Q.

Redundant Dictionaries

(22)

m = ei ·, m

frequency Fourier:

Q

N Dictionary = ( m)m RQ N , N Q.

Redundant Dictionaries

(23)

Wavelets:

m = (2 jR x n)

m = (j, , n)

scale

orientation

position

m = ei ·, m

frequency Fourier:

Q

N Dictionary = ( m)m RQ N , N Q.

Redundant Dictionaries

= 1 = 2

(24)

Wavelets:

m = (2 jR x n)

m = (j, , n)

scale

orientation

position

m = ei ·, m

frequency Fourier:

DCT, Curvelets, bandlets, . . .

Q

N Dictionary = ( m)m RQ N , N Q.

Redundant Dictionaries

= 1 = 2

(25)

Synthesis: f = m xm m = x.

x

f

Image f = x Coefficients x

Wavelets:

m = (2 jR x n)

m = (j, , n)

scale

orientation

position

m = ei ·, m

frequency Fourier:

DCT, Curvelets, bandlets, . . .

Q

N Dictionary = ( m)m RQ N , N Q.

Redundant Dictionaries

=

= 1 = 2

(26)

Ideal sparsity: for most m, xm = 0.

J0(x) = # {m \ xm = 0}

Sparse Priors

Image f0 Coefficients x

(27)

Ideal sparsity: for most m, xm = 0.

J0(x) = # {m \ xm = 0}

Sparse approximation: f = x where argmin

x2RN ||f0 x||2 + T 2J0(x)

Sparse Priors

Image f0 Coefficients x

(28)

Ideal sparsity: for most m, xm = 0.

J0(x) = # {m \ xm = 0}

Sparse approximation: f = x where

Orthogonal : = = IdN

xm = f0, m if | f0, m | > T,

0 otherwise. ST

f = ST (f0) argmin

x2RN ||f0 x||2 + T 2J0(x)

Sparse Priors

Image f0 Coefficients x

(29)

Ideal sparsity: for most m, xm = 0.

J0(x) = # {m \ xm = 0}

Sparse approximation: f = x where

Orthogonal : = = IdN

xm = f0, m if | f0, m | > T,

0 otherwise. ST

Non-orthogonal : NP-hard.

f = ST (f0) argmin

x2RN ||f0 x||2 + T 2J0(x)

Sparse Priors

Image f0 Coefficients x

(30)

Image with 2 pixels:

q = 0

J0(x) = # {m \ xm = 0}

J0(x) = 0 null image.

J0(x) = 1 sparse image.

J0(x) = 2 non-sparse image.

x2

Convex Relaxation: L1 Prior

x1

(31)

Image with 2 pixels:

q = 0 q = 1/2 q = 1 q = 3/2 q = 2

Jq(x) =

m

|xm|q

J0(x) = # {m \ xm = 0}

J0(x) = 0 null image.

J0(x) = 1 sparse image.

J0(x) = 2 non-sparse image.

x2

Convex Relaxation: L1 Prior

q priors: (convex for q 1)

x1

(32)

Image with 2 pixels:

q = 0 q = 1/2 q = 1 q = 3/2 q = 2

Jq(x) =

m

|xm|q

J0(x) = # {m \ xm = 0}

J0(x) = 0 null image.

J0(x) = 1 sparse image.

J0(x) = 2 non-sparse image.

x2

J1(x) =

m

|xm|

Convex Relaxation: L1 Prior

Sparse 1 prior:

q priors: (convex for q 1)

x1

(33)

L1 Regularization

coefficients x0 RN

(34)

L1 Regularization

coefficientsx0 RN f0 =imagex0 RQ

(35)

L1 Regularization

observations w

coefficients image

K

x0 RN f0 = x0 RQ y = Kf0 + w RP

(36)

L1 Regularization

observations

= K RP N w

coefficients image

K

x0 RN f0 = x0 RQ y = Kf0 + w RP

(37)

Fidelity Regularization

xminRN

1

2||y x||2 + ||x||1

L1 Regularization

Sparse recovery: f = x where x solves

observations

= K RP N w

coefficients image

K

x0 RN f0 = x0 RQ y = Kf0 + w RP

(38)

x argmin

x=y m

|xm| x

x = y

Noiseless Sparse Regularization

Noiseless measurements: y = x0

(39)

x argmin

x=y m

|xm| x

x = y

x argmin

x=y m

|xm|2

Noiseless Sparse Regularization

x

x = y Noiseless measurements: y = x0

(40)

x argmin

x=y m

|xm| x

x = y

x argmin

x=y m

|xm|2

Noiseless Sparse Regularization

Convex linear program.

Interior points, cf. [Chen, Donoho, Saunders] “basis pursuit”.

Douglas-Rachford splitting, see [Combettes, Pesquet].

x

x = y Noiseless measurements: y = x0

(41)

Regularization Data fidelity

y = x0 + w Noisy measurements:

x argmin

x RQ

1

2||y x||2 + ||x||1

Noisy Sparse Regularization

(42)

Regularization

Data fidelity Equivalence

|| x =

y||

y = x0 + w Noisy measurements:

x argmin

x RQ

1

2||y x||2 + ||x||1

x argmin

|| x y|| ||x||1

Noisy Sparse Regularization

x

(43)

Iterative soft thresholding

Forward-backward splitting

Algorithms:

Regularization

Data fidelity Equivalence

|| x =

y||

y = x0 + w Noisy measurements:

x argmin

x RQ

1

2||y x||2 + ||x||1

x argmin

|| x y|| ||x||1

Noisy Sparse Regularization

Nesterov multi-steps schemes.

see [Daubechies et al], [Pesquet et al], etc

x

(44)

K

y = Kf0 + w Measures:

(Kf)i =

0 if i 2 ⌦, fi if i /2 ⌦.

Inpainting Problem

Références

Documents relatifs

[r]

Table 6 shows that the neural network predictions of peak concentration and peak travel time are generally poorer for the testing data set than for the training and vali- dation

Dessine ci-dessous, en vraie grandeur, la section de ce cylindre par un plan parallèle à son axe contenant O et O'b. 3 On réalise la section ABB'A' par un plan parallèle

cost of land [urban land is expensive both for LDCs and DCsJ, and the high new fixed costs for infrastructure, which effectively use up a large percentage of

independently, while there is little documented in design guidelines and requirements to highlight and manage the attribute interdependencies. This can result in chassis

After a certain period of focusing on my breath, it gradually reached a level where my breathing began to slow down, almost it was halted, and I began to

article Le Quotidien Jurassien (15.06) Porrentruy est en ce moment le paradis des iris qui dévoilent leurs couleurs magiques reportage RTS, Couleurs locales (18.06) Le choix du

Cette étude suggère qu’une faible compréhension des bénéfices d’un traitement précoce, des droits des patients séropositifs dans le pays d’accueil ainsi que