Classification of Hyperspectral Data Using Spectral-Spatial Approaches

118  Download (0)

Texte intégral

(1)

Spectral-Spatial Approaches

Yuliya Tarabalka, Jocelyn Chanussot and Jon Atli Benediktsson

GIPSA-Lab, Grenoble Institute of Technologygy, France

Department of Electrical and Computer Engineering, University of Iceland, Iceland

May 13, 2010

(2)

Conclusions and perspectives

Hyperspectral image

Every pixel contains a detailed spectrum (>100 spectral bands)

+ More information per pixel → increasing capability to distinguish

objects

(3)

Conclusions and perspectives

Hyperspectral image

Every pixel contains a detailed spectrum (>100 spectral bands) + More information per pixel → increasing capability to distinguish objects

− Dimensionality increases → image analysis becomes more complex

Efficient algorithms for automatic processing are required!

λ pixel vectorxi

xi xi

x1 x2 x3

x2i

xn

lines

samples

Tens or hundreds of

bands wavelength λ

(4)

Conclusions and perspectives

Classification problem

ROSIS image

Spatial resolution: 1.3m/pix Spectral resolution: 103 bands

Ground-truth data Task

Assign every pixel to one of the nine classes:

alphalt meadows

gravel trees metal sheets

bare soil

bitumen

bricks

shadows

(5)

Conclusions and perspectives

Classification problem

AVIRIS image

Spatial resolution: 20m/pix Spectral resolution: 200 bands

Ground-truth data

Task Assign every pixel to one of the 16 classes:

corn-no till, corn-min till, corn, soybeans-no till, soybeans-min till,

soybeans-clean till, alfalfa, grass/pasture, grass/trees,

grass/pasture-mowed, hay-windrowed, oats, wheat, woods, bldg-grass-tree-drives,

stone-steel towers

(6)

Conclusions and perspectives

Classification approaches

Only spectral information

Spectrum of each pixel is analyzed Directly accessible

Variety of methods

Support Vector Machines (SVM) → good classification results [Camps-Valls05]

alphalt meadows

gravel trees metal sheets

bare soil

bitumen

bricks

shadows

Overall accuracy = 81.01%

(7)

Conclusions and perspectives

Classification approaches

Only spectral information

Spectrum of each pixel is analyzed Directly accessible

Variety of methods

Support Vector Machines (SVM) → good classification results [Camps-Valls05]

xi

Margin=2

∥w∥

b

∥w∥

yi=1

yj=−1 xj

Optimal separating hyperplane

j

i w.xb=1

w.xb=−1

alphalt meadows

gravel trees metal sheets

bare soil

bitumen

bricks

shadows

Overall accuracy = 81.01%

(8)

Conclusions and perspectives

Classification approaches

Only spectral information

Spectrum of each pixel is analyzed Directly accessible

Variety of methods

Support Vector Machines (SVM) → good classification results [Camps-Valls05]

Spectral + spatial information

Info about spatial structures is included Since neighboring pixels are related How to define spatial structures?

How to combine spectral and spatial

(9)

Conclusions and perspectives

State of the art: Approaches for extracting spatial info

1

Closest fixed neighborhoods

Markov Random Field [Pony00, Jackson02, Farag05]

Contextual features [Camps-Valls06]

+ Simplicity

− Imprecision at the border of regions

(10)

Conclusions and perspectives

State of the art: Approaches for extracting spatial info

1

Closest fixed neighborhoods

Markov Random Field [Pony00, Jackson02, Farag05]

Contextual features [Camps-Valls06]

+ Simplicity

− Imprecision at the border of regions

(11)

Conclusions and perspectives

State of the art: Approaches for extracting spatial info

1

Closest fixed neighborhoods

Markov Random Field [Pony00, Jackson02, Farag05]

Contextual features [Camps-Valls06]

+ Simplicity

− Imprecision at the border of regions

2

Morphological and area filtering

Morphological profiles [Dell’Acqua04, Benediktsson05, Fauvel07]

Area filtering [Fauvel07]

+ Neighborhoods are adapted to the structures

− Neighborhoods are scale dependent ⇒ imprecision in the spatial info

(12)

Conclusions and perspectives

State of the art: Approaches for extracting spatial info

1

Closest fixed neighborhoods

Markov Random Field [Pony00, Jackson02, Farag05]

Contextual features [Camps-Valls06]

+ Simplicity

− Imprecision at the border of regions

2

Morphological and area filtering

Morphological profiles [Dell’Acqua04, Benediktsson05, Fauvel07]

Area filtering [Fauvel07]

+ Neighborhoods are adapted to the structures

− Neighborhoods are scale dependent ⇒ imprecision in the spatial info

(13)

Conclusions and perspectives

State of the art: Approaches for extracting spatial info

3

Segmentation map = exhaustive partitioning of the image into homogeneous regions

Extraction and Classification of Homogeneous Objects [Kettig76]

+ Has become a standard spectral-spatial classification technique

− Statistical approach ⇒ not well adapted for hyperspectral data Recent works:

Multiscale segmentation + SVM classification [Linden07, Huang09]

− Computationally demanding

Marker selection by morphological filtering + watershed [Noyel08]

+ Minimized dependence on the thresholds

(14)

Conclusions and perspectives

State of the art: Approaches for extracting spatial info

3

Segmentation map = exhaustive partitioning of the image into homogeneous regions

Extraction and Classification of Homogeneous Objects [Kettig76]

+ Has become a standard spectral-spatial classification technique

− Statistical approach ⇒ not well adapted for hyperspectral data Recent works:

Multiscale segmentation + SVM classification [Linden07, Huang09]

− Computationally demanding

Marker selection by morphological filtering + watershed [Noyel08]

+ Minimized dependence on the thresholds

(15)

Conclusions and perspectives

State of the art: Approaches for extracting spatial info

3

Segmentation map = exhaustive partitioning of the image into homogeneous regions

Extraction and Classification of Homogeneous Objects [Kettig76]

+ Has become a standard spectral-spatial classification technique

− Statistical approach ⇒ not well adapted for hyperspectral data Recent works:

Multiscale segmentation + SVM classification [Linden07, Huang09]

− Computationally demanding

Marker selection by morphological filtering + watershed [Noyel08]

+ Minimized dependence on the thresholds

(16)

Conclusions and perspectives

combine spectral and spatial info

Proposed spectral-spatial classification approaches

Using closest fixed neighborhoods

SVM and MRF-based classification

Using adaptive neighborhoods

Using spatial neighborhoods derived from unsupervised

segmentation

Segmentation Classification Using marker-based region growing segmentation Marker

selection Classification Watershed Partitional

clustering Hierarchical segmentation

Object- based

Composite kernels

Segmentation + pixelwise classification

Using ML discriminant

Using probabilistic

Multiple classifier

Marker- controlled

Construction of a Minimum Spanning Strategy 1

Strategy 2

Strategy 3

High-performance

(17)

Conclusions and perspectives

combine spectral and spatial info

Proposed spectral-spatial classification approaches

Construction of a Minimum Spanning Using closest fixed

neighborhoods

SVM and MRF-based classification

Using adaptive neighborhoods

Using spatial neighborhoods derived from unsupervised

segmentation

Segmentation Classification Using marker-based region growing segmentation Marker

selection Classification Watershed Partitional

clustering Hierarchical segmentation

Object- based

Composite kernels

Segmentation + pixelwise classification

Using ML discriminant

Using probabilistic

Multiple classifier Strategy 1

Strategy 2

Strategy 3

Marker- controlled High-performance

(18)

Conclusions and perspectives

Outline

1 Introduction

2 Classification using segmentation-derived adaptive neighborhoods Segmentation

Spectral-spatial classification Concluding discussion

3 Segmentation and classification using automatically selected markers Marker selection

Classification using marker-controlled region growing Concluding discussion

4 Conclusions and perspectives

(19)

Conclusions and perspectives

Objective

1

Automatically segment a hyperspectral image into homogeneous regions

2

Spectral info + spatial info → classify image

+ → Classification

map

(20)

Conclusions and perspectives

Outline

1 Introduction

2 Classification using segmentation-derived adaptive neighborhoods Segmentation

Spectral-spatial classification Concluding discussion

3 Segmentation and classification using automatically selected markers Marker selection

Classification using marker-controlled region growing Concluding discussion

4 Conclusions and perspectives

(21)

Conclusions and perspectives

1. Watershed segmentation

gradient

Region growing method:

Minimum of a gradient = core of a homogeneous region

1 region = set of pixels connected to 1 local minimum of the gradient

Watershed lines = edges between adjacent regions

(22)

Conclusions and perspectives

1. Watershed segmentation for hyperspectral image

From B-band image → 1-band segmentation map:

Feature extraction (PCA, ICA,...)?

Vectorial gradient?

Combine B gradients?

Combine B watershed regions?

Gradient (1 band – 1 band)

Watershed Hyperspectral image

(B bands)

Feature extraction (B bands – 1 band)

Combine gradients (B bands – 1 band) (1 band – 1 band)

Segmentation map (1 band)

Combine regions (B bands – 1 band)

Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and classification of hyperspectral

images using watershed transformation,” Pattern Recognition, vol. 43, no. 7, pp. 2367-2379, July

(23)

Conclusions and perspectives

1. Watershed segmentation for hyperspectral image

From B-band image → 1-band segmentation map:

Feature extraction (PCA, ICA,...)?

Vectorial gradient?

Combine B gradients?

Combine B watershed regions?

Gradient (1 band – 1 band)

Watershed Hyperspectral image

(B bands)

Feature extraction (B bands – 1 band)

Combine gradients (B bands – 1 band) (1 band – 1 band)

Segmentation map (1 band)

Combine regions (B bands – 1 band)

Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and classification of hyperspectral

images using watershed transformation,” Pattern Recognition, vol. 43, no. 7, pp. 2367-2379, July

(24)

Conclusions and perspectives

1. Watershed segmentation for hyperspectral image

From B-band image → 1-band segmentation map:

Feature extraction (PCA, ICA,...)?

Vectorial gradient?

Combine B gradients?

Combine B watershed regions?

Gradient (B bands – 1 band)

Watershed Hyperspectral image

(B bands)

Feature extraction (B bands – 1 band)

Combine gradients (B bands – 1 band) (1 band – 1 band)

Segmentation map (1 band)

Combine regions (B bands – 1 band)

Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and classification of hyperspectral

images using watershed transformation,” Pattern Recognition, vol. 43, no. 7, pp. 2367-2379, July

(25)

Conclusions and perspectives

1. Watershed segmentation for hyperspectral image

From B-band image → 1-band segmentation map:

Feature extraction (PCA, ICA,...)?

Vectorial gradient?

Combine B gradients?

Combine B watershed regions?

Gradient (B bands – B bands)

Watershed Hyperspectral image

(B bands)

Feature extraction (B bands – 1 band)

Combine gradients (B bands – 1 band) (1 band – 1 band)

Segmentation map (1 band)

Combine regions (B bands – 1 band)

Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and classification of hyperspectral

images using watershed transformation,” Pattern Recognition, vol. 43, no. 7, pp. 2367-2379, July

(26)

Conclusions and perspectives

1. Watershed segmentation for hyperspectral image

From B-band image → 1-band segmentation map:

Feature extraction (PCA, ICA,...)?

Vectorial gradient?

Combine B gradients?

Combine B watershed regions?

Gradient (B bands – B bands)

Watershed Hyperspectral image

(B bands)

Feature extraction (B bands – 1 band)

Combine gradients

(B bands – B bands)

Segmentation map (1 band)

Combine regions (B bands – 1 band)

Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and classification of hyperspectral

images using watershed transformation,” Pattern Recognition, vol. 43, no. 7, pp. 2367-2379, July

(27)

Conclusions and perspectives

1. Watershed segmentation

Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and classification of hyperspectral images using watershed transformation,” Pattern Recognition, vol. 43, no. 7, pp. 2367-2379, July 2010.

Original

image

(28)

Conclusions and perspectives

1. Watershed segmentation

Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and classification of hyperspectral images using watershed transformation,” Pattern Recognition, vol. 43, no. 7, pp. 2367-2379, July 2010.

Original image

Robust Color

Morpho Gradient

(29)

Conclusions and perspectives

1. Watershed segmentation

Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and classification of hyperspectral images using watershed transformation,” Pattern Recognition, vol. 43, no. 7, pp. 2367-2379, July 2010.

Original image

Robust Color Morpho Gradient

Watershed

11802 regions

(30)

Conclusions and perspectives

1. Watershed segmentation

Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and classification of hyperspectral images using watershed transformation,” Pattern Recognition, vol. 43, no. 7, pp. 2367-2379, July 2010.

Original image

Robust Color Morpho Gradient

Watershed 11802 regions

Edges → to

adjacent regions

(31)

Conclusions and perspectives

2. Partitional clustering (EM)

1

Clustering

pixels are grouped into C clusters in each cluster → pixels drawn from a Gaussian distribution

distribution parameters → EM algorithm

2

Labeling of connected components

10 clusters

(32)

Conclusions and perspectives

2. Partitional clustering (EM)

1

Clustering

pixels are grouped into C clusters in each cluster → pixels drawn from a Gaussian distribution

distribution parameters → EM algorithm

2

Labeling of connected components

10 clusters

⇒ 21450 regions

same cluster, but different

regions!

XX XX XX X z H

H H

H H j

(33)

Conclusions and perspectives

3. Hierarchical SEGmentation (HSEG,[Tilton98])

Region growing + Spectral Clustering Dissimilarity criterion (DC):

Spectral Angle Mapper (SAM)

between the region mean vectors u i and u j SAM (u i , u j ) = arccos( u i · u j

ku i k 2 ku j k 2 )

1

Each pixel - one region

2

Find DC mi n between adjacent regions

3

Merge adjacent regions with DC = DC mi n

4

Merge non-adjacent regions with DC ≤ DC mi n · Spec tr al Cl uster W ei ght

5

If not converge, go to 2

(34)

Conclusions and perspectives

3. Hierarchical SEGmentation (HSEG,[Tilton98])

Region growing + Spectral Clustering Dissimilarity criterion (DC):

Spectral Angle Mapper (SAM)

between the region mean vectors u i and u j SAM (u i , u j ) = arccos( u i · u j

ku i k 2 ku j k 2 )

1

Each pixel - one region

2

Find DC mi n between adjacent regions

3

Merge adjacent regions with DC = DC mi n

4

Merge non-adjacent regions with DC ≤ DC mi n · Spec tr al Cl uster W ei ght

5

If not converge, go to 2

(35)

Conclusions and perspectives

3. Hierarchical SEGmentation (HSEG,[Tilton98])

Region growing + Spectral Clustering Dissimilarity criterion (DC):

Spectral Angle Mapper (SAM)

between the region mean vectors u i and u j SAM (u i , u j ) = arccos( u i · u j

ku i k 2 ku j k 2 )

1

Each pixel - one region

2

Find DC mi n between adjacent regions

3

Merge adjacent regions with DC = DC mi n

4

Merge non-adjacent regions with DC ≤ DC mi n · Spec tr al Cl uster W ei ght

5

If not converge, go to 2

(36)

Conclusions and perspectives

3. Hierarchical SEGmentation (HSEG,[Tilton98])

Region growing + Spectral Clustering Dissimilarity criterion (DC):

Spectral Angle Mapper (SAM)

between the region mean vectors u i and u j SAM (u i , u j ) = arccos( u i · u j

ku i k 2 ku j k 2 )

1

Each pixel - one region

2

Find DC mi n between adjacent regions

3

Merge adjacent regions with DC = DC mi n

4

Merge non-adjacent regions with DC ≤ DC mi n · Spec tr al Cl uster W ei ght

5

If not converge, go to 2

(37)

Conclusions and perspectives

3. Hierarchical SEGmentation (HSEG,[Tilton98])

Region growing + Spectral Clustering Dissimilarity criterion (DC):

Spectral Angle Mapper (SAM)

between the region mean vectors u i and u j SAM (u i , u j ) = arccos( u i · u j

ku i k 2 ku j k 2 )

1

Each pixel - one region

2

Find DC mi n between adjacent regions

3

Merge adjacent regions with DC = DC mi n

4

Merge non-adjacent regions with DC ≤ DC mi n · Spec tr al Cl uster W ei ght

5

If not converge, go to 2

(38)

Conclusions and perspectives

3. Hierarchical SEGmentation (HSEG,[Tilton98])

Region growing + Spectral Clustering Dissimilarity criterion (DC):

Spectral Angle Mapper (SAM)

between the region mean vectors u i and u j SAM (u i , u j ) = arccos( u i · u j

ku i k 2 ku j k 2 )

1

Each pixel - one region

2

Find DC mi n between adjacent regions

3

Merge adjacent regions with DC = DC mi n

4

Merge non-adjacent regions with DC ≤ DC mi n · Spec tr al Cl uster W ei ght

5

If not converge, go to 2

(39)

Conclusions and perspectives

3. Hierarchical SEGmentation (HSEG,[Tilton98])

Region growing + Spectral Clustering Dissimilarity criterion (DC):

Spectral Angle Mapper (SAM)

between the region mean vectors u i and u j SAM (u i , u j ) = arccos( u i · u j

ku i k 2 ku j k 2 )

1

Each pixel - one region

2

Find DC mi n between adjacent regions

3

Merge adjacent regions with DC = DC mi n

4

Merge non-adjacent regions with DC ≤ DC mi n · Spec tr al Cl uster W ei ght

5

If not converge, go to 2

SCW = 0.0

7231 regions

(40)

Conclusions and perspectives

3. Hierarchical SEGmentation (HSEG,[Tilton98])

Region growing + Spectral Clustering Dissimilarity criterion (DC):

Spectral Angle Mapper (SAM)

between the region mean vectors u i and u j SAM (u i , u j ) = arccos( u i · u j

ku i k 2 ku j k 2 )

1

Each pixel - one region

2

Find DC mi n between adjacent regions

3

Merge adjacent regions with DC = DC mi n

4

Merge non-adjacent regions with DC ≤ DC mi n · Spec tr al Cl uster W ei ght

5

If not converge, go to 2

SCW = 0.1

7575 regions

(41)

Conclusions and perspectives

Outline

1 Introduction

2 Classification using segmentation-derived adaptive neighborhoods Segmentation

Spectral-spatial classification Concluding discussion

3 Segmentation and classification using automatically selected markers Marker selection

Classification using marker-controlled region growing Concluding discussion

4 Conclusions and perspectives

(42)

Conclusions and perspectives

Spectral-spatial classification: majority vote

Hyperspectral image (B bands)

Pixelwise classification Segmentation

Spectral-spatial classification

(majority voting)

Final classification map

Pixelwise classification map

(dark blue, white and light grey

classes) Segmentation

map (3 spatial regions)

Result of spectral-spatial

classification (classification map after majority

vote) Combination of

unsupervised segmentation and

pixelwise classification results (majority vote within 3 spatial regions) 1 1 1 1 1 1

1 1 1 2 2 2 1 1 2 2 2 2 1 1 2 2 2 2 1 1 3 3 3 2 1 3 3 3 3 3 3 3 3 3 3 3

(43)

Conclusions and perspectives

Spectral-spatial classification

Original

image

(44)

Conclusions and perspectives

Spectral-spatial classification

Original image

⇒ SVM classification

OA = 81.01%

AA = 88.25%

(45)

Conclusions and perspectives

Spectral-spatial classification

SVM classification

OA = 81.01%

AA = 88.25%

SVM + Watershed

OA = 85.42%

AA = 91.31%

SVM +

Partit.clustering

OA = 94.00%

AA = 93.13%

SVM + HSEG (SCW = 0.1)

OA = 93.85%

AA = 97.07%

(46)

Conclusions and perspectives

Classification accuracies (%)

SVM +Watersh. +Part.Cl. +HSEG EMP

1

ECHO

SCW 0.0 0.1

Overall Acc. 81.01 85.42 94.00 90.00 93.85 85.22 87.58 Average Acc. 88.25 91.31 93.13 94.15 97.07 90.76 92.16 Kappa Coef.κ 75.86 81.30 91.93 86.86 91.89 80.86 83.90

Asphalt 84.93 93.64 90.10 73.33 94.77 95.36 87.98

Meadows 70.79 75.09 95.99 88.73 89.32 80.33 81.64

Gravel 67.16 66.12 82.26 97.47 96.14 87.61 76.91

Trees 97.77 98.56 85.54 98.45 98.08 98.37 99.31

Metal sheets 99.46 99.91 100 99.10 99.82 99.48 99.91

Bare soil 92.83 97.35 96.72 98.43 99.76 63.72 93.96

Bitumen 90.42 96.23 91.85 95.92 100 98.87 92.97

Bricks 92.78 97.92 98.34 98.81 99.29 95.41 97.35

Shadows 98.11 96.98 97.36 97.11 96.48 97.68 99.37

(47)

Conclusions and perspectives

Outline

1 Introduction

2 Classification using segmentation-derived adaptive neighborhoods Segmentation

Spectral-spatial classification Concluding discussion

3 Segmentation and classification using automatically selected markers Marker selection

Classification using marker-controlled region growing Concluding discussion

4 Conclusions and perspectives

(48)

Conclusions and perspectives

1

Spectral-spatial classification improves accuracies when compared to pixel-wise classification

2

Several segmentation techniques are investigated

3

The HSEG segmentation map leads to the best classification

4

Obtained classification accuracies > all previous results

However...

(49)

Conclusions and perspectives

Unsupervised segmentation

Unsupervised segmentation = exhaustive partitioning into homogeneous regions

How to define a measure of homogeneity?

(50)

Conclusions and perspectives

Unsupervised segmentation

Unsupervised segmentation = exhaustive partitioning into homogeneous regions

How to define a measure of homogeneity?

Hierarchical SEGmentation (HSEG, [Tilton98]) Image

2381reg.

pgflastimage

1389reg.

pgflastimage

280reg.

pgflastimage

Oversegmentation! @ I @ Undersegmentation? 6

(51)

Conclusions and perspectives

Unsupervised segmentation

Unsupervised segmentation = exhaustive partitioning into homogeneous regions

How to define a measure of homogeneity?

Hierarchical SEGmentation (HSEG, [Tilton98]) Image

2381reg.

pgflastimage

1389reg.

pgflastimage

280reg.

pgflastimage

Oversegmentation Undersegmentation?

Preferred in previous works → not to lose objects

(52)

Conclusions and perspectives

Watershed segmentation

Original image

Robust Color Morpho Gradient

Watershed

1277 regions

(53)

Conclusions and perspectives

Watershed segmentation

Original image

Robust Color Morpho Gradient

Watershed 1277 regions

3

Severe oversegmentation!

Every local minimum of the gradient

one region

(54)

Conclusions and perspectives

Marker-controlled segmentation

Reduce oversegmentation ⇐ incorporate an additional knowledge into segmentation

We propose to use markers Determine markers for

each region of interest

Segment an image

One region in a segmentation map

One marker

(55)

Conclusions and perspectives

Objective

Determine markers automatically ← using probabilistic classification results

Marker-controlled region growing→ segment and classify a

hyperspectral image

(56)

Conclusions and perspectives

Outline

1 Introduction

2 Classification using segmentation-derived adaptive neighborhoods Segmentation

Spectral-spatial classification Concluding discussion

3 Segmentation and classification using automatically selected markers Marker selection

Classification using marker-controlled region growing Concluding discussion

4 Conclusions and perspectives

(57)

Conclusions and perspectives

Using probabilistic SVM

(58)

Conclusions and perspectives

Marker selection using probabilistic SVM

B-band hyperspectral image X = {x j ∈ R B , j = 1, 2, ..., n}

B ∼ 100

Hyperspectral image (B bands)

Probabilistic pixelwise classification

classification map

Selection of the most

probability map

map of

Marker-controlled region growing

markers

reliable classified pixels

Segmentation map + classification map

(59)

Conclusions and perspectives

Marker selection using probabilistic SVM

SVM classifier* → well suited for hyperspectral images

Output:

classification map probability map

-

Hyperspectral image (B bands)

Probabilistic pixelwise classification

classification map Selection of the most

probability map map of

Marker-controlled region growing markers

reliable classified pixels

Segmentation map + classification map

probability estimate for each pixel to belong to the assigned class

*C. Chang and C. Lin, "LIBSVM: A library for Support Vector Machines," Software available at

http://www.csie.ntu.edu.tw/∼cjlin/libsvm, 2001.

(60)

Conclusions and perspectives

Marker selection using probabilistic SVM

Analysis of classification and probability maps:

classification map probability map

1

Perform connected components labeling of the classification map

2

Analyze each connected component:

If it is large (> 20 pixels) → use P % (5%) of its pixels with the highest probabilities as a marker

If it is small → its pixels with probabilities > T % (90%) are used as a marker

Hyperspectral image (B bands)

Probabilistic pixelwise classification

classification map Selection of the most

probability map map of

Marker-controlled region growing markers

reliable classified pixels

Segmentation map + classification map

(61)

Conclusions and perspectives

Marker selection using probabilistic SVM

Analysis of classification and probability maps:

classification map probability map

1

Perform connected components labeling of the classification map

2

Analyze each connected component:

If it is large (> 20 pixels) → use P % (5%) of its pixels with the highest probabilities as a marker

If it is small → its pixels with probabilities > T % (90%) are used as a marker

Hyperspectral image (B bands)

Probabilistic pixelwise classification

classification map Selection of the most

probability map map of

Marker-controlled region growing markers

reliable classified pixels

Segmentation map + classification map

H

H H

Y

(62)

Conclusions and perspectives

Marker selection using probabilistic SVM

Analysis of classification and probability maps:

classification map probability map

1

Perform connected components labeling of the classification map

2

Analyze each connected component:

If it is large (> 20 pixels) → use P % (5%) of its pixels with the highest probabilities as a marker

If it is small → its pixels with probabilities > T % (90%) are used as a marker

Hyperspectral image (B bands)

Probabilistic pixelwise classification

classification map Selection of the most

probability map map of

Marker-controlled region growing markers

reliable classified pixels

Segmentation map + classification map

(63)

Conclusions and perspectives

Marker selection using probabilistic SVM

Analysis of classification and probability maps:

classification map probability map

1

Perform connected components labeling of the classification map

2

Analyze each connected component:

If it is large (> 20 pixels) → use P % (5%) of its pixels with the highest probabilities as a marker

If it is small → its pixels with probabilities > T % (90%) are used as a marker

Hyperspectral image (B bands)

Probabilistic pixelwise classification

classification map Selection of the most

probability map map of

Marker-controlled region growing markers

reliable classified pixels

Segmentation map + classification map

Must contain a marker!

H H H

H H H

H H H H

H H H H

H H j

(64)

Conclusions and perspectives

Marker selection using probabilistic SVM

Analysis of classification and probability maps:

classification map probability map

1

Perform connected components labeling of the classification map

2

Analyze each connected component:

If it is large (> 20 pixels) → use P % (5%) of its pixels with the highest probabilities as a marker

If it is small → its pixels with probabilities > T % (90%) are used as a marker

Hyperspectral image (B bands)

Probabilistic pixelwise classification

classification map Selection of the most

probability map map of

Marker-controlled region growing markers

reliable classified pixels

Segmentation map + classification map

Must contain a marker!

H H H

H H H

H H H H

H H H H

H H j

(65)

Conclusions and perspectives

Marker selection using probabilistic SVM

Analysis of classification and probability maps:

classification map probability map

1

Perform connected components labeling of the classification map

2

Analyze each connected component:

If it is large (> 20 pixels) → use P % (5%) of its pixels with the highest probabilities as a marker

If it is small → its pixels with probabilities > T % (90%) are used as a marker

Hyperspectral image (B bands)

Probabilistic pixelwise classification

classification map Selection of the most

probability map map of

Marker-controlled region growing markers

reliable classified pixels

Segmentation map + classification map

Has a marker only if it is very reliable

H H H H

H H H H

H H H

H H

H H H j

(66)

Conclusions and perspectives

Marker selection using probabilistic SVM

Analysis of classification and probability maps:

classification map probability map

Each connected component → 1 or 0 marker (2250 regions → 107 markers)

Marker is not necessarily a connected set of pixels

Each marker has a class label

Hyperspectral image (B bands)

Probabilistic pixelwise classification

classification map Selection of the most

probability map map of

Marker-controlled region growing markers

reliable classified pixels

Segmentation map + classification map

map of 107 markers

(67)

Conclusions and perspectives

Multiple classifier approach

(68)

Conclusions and perspectives

Multiple classifier approach for marker selection

Previous method: strong dependence on the performances of the selected probabilistic classifier

Objective: mitigate this dependence

→ using multiple classifiers

(69)

Conclusions and perspectives

Multiple classifier approach for marker selection

Previous method: strong dependence on the performances of the selected probabilistic classifier

Objective: mitigate this dependence

→ using multiple classifiers

Multiple classifier marker selection approach

1

Classify an image by several independent classifiers

2

Pixels assigned by all the classifiers to the same class

⇓ Map of markers

classifiers

Hyperspectral

image Map of

markers Marker

selection

(70)

Conclusions and perspectives

Multiple spectral-spatial classifier marker selection

Take into account spatial context

Pixelwise classification

Hyperspectral image

(B bands)

Majority voting within segmentation

regions Watershed

segmentation

Marker selection

Map of markers Segmentation by

EM for Gaussian mixture resolving

RHSEG segmentation

Majority voting within segmentation

regions

Majority voting within segmentation

regions

classification map

classification map

classification map

(71)

Conclusions and perspectives

Multiple spectral-spatial classifier marker selection

Take into account spatial context

Pixelwise classification

Hyperspectral image

(B bands)

Majority voting within segmentation

regions Watershed

segmentation

Marker selection

Map of markers Segmentation by

EM for Gaussian mixture resolving

RHSEG segmentation

Majority voting within segmentation

regions

Majority voting within segmentation

regions

classification map

classification map

classification map

(72)

Conclusions and perspectives

Multiple spectral-spatial classifier marker selection

Take into account spatial context

Pixelwise classification

Hyperspectral image

(B bands)

Majority voting within segmentation

regions Watershed

segmentation

Marker selection

Map of markers Segmentation by

EM for Gaussian mixture resolving

RHSEG segmentation

Majority voting within segmentation

regions

Majority voting within segmentation

regions

classification map

classification map

classification map

(73)

Conclusions and perspectives

Multiple spectral-spatial classifier marker selection

Take into account spatial context

Pixelwise classification

Hyperspectral image

(B bands)

Majority voting within segmentation

regions Watershed

segmentation

Marker selection

Map of markers Segmentation by

EM for Gaussian mixture resolving

RHSEG segmentation

Majority voting within segmentation

regions

Majority voting within segmentation

regions

classification map

classification map

classification map

(74)

Conclusions and perspectives

Multiple spectral-spatial classifier marker selection

Take into account spatial context

Pixelwise classification

Hyperspectral image

(B bands)

Majority voting within segmentation

regions Watershed

segmentation

Marker selection

Map of markers Segmentation by

EM for Gaussian mixture resolving

RHSEG segmentation

Majority voting within segmentation

regions

Majority voting within segmentation

regions

classification map

classification map

classification map

(75)

Conclusions and perspectives

Outline

1 Introduction

2 Classification using segmentation-derived adaptive neighborhoods Segmentation

Spectral-spatial classification Concluding discussion

3 Segmentation and classification using automatically selected markers Marker selection

Classification using marker-controlled region growing Concluding discussion

4 Conclusions and perspectives

(76)

Conclusions and perspectives

Marker-controlled watershed

(77)

Conclusions and perspectives

1. Marker-controlled watershed

Hyperspectral image (B bands)

Gradient Classification

gradient image map of

markers

Marker-controlled watershed segmentation Marker selection

Segmentation map

+ classification map

(78)

Conclusions and perspectives

1. Marker-controlled watershed / Gradient

Original hyperspectral image

Robust Color Morphological Gradient*

Hyperspectral image (B bands)

Gradient Classification

gradient image map of

markers Marker-controlled watershed segmentation Marker selection

Segmentation map + classification map

*Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and classification of

hyperspectral images using watershed transformation,” Pattern Recognition, vol. 43, no. 7, pp.

(79)

Conclusions and perspectives

1. Marker-controlled watershed

Hyperspectral image (B bands)

Gradient Classification

gradient image map of

markers

Marker-controlled watershed segmentation Marker selection

Segmentation map

+ classification map

(80)

Conclusions and perspectives

1. Marker-controlled watershed

1

Transform the gradient f g → markers are the only minima

Create a marker image:

f

m

(x) = (

0, if x belongs to marker, t

max

, otherwise

Compute (f

g

+ 1) V f

m

Perform minima imposition:

morphological reconstruction by erosion of (f

g

+ 1) V

f

m

from f

m

: f

gmi

= R

(fεg+1)Vfm

(f

m

)

Three local minima

Marker

Single minimum

(81)

Conclusions and perspectives

1. Marker-controlled watershed

1

Transform the gradient f g → markers are the only minima

Create a marker image:

f

m

(x) = (

0, if x belongs to marker, t

max

, otherwise

Compute (f

g

+ 1) V f

m

Perform minima imposition:

morphological reconstruction by erosion of (f

g

+ 1) V

f

m

from f

m

: f

gmi

= R

(fεg+1)Vfm

(f

m

)

Three local minima

Marker

Single minimum

(82)

Conclusions and perspectives

1. Marker-controlled watershed

1

Transform the gradient f g → markers are the only minima

Create a marker image:

f

m

(x) = (

0, if x belongs to marker, t

max

, otherwise

Compute (f

g

+ 1) V f

m

Perform minima imposition:

morphological reconstruction by erosion of (f

g

+ 1) V

f

m

from f

m

: f

gmi

= R

(fεg+1)Vfm

(f

m

)

Three local minima

Marker

Single minimum

(83)

Conclusions and perspectives

1. Marker-controlled watershed

1

Transform the gradient f g → markers are the only minima

Create a marker image:

f

m

(x) = (

0, if x belongs to marker, t

max

, otherwise

Compute (f

g

+ 1) V f

m

Perform minima imposition:

morphological reconstruction by erosion of (f

g

+ 1) V

f

m

from f

m

:

f

gmi

= R

(fεg+1)Vfm

(f

m

)

Three local minima

Marker

Single minimum

(84)

Conclusions and perspectives

1. Marker-controlled watershed

1

Transform the gradient f g → markers are the only minima

2

Apply watershed on the filtered gradient image f gmi (Vincent and Soille, 1991)

Three local minima

Marker

Single minimum

P P P P

P P P P

P P P P

P P P P

PP q

(85)

Conclusions and perspectives

1. Marker-controlled watershed

1

Transform the gradient f g → markers are the only minima

2

Apply watershed on the filtered gradient

image f gmi (Vincent and Soille, 1991)

(86)

Conclusions and perspectives

1. Marker-controlled watershed

1

Transform the gradient f g → markers are the only minima

2

Apply watershed on the filtered gradient image f gmi (Vincent and Soille, 1991)

3

Assign every watershed pixel to the spectrally most similar neighboring region

(87)

Conclusions and perspectives

1. Marker-controlled watershed

1

Transform the gradient f g → markers are the only minima

2

Apply watershed on the filtered gradient image f gmi (Vincent and Soille, 1991)

3

Assign every watershed pixel to the spectrally most similar neighboring region

Several minima in the filtered gradient

Several regions in the segmentation map

(88)

Conclusions and perspectives

1. Marker-controlled watershed

1

Transform the gradient f g → markers are the only minima

2

Apply watershed on the filtered gradient image f gmi (Vincent and Soille, 1991)

3

Assign every watershed pixel to the spectrally most similar neighboring region

4

Merge regions belonging to the same marker

(89)

Conclusions and perspectives

1. Marker-controlled watershed

1

Transform the gradient f g → markers are the only minima

2

Apply watershed on the filtered gradient image f gmi (Vincent and Soille, 1991)

3

Assign every watershed pixel to the spectrally most similar neighboring region

4

Merge regions belonging to the same marker

5

Class of each marker → class of the corresponding region

(90)

Conclusions and perspectives

Construction of a Minimum Spanning Forest

(91)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

Hyperspectral image (B bands)

Classification

map of

Construction of minimum spanning

forest

markers

Marker selection

Majority voting within connected

components

Segmentation map

+ classification map

(92)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

B bands x

7

x

8

x

9

x

6

x

5

x x

1

x

2

x

3

4

map of markers

1 1 0 0 0 0 0 0 2

1) Map an image onto a graph

Weight w

i ,j

indicates the degree of dissimilarity between pixels x

i

and x

j

. Spectral Angle Mapper (SAM) distance can be used:

w

i ,j

= SAM(x

i

, x

j

) = arccos

P

B b=1

x

i b

x

j b

[ P

B

b=1

x

i b2

]

1/2

[ P

B b=1

x

j b2

]

1/2

!

(93)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

B bands x

7

x

8

x

9

x

6

x

5

x x

1

x

2

x

3

4

map of markers

1 1 0 0 0 0

0 0 2 image graph

1) Map an image onto a graph

Weight w

i ,j

indicates the degree of dissimilarity between pixels x

i

and x

j

. Spectral Angle Mapper (SAM) distance can be used:

w

i ,j

= SAM(x

i

, x

j

) = arccos

P

B b=1

x

i b

x

j b

[ P

B

b=1

x

i b2

]

1/2

[ P

B b=1

x

j b2

]

1/2

!

(94)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

B bands x

7

x

8

x

9

x

6

x

5

x

4

x

3

x

2

x

1

map of markers

1 1 0 0 0 0 0 0 2

w

12

image graph

1) Map an image onto a graph

Weight w

i ,j

indicates the degree of dissimilarity between pixels x

i

and x

j

. Spectral Angle Mapper (SAM) distance can be used:

w

i ,j

= SAM(x

i

, x

j

) = arccos

P

B b=1

x

i b

x

j b

[ P

B

b=1

x

i b2

]

1/2

[ P

B b=1

x

j b2

]

1/2

!

(95)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

B bands x

7

x

8

x

9

x

6

x

5

x

4

x

3

x

2

x

1

map of markers

1 1 0 0 0 0

0 0 2 image graph

0 8

14 2

8

5 5 9 15

16 4 11 12

12 3

14 10

5 11 8

1) Map an image onto a graph

Weight w

i ,j

indicates the degree of dissimilarity between pixels x

i

and x

j

. Spectral Angle Mapper (SAM) distance can be used:

w

i ,j

= SAM(x

i

, x

j

) = arccos

P

B b=1

x

i b

x

j b

[ P

B

b=1

x

i b2

]

1/2

[ P

B b=1

x

j b2

]

1/2

!

(96)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

1 1

2 B bands x

7

x

8

x

9

x

6

x

5

x

4

x

3

x

2

x

1

map of markers

1 1 0 0 0 0 0 0 2

8

0 0

14 2

8

5 15

image graph

0 0

9 5

16

4 0

11 12

12 3

14 10

5 11

0 8 0

1) Map an image onto a graph

Weight w

i ,j

indicates the degree of dissimilarity between pixels x

i

and x

j

. Spectral Angle Mapper (SAM) distance can be used:

w

i ,j

= SAM(x

i

, x

j

) = arccos

P

B b=1

x

i b

x

j b

[ P

B

b=1

x

i b2

]

1/2

[ P

B b=1

x

j b2

]

1/2

!

(97)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

8 1 0 1

2

2 14 0

5 8 15

9

image graph 5

16

0 4 0 0

11 12

14 10 12 11 3

8 5

0 0

Given a graph G, a MSF F rooted on vertices {r 1 , ..., r m } is:

a non-connected graph without cycles contains all the vertices of G

consists of connected subgraphs, each subgraph (tree) contains (is

rooted on) one root r i

(98)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

0

1 1

2 modified graph

0

0 0

0

0

0 0

3 8

4 16

8 5

5

14 5

2 9

14

10 11

11 12

8

12

15

r

2

r

1

0

0

2) Add m extra vertices r i , i = 1, ..., m corresponding to m markers

(99)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

0

1 1

2 0

0 0

0

0

0 0

3 8

4 16

8 5

5

14 5

2 9

14

10 11

11 12

8

12

15

r

2

r

1

0

0

3) Construct a MSF F = (V , E )

Initialization: V = {r 1 , r 2 , ..., r m } (roots are in the forest)

1

Choose edge of the modified graph e i j with minimal weight such that i ∈ V and j / ∈ V

2

V = V ∪ {j}, E = E ∪ {e i ,j }

(100)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

0

1 1

2 0

0 0

0

0

0 0

3 8

4 16

8 5

5

14 5

2 9

14

10 11

11 12

8

12

15

r

2

r

1

0

0

3) Construct a MSF F = (V , E )

Initialization: V = {r 1 , r 2 , ..., r m } (roots are in the forest)

1

Choose edge of the modified graph e i j with minimal weight such that i ∈ V and j / ∈ V

2

V = V ∪ {j}, E = E ∪ {e i ,j }

(101)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

0

1 1

2 0

0 0

0

0

0 0

3 8

4 16

8 5

5

14 5

2 9

14

10 11

11 12

8

12

15

r

2

r

1

0

0

3) Construct a MSF F = (V , E )

Initialization: V = {r 1 , r 2 , ..., r m } (roots are in the forest)

1

Choose edge of the modified graph e i j with minimal weight such that i ∈ V and j / ∈ V

2

V = V ∪ {j}, E = E ∪ {e i ,j }

(102)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

0

1 1

2 0

0 0

0

0

0 0

3 8

4 16

8 5

5

14 5

2 9

14

10 11

11 12

8

12

15

r

2

r

1

0

0

3) Construct a MSF F = (V , E )

Initialization: V = {r 1 , r 2 , ..., r m } (roots are in the forest)

1

Choose edge of the modified graph e i j with minimal weight such that i ∈ V and j / ∈ V

2

V = V ∪ {j}, E = E ∪ {e i ,j }

(103)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

0

1 1

2 0

0 0

0

0

0 0

3 8

4 16

8 5

5

14 5

2 9

14

10 11

11 12

8

12

15

r

2

r

1

0

0

3) Construct a MSF F = (V , E )

Initialization: V = {r 1 , r 2 , ..., r m } (roots are in the forest)

1

Choose edge of the modified graph e i j with minimal weight such that i ∈ V and j / ∈ V

2

V = V ∪ {j}, E = E ∪ {e i ,j }

(104)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

0

1 1

2 0

0 1

0

0

0 0

3 8

4 16

8 5

5

14 5

2 9

14

10 11

11 12

8

12

15

r

2

r

1

0

0

3) Construct a MSF F = (V , E )

Initialization: V = {r 1 , r 2 , ..., r m } (roots are in the forest)

1

Choose edge of the modified graph e i j with minimal weight such that i ∈ V and j / ∈ V

2

V = V ∪ {j}, E = E ∪ {e i ,j }

(105)

Conclusions and perspectives

2. Construction of a Minimum Spanning Forest (MSF)

0

1 1

2 0

0 1

0

2

0 0

3 8

4 16

8 5

5

14 5

2 9

14

10 11

11 12

8

12

15

r

2

r

1

0

0

3) Construct a MSF F = (V , E )

Initialization: V = {r 1 , r 2 , ..., r m } (roots are in the forest)

1

Choose edge of the modified graph e i j with minimal weight such that i ∈ V and j / ∈ V

2

V = V ∪ {j}, E = E ∪ {e i ,j }

Figure

Updating...

Références

Sujets connexes :
Outline : ⇑ One marker