• Aucun résultat trouvé

Soft Computing and Evolutionary Algorithms

N/A
N/A
Protected

Academic year: 2022

Partager "Soft Computing and Evolutionary Algorithms"

Copied!
13
0
0

Texte intégral

(1)

Evolutionary Evolutionary

Algorithms Algorithms

Andrea G. B. Tettamanzi

(2)

Contents of the Lectures

• Taxonomy and History;

• Evolutionary Algorithms basics;

• Theoretical Background;

• Outline of the various techniques: plain genetic algorithms, evolutionary programming, evolution strategies, genetic programming;

• Practical implementation issues;

• Evolutionary algorithms and soft computing;

• Selected applications from the biological and medical area;

• Summary and Conclusions.

(3)

Bibliography

Th. Bäck. Evolutionary Algorithms in Theory and Practice. Oxford University Press, 1996

L. Davis. The Handbook of Genetic Algorithms. Van Nostrand &

Reinhold, 1991

D.B. Fogel. Evolutionary Computation. IEEE Press, 1995

D.E. Goldberg. Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley, 1989

J. Koza. Genetic Programming. MIT Press, 1992

Z. Michalewicz. Genetic Algorithms + Data Structures = Evolution Programs. Springer Verlag, 3rd ed., 1996

H.-P. Schwefel. Evolution and Optimum Seeking. Wiley & Sons, 1995

J. Holland. Adaptation in Natural and Artificial Systems. MIT Press 1995

(4)

Taxonomy

(1)

Genetic Algorithms

Evolutionary Algorithms

Evolutionary Programming

Evolution Strategies

Genetic Programming

Simulated Annealing

Taboo Search

Monte Carlo methods

Stochastic optimization methods

(5)

Taxonomy

(2)

Distinctive features of Evolutionary Algorithms:

• operate on appropriate encoding of solutions;

• population search;

• no regularity conditions requested;

• probabilistic transitions.

(6)

History

(1)

I. Rechenberg, H.-P. Schwefel TU Berlin, ‘60s

John H. Holland University of Michigan, L. Fogel

John Koza

Stanford University

‘80s

(7)

History

(2)

1859 Charles Darwin: inheritance, variation, natural selection 1957 G. E. P. Box: random mutation & selection for optimization 1958 Fraser, Bremermann: computer simulation of evolution 1964 Rechenberg, Schwefel: mutation & selection

1966 Fogel et al.: evolving automata - “evolutionary programming”

1975 Holland: crossover, mutation & selection - “reproductive plan”

1975 De Jong: parameter optimization - “genetic algorithm”

1989 Goldberg: first textbook 1991 Davis: first handbook

1993 Koza: evolving LISP programs - “genetic programming”

(8)

Evolutionary Algorithms Basics

• what an EA is (the Metaphor)

• object problem and fitness

• the Ingredients

• schemata

• implicit parallelism

• the Schema Theorem

• the building blocks hypothesis

• deception

(9)

The Metaphor

Environment Object problem Individual

Fitness

Candidate solution Quality

EVOLUTION PROBLEM SOLVING

(10)

Object problem and Fitness

genotype solution

M

c S

c S :

min ( )

R s s

s

f

fitness

(11)

The Ingredients

t t + 1

mutation

recombination reproduction

selection

(12)

The Evolutionary Cycle

Recombination

Mutation

Population

Offspring Parents

Selection

Replacement

Reproduction

(13)

Pseudocode

generation = 0;

SeedPopulation(popSize); // at random or from a file while(!TerminationCondition())

{

generation = generation + 1;

CalculateFitness(); // ... of new genotypes

Selection(); // select genotypes that will reproduce Crossover(pcross); // mate pcross of them on average

Mutation(pmut); // mutate all the offspring with Bernoulli // probability pmut over genes

}

(14)

A Sample Genetic Algorithm

• The MAXONE problem

• Genotypes are bit strings

• Fitness-proportionate selection

• One-point crossover

• Flip mutation (transcription error)

(15)

The MAXONE Problem

Problem instance: a string of l binary cells,



l

:

Objective: maximize the number of ones in the string.

f i

i l

( )   

  1

Fitness:

(16)

Fitness Proportionate Selection

Implementation: “Roulette Wheel”

P f

( ) ( ) f

 

 

Probability of

being selected:

2 ff

 ( )

(17)

One Point Crossover

0 0

0 0

0 1 1 1 1 1

0 1

1 0

1 1 0 0 1 0

crossover point

0 1

0 0

0 1 0 0 1 0

0 0

1 0

1 1 1 1 1 1

parents offspring

(18)

Mutation

1 1

1 0

1 1 0 0 1 0

pmut

0 1

1 0

1 1 1 0 1 0

independent Bernoulli transcription errors

(19)

Example: Selection

0111011011 f = 7 Cf = 7 P = 0.125 1011011101 f = 7 Cf = 14 P = 0.125 1101100010 f = 5 Cf = 19 P = 0.089 0100101100 f = 4 Cf = 23 P = 0.071 1100110011 f = 6 Cf = 29 P = 0.107 1111001000 f = 5 Cf = 34 P = 0.089 0110001010 f = 4 Cf = 38 P = 0.071 1101011011 f = 7 Cf = 45 P = 0.125 0110110000 f = 4 Cf = 49 P = 0.071 0011111101 f = 7 Cf = 56 P = 0.125

Random sequence: 43, 1, 19, 35, 15, 22, 24, 38, 44, 2

(20)

Example: Recombination & Mutation

0111011011  0111011011  0111111011 f = 8 0111011011  0111011011  0111011011 f = 7 110|1100010  1100101100  1100101100 f = 5 010|0101100  0101100010  0101100010 f = 4 1|100110011  1100110011  1100110011 f = 6 1|100110011  1100110011  1000110011 f = 5 0110001010  0110001010  0110001010 f = 4 1101011011  1101011011  1101011011 f = 7 011000|1010  0110001011  0110001011 f = 5 110101|1011  1101011010  1101011010 f = 6

TOTAL = 57

(21)

Schemata

Don’t care symbol:

 1 0  1 

a schema S matches 2l - o(S) strings

a string of length l is matched by 2l schemata order of a schema: o(S) = # fixed positions

defining length

(S) = distance between first and last fixed position

(22)

Implicit Parallelism

In a population of n individuals of length l 2l  # schemata processed  n2l

n3 of which are processed usefully (Holland 1989) (i.e. are not disrupted by crossover and mutation)

But see Bertoni & Dorigo (1993)

“Implicit Parallelism in Genetic Algorithms”

Artificial Intelligence 61(2), p. 307314

(23)

Fitness of a schema

f S x q S q f

x

S x

( )  ( ) ( ) ( )

 

1  

f(

): fitness of string

qx(

): fraction of strings equal to

in population x

qx(S): fraction of strings matched by S in population x

(24)

The Schema Theorem

{Xt}t=0,1,... populations at times t

f S f X

f X c

X t

t

t

( ) ( )

( )

 

E q S X q S c p S

l o S p

X X

t

cross mut

t

[

t

( )| ] ( )( ) ( )

0 0

1 1 ( )

   1

 

  



suppose that is constant

i.e. above-average individuals increase exponentially!

(25)

The Schema Theorem (proof)

E q S X q S f S

f X P S q S c P S

X t X

X t

surv X surv

t t

t

[ ( )| ] ( ) ( ) t

( ) [ ] ( )( ) [ ]

1

1 1

1

1 1

P S p S

l p o S

surv[ ] cross ( ) mut

  ( )

 

1 1

(26)

The Building Blocks Hypothesis

‘‘An evolutionary algorithm seeks near-optimal performance through the juxtaposition of short, low-order, high-performance

schemata — the building blocks’’

(27)

Deception

i.e. when the building block hypothesis does not hold:

 *  S

but

f S ( ) f S ( )

for some schema S, Example:

* = 1111111111

S1 = 111*******

S2 = ********11 S = 111*****11 S = 000*****00

(28)

Remedies to deception

Prior knowledge of the objective function

Non-deceptive encoding Inversion

Semantics of genes not positional

“Messy Genetic Algorithms”

Underspecification & overspecification

(29)

Theoretical Background

• Theory of random processes;

• Convergence in probability;

• Open question: rate of convergence.

(30)

Events

Sample space

A

B D 

(31)

Random Variables

)

0

(  X

X X :   R

(32)

Stochastic Processes

X

t

( )

t0 1, ,

A sequence of r.v.’s

 , , ,

, 2

1 X X t

X

Each with its own probability distribution.

Notation:

(33)

EAs as Random Processes

a sample of size n

x 

( )n

, 2

,

probability space

, , F P   X

t

( )  

t , , 0 1

, 2

,

“random numbers” trajectory

evolutionary process

(34)

Markov Chains

X

t

( )  

t , , 0 1

A stochastic process

Is a Markov chain iff, for all

t

,

P X [

t

x X X |

0

,

1

, ,  X

t1

]  P X [

t

x X |

t1

]

A B C

0.4 0.6

0.3

0.7

0.25

0.75

(35)

Abstract Evolutionary Algorithm

select: (n)

cross: 

mutate: 

mate: 

insert: 

Xt

Xt+1 select

select

cross mate

insert mutate

Stochastic functions:

 

X

t1

( )   T

t

( )  X

t

( ) 

Transition function:

(36)

Convergence to Optimum

Theorem: if {Xt()}t = 0, 1, ... ismonotone, homogeneous, x0 is given, y in reach(x0)   (n)O reachable, then

lim [

( )

| ] .

t t O

P X

n

X x



 

0

0

 1

Theorem: if select, mutate are generous, the neighborhood

structure is connective, transition functions Tt(), t = 0, 1, ... are i.i.d.

and elitist, then

lim [

( )

] .

t t O

P X

n



   1

(37)

Outline of various techniques

• Plain Genetic Algorithms

• Evolutionary Programming

• Evolution Strategies

• Genetic Programming

(38)

Plain Genetic Algorithms

• Individuals are bit strings

• Mutation as transcription error

• Recombination is crossover

• Fitness proportionate selection

(39)

Evolutionary Programming

• Individuals are finite-state automata

• Used to solve prediction tasks

• State-transition table modified by uniform random mutation

• No recombination

• Fitness depends on the number of correct predictions

• Truncation selection

(40)

Evolutionary Programming: Individuals

Finite-state automaton: (Q, q0, A, , )

• set of states Q;

• initial state q0;

• set of accepting states A;

• alphabet of symbols ;

• transition function : Q   Q;

• output mapping function : Q  ;

q0 q1 q2

a b c

state input

q0

q

q0

q1 q1

q1

q

q2

q q1

q0

q2 b/c c/b

a/b c/c

a/b

b/c a/a

c/a b/a

a

c c

c a

a b

b b

(41)

Evolutionary Programming: Fitness

a b c a b c a b

b =? no

yes

f(

) = f(

) + 1 individual

prediction

(42)

Evolutionary Programming: Selection

Variant of stochastic q-tournament selection:

 

1

2

q

... score(

) = #{

i | f(

) > f(

i) }

Order individuals by decreasing score Select first half (Truncation selection)

(43)

Evolution Strategies

• Individuals are n-dimensional vectors of reals

• Fitness is the objective function

• Mutation distribution can be part of the genotype

(standard deviations and covariances evolve with solutions)

• Multi-parent recombination

• Deterministic selection (truncation selection)

(44)

Evolution Strategies: Individuals

candidate solution

rotation angles

standard deviations

a

x

 

 

ij

i

j

i j

 1

2

2

2 2

arctan cov( , )

(45)

Evolution Strategies: Mutation

   

  

    

   

  

 

i i i

j j j

N N

N

x x N

exp( ( , ) ( , )) ( , )

( , , )

0 1 0 1

0 1

      0

Hans-Paul Schwefel suggests:

 

 

 

  

2 2

0 0873 5

1

1

n n .

self-adaptation

(46)

Genetic Programming

• Program induction

• LISP (historically), math expressions, machine language, ...

• Applications:

– optimal control;

– planning;

– sequence induction;

– symbolic regression;

– modelling and forecasting;

– symbolic integration and differentiation;

– inverse problems

(47)

Genetic Programming: The Individuals

subset of LISP S-expressions

(OR (AND (NOT d0) (NOT d1)) (AND d0 d1)) OR

AND NOT

d0

NOT d1

AND

d0 d1

(48)

Genetic Programming: Initialization

OR AND

NOT d0

NOT d1

AND

d0 d1

OR

OR AND

OR

AND AND

OR

AND AND

NOT

(49)

Genetic Programming: Crossover

OR

AND NOT

d0 d0 d1

OR

OR AND

d1 NOT NOT NOT

d0 d0 d1

OR

AND NOT

d0 d1 d0

OR AND OR

d1 NOT

NOT NOT

d0 d1 d0

(50)

Genetic Programming: Other Operators

• Mutation: replace a terminal with a subtree

• Permutation: change the order of arguments to a function

• Editing: simplify S-expressions, e.g. (AND X X)  X

• Encapsulation: define a new function using a subtree

• Decimation: throw away most of the population

(51)

Genetic Programming: Fitness

Fitness cases: j = 1, ..., Ne

“Raw” fitness:

“Standardized” fitness: s(

)  [0, +

)

“Adjusted” fitness:

r j C j

j Ne

( )  Output( , )  ( )

1

a( ) s

 ( )

 

 1 1

(52)

Sample Application: Myoelectric Prosthesis Control

• Control of an upper arm prosthesis

• Genetic Programming application

• Recognize thumb flection, extension and abduction patterns

(53)

Prosthesis Control: The Context

human arm

myoelectric signals

measure

raw myo-measurements

preprocess

myo-signal features

deduce intentions

map into goal

human motion robot motion

convert

actuator commands

robot arm

150 ms

2 electrodes

(54)

Prosthesis Control: Terminals

Features for electrodes 1, 2:

• Mean absolute value (MAV)

• Mean absolute value slope (MAVS)

• Number of zero crossings (ZC)

• Number of slope sign changes (SC)

• Waveform length (LEN)

• Average value (AVG)

• Up slope (UP)

• Down slope (DOWN)

MAV1/MAV2, MAV2/MAV1

0.7, 0.8, 0.9, 1.0, 1.1, 1.2, 1.3, 0.01, -1.0

(55)

Prosthesis Control: Function Set

Addition x + y

Subtraction x - y

Multiplication x * y

Division x / y (protected for y=0)

Square root sqrt(|x|)

Sine sin x

Cosine cos x

Tangent tan x (protected for x=/2)

Natural logarithm ln |x| (protected for x=0) Common logarithm log |x| (protected for x=0)

Exponential exp x

Power function x ^ y

Reciprocal 1/x (protected for x=0)

Absolute value |x|

Integer or truncate int(x)

Sign sign(x)

(56)

Prosthesis Control: Fitness

type 1 undefined type 2 undefined type 3 undefined

 

r( ) abduction extension flexion min abduction extension , abduction flexion , extension flexion

100

separation spread

22 signals per motion

result

(57)

Myoelectric Prosthesis Control Reference

• Jaime J. Fernandez, Kristin A. Farry and John B. Cheatham.

“Waveform Recognition Using Genetic Programming: The Myoelectric Signal Recognition Problem. GP ‘96, The MIT Press, pp. 63–71

(58)

Classifier Systems

(Michigan approach)

IF X = A AND Y = B THEN Z = D

individual:

IF ... THEN ...

IF ... THEN ...

IF ... THEN ...

IF ... THEN ...

IF ... THEN ...

IF ... THEN ...

IF ... THEN ...

IF ... THEN ...

IF ... THEN ...

f e f r n n

p f n n

n

n n

   

 

 

1

1

( ) ( 1 ) ( ) ( ) class( ) ( ) ( ) ( ) class( )

  

 

r   ( 1 gN R

)

where

number of attributes in antecedent part

(59)

Practical Implementation Issues

• from elegant academia to not so elegant but robust and efficient real-world applications, evolution programs

• handling constraints

• hybridization

• parallel and distributed algorithms

(60)

Evolution Programs

Slogan:

Genetic Algorithms + Data Structures = Evolution Programs

Key ideas:

• use a data structure as close as possible to object problem

• write appropriate genetic operators

• ensure that all genotypes correspond to feasible solutions

• ensure that genetic operators preserve feasibility

(61)

Encodings: “Pie” Problems

W X Y Z

0–255

128 32 90 20

0–255 0–255 0–255

W X Y Z

X = 32/270 = 11.85%

(62)

Encodings: “Permutation” Problems

Adjacency Representation

Ordinal Representation (2, 4, 8, 3, 9, 7, 1, 5, 6)

(1, 1, 2, 1, 4, 1, 3, 1, 1) Path Representation

(1, 2, 4, 3, 8, 5, 9, 6, 7)

Matrix Representation

1

0 1

0 0

0 0

0 0

0 0 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 0 0 1 1 1 1 1 1

0 0 0 0 1 1 1 1 0 0 0 0 1 1 0 1

0 0 0 0 0 1 1 0 1 0

0

0 0 0 0 0 0

0 0 0 0 0 0 0

Sorting Representation

(-23, -6, 2, 0, 19, 32, 85, 11, 25)

(63)

Handling Constraints

• Penalty functions

Risk of spending most of the time evaluating unfeasible solutions, sticking with the first feasible solution found, or finding an unfeasible solution that scores better of feasible solutions

• Decoders or repair algorithms

Computationally intensive, tailored to the particular application

• Appropriate data structures and specialized genetic operators

All possible genotypes encode for feasible solutions

(64)

Penalty Functions

S c

P

f ( ) Eval( ( ))   c zP z ( )

P z ( )  w t ( )  w

i

i

( ) z

(65)

Decoders / Repair Algorithms

S c

recombination

mutation

(66)

Hybridization

2) Use local optimization algorithms as genetic operators (Lamarckian mutation)

1) Seed the population with solutions provided by some heuristics heuristics initial population

3) Encode parameters of a heuristics

genotype heuristics candidate solution

(67)

Sample Application: Unit Commitment

• Multiobjective optimization problem: cost VS emission

• Many linear and non-linear constraints

• Traditionally approached with dynamic programming

• Hybrid evolutionary/knowledge-based approach

• A flexible decision support system for planners

• Solution time increases linearly with the problem size

(68)

The Unit Commitment Problem

C P

i

( )

i

  a b P c P

i i i

i i2

 

z C Pi i SUi SD HSi i

i n

$  ( )   

1

zE E Pi i

i n

( ) 1

E Pij( )i  ij ij iP  ij iP2 E Pi i E Pij i

j m

( )  ( )

1

Emissions Cost

(69)

Predicted Load Curve

0 5 10 15 20 25 30 35 40 45

Spinning Reserve Load

(70)

Unit Commitment: Constraints

• Power balance requirement

• Spinning reserve requirement

• Unit maximum and minimum output limits

• Unit minimum up and down times

• Power rate limits

• Unit initial conditions

• Unit status restrictions

• Plant crew constraints

• ...

(71)

Unit Commitment: Encoding

Unit 1 Unit 2 Unit 3 Unit 4 Time

1.0 00:00

01:00 02:00 03:00 04:00 05:00 06:00 07:00 08:00 09:00 1.0

1.0

1.0 1.0

1.0 1.0

1.0 1.0 1.0

1.0 1.0 1.0

0.9

0.8

0.8 0.8 0.8

0.4 0.8

0.8

0.75 0.8 0.2

0.2

0.25 0.2

0.2 0.15 0.0

0.0

0.0 0.0

0.0 0.0

0.5 0.65

0.5 0.5

1.0

Fuzzy Knowledge

Base

(72)

Unit Commitment: Solution

Unit 1 Unit 2 Unit 3 Unit 4 Time 00:00 01:00 02:00 03:00 04:00 05:00 06:00 07:00 08:00 09:00

down

hot-stand-by starting

shutting down up

(73)

Unit Commitment: Selection

cost ($)

emission

$507,762 $516,511

213,489 £ 60,080 £

competitive selection:

(74)

Unit Commitment References

• D. Srinivasan, A. Tettamanzi. “An Integrated Framework for Devising Optimum Generation Schedules”. In Proceedings of the 1995 IEEE International Conference on Evolutionary

Computing (ICEC ‘95), vol. 1, pp. 1-4.

• D. Srinivasan, A. Tettamanzi. A Heuristic-Guided Evolutionary Approach to Multiobjective Generation Scheduling. IEE

Proceedings Part C - Generation, Transmission, and Distribution, 143(6):553-559, November 1996.

• D. Srinivasan, A. Tettamanzi. An Evolutionary Algorithm for

Evauation of Emission Compliance Options in View of the Clean Air Act Amendments. IEEE Transactions on Power Systems, 12(1):336-341, February 1997.

(75)

Parallel Evolutionary Algorithms

• Algoritmo evolutivo standard enunciato come sequenziale...

• … ma gli algoritmi evolutivi sono intrinsecamente paralleli

• Vari modelli:

– algoritmo evolutivo cellulare

– algoritmo evolutivo parallelo a grana fine (griglia) – algoritmo evolutivo parallelo a grana grossa (isole)

– algoritmo evolutivo sequenziale con calcolo della fitness parallelo (master - slave)

(76)

Terminology

• Panmictic

• Apomictic

(77)

Island Model

(78)

Selected Applications in Biology and Medical Science

• the protein folding problem, i.e. determining the tertiary structure of proteins using evolutionary algorithms;

• quantitative structure-activity relationship modeling for drug design;

• applications to medical diagnosis, like electroencephalogram (EEG) classification and automatic feature detection in medical imagery (PET, CAT, NMR, X-RAY, etc.);

• applications to radiotherapy treatment planning;

• applications to myoelectric prosthesis control.

(79)

Sample Application: Protein Folding

• Finding 3-D geometry of a protein to understand its functionality

• Very difficult: one of the “grand challenge problems”

• Standard GA approach

• Simplified protein model

(80)

Protein Folding: The Problem

• Much of a proteins function may be derived from its conformation (3-D geometry or “tertiary” structure).

• Magnetic resonance & X-ray crystallography are currently used to view the conformation of a protein:

– expensive in terms of equipment, computation and time;

– require isolation, purification and crystallization of protein.

• Prediction of the final folded conformation of a protein chain has been shown to be NP-hard.

• Current approaches:

– molecular dynamics modelling (brute force simulation);

– statistical prediction;

– hill-climbing search techniques (simulated annealing).

(81)

Protein Folding: Simplified Model

• 90° lattice (6 degrees of freedom at each point);

• Peptides occupy intersections;

• No side chains;

• Hydrophobic or hydrophilic (no relative strengths) amino acids;

• Only hydrophobic/hydrophilic forces considered;

• Adjacency considered only in cardinal directions;

• Cross-chain hydrophobic contacts are the basis for evaluation.

(82)

Protein Folding: Representation

preference order encoding:

relative move encoding:

UP DOWN FORWARD LEFT UP RIGHT

UP

LEFT

RIGHT

DOWN

FORWARD

DOWN

LEFT

UP

FORWARD

RIGHT

FORWARD

UP

DOWN

LEFT

RIGHT

LEFT

DOWN

FORWARD

UP

RIGHT

...

...

(83)

Protein Folding: Fitness

Decode: plot the course encoded by the genotype.

Test each occupied cell:

• any collisions: -2;

• no collisions AND a hydrophobe in an adjacent cell: 1.

Notes:

• for each contact: +2;

• adjacent hydrophobes not discounted in the scoring;

• multiple collisions (>1 peptides in one cell): -2;

• hydrophobe collisions imply an additional penalty (no contacts are scored).

(84)

Protein Folding: Experiments

• Preference ordering encoding;

• Two-point crossover with a rate of 95%;

• Bit mutation with a rate of 0.1%;

• Population size: 1000 individuals;

• crowding and incest reduction.

• Test sequences with known minimum configuration;

(85)

Protein Folding References

• S. Schulze-Kremer. “Genetic Algorithms for Protein Tertiary Structure Prediction”. PPSN 2, North-Holland 1992.

• R. Unger and J. Moult. “A Genetic Algorithm for 3D Protein Folding Simulations”. ICGA-5, 1993, pp. 581–588.

• Arnold L. Patton, W. F. Punch III and E. D. Goodman. “A Standard GA Approach to Native Protein Conformation Prediction”. ICGA 6, 1995, pp. 574–581.

(86)

Sample Application: Drug Design

Purpose: given a chemical specification (activity), design a tertiary structure complying with it.

Requirement: a quantitative structure-activity relationship model.

Example: design ligands that can bind targets specifically and selectively. Complementary peptides.

(87)

Drug Design: Implementation

N L H A F G L F K A amino acid (residue)

individual

• name

• hydropathic value

Operators:

• Hill-climbing Crossover

• Hill-climbing Mutation

• Reordering (no selection)

implicit selection

(88)

Drug Design: Fitness

target a complement b

moving average hydropathy

a

k

h

i

i k s k s

 

b

k

g

i

i k s k s

 

hydropathy of residues

k  s, ..., n  s n

: number of residues in target

Q a b

n s

i i

i

(2 )

2 (lower

Q

= better complementarity)

(89)

Drug Design: Results

0 2 4 6 8 10 12 14 16

-6 -4 -2 0 2 4

Sequence:FANSGNVYFGIIAL Fassina GA Target Hydropathic

Value

AminoAcid

(90)

Drug Design References

• T. S. Lim. A Genetic Algorithms Approach for Drug Design. MS Dissertation, Oxford University, Computing Laboratory, 1995.

• A. L. Parrill. Evolutionary and Genetic Methods in Drug Design.

Drug Discovery Today, Vol. 1, No. 12, Dec 1996, pp. 514–521.

(91)

Sample Application: Medical Diagnosis

• Classifier Systems application

• Learning by examples

• Lymphography

– 148 examples, 18 attributes, 4 diagnoses

– estimated performance of a human expert: 85% correct

• Prognosis of breast cancer recurrence

– 288 examples, 10 attributes, 2 diagnoses – performance of human expert unknown

• Location of primary tumor

– 339 examples, 17 attributes, 22 diagnoses

– estimated performance of a human expert: 42% correct

(92)

Medical Diagnosis Results

• Performance indistiguishable from humans

• Performance for breast cancer: about 75%

• In primary tumor, patients with identical symptoms have different diagnoses

• Symbolic (= comprehensible) diagnosis rules

(93)

Medical Diagnosis References

Pierre Bonelli, Alexandre Parodi, “An Efficient Classifier System and its Experimental Comparison with two Representative learning methods on three medical domains”. ICGA 4, pp. 288–295.

Tod A. Sedbrook, Haviland Wright, Richard Wright. “Application of a Genetic Classifier for Patient Triage”. ICGA 4, pp. 334–338.

H. F. Gray, R. J. Maxwell, I. Martínez-Perez, C. Arús, S. Cerdán.

“Genetic Programming Classification of Magnetic Resonance Data”. GP

‘96, p. 424.

Alejandro Pazos, Julian Dorado, Antonio Santos. “Detection of Patterns in Radiographs using ANN Designed and Trained with GA”. GP ‘96, p.

432.

(94)

Sample Application: Radiotherapy Treatment Planning

• X-rays or electron beams for cancer treatment

• Conformal therapy: uniform dose over cancerous regions, spare healthy tissues

• Constrained optimization, inverse problem

• From dose specification to beam intensities

• Constraints:

– beam intensities are positive

– rate of intensity change is limited

• Conflicting objectives: Pareto-optimal set of solutions

(95)

RTP: The Problem

plane of interest

tretment area

organ at risk

head x

y

z beam

TA: dose delivered to treatment area OAR: dose delivered to

organs at risk

OHT: dose delivered to other healty

tissues

TA = 100%

OAR < 20%

OHT < 30%

(96)

RTP: Fitness and Solutions

|TA - TA*|

|OAR - OAR*|

A

B C

Pareto optimal set

(97)

Radiotherapy Treatment Planning References

• O. C. L. Haas, K. J. Burnham, M. H. Fisher, J. A. Mills. “Genetic Algorithm Applied to Radiotherapy Treatment Planning”.

ICANNGA ‘95, pp. 432–435.

(98)

Evolutionary Algorithms and Soft Computing

EAs

FL NNs

optimization optimization

monitoring fitness

SC

(99)

Soft Computing

• Tolerant of imprecision, uncertainty, and partial truth

• Adaptive

• Methodologies:

– Evolutionary Algorithms – Neural Networks

– Bayesian and Probabilistic Networks – Fuzzy Logic

– Rough Sets

• Bio-inspired: Natural Computing

• A Scientific Discipline?

• Methodologies co-operate, do not compete (synergy)synergy

(100)

Artificial Neural Networks

axon dendritis

synapsis

x

1

x

2

x

w

1

w

2

w

y

(101)

Fuzzy Logic

1

0

(102)

EAs

FL NNs

optimization

fitness

(103)

Neural Network Design and Optimization

• Evolving weights for a network of predefined structure

• Evolving network structure

– direct encoding – indirect encoding

• Evolving learning rules

• Input data selection

(104)

Evoluzione dei pesi (struttura predefinita)

0.2 -0.3

0.6 0.7

-0.5 0.4

(0.2, -0.3, 0.6, -0.5, 0.4, 0.7)

(105)

Evolving the Structure: Direct Encoding

6

4 5

1 2 3

1 2 3 4 5 6

1 0 0 0 1 1 0

2 0 0 0 1 0 1

3 0 0 0 0 1 0

4 0 0 0 0 0 1

5 0 0 0 0 0 1

6 0 0 0 0 0 0

(106)

Evoluzione pesi e struttura feed-forward codifica diretta

(3, 2, 3)

3x3 3x2 2x3 3x1

W0 W1 W2 W3

(107)

Evoluzione pesi e struttura feed-forward codifica diretta

• Operatore di mutazione:

– rimozione neurone: elimina colonna in Wi - 1, riga in Wi; – duplicazione neurone: copia colonna in Wi - 1, riga in Wi; – rimozione di un layer con un solo neurone: WTi - 1 Wi; – duplicazione di un layer: inserisci matrice identità;

• Operatore di semplificazione:

– rimuovi neuroni con riga in Wi di norma < ;

• Operatore di incrocio:

– scegli due punti di incrocio nei genitori;

– scambia le code;

– collega i pezzi con nuova matrice di pesi casuale

(108)

Structure Evolution: Direct Encoding









































1 1

1 , 1

1 0

1 , 0

1 0

0 , 1

1 0

0 , 0

0 0

0 0

, ,

,

e d

c b

a

b a

a D a

a a

a C a

e a

a B a

c a

d A c

D C

B S A

Graph-generating Grammar

(S: A, B, C, D || A: c, d, a, c || B: a, a, a, e || C: a, a, a, a || ... )

(109)

EAs

FL NNs

optimization

SC

monitoring

(110)

Evolutionary Algorithms and Fuzzy Logic

Evolutionary Algorithm

Fuzzy Sistem

Fuzzy Government

fuzzy fitness fuzzy operators

1 2

3

(111)

• Representation

• Genetic operators

• Selection mechanism

• Example: Learning fuzzy classifiers

Fuzzy System Design and Optimization

(112)

Fuzzy Rule-Based Systems

(113)

totally overlapping membership functions

10011000 11011010

membership function genes

c

1

c

2

c

3

c

4

00001010

max = NdomNinput * Noutput rule genes of value (0 ... Ndom)

input output

FA

1

FA

2

FA

3

FA

1

FA

2

FA

3

R

1

R

2

. . . R

max

genotype

rules

Representation of a Fuzzy Rulebase

(114)

Input

membership functions

Output MFs

Rules

IF x is A AND v is B THEN F is C

IF a is D THEN F is E

IF  is G AND x is H THEN F is C

A richer representation

(115)

Initialization

Input variables

Output variables

Rules

no. domains = 1 + exponential(3)

min C max

no. domains = 2 + exponential(3)

no. rules = 2 + exponential(6)

IF is AND is AND is AND is THEN is

for each input variable, flip a coin to decide whether to include

a b c d

(116)

Recombination

IF x is A AND v is B THEN F is C

something else something else IF true THEN F is K

something else IF a is D THEN F is E

IF  is G AND x is H THEN F is C

something else

IF x is A AND v is B THEN F is C

IF a is D THEN F is E

IF  is G AND x is H THEN F

A rule takes with it all the referred domains

with their MFs

(117)

Mutation

• {add, remove, change} domain to {input, output} variable;

• {duplicate, remove} a rule;

• change a rule:

{add, remove, change} a clause in the {antecedent, consequent}

input MF perturbation:

a b c d

(118)

Esempio: “Learning fuzzy classifiers”

Presentazione di

PowerPoint

(119)

• Motivation:

– EAs easy to implement

– little specific knowledge required – long computing time

• Features:

– complex dynamics – non-binary conditions

– “intuitive” knowledge available

Controlling the Evolutionary Process

(120)

ALGORITHM

statistics visualization

KNOWLEDGE

Knowledge Acquisition

(121)

• Fuzzy fitness (objective function)

• Fuzzy encoding

• Fuzzy operators

– recombination – mutation

• Population Statistics

Fuzzfying Evolutionary Algorithms

(122)

• Faster calculation

• Less precision

• Specific Selection

Fuzzy Fitness

(123)

“Fuzzy rulebase for the dynamic control of an evolutionary algorithm”

Population

S ta tis tic s

P ar am et er s

If D(Xt) is LOW then pmut is HIGH

If f (Xt) is LOW and D(Xt) is HIGH then Emerg is NO . . .

Fuzzy Government

(124)

EAs

FL NNs

integration

(125)

Neuro-Fuzzy Systems

• Fuzzy Neural Networks

– fuzzy neurons (OR, AND, OR/AND)

– learning algorithms (backpropagation-style) – NEFPROX

– ANFIS

• Co-operative Neuro-Fuzzy Systems

– Adaptive FAMs: differential competitive learning – Self-Organizing Feature Maps

– Fuzzy ART and Fuzzy ARTMAP

(126)

Fuzzy Neural Networks

AND

x

1

x

2

x

n

w

11

w

12

w

1n

OR y

AND

1

w

m 2

w

m

w

mn

v

1

v

m

(127)

FAM Systems

x

fuzz defuzz

y

) ( A

1

B

1

) ( A

k

B

k

) 

( A

2

B

2

(128)

EAs

FL NNs

optimization optimization

monitoring fitness

SC

integration

A. Tettamanzi, M. Tomassini. Soft Computing. Springer-Verlag 2001

(129)

Summary and Conclusions

Références

Documents relatifs

In this subsection the characteristics of the HSA-EA were studied on the collection of equi-partite graphs, where we focused on the behavior of the algorithm in the vicinity of

In principle, evolutionary techniques solve selected problems in the same way as human, which in general can be used successfully to a large set of engineering problems like the

We see that it is similar to a genetic algorithm, but the GA evolves solutions to an optimization problem, while the GP evolves computer programs which can themselves solve

However, the majority of game programming books currently on the market target one of two types of readers: hobbyists who want to learn a little bit about mak- ing games

Our first attempt using modular GP to evolve financial trading rules was based on ADF-GP (Yu et al., 2004). There, the evolved rules trade in both stock markets and foreign

Current optimization methods, including the simple genetic algorithm (SGA), are not able to model unstructured problem domains since these methods are not flexible enough to change

Figure 1.8 Possible path that the coordinate search method might take on a quadratic cost

A more promising ansatz for numerical evolutionary optimization algorithms is to place one or more virtual seekers, each representing one possible parameter combination, onto