HAL Id: inria-00399949
https://hal.inria.fr/inria-00399949
Submitted on 29 Jun 2009
HAL is a multi-disciplinary open access
archive for the deposit and dissemination of
sci-entific research documents, whether they are
pub-L’archive ouverte pluridisciplinaire HAL, est
destinée au dépôt et à la diffusion de documents
scientifiques de niveau recherche, publiés ou non,
Multilevel Approach
Abderrahmane Benzaoui, Régis Duvigneau
To cite this version:
Abderrahmane Benzaoui, Régis Duvigneau. Efficient Hierarchical Optimization using an Algebraic
Multilevel Approach. [Research Report] RR-6974, INRIA. 2009, pp.31. �inria-00399949�
a p p o r t
d e r e c h e r c h e
9
-6
3
9
9
IS
R
N
IN
R
IA
/R
R
--6
9
7
4
--F
R
+
E
N
G
Thème NUM
Efficient Hierarchical Optimization using an
Algebraic Multilevel Approach
Abderrahmane BENZAOUI — Regis DUVIGNEAU
N° 6974
AbderrahmaneBENZAOUI , Regis DUVIGNEAU
ThèmeNUMSystèmesnumériques ProjetOPALE
Rapportdere her he n°6974June200928pages
Abstra t: This paper presents an e ient method to redu e the optimization ost. In thismethod, theeigenve torsoftheHessianoftheobje tivefun tion aredeterminedrst. Then,the sear hforthe optimumis arriedoutsu essivelyinsubspa esdened bythese ve tors. Forthispurpose, theMulti-dire tional-Sear hAlgorithmisusedin thisstudy,but any other optimization algorithm an be employed. The method is validated in two test ases: analyti alfun tion and shape re onstru tion problem. In both ases, this method showsverypromisingresults.
Key-words: Multilevel optimization, Spe tral De omposition of the Hessian, Multi-dire tional-Sear hAlgorithm.
Résumé : Ce rapport présente une méthode e a epermettant de réduire le oût des algorithmes d'optimisation. Dans ette méthode, les ve teurs propres du Hessien de la fon tionnellesontdéterminés dansun premier temps. Ensuite, lare her hede l'optimum est ee tuée dans des sous-espa es dénis par es ve teurs. Pour ela, l'Algorithme de Re her heMultidire tionnelleestutilisédans etteétude. Cependant,toutautrealgorithme d'optimisationpeutêtreemployé. Laméthodeestvalidée surdeux astests: Unefon tion analytique et un problème de re onstru tion de forme. Dans les deux as, ette méthode montredesrésultatstrèsprometteurs.
Mots- lés : Optimisationmultiniveau,Dé ompositionSpe trale duHessien, Algorithme deRe her heMulti-dire tionnelle.
1 Introdu tion
Optimizationtoolsin engineeringdesignproblemsveryoftenrequireahigh omputational ost. This ost originates from two main sour es: First, the evaluation of the optimized fun tion involved in su h problems is in general very expensive. Then, depending on the method employedand onthe dimension of the design ve tor,the optimization pro edure requires ahighnumberof evaluations ofthe obje tivefun tion torea hthe nalsolution. Manystudiesaimedatredu ingthe omputational ostofthese toolsbyredu ingthe ost ofevaluations orbyusing moree ient algorithmsthat requirealowernumberof allto theobje tivefun tion.
To improve the e ien y of the algorithm, some authors propose to use a multilevel approa hinsteadofadire tone. Thismeansthat theoptimizationis arriedout,at some steps, ona oarse levelwhere notall thedesign parametersare onsidered. Their ideais inspiredfromthemultigridtheoryusedtosolveproblemswithdierentialequations. Among thisstudies,JamesonandMartinelli[11℄ usedthemultigrid on eptto solvesimultanously the ow equations and the optimization problems. Lewis and Nash [14℄ used the same on epttosolveoptimizationproblemsofsystemsgovernedbydierentialequation. Désidéri etal[6,5,8℄generalizedthisapproa htoelaborateamultilevelshapeoptimizationalgorithm wheretheshapeisparameterizedusingtheBézier urvesoraFree-FormDeformation(FFD) te hnique[17℄. Inthese studies, the oarse optimization is arried outon subspa es that depend on the parameterization of the geometry of interest. In our study, we propose a moregeneralmethodthat ana eleratethe onvergen eoftheoptimizationalgorithmand anbeemployedforanykindofproblem. Thismethodusestheeigenve torsoftheHessian of the ost fun tion to dene the subspa esof optimization. This is a more e ient and abstra tapproa hforamultileveloptimization. Inthisarti le,weintrodu eourmultilevel methodandpresentthe orrespondingalgorithm. Thismethodisthenvalidatedintwotest fun tionsin ordertobeemployedin furtherstudies.
This arti le is organised as follow: we start rst by giving a short survey of the op-timization methods. In parti ular, wepay moreattentionto theMulti-dire tional-Sear h Algorithmmethod (MSA)[4℄whi h isusedin this study. Then, wedes ribein details our multilevelapproa h. Afterthat,weshowthevalidationofthisapproa hontwotest ases: ananalyti alfun tionandashapere onstru tionproblem. Finally,thearti leisterminated byageneral on lusion.
2 Survey of optimization methods
Optimizingafun tion
f
(generally alled a ostfun tion)isndingitsminimum. Inmany problems, this minimization is subje t to some onstraints. To make the problem easier, the onstraintsareintegratedin the ostfun tion bymeansofapenalitytermoranyother adequatewaydepending onthe nature ofthese onstraints. There are severalmethods of optimizationwhi h anbe lassiedintotwoprin ipal ategories:deterministi methodsand sto hasti methods. Deterministi methods do not make useof random parameterswhenlooking for theoptimum. So, the sameroutine, rantwo timesunder thesame onditions leadsto the sameresults. Howevere, a sto hati method usessomerandom te hniquesto lo atetheoptimum. Thus,thesameroutine,rantwotimesunderthesame onditionsdoes not ne essarylead to the same results. Even though, when onverged, the resultof this routinemustbeunique.
The deterministi methods are lassiedinto gradient-based method and gradient-free ones. Gradient-based methods are the most lassi al and the most intuitive optimization te hniques. Finding a minimum of a given onvex fun tion is looking for the zero of its gradient. Oneofthemostpopularmethodsisthesteepestdes entone. Itusesthegradient ve torto dene apath ofoptimization. Butthis anhavealowrateof onvergen e. The Newton method is an enhan ement of the des end methods by means of the inverse of the Hessian matrix. Sin e the omputation of the inverse of the Hessian is umbersome, Quasi-Newtonmethodssuggestanapproximationofthismatrix,whi hisupdatedatea h iteration.
Many di ulties appear whenusing gradient-basedmethods. First,theyneed to om-pute the rst and sometimes the se ond derivatives whi h an be ostly and sensitive to numeri alnoises[10℄ spe iallywhenthe derivativesare evaluated usingte hniques su h as nite-dieren e method. These errors an generate false lo al minima. Then, Gradient-based methods an easily be trapped into lo al minima and so do not lo ate the global optimum. Many designproblems are multimodal (i.ethey havemany lo al minima). So, thesemethodsarenotsuitedforsu hproblems.
Simplex methods are an alternative of the gradient-based ones. These methods does notmakeuseof thederivativesofthe ostfun tion. theirmain ideaisthedispla ementof asimplex(i.e a set of
n
+ 1
design ve tors,wheren
is the dimensionof these ve tors) in thedesign spa e, so to get ade rease of the ost fun tion. The su ess of these methods is motivated by the development of parallel ma hines. Then
+ 1
evaluations of the ost fun tion anthenbemadesimultaneously. AmongtheSimplexmethods,TheNelder-Mead Algorithm, and the Multi-Dire tional-Sear h Algorithm (MSA) are the most popular [4℄ [16℄. Nevertheless, as thegradient-basedmethods, the Simplex methods are also trapped intolo alminimaandsonotverysuitedtomultimodalproblems.Sto hasti methodsareknowntobemorerobustandabletoavoidlo alminima. These methodsgenerallymimi naturalphenomenatoobtaintheoptimum. Amongthesto hasti methods, theEvolutionary strategiesandParti leSwarm Optimisationmethod (PSO)are themost ommonlyused. Thesete hniquesuseasetof andiatesolutions alledpopulation. In Evolutionary Algoritms, the andidate solutions are treated as if they were biologi al spie esthatevolvetoabest tness. Usingsomeoperatorsthatmimi biologi albehaviours (su h as reprodu tion, sele tion, mutation), the optimum solution an be obtained after some generations (iterations). In Parti le Swarm Optimization, The andidate solutions are treatedas if theywere animals that move to abest lo ation(to nd food or to avoid apredatorfor exemple). At ea h iteration, thenewposition of ea h parti le is inuen ed byitsownmemory(lo al memory)andbythememoryofitsneighbours(globalmemory). The operators of the sto hasti methods use some random parametersin order to sear h
thesolutioninto theentiredesignspa e. Forthis reason,theyarelesslikelyto betrapped bylo al minimaandtheyare moreableto lo ateglobaloptimum. Themain drawba kof thesemethods isthenumberofthe ostfun tion evaluationswhi hdependsonthesize of thepopulation. Hen e thesto hasti methods aregenerallyexpensiveiftheydo notuse a oarseapproximationofthe ostfun tion.
A survey of the optimization te hniques anbe found in [7℄ for instan e. The aim of ourstudyisnotto omparethesete hniques. Wejust needtouseoneofthemto validate our Multilevel optimization algorithm. Be ause of its simpli ity, the MSA algorithm was employed. Bellow,wepresentindetails thisalgorithm.
3 Multi-dire tional-Sear h Algorithm
Thisalgorithm,developpedbyDennisandTor zon[4℄,usesasimplextondtheoptimum. This simplexis omposed of
n
+ 1
design ve torsX
0
,X
1
, ...,X
n
. After being initialised, thesimplexinea hnewiterationisobtainedasfollows:Suppose that
X
(k)
0
is the vertex that gives the smallest ost fun tion among all the simplexverti es,wherek
istheiterationnumber. Thenthesimplexisree tedwithrespe t toX
(k)
0
a ordingtothefollowingequation:g
X
i
(k)
= (1 + α)X
0
(k)
− αX
(k)
i
i
= 0, ..., n
(1) whereg
X
i
(k)
designatetheree tedverti esandα
isapositifrealusually equalto1
. Ifthe reexion is su essful,i.e ifoneof the newverti eshasasmallest ost fun tion than that ofX
(k)
0
, this means that perhaps the solution is in this dire tion. So the newsimplex is expandedto pursue the sear h in this dire tion. Thesimplex of thenew iterationis then obtainedby:X
i
(k+1)
= (1 − γ) g
X
(k)
0
+ γ g
X
(k)
i
i
= 0, ..., n
(2) whereγ >
1
is theexpansion oe ient. Usuallyγ
is set to2
. In the other ase, if the reextion fails, the simplexis ontra tedsothat the verti es ome loserto the best one. Thesimplexofthenewiterationisthenobtainedby:X
i
(k+1)
= (1 − β) g
X
0
(k)
+ β g
X
i
(k)
i
= 0, ..., n
(3) whereβ
isthe ontra tion oe ient. Usuallyβ
isset to−
1
2
.Thealgorithm is arried outuntilsatisfyingagivenstopping riterion. Several hoi es forthis riterionarepossible. Inourstudy,we omputethedistan e(usingEu lideannorm)
fromthebestvertex
X
(k)
0
. Then,thestopping riterionisgivenby:max
i
k X
i
(k)
− X
(k)
0
k
k X
ref
k
< ǫ
(4)where
X
ref
isagivenreferen eve tor.Whilethis onditionisnotsatisedandthenumberofiterationisbelowthemaximum valueforthesele tedlevel,theMSAalgorithmpursuestheiterations.
4 Multilevel optimization
When studying an optimization problem in uid dynami s for instan e, many di ulties o ur. First,thephysi alproblem ismodeledby omplexequations(su hasNavier-Stokes equations),sothattheevaluationofthe ostfun tionis umbersome. Then,thenumeri al omputationofthephysi alproblemisexpensive. Andnally,thene parameterizationof theoptimized obje tleadsto a high dimensiondesign ve tor whi h resultsin asti opti-mizationsear h. Manyauthors proposed hierar hi alte hniquestomaketheoptimization algorithm heaper. Among these te hniques, we an ite the use of asimplied model of thephysi al problem(forexemple,theuse ofEulerequationsinsteadof theNavier-Stokes ones),theuseofametamodelinsteadoftheexa tmodel,ortheuseofhierar hi al param-eterization instead of a single level one. Here wedo notdes ribenor givean exhaustive listof these hierar hi alte hniques. Theinterestedreader anreferfor instan eto [9℄. In ourstudy,weareinterestedonlybyte hniques on erningtheparameterizationlevelofthe optimisedobje t(Multilevelalgorithms).
Theideaofthemultilevelalgorithmsistoa eleratetheneleveloptimizationbylooking forthesolutionona oarselevelwhenne essary. Thisisinspiredbythemultigridmethod usedto solveproblemswith partialdierentialequations. Indeed,it iswellknownin su h problems, that the omputation of a ne solution is expensive not only be ause of the in reasediterations ost(resolutionofahighdimensionlinearsystem),butalsobe auseof thelowrateof onvergen eoftheiterativealgorithm. Thisiswhythemultigridte hniques usea oarseleveltoa eleratethenelevelresolution. Severalstrategies anbe onsidered, rangingfromsimplelevelin reasetoV- y le orFullMulti-Gridapproa hes[6℄.
Theideaissimilarinoptimization. Earlierstudiesreprodu edsu essfullythemultigrid on ept and strategies to optimization problems. In parti ular, Jameson and Martinelli [11℄ generaliseda multigrid ode to optimization in aerodynami design. Lewisand Nash [14, 15℄ used the multigrid on ept for optimization of systems governed by dierential equations. A surveyofthesemethods anbefoundin[2℄. Furthermore,Désidériet alused themultigridapproa htoelaborateamultilevelshapeoptimizationwithaBézieroraFFD parameterization [1,6, 8℄. In these studies, when the Bézier orFFD parameterization is employed, the transferfrom a oarse level to aner oneis done using the Bézier degree-elevation pro ess, whi h is a straightforward te hniquethat permits to add some ontrol pointswithoutanaymodi ationontheoptimizedshape. Eventhough,thisisnotthebest waytouseamultilevelstrategy.
In the present study, amore e ient method is proposed by ombining the multigrid on eptwiththespe tralde ompositionoftheHessianmatrix. Indeed,forades entmethod forinstan e, thesmallesteigenvaluesoftheHessianmatrix orrespondtodire tions where the onvergen e of the optimization algorithm is veryslow, while the highest eigenvalues
orrespond to dire tions where the onvergen e is fast. Thus, instead of iteratingon the entiredesign spa e,ouralgorithm looks forthesolutionin asele tedsubspa ein order to geta fastersolution in thedire tions of low onvergen e rate. Then itpursues the sear h ontheentiredesignspa e. This anbedownbyseveralstrategiesanalogoustothoseofthe multigridmethod.
Let
E
be the entire design spa e andX
0
∈ E
is a starting design ve tor. The mul-tilevel algorithmstarts rst by omputing theHessian matrixH
(X
0
)
and itseigenve torsV
= (v
1
, ..., v
n
)
. These ve torsarerankedfrom the smallestto the highest orresponding eigenvalue. SupposethatX
(l)
∈ E
isthedesignve torobtainedatalevel
l
. Then,the mul-tilevelalgorithmlooksforthenewdesignve torX
(l+1)
∈ E
atthenewlevel
l
+ 1
byadding a orre tiontermwhi h minimizesthe ostfun tion in the urrentoptimization subspa e. Thisisdoneasfollows:Suppose that the new level is hara terized by
m
= m
l+1
parameters (m
≤ n
) and onsiderthebasisV
m
and thesubspa eE
m
denedby:V
m
= (v
1
, ..., v
m
)
(5)E
m
= {z = V
m
y
| y ∈ ℜ
m
}
(6) The orre tiontermmustbeave torfromE
m
. Hen e,to ndthebest orre tion,the optimizationalgorithmlooksforave tory
∈ ℜ
m
sothatthe ostfun tion:
g(y) = f (X
(l)
+ V
m
y)
(7) is minimized. In this way, we an either go from a oarse to a ner level orfrom a ne to a oarser level withoutloosing any information obtainedfrom the formerlevel. In our ase,usingtheMSAalgorithm,atea hnewlevelwestartbytheinitializationofasimplex ofm
+ 1
verti esinℜ
m
(one of the verti es is zero). Then, the MSA algorithm ndsthe optimumvalue
y
∗
thatminimizes(7). Thedesignve torofthenewlevelis then:
X
(l+1)
= X
(l)
+ V
m
y
∗
(8)The resolution at any level an be full or in omplete. The algorithm an be arried out using several levels, ordered in someway alled a y le, by analogy to the multigrid terminology. So a y le is dened by asequen e of levelsof dimensions
(m
1
, m
2
, ..., m
nl
)
andbythe orrespondingnumbersof iterations(it
1
, it
2
, ..., it
nl
)
wherenl
isthenumberof levelsinthe y le. Then,the y le anberepeatedmanytimesifne essary.Themultilevelalgorithm issummarizedbelow:
1. Read theinputdataandinitializethedesignvariable
X
0
; 2. initializethenumberof y lesatk
= 0
;3. while
k < nc
andthestopping riterionnotsatised,performiterationsonamultilevel y le:(a) omputethe Hessianmatrix
H
(X
k
)
(usingan exa torasurrogate model) and evaluateitseigenve tors;(b) initializethelevelnumberat
l
= 1
; ( ) letX
l
= X
k
(d) while
l
≤ nl
do:i. sele tabasis
V
m
withm
= m
l
eigenve tors,ii. usetheMSAoptimisertoperforma orre tionve tor
y
∗
∈ ℜ
m
thatminimizes the ostfun tion
g(y) = f X
(l)
+ V
m
y
; iii.X
(l+1)
←− X
(l)
+ V
m
y
∗
; iv.l
←− l + 1
andgoto(3d); (e)X
k+1
←− X
(l)
; (f)k
←− k + 1
andgoto(3); 4.X
∗
←− X
(k)
.Thestopping riterionin thestep(3)of theabovealgorithm istherelativede rease in the ost fun tion. While this de rease is abovea given value, the algorithm pursues the omputation. Furthermore,in step(3(d)i), the
m
l
eigenve tors orrespondtothe smallest eigenvalues of the Hessian. In this way, on a oarse level, the algorithm fo uses only on dire tionswithlow onvergen erate.5 Appli ations
5.1 First test ase: analyti al fun tion
Thistest aseisastraightforwordproblemthatallowsustovalidateouroptimization algo-rithmand tondthebest multilevelstrategy. Thefun tion tobeoptimizedis aquadrati fun tiongivenbythefollowingexpression:
f
(X) = a + B
T
X
+ X
T
CX
(9)where
X
∈ ℜ
n
is the optimization parameter. In this fun tion,
a
∈ ℜ
,B
∈ ℜ
n
and
C
∈ ℜ
n×n
are hosenarbitrarily. Butto guaranteetheexisten eofaminimum,
C
istaken asasymmetri positivedenitematrix. Inthis ase,theHessianoff
issimplyequalto2C
andis alsopositivedenite. Sin ewewouldliketo usethisfun tion to testthe multilevel optimizationwith aHessiande ompositionapproa h,C
is hosenin amannerthat allows usto ontrol itseigenvaluesand eigenve tors. Morepre isely,forn
= 12
, wehave hosen:a
= 2
andB
= (1, 3, 2, 4, −1, 5, 10, 8, 9, 0, −4, 12)
T
. Thematrix
C
isgiveninappendix(A ). This fun tion has a unique minimum that an be obtained by onsidering that the gradientisequaltozero. ThisisequivalenttosolvethesystemCX
= −
1
Inorder to examinetheee t of the onditionnumberof theHessianmatrix onthe opti-mizationpro edure,wehavetestedthreefun tionsoftheform(9)whereonlythematrix
C
ismodied. Table(1)givesthe onditionnumberandtheoptimalvalueforthethreetested fun tions.Fun tion Conditionnumber Optimalvalue
f
1
10
−39.55215
f
2
18000
−32.517335
f
3
10
11
−21.79586
Table1: Conditionnumbersandoptimalvaluesforthethreetestedfun tions.
In the following se tion, we tryto ndthese valuesusing the optimization algorithm. The e ien y of the multilevel strategy is measured by the number of evaluations of the fun tionrequiredtorea htheoptimum.
5.1.1 Single level optimization
Before testing the multilevel algorithm we try to nd the optimum value of the above fun tionusingtheMSAmethodwithoutanymultilevelnorHessiande ompositionstrategy. Forinstan e,forthefun tion
f
2
,we arryout35000
iterationsoftheMSAoptimizerinorder tondagoodapproximationofthevaluegivenin table(1). Theoptimumvalueobtained isf
min
= −32.516920
. Wehavesoarelativeerrorof0.001%
. The orrespondingnumberof evaluationsis910014
, whi h is veryhigh. This meansthat usingthe samealgorithm in a problemwiththesameparameterizationsize,andwheretheevaluationofthe ostfun tion isexpensive anbeverysti,ifpossible. Soalimitationin thea ura yisthenne essary. Ifwelimitatetheiterationnumberto5000
,whi hisstillahighnumber,theoptimumvalue obtainedis−29.2075
andtherelativeerroris10%
.This exemple shows the ne essity to improve the optimization pro edure to be ome faster. Inorder tovalidateourmultilevelalgorithmand toproveitse ien y,westartby optimizingthe abovefun tion using asimple levelstrategy witha spe tral de omposition of the Hessian. This means that the optimizedfun tion is equivalentto that of equation (7)where
m
= n
. Therefore,it onsistsin workingin theeigenve torsbasis. This isdone bysettingnl
= 1
andm
1
= n
in thealgorithm des ribedin 4. Theevolutionof the ost fun tion withrespe tto thenumberofevaluationsispresentedingure(1). Theresultof thisoptimizationisspe ta ular. ThesimpleuseoftheHessianeigenve torsbasispermitsthe MSA optimizerto onvergeveryfast. Indeed, theoptimumvaluef
min
= −32.517335
was rea hed after130
iterationsonly with3394
evaluations ofthe ostfun tion. This method is thus268
times faster then the rst one! As we will see, this performan e depends on the onditionnumberoftheHessianmatrixand anbebetterusinganadequatemultilevel strategy.-35
-30
-25
-20
-15
-10
-5
0
5
0
500
1000
1500
2000
cost function
Number of evaluations
simple MSA optimization
MSA optimization using Hessian decomposition
Figure1: Evolutionofthe ostfun tion
f
2
forasingleleveloptimization.5.1.2 Multileveloptimization
Asmentionedabove,themultilevelstrategy animprovetheperforman eoftheoptimization algorithmwhen itisused inan adequatemanner. Forthe samefun tion
f
2
,we arry out thealgorithm des ribedin 4 with twoand three levels usingthe strategyshown in table (2). To be e ient, it is re omended to usea few numberof iterations at ea h level and to repeat the multilevel y le until onvergen e. However, a slight modi ation on this algorithm is introdu ed. It onsists on dividing the initial simplex by twoat ea h y le. Without this modi ation, when thedesign variableX
isnear thenal solution, therst MSA iterationsat ea h levelbe omeine ient be ause thesimplex verti esare relatively farfrom the solution. In this ase, node rease in the ostfun tion anbe obtainedifwe useaveryfewnumberofiterations. So,asmallsimplexis thenmoresuitable.Table(3)presentsthemultileveltestsforthefun tion
f
2
usingthestrategydes ribedin table(2),whereasgure(2)showstheevolutionof ostfun tionwithrespe ttothenumber of evaluations for the best ases. In table (3), the ase1
orresponds to the single level optimizationusingthespe tralde ompositionoftheHessian.Cy les y le
1
y le2
y le3
Finelevelresolution
•
•
•
ց
ց
ց
Coarselevelresolution
•
•
•
Table2: s hemati des reptionofthetwo-level y les.
Case levels numberof numberof numberof optimalvalue iterationsby y le y le evaluations
1
12
130
1
3394
−32.517335
2
12
6
3
ց
3
19
2661
−32.517335
3
12
6
3
ց
2
22
2773
−32.517307
4
12
6
3
ց
4
18
2773
−32.517335
5
12
6
2
ց
4
22
2817
−32.517302
6
12
6
4
ց
2
18
2737
−32.517335
7
12
6
3
3
ց
2
ց
2
17
2483
−32.517335
Table3: Des riptionoftheMultileveloptimizationoftheanalyti alfun tion
f
2
.As we an see in table (3) and in gure (2), the multilevel strategy allows a faster optimization than the single level one. Indeed, if we ompare ases
1
and7
for instan e, we an see that the multilevel strategy allows a redu tion in the number of evaluations of about27%
. The best multilevel tests weobtained for this fun tion are ase2
for two-level optimization and ase7
for three-level optimization. In these ases, the number of iterationsbylevelissmall butenoughtogetgoodperforman es. Itisnotne essarytouse ahigher numberof iterationsby level be ausethis will resultin an in reasing numberof-35
-30
-25
-20
-15
-10
0
200
400
600
800
1000
1200
1400
cost function
Number of evaluations
case 1
case 2
case 4
case 6
case 7
Figure2: Evolutionof the ostfun tion
f
2
foramultileveloptimization.evaluationswithoutanya elerationofthe onvergen e. However,ifweredu ethenumber ofiterations,the onvergen ebe omesslowresultingin anin reasingnumberof y lesand soanin reasingnumberofevaluations,whilethea ura yontheoptimumvalueisworst.
5.1.3 The ee t of the onditionnumber
Tables(4)and(5)presentthemultileveltestsforthefun tions
f
1
andf
3
,whereasgures (3) and(4) showtheevolutionof ostfun tion with respe t to thenumber ofevaluations for the best ases. In these tables, ases0
refers to the single level optimization using a simpleMSAmethod, while ases1
refersto thesingleleveloptimizationusingthespe tral de ompositionoftheHessian.We anseeonthese tablesandguresthat, whenthe onditionnumberof theHessian issmall (fun tion
f
1
),theHessiande omposition isnotinterestingsin ethetotalnumber of evaluations at the onvergen e is higher slightly in ase1
than that in ase0
of table (4). Eventhough,themultilevelapproa his stillinterestingandaverygood redu tionon thetotal numberof evaluations an beobtained(about46%
of redu tionbetween ases2
and0
). Note that, using the MSA method, the optimization at the oarse levelneeds aCase levels numberof numberof numberof optimalvalue iterationsby y le y le evaluations
0
12
180
1
4694
−39.55215
1
12
210
1
5474
−39.55215
2
12
6
3
ց
3
18
2521
−39.55215
3
12
6
4
ց
3
19
3155
−39.55215
4
12
6
3
3
ց
2
ց
2
18
2629
−39.55215
Table4: Multileveloptimization oftheanalyti alfun tion
f
1
withaV- y lestrategy.-40
-35
-30
-25
-20
-15
-10
-5
0
5
0
500
1000
1500
2000
f(X)
Number of evaluations
case 0
case 1
case 2
case 3
case 4
-4
-3
-2
-1
0
1
2
0
200
400
600
800
1000
1200
1400
1600
log10(f(X)-fmin)
Number of evaluations
case 0
case 1
case 2
case 3
case 4
Figure3: Evolutionofthe ostfun tion
f
1
forasinglelevelandamultileveloptimization.smallernumberofevaluationsthan theoptimizationat thene level, be ause thenumber of verti esis smaller(equal to
m
+ 1
insteadofn
+ 1
). Butthis is nottheunique reason forwhi h themultileveloptimizationisinteresting. Indeed,ifwe ompare ases2
and0
inCase levels numberof numberof numberof optimalvalue iterationsby y le y le evaluations
0
12
1000
1
26014
1.997
1
12
82
1
2146
−21.735863
2
12
6
3
ց
3
12
1681
−21.735863
3
12
6
4
ց
3
12
1993
−21.735863
4
12
6
3
ց
4
12
1849
−21.735863
5
12
6
3
3
ց
2
ց
2
15
2191
−21.735862
6
12
6
3
3
ց
2
ց
3
12
1849
−21.735863
Table5: Multileveloptimization oftheanalyti alfun tion
f
3
withaV- y lestrategy.table(4) we anseethat notonlythenumberofevaluationsisredu ed,but alsothetotal numberof iterations. This means that, eventhough its ee t is notverystrong, looking forthesolutiononsubspa esgeneratedbytheeigenve torspermitstoa eleratetheglobal onvergen eoftheoptimizationalgorithm.
Nevertheless,whenthe onditionnumberoftheHessianishigh(fun tion
f
3
),the Hes-siande omposition seemsto bene essaryandverye ient. Indeed,After1000
iterations, the single level MSA optimizer withoutde omposition of the Hessian is unable to obtain any sensitivede rease in the ost fun tion. However,when the Hessiande omposition is employed, only82
iterations are needed to rea h the optimal value (see ases0
and1
of table (5)). Themultilevelapproa his of mildinterestforthis fun tion. Indeed,in ase2
wegainabout22%
in thenumberofevaluationsifwe ompareitto ase1
.Fromthesetestswe an on ludethatthemultilevelstrategypermitstoa eleratethe onvergen eofthe optimizationalgorithm and toredu e its ost. This is truewhateveris the ondition number ofthe Hessianmatrix. Ifthis numberis high, it be omes ne essary
-22
-20
-18
-16
-14
-12
-10
0
100
200
300
400
500
f(X)
Number of evaluations
case 1
case 2
case 3
case 4
case 6
-4
-3
-2
-1
0
1
2
3
0
200
400
600
800
1000
1200
1400
1600
log10(f(X)-fmin)
Number of evaluations
case 0
case 1
case 2
case 3
case 4
case 6
Figure4: Evolutionofthe ostfun tion
f
3
forasinglelevelandamultileveloptimization.to look forthesolutionin subspa esgenerated bythe eigenve torsofthe Hessian. Inthis ase,thegainintheoptimization ostissigni ant.
5.2 Se ond test ase: Shape re onstru tion problem
5.2.1 Test- asedes ription
Inthistest ase,wewouldliketoapproa hagiventargetfun tion
F
0
(t)
byaBézier urveF
(t, X)
,whereX
= (x
1
, ...x
n
)
T
isthedesignve torwhose omponentsarethe ontrolpoints oftheBézier urve. So,theapproximatedfun tion isgivenby:
F
(t, X) =
n−1
X
k=0
B
n−1
k
(t)x
k+1
(11)for
t
∈ [0, 1]
. Inequation(11),n
istheparameterizationlevelandB
k
n−1
(t)
aretheBernstein polynomialsgivenby:B
n
k
(t) = C
k
n
t
k
(1 − t)
n−k
(12) whereC
k
n
=
n!
k!(n − k)!
(13)Thetargetfun tionwesele tinthisstudyisalsoaBézier urve 1 ofdegree
n
0
> n
. Itis givenby:F
0
(t) =
n
0
−1
X
k=0
B
k
n
0
−1
(t)x
0k+1
(14) wherex
01
,...x
0n
0
are the ontrol points of the target fun tion whose value are given in appendix(B ).Thesquarederroroftheapproximationofthetargetfun tion
F
0
(t)
bytheBézier urveF
(t, X)
isgivenby:e(X) =
Z
1
0
(F (t, X) − F
0
(t))
2
dt
(15)For agiven parameterization level
n
, the approximation problem is equivalent to the sear hofadesignve torX
thatminimizesthesquarederror. Thisishen eanun onstrained optimizationproblemwherethe ostfun tionisf
(X) ≡ e(X)
. Atheoreti alsolutionofthis problem anbefoundeasilyby onsideringthat thegradientofthe ostfun tion vanishes attheoptimum. Thisleadstothefollowinglinearsystem:A
· X = B
(16) whereA
= (a
kj
)
1≤k≤n, 1≤j≤n
,B = (b
k
)
1≤k≤n
and:a
kj
= 2
Z
1
0
B
n−1
k−1
(t)B
j−1
n−1
(t)dt
(17)b
k
= 2
Z
1
0
B
k−1
n−1
(t)F
0
(t)dt
(18) Note that the matrixA
is equalto the Hessian of the ost fun tion and is symmetri positivedenite. This quadrati problem hasaunique solutionwhi h anbeobtainedby solvingthesystem(16).Note alsothat theaboveproblemis not onstrainedsin ewehaveno onditiononthe ontrolpoints. However,itisusualtoimpose onditionssu has
x
1
= F
0
(0)
andx
n
= F
0
(1)
. Inthis ase,theresolutionoftheproblemdoesnot hange,but itsdimensionisredu edton
− 2
sin etheunkown ontrolpointsbe omex
2
, ..., x
n−1
.Toevaluatethe ostfun tion,theinterval
[0, 1]
issubdividedinton
p
points. Thesquared error(15)isthengivenby:e(X) ≈
1
n
p
n
p
X
p=1
F
p
− 1
n
p
− 1
, X
− F
0
p
− 1
n
p
− 1
2
(19) 1Thisisjustanexempleandwe ansele tanykindoffun tions.Theresolutionmethodandthebehaviour oftheoptimizationalgorithmdonotdependonthenatureofthetargetshape.
A similar dis retization is doneto omputethe Hessianmatrix
A
, given by (17), and the righthandsidetermofequation(16)whi hisgivenby(18).In order to test our optimization algorithm, we approximate two target fun tions
F
01
andF
02
of parameterizationlevels14
and18
by respe tivelyfun tionsF
1
andF
2
oflevels8
and16
. The orresponding ost fun tions are sof
1
andf
2
respe tively. This allowsus to test at the sametime the ability to approximate dierent shapes and the inuen e of the onditionnumberoftheHessianmatrixonthealgorithm. Table(6)givestheoptimum valueandthe onditionnumbers orrespondingtothese ostfun tionsforn
p
= 20
,whereas gure(5)givestheshapeofthetarget andtheapproximatedfun tions.CostFun tion
n
0
n
Conditionnumber Optimalvaluef
1
14
8
4617.6
1.751264 × 10
−5
f
2
18
16
3.6 × 10
9
1.037 × 10
−9
Table 6: Conditionnumbersand optimal valuesfor thetwotested fun tions of theshape re onstru tionproblem.
0
0.1
0.2
0.3
0.4
0.5
0
0.2
0.4
0.6
0.8
1
F(t)
t
F01
F1
1
1.2
1.4
1.6
1.8
2
0
0.2
0.4
0.6
0.8
1
F(t)
t
F02
F2
Figure 5: Target and approximated urvesfor fun tions
F
01
(left) andF
02
(right) of the shapere onstru tionproblem.5.2.2 Experimentationand results
In this se tion, we try to obtain the approximated fun tion using the optimization algo-rithm. As forthersttest ase,wetestmany asesin orderto ndthebest optimization strategy. Tables (7) and (8) des ribesome asesfor ost fun tions
f
1
andf
2
respe tively ( orrespondingto thetarget fun tionsF
01
andF
02
)wheras gure(6) showstheevolution ofthesefun tions withrespe ttothenumberofevaluationsforthestudied ases.Case levels numberof numberof numberof optimalvalue iterationsby y le y le evaluations
×10
5
0
8
200
1
3610
2.303
1
8
135
1
2440
1.751264
2
8
4
3
ց
5
18
2125
1.751264
3
8
4
3
ց
4
19
2053
1.751264
4
8
4
3
ց
3
15
1471
1.751264
5
8
4
2
3
ց
2
ց
2
19
1958
1.751264
Table7: Thetested asesforthemultilveloptimizationofthe ostfun tions
f
1
oftheshape re onstru tionproblem.Fromtables(7)and(8)andgure(6)we anseethatthebehaviouroftheoptimization algorithm is the sameas that for the rst test ase. Formedium Hessian ondition, the simpleuseof theHessiande ompositionallowsthealgorithm to befaster. The multilevel strategypermitstoa elerate the onvergen e. Thebesta elerationisobtainedwithfew iterationsbylevelandby y le( ases
4
and5
forfun tionf
1
and2
and4
forfun tionf
2
). Thisa elerationismoreimportantforhigh onditionnumber.6 Con lusion
Inthispaperwepresentane ientapproa hofoptimizationbasedontheuseofanalgebrai multilevelmethod. Thismethodusestheeigenve torsoftheHessianoftheobje tivefun tion to dene subspa es of optimization. This approa h is des ribed in details and validated
Case levels numberof numberof numberof optimalvalue iterationsby y le y le evaluations
0
16
200
1
6818
0.00041
1
16
140
1
4778
1.037 × 10
−9
2
16
8
3
ց
3
20
3641
1.037 × 10
−9
3
16
8
4
ց
3
20
4321
1.037 × 10
−9
4
16
8
4
3
ց
2
ց
2
20
3781
1.037 × 10
−9
Table8: Thetested asesforthemultilveloptimizationofthe ostfun tions
f
2
oftheshape re onstru tionproblem.0
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
0
200
400
600
800
1000
1200
1400
cost function
Number of evaluations
case 0
case 1
case 2
case 3
case 4
case 5
0
0.05
0.1
0.15
0.2
0.25
0.3
0
200
400
600
800
1000
1200
1400
cost function
Number of evaluations
case 0
case 1
case 2
case 3
case 4
Figure6: Evolutionofthe ostfun tions
f
1
(left)andf
2
(right)oftheshapere onstru tion problemforseveraloptimizationstrategies.on two test ases. It is shown with this test ases, that the use of the eigenve tors of the Hessian as basis of the design spa e a elerates the onvergen e of the optimization algorithm. This a eleration is signi ant for medium and high ondition number of the Hessian. A slight additional a eleration is obtained when the Hessian de omposition is ombinedtoanadequatemultilevelstrategysimilar tothoseofthemultigridmethod.
TheevaluationoftheHessianmatrixoftheobje tivefun tioninthetest asesis straight-forward. Thisiswhyitis omputedexa tly. Nevertheless,formu h ompli atedproblems, wherenoanalyti alexpression fortheHessianisavailable,it anbeveryusefulto approx-imateitbyasurrogatemodel. Indeed, iftheHessianisevaluatedbytheFinite-Dieren e methodforexemple,thetotalnumberof alltotheobje tivefun tionin reasesbyanorder of
n
2
allforea hHessianevaluation,where
n
isthedimensionofthedesignve tor. Sothe surrogatemodel anredu e the ostoftheHessianevaluationsensitively.Thetestproblemsstudiedinthisarti learequadrati and onvex,thatmeansthatthey haveauniqueminimum and a onstantpositivediniteHessian. Forthese problems, the optimization using the Multi-dire tional-Sear h method is easy. However, this is seldom the asein pra ti e. Most of the engineering problems involve multimodal fun tions and amethod su h MSA isnotsuitablebe auseit isnotableto avoidlo al minima. Then, it would be interesting to test our multilevel approa h using asto hasti algorithm su h as Parti leSwarmOptimization,moresuitableformultimodalproblems.
7 A knowledgments
Thisstudy hase beensupportedbythe "OMD"proje t(Multi-Dis iplinary Optimization) grantedbyANR-RNTL.
Appendi es
A Analyti al fun tionfortestingthe multilevel algorithm
Theanalyti alfun tionsusedinthisstudytotestthemultilevelalgorithmaregivenbythe generalexperssion:
f
(X) = a + B
T
X
+ X
T
CX
where
a
= 2
andB
= (1, 3, 2, 4, −1, 5, 10, 8, 9, 0, −4, 12)
T
. The matrix
C
is onstru ted in su hawaythatpermitsusto ontrolitseigenvaluesandeigenve tors.Indeed,sin ewewant totestamethodbasedontheeigenve torssubspa es,itispreferablethattheeigenve torsof thematrixC
shouldbedierentfromtheunitve torsofthe anoni albasis. Inpra ti e,to getsu hamatrix,westartfromadiagonalonewhi hhasthedesiredeigenvalues. Then,by meansofsu essiverotationsarroundalltheunitve tors,weobtaineamatrixthat onserves thesameeigenvaluesbut forwhi htheeigenve torsaredierentfrom the anoni albasis.Inorder to testthe ee t ofthe onditionnumberof theHessian,three test fun tions,
f
1
,f
2
andf
3
were onsideredinthisstudy. Thesefun tionshavethesamevaluesfora
andB
but dierente matri esC
1
,C
2
andC
3
respe tively. Inthefollowing,wegivethedetails aboutthesefun tions:- eigenvaluesofthematrix
C
1
:1, 2, 2.5, 3, 4, 4.5, 5, 6, 7.2, 8, 9.5, 10
- eigenvaluesofthematrix
C
2
:1, 1000, 14000, 50, 70, 80, 90, 15, 2, 3, 40, 18000
- eigenvaluesofthematrixC
3
:1, 10, 100, 1000, 10
4
,
10
5
,
10
6
,
10
7
,
10
8
,
10
9
,
10
10
,
10
11
- olumns
1
to4
of thematrixC
1
:9.7629155 × 10
+00
5.6208746 × 10
−01
−1.0626881 × 10
−01
−9.2818830 × 10
−02
5.6208746 × 10
−01
8.1451627 × 10
+00
1.3770233 × 10
+00
1.3850436 × 10
−01
−1.0626881 × 10
−01
1.3770233 × 10
+00
6.6849147 × 10
+00
1.8063792 × 10
+00
−9.2818830 × 10
−02
1.3850436 × 10
−01
1.8063792 × 10
+00
5.4974392 × 10
+00
−5.9758123 × 10
−02
2.0864099 × 10
−02
3.0582552 × 10
−01
1.9782906 × 10
+00
−4.0956354 × 10
−02
2.9328205 × 10
−02
6.3251268 × 10
−02
3.8459803 × 10
−01
−3.0950247 × 10
−02
4.9356755 × 10
−02
9.2080432 × 10
−03
−4.7742210 × 10
−02
−2.3490984 × 10
−02
5.3687066 × 10
−02
−1.5039154 × 10
−02
−4.6279645 × 10
−02
−1.8537947 × 10
−02
6.2408534 × 10
−02
−8.7547586 × 10
−02
9.1268011 × 10
−02
−1.3495635 × 10
−02
4.8514758 × 10
−02
−5.2914361 × 10
−02
2.7131573 × 10
−03
−7.6144911 × 10
−03
1.3452063 × 10
−02
3.1096754 × 10
−02
−4.4863302 × 10
−02
2.7329545 × 10
−03
8.5515493 × 10
−04
−5.1350838 × 10
−02
1.4255185 × 10
−01
- olumns
5
to8
of thematrixC
1
:−5.9758123 × 10
−02
−4.0956354 × 10
−02
−3.0950247 × 10
−02
−2.3490984 × 10
−02
2.0864099 × 10
−02
2.9328205 × 10
−02
4.9356755 × 10
−02
5.3687066 × 10
−02
3.0582552 × 10
−01
6.3251268 × 10
−02
9.2080432 × 10
−03
−1.5039154 × 10
−02
1.9782906 × 10
+00
3.8459803 × 10
−01
−4.7742210 × 10
−02
−4.6279645 × 10
−02
4.5304527 × 10
+00
1.8118005 × 10
+00
6.9393908 × 10
−01
3.3417883 × 10
−02
1.8118005 × 10
+00
4.2043160 × 10
+00
1.4764748 × 10
+00
8.8013774 × 10
−01
6.9393908 × 10
−01
1.4764748 × 10
+00
4.0756228 × 10
+00
1.1484860 × 10
+00
3.3417883 × 10
−02
8.8013774 × 10
−01
1.1484860 × 10
+00
4.0376518 × 10
+00
−9.0394638 × 10
−02
1.0894871 × 10
−01
8.9514996 × 10
−01
8.0315921 × 10
−01
1.3949754 × 10
−02
−3.1079111 × 10
−02
2.4563596 × 10
−01
8.5497151 × 10
−01
−6.9448221 × 10
−02
3.1241715 × 10
−02
3.6622936 × 10
−02
3.0216088 × 10
−01
−1.7933843 × 10
−01
1.0483966 × 10
−01
−1.8180862 × 10
−02
−2.3542808 × 10
−01
- olumns
9
to12
ofthematrixC
1
:−1.8537947 × 10
−02
−1.3495635 × 10
−02
−7.6144911 × 10
−03
2.7329545 × 10
−03
6.2408534 × 10
−02
4.8514758 × 10
−02
1.3452063 × 10
−02
8.5515493 × 10
−04
−8.7547586 × 10
−02
−5.2914361 × 10
−02
3.1096754 × 10
−02
−5.1350838 × 10
−02
9.1268011 × 10
−02
2.7131573 × 10
−03
−4.4863302 × 10
−02
1.4255185 × 10
−01
−9.0394638 × 10
−02
1.3949754 × 10
−02
−6.9448221 × 10
−02
−1.7933843 × 10
−01
1.0894871 × 10
−01
−3.1079111 × 10
−02
3.1241715 × 10
−02
1.0483966 × 10
−01
8.9514996 × 10
−01
2.4563596 × 10
−01
3.6622936 × 10
−02
−1.8180862 × 10
−02
8.0315921 × 10
−01
8.5497151 × 10
−01
3.0216088 × 10
−01
−2.3542808 × 10
−01
4.4144362 × 10
+00
4.1567018 × 10
−01
6.5128412 × 10
−01
−2.9517762 × 10
−01
4.1567018 × 10
−01
4.6760789 × 10
+00
3.0214106 × 10
−01
−3.3530113 × 10
−01
6.5128412 × 10
−01
3.0214106 × 10
−01
5.2609029 × 10
+00
−1.1481623 × 10
+00
−2.9517762 × 10
−01
−3.3530113 × 10
−01
−1.1481623 × 10
+00
1.4101066 × 10
+00
- olumns
1
to4
of thematrixC
2
:1.3507833 × 10
+04
6.7406566 × 10
+03
3.3702384 × 10
+03
1.6725857 × 10
+03
6.7406566 × 10
+03
3.3967105 × 10
+03
1.6471350 × 10
+03
9.4732708 × 10
+02
3.3702384 × 10
+03
1.6471350 × 10
+03
1.0770710 × 10
+03
−5.9885304 × 10
+01
1.6725857 × 10
+03
9.4732708 × 10
+02
−5.9885304 × 10
+01
1.4180254 × 10
+03
8.4377266 × 10
+02
3.5444149 × 10
+02
6.2354442 × 10
+02
−9.7398703 × 10
+02
4.2575342 × 10
+02
1.2333644 × 10
+02
5.6811820 × 10
+02
−1.0036749 × 10
+03
1.9190201 × 10
+02
3.3374780 × 10
+02
−1.1273555 × 10
+03
2.9278273 × 10
+03
1.1160305 × 10
+02
−3.5811160 × 10
+01
4.7875127 × 10
+02
−1.0848726 × 10
+03
5.9442740 × 10
+01
−6.5347562 × 10
+01
4.8219628 × 10
+02
−1.1242647 × 10
+03
2.4762670 × 10
+01
3.2737633 × 10
+01
−1.0064212 × 10
+02
2.7931117 × 10
+02
9.6358696 × 10
+00
4.5785273 × 10
+01
−1.8179770 × 10
+02
4.3287249 × 10
+02
−8.6156343 × 10
+00
1.1937757 × 10
+01
−9.2227840 × 10
+01
2.4895354 × 10
+02
- olumns
5
to8
of thematrixC
2
:8.4377266 × 10
+02
4.2575342 × 10
+02
1.9190201 × 10
+02
1.1160305 × 10
+02
3.5444149 × 10
+02
1.2333644 × 10
+02
3.3374780 × 10
+02
−3.5811160 × 10
+01
6.2354442 × 10
+02
5.6811820 × 10
+02
−1.1273555 × 10
+03
4.7875127 × 10
+02
−9.7398703 × 10
+02
−1.0036749 × 10
+03
2.9278273 × 10
+03
−1.0848726 × 10
+03
1.1976560 × 10
+03
6.8622449 × 10
+02
−2.3758802 × 10
+03
9.9112334 × 10
+02
6.8622449 × 10
+02
1.8212227 × 10
+03
−3.4531339 × 10
+03
1.0514963 × 10
+03
−2.3758802 × 10
+03
−3.4531339 × 10
+03
8.1289247 × 10
+03
−2.8111963 × 10
+03
9.9112334 × 10
+02
1.0514963 × 10
+03
−2.8111963 × 10
+03
1.0872175 × 10
+03
9.3382078 × 10
+02
1.3416656 × 10
+03
−3.1661606 × 10
+03
1.0909808 × 10
+03
−2.5810951 × 10
+02
−2.1640212 × 10
+02
6.3318913 × 10
+02
−2.5467525 × 10
+02
−3.7001917 × 10
+02
−4.1243999 × 10
+02
1.0972433 × 10
+03
−4.2184793 × 10
+02
−2.9896513 × 10
+02
−5.4769087 × 10
+01
4.5207472 × 10
+02
−2.1832809 × 10
+02
- olumns
9
to12
ofthematrixC
2
:5.9442740 × 10
+01
2.4762670 × 10
+01
9.6358696 × 10
+00
−8.6156343 × 10
+00
−6.5347562 × 10
+01
3.2737633 × 10
+01
4.5785273 × 10
+01
1.1937757 × 10
+01
4.8219628 × 10
+02
−1.0064212 × 10
+02
−1.8179770 × 10
+02
−9.2227840 × 10
+01
−1.1242647 × 10
+03
2.7931117 × 10
+02
4.3287249 × 10
+02
2.4895354 × 10
+02
9.3382078 × 10
+02
−2.5810951 × 10
+02
−3.7001917 × 10
+02
−2.9896513 × 10
+02
1.3416656 × 10
+03
−2.1640212 × 10
+02
−4.1243999 × 10
+02
−5.4769087 × 10
+01
−3.1661606 × 10
+03
6.3318913 × 10
+02
1.0972433 × 10
+03
4.5207472 × 10
+02
1.0909808 × 10
+03
−2.5467525 × 10
+02
−4.2184793 × 10
+02
−2.1832809 × 10
+02
1.2906793 × 10
+03
−2.6413625 × 10
+02
−4.4901402 × 10
+02
−1.7266860 × 10
+02
−2.6413625 × 10
+02
1.3032741 × 10
+02
7.7813382 × 10
+01
6.5401511 × 10
+01
−4.4901402 × 10
+02
7.7813382 × 10
+01
2.0480385 × 10
+02
7.4193135 × 10
+01
−1.7266860 × 10
+02
6.5401511 × 10
+01
7.4193135 × 10
+01
9.0528580 × 10
+01
- olumns
1
to4
of thematrixC
3
:7.6923077 × 10
+10
3.5502959 × 10
+10
1.6385981 × 10
+10
7.5627604 × 10
+09
3.5502959 × 10
+10
2.0937642 × 10
+10
1.1764294 × 10
+10
6.3992588 × 10
+09
1.6385981 × 10
+10
1.1764294 × 10
+10
7.6381725 × 10
+09
4.6689231 × 10
+09
7.5627604 × 10
+09
6.3992588 × 10
+09
4.6689231 × 10
+09
3.1346717 × 10
+09
3.4905048 × 10
+09
3.4010047 × 10
+09
2.7400807 × 10
+09
1.9931283 × 10
+09
1.6110022 × 10
+09
1.7762332 × 10
+09
1.5612211 × 10
+09
1.2189181 × 10
+09
7.4353947 × 10
+08
9.1512559 × 10
+08
8.6966246 × 10
+08
7.2377237 × 10
+08
3.4317184 × 10
+08
4.6636302 × 10
+08
4.7583767 × 10
+08
4.1987082 × 10
+08
1.5838253 × 10
+08
2.3556281 × 10
+08
2.5659013 × 10
+08
2.3899290 × 10
+08
7.3010173 × 10
+07
1.1820512 × 10
+08
1.3689702 × 10
+08
1.3404524 × 10
+08
3.1907869 × 10
+07
5.8434212 × 10
+07
7.4301607 × 10
+07
7.8876881 × 10
+07
−2.1055961 × 10
+07
−3.3658883 × 10
+07
−3.9388215 × 10
+07
−3.9302344 × 10
+07
- olumns
5
to8
of thematrixC
3
:3.4905048 × 10
+09
1.6110022 × 10
+09
7.4353947 × 10
+08
3.4317184 × 10
+08
3.4010047 × 10
+09
1.7762332 × 10
+09
9.1512559 × 10
+08
4.6636302 × 10
+08
2.7400807 × 10
+09
1.5612211 × 10
+09
8.6966246 × 10
+08
4.7583767 × 10
+08
1.9931283 × 10
+09
1.2189181 × 10
+09
7.2377237 × 10
+08
4.1987082 × 10
+08
1.3575536 × 10
+09
8.8250564 × 10
+08
5.5383334 × 10
+08
3.3805229 × 10
+08
8.8250564 × 10
+08
6.0576622 × 10
+08
3.9942974 × 10
+08
2.5517254 × 10
+08
5.5383334 × 10
+08
3.9942974 × 10
+08
2.7550985 × 10
+08
1.8348065 × 10
+08
3.3805229 × 10
+08
2.5517254 × 10
+08
1.8348065 × 10
+08
1.2698581 × 10
+08
2.0171595 × 10
+08
1.5885399 × 10
+08
1.1874888 × 10
+08
8.5201704 × 10
+07
1.1806298 × 10
+08
9.6486512 × 10
+07
7.4506549 × 10
+07
5.5014582 × 10
+07
7.4787171 × 10
+07
6.5505299 × 10
+07
5.4060107 × 10
+07
4.2560333 × 10
+07
−3.5445851 × 10
+07
−2.9762013 × 10
+07
−2.3675810 × 10
+07
−1.8044559 × 10
+07
- olumns
9
to12
ofthematrixC
3
:1.5838253 × 10
+08
7.3010173 × 10
+07
3.1907869 × 10
+07
−2.1055961 × 10
+07
2.3556281 × 10
+08
1.1820512 × 10
+08
5.8434212 × 10
+07
−3.3658883 × 10
+07
2.5659013 × 10
+08
1.3689702 × 10
+08
7.4301607 × 10
+07
−3.9388215 × 10
+07
2.3899290 × 10
+08
1.3404524 × 10
+08
7.8876881 × 10
+07
−3.9302344 × 10
+07
2.0171595 × 10
+08
1.1806298 × 10
+08
7.4787171 × 10
+07
−3.5445851 × 10
+07
1.5885399 × 10
+08
9.6486512 × 10
+07
6.5505299 × 10
+07
−2.9762013 × 10
+07
1.1874888 × 10
+08
7.4506549 × 10
+07
5.4060107 × 10
+07
−2.3675810 × 10
+07
8.5201704 × 10
+07
5.5014582 × 10
+07
4.2560333 × 10
+07
−1.8044559 × 10
+07
5.9198495 × 10
+07
3.9143693 × 10
+07
3.2441978 × 10
+07
−1.3332763 × 10
+07
3.9143693 × 10
+07
2.6454096 × 10
+07
2.2554175 × 10
+07
−9.1344475 × 10
+06
3.2441978 × 10
+07
2.2554175 × 10
+07
2.2706317 × 10
+07
−8.6560406 × 10
+06
−1.3332763 × 10
+07
−9.1344475 × 10
+06
−8.6560406 × 10
+06
3.3733471 × 10
+06
B Target fun tions for the shape re onstru tion problem
The target fun tions used in the shape re onstru tion test ase are given by the general Bézier urveformulation:
F
0
(t) =
n
X
0
−1
k=0
B
k
n
0
−1
(t)x
0k+1
Twofun tions
F
01
(t)
andF
02
(t)
are onsideredin thisstudy: - thefun tionF
01
(t)
isgivenby:F
01
(t) =
13
X
k=0
B
k
13
(t)x
0k+1
wheretheve torofthe ontrolpoints
X
01
is:X
01
= (0.1, 0.5, 0.1, 0.1, 0.1, 0.1, 1.5, 0.1, 0.1, 0.1, 0.1, 0.1, 1, 0.1)
T
- thefun tion
F
02
(t)
isgivenby:F
02
(t) =
17
X
k=0
B
17
k
(t)x
0k+1
wheretheve torofthe ontrolpoints
X
02
is:Referen es
[1℄ M. Andreoli, A. Janka, and J. A. Desideri. Free-form deformation parameterization for multilevel3Dshapeoptimization in aerodynami s. Resar h Report, INRIA,5019, 2003.
[2℄ A. Borzì. Multilevelmethodsforoptimizationandinverseproblems. InSIAMAnnual Meeting, Minisymposium,Boston,http://www.uni-graz.at/imawww/borzi/,2006.
[3℄ D. Bü he, N. N. S hraudolph, andP. Koumoutsakos. A eleratingevolutionary algo-rithms with gaussian pro ess tness fun tion models. IEE Transa tions on Systems, Man andCyberneti s -Part C:Appli ations andReviews,35(2):183194,2005.
[4℄ J. E. Dennis and V. Tor zon. Dire t sear h methods on parallel ma hines. SIAM Journal onOptimization,1(4):448474,1991.
[5℄ J. A. Désidéri. Two-level ideal algorithm for shape optimization. In International Conferen e onAdvan es inNumeri al Mathemati s,Mos ow,September2005.
[6℄ J.A. Désidéri,B.AbouEl Majd,andA.Janka. Nestedand self-adaptativebézier pa-rameterizationforshapeoptimization. InInternationalConferen eonControl,Partial Dierential EquationsandS ienti Computing,Beijing,China,September2004.
[7℄ R. Duvigneau. Introdu tion aux méthodes d'optimisation sans gradient pour l'optimisation et le ontrle en mé anique des uides. In Optimisation et Contrle des é oulementsetdesTransferts.E oledeprintemps OCET,Mar h 2006.
[8℄ B. Abou El Majd, R. Duvigneau, and J. A. Désidéri. Aerodynami shape optimiza-tion using a full adaptativemultilevel algorithm. In ERCOFTAC Conferen e Design Optimization : MethodsandAppli ations,CanaryIsland,Spain,April2006.
[9℄ K. C. Giannakoglou and M. K. Karakasis. Hierar hi al and distributed metamodel-assistedevolutionaryalgorithms.InIntrodu tiontoOptimizationandMultidis iplinary Design. vonKarmanInstitueofFluidDynami s,Le ture Series,Mar h2006.
[10℄ A.Giunta,J.Dudley,R.Nardu i,B.Grossman,R.Haftka,W.Mason,andL.Watson. Noisyaerodynami responseandsmoothapproxinationsinHSCTdesign.AIAAPaper, 94-4316,1994.
[11℄ A. Jamesonand L. Martinelli. Optimum aerodynami designusing the navier-stokes equation. Theori al andComputaionalFluid Dynami s,10:213237,1998.
[12℄ Y. Jin. A omprehensivesurveyoftnessapproximationinevolutionary omputation. SoftComputing, 9(1),2005.
[13℄ D. R.Jones. Ataxonomyofglobal optimizationmethods basedonresponse surfa es. J. Global Optimization,21:345383,2001.
[14℄ R. Lewisand S.Nash. A multigridapproa htothe optimizationof systemsgoverned bydierentialequations. Ameri anInstituteof Aeronauti sandAstronauti s,Reston, VA,AIAA-2000-4890,2000.
[15℄ R. Lewis and S. Nash. Model problems for the multigrid optimization of systems governedbydierentialequations. SIAMJ. S i. Comput.,26:18111837,2005.
[16℄ J.ANelderandR.Mead. Asimplexmethodforfun tionminimization.TheComputer Journal,7:308313,1965.
[17℄ T.SederbergandS.Parry.Free-formdeformationofsolidgeometri models. Computer Graphi s,20(4):151160,1986.
Contents
1 Introdu tion 3
2 Survey ofoptimizationmethods 3
3 Multi-dire tional-Sear hAlgorithm 5
4 Multileveloptimization 6
5 Appli ations 8
5.1 Firsttest ase: analyti alfun tion . . . 8
5.1.1 Singleleveloptimization . . . 9
5.1.2 Multileveloptimization . . . 10
5.1.3 Theee tof the onditionnumber . . . 12
5.2 Se ondtest ase: Shapere onstru tionproblem . . . 15
5.2.1 Test- asedes ription . . . 15
5.2.2 Experimentationandresults . . . 18
6 Con lusion 18
7 A knowledgments 20
Appendi es 21
A Analyti al fun tionfor testingthe multilevelalgorithm 21