Contents
1 Introduction 1
1.1 Objective and methodology . . . . 3
1.2 Main contributions . . . . 4
1.3 Publications . . . . 5
1.3.1 Main contributed publications for the thesis . . 5
1.3.2 Related contributions on automatic algorithm configuration . . . . 6
1.3.3 Additional contributions . . . . 7
1.4 Structure of the thesis . . . . 8
2 Background 11 2.1 Optimization . . . . 11
2.1.1 Discrete optimization problems . . . . 11
2.1.2 Continuous optimization problems . . . . 13
2.2 Metaheuristic algorithms . . . . 14
2.2.1 Metaheuristics for discrete optimization problems 16 2.2.2 Metaheuristics for continuous optimization prob- lems . . . . 23
2.3 The algorithm configuration problem . . . . 27
2.4 Types of parameters . . . . 30
2.5 Summary . . . . 31
3 Configuration algorithms 33 3.1 Evaluation method: Evaluation budget allocator for ranking and selection . . . . 34
3.1.1 Repeated evaluation . . . . 35
3.1.2 F-Race . . . . 35
3.2 Search method: Black-box optimizers . . . . 39
3.2.1 Searching the numerical parameter space . . . . 40
3.2.2 Searching the categorical parameter space . . . 41
3.3 Combining evaluation method and search method . . . 42
3.3.1 Iterated selection . . . . 42
3.3.2 Post-selection . . . . 43
3.4 Related works . . . . 44
3.4.1 Offline configuration versus online adaptation . 44 3.4.2 Offline configuration algorithms . . . . 46
3.4.3 Applications of F-Race . . . . 51
3.5 Summary . . . . 54
4 Iterated selection using F-Race: Iterated F-Race and beyond 57 4.1 Previous strategies for generating configurations for F- Race . . . . 58
4.1.1 Full factorial design . . . . 58
4.1.2 Random sampling design . . . . 58
4.2 Iterated F-Race . . . . 59
4.2.1 The framework of iterated F-Race . . . . 59
4.2.2 I/F-Race: An example iterated F-Race algorithm 62 4.3 Case studies: I/F-Race for numerical parameters . . . . 64
4.3.1 Comparing I/F-Race with previous sampling mechanisms for F-Race . . . . 64
4.3.2 Fixed instance order and common random seed 65 4.4 Case studies: I/F-Race for categorical and conditional parameters . . . . 68
4.4.1 Case study 1,MMAS under four parameters . 69 4.4.2 Case study 2,MMAS under seven parameters 70 4.4.3 Case study 3, ACOTSP under twelve parameters 71 4.5 MADS/F-Race: Hybridizing F-Race with MADS . . . 73
4.5.1 Mesh Adaptive Direct Search. . . . 74
4.5.2 MADS/F-Race . . . . 75
4.5.3 Case study: MADS/F-Race vs. MADS(fixed) . . 76
4.5.4 Incumbent protection mechanism . . . . 82
4.6 Summary . . . . 83
5 Continuous optimizers for tuning numerical parameters 85 5.1 Tuning algorithms . . . . 86
5.1.1 Search methods . . . . 86
5.1.2 Evaluation methods . . . . 88
5.1.3 Combining search methods and evaluation methods 88 5.2 Benchmark tuning problems . . . . 89
5.2.1 MAX–MIN Ant System – Traveling Salesman Problem . . . . 89
5.2.2 Particle swarm optimization – Rastrigin functions 92 5.3 Experiments . . . . 95
5.3.1 Experimental setup . . . . 95
5.3.2 The settings of search algorithms . . . . 96
5.3.3 The settings of stochasticity handling in restart 100 5.3.4 Comparisons of search performance of all tuning algorithms . . . . 101
5.3.5 Further comparisons . . . . 105
5.4 Parameter landscape analysis . . . . 110
5.4.1 The parameter landscape of two parameters . . 110
5.4.2 The parameter landscape of all case studies . . 112
5.5 Summary . . . . 115
6 Post-selection mechanism for handling stochasticity in automatic configuration 117 6.1 Configuration algorithms . . . . 118
6.1.1 Black-box search methods . . . . 118
6.1.2 Evaluation method . . . . 119
6.1.3 Combination of search and evaluation . . . . 120
6.2 Basic settings of post-selection . . . . 121
6.2.1 Experimental setup . . . . 122
6.2.2 Repeated evaluation, iterated selection, and post-selection . . . . 124
6.2.3 Basic settings of post-selection . . . . 127
6.2.4 Post-selection vs. I/F-Race . . . . 130
6.2.5 Post-selection in ParamILS . . . . 131
6.2.6 Summary and outlook . . . . 133
6.3 Advanced settings of post-selection . . . . 134
6.3.1 Post-selection Nelder-Mead Simplex configurators 135 6.3.2 Post-selection BOBYQA settings . . . . 138
6.3.3 Post-selection CMA-ES settings . . . . 140
6.3.4 Population variation strategy for CMA-ES con- figurators . . . . 141
6.3.5 Comparison to existing automatic configuration software . . . . 142
6.3.6 Comparison of the best configurators . . . . 146
6.4 Comparisons on further tuning benchmarks . . . . 148
6.4.1 Configuring robust tabu search . . . . 148
6.4.2 ConfiguringMAX–MIN Ant System for QAP 149 6.4.3 Configuring iCMAES-ILS . . . . 152
6.5 Summary . . . . 152
7 Conclusions and future work 155 7.1 Conclusions . . . . 155
7.2 Future work . . . . 158
Appendices 161 A Automatically Tuned Iterated Greedy Algorithms for a
Real-world Cyclic Train Scheduling Problem 163
A.1 Introduction . . . . 163
A.2 The freight train scheduling problem . . . . 164
A.2.1 Problem setting . . . . 165
A.2.2 Formulation of the problem . . . . 166
A.2.3 Benchmark instances . . . . 168
A.3 Greedy algorithms . . . . 168
A.3.1 The g-CVSPTWheuristic . . . . 169
A.3.2 Modified greedy heuristic . . . . 170
A.3.3 Locomotive type exchange heuristic . . . . 170
A.4 Iterated greedy algorithms . . . . 171
A.5 Experimental results for greedy and iterated greedy . . 172
A.5.1 Experimental setup . . . . 172
A.5.2 Experimental results . . . . 173
A.6 Iterated Ants . . . . 175
A.7 Discussion . . . . 176
A.8 Conclusions . . . . 178
B ScaLa: Scaling large instances 181 B.1 Experimental setup . . . . 182
B.2 Measuring instance similarity . . . . 183
B.3 Finding similarities between large and small instances . 183 B.4 Solving large instances by tuning on small instances . . 184
B.5 Summary . . . . 186
C Offline Configuration Meets Online Adaptation: An Ex- perimental Investigation in Operator Selection 189 C.1 Introduction . . . . 189
C.2 Related works . . . . 191
C.3 Target problem . . . . 193
C.4 Target algorithm . . . . 193
C.5 Operator selection strategies . . . . 194
C.5.1 Static operator strategy . . . . 194
C.5.2 Mixed operator strategy . . . . 195
C.5.3 Adaptive operator selection . . . . 195
C.6 Experimental setup . . . . 198
C.6.1 Instance setup . . . . 198
C.6.2 Target algorithm setup . . . . 199
C.6.3 Offline configuration setup . . . . 200
C.7 Experimental results . . . . 202
C.7.1 Result presentation . . . . 202
C.7.2 Static operator strategy . . . . 203
C.7.3 Online adaptive operator selection . . . . 204
C.7.4 Analysis on the effectiveness of online adaptation 211 C.7.5 Mixed operator strategy . . . . 213
C.7.6 Combining MOS and AOS . . . . 216
C.8 Conclusions and future works . . . . 217
D Automatic configuration of MIP solver: Case study in vertical flight planning 219 D.1 Introduction . . . . 219
D.2 Vertical flight planning . . . . 220
D.2.1 Mixed integer linear programming model . . . . 220
D.2.2 Problem instance . . . . 222
D.3 Performance variability of MIP solver . . . . 222
D.4 Automatic configuration of MIP solver . . . . 225
D.4.1 Automatic solver configuration . . . . 225
D.4.2 Formulation comparison with default setting . . 226
D.4.3 Automatically tuned setting versus default setting 227 D.4.4 Formulation comparison with tuned setting . . . 228
D.4.5 Further analysis on the tuned configuration . . 228
D.5 Conclusions . . . . 230
Bibliography 231