Meta Model Based EA for Complex Optimization

Evolutionary Algorithms are population-based, stochastic search techniques, widely used as efficient global optimizers. However, many real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations. Use of evolutionary algorithms in such problem domains is thus practically prohibitive. An attractive alternative is to build meta models or use an approximation of the actual fitness functions to be evaluated. These meta models are order of magnitude cheaper to evaluate compared to the actual function evaluation. Many regression and interpolation tools are available to build such meta models. This paper briefly discusses the architectures and use of such meta-modeling tools in an evolutionary optimization context. We further present two evolutionary algorithm frameworks which involve use of meta models for fitness function evaluation. The first framework, namely the Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model [14] reduces computation time by controlled use of meta-models (in this case approximate model generated by Support Vector Machine regression) to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the metamodel are generated from a single uniform model. This does not take into account uncertain scenarios involving noisy fitness functions. The second model, DAFHEA-II, an enhanced version of the original DAFHEA framework, incorporates a multiple-model based learning approach for the support vector machine approximator to handle noisy functions [15]. Empirical results obtained by evaluating the frameworks using several benchmark functions demonstrate their efficiency




References:
[1] A. Ratle.., "Accelerating the convergence of evolutionary algorithms by
fitness landscape approximation", Parallel Problem Solving from
Nature-PPSN V, Springer-Verlag, pp. 87-96, 1998.
[2] A. Smola and B. Schölkopf, "A Tutorial on Support Vector Regression",
NeuroCOLT Technical Report NC-TR-98-030, Royal Holloway
College, University of London, UK, 1998.
[3] B. Dunham, D. Fridshal., R. Fridshal and J. North, "Design by natural
selection", Synthese, 15, pp. 254-259, 1963.
[4] B. Schölkopf , J. Burges and A. Smola, ed., "Advances in Kernel
Methods: Support Vector Machines", MIT Press, 1999.
[5] C. Bishop, "Neural Networks for Pattern Recognition", Oxford Press,
1995.
[6] D. B├╝che., N. Schraudolph, and P. Koumoutsakos, "Accelerating
Evolutionary Algorithms Using Fitness Function Models", Proc.
Workshops Genetic and Evolutionary Computation Conference,
Chicago, 2003.
[7] H. D. Vekeria and i. C. Parmee, "The use of a co-operative multi-level
CHC GA for structural shape optimization", Fourth European Congress
on Intelligent Techniques and Soft Computing - EUFIT-96, 1996.
[8] H. S. Kim and S. B. Cho, " An efficient genetic algorithm with less
fitness evaluation by clustering", Proceedings of IEEE Congress on
Evolutionary Computation, pp. 887-894, 2001.
[9] J. Sacks, W. Welch, T. Mitchell and H. Wynn, "Design and analysis of
computer experiments", Statistical Science, 4(4), 1989.
[10] K. Rasheed, "An Incremental-Approximate-Clustering Approach for
Developing Dynamic Reduced Models for Design Optimization",
Proceedings of IEEE Congress on Evolutionary Computation, 2000.
[11] K. Rasheed, S. Vattam and X. Ni., "Comparison of Methods for Using
Reduced Models to Speed Up Design Optimization", The Genetic and
Evolutionary Computation Conference (GECCO'2002), 2002.
[12] K. Won, T. Roy and K. Tai, "A Framework for Optimization Using
Approximate Functions", Proceedings of the IEEE Congress on
Evolutionary Computation- 2003, Vol.3, IEEE Catalogue No.
03TH8674C, ISBN 0-7803-7805-9.
[13] M. A. El-Beltagy and A. J. Keane, "Evolutionary optimization for
computationally expensive problems using Gaussian processes", Proc.
Int. Conf. on Artificial Intelligence (IC-AI'2001), CSREA Press, Las
Vegas, pp. 708-714, 2001.
[14] M. Bhattacharya and G. Lu, "DAFHEA: A Dynamic Approximate
Fitness based Hybrid Evolutionary Algorithm", Proceedings of the IEEE
Congress on Evolutionary Computation- 2003, Vol.3, IEEE Catalogue
No. 03TH8674C, ISBN 0-7803-7805-9, pp. 1879-1886.
[15] M. Bhattacharya, "Surrogate based Evolutionary Algorithm for
Engineering Design Optimization", Proceedings of the Eighth
International Conference on Cybernetics, Informatics and Systemic
(ICCIS 2005), ISBN 975-98458-9-X, pp. 52-57.
[16] P. Hajela and A. Lee., "Topological optimization of rotorcraft subfloor
structures for crashworthiness considerations", Computers and
Structures, vol.64, pp. 65-76, 1997.
[17] R. Myers and D. Montgomery, "Response Surface Methodology", John
Wiley & Sons, 1985.
[18] S. Pierret, "Three-dimensional blade design by means of an artificial
neural network and Navier-Stokes solver", Proceedings of Fifth
Conference on Parallel Problem Solving from Nature, Amsterdam, 1999.
[19] S. R. Gunn, "Support Vector Machines for Classification and
Regression", Technical Report, School of Electronics and Computer
Science, University of Southampton, (Southampton, U.K.), 1998.
[20] T. Hastie, R. Tibshirani, J. Friedman, "The Elements of Statistical
Learning: Data Mining, Inference, and Prediction", Springer Series in
Statistics, ISBN 0-387-95284-5.
[21] V. Cherkassky and Y. Ma, "Multiple Model Estimation: A New
Formulation for Predictive Learning", under review in IEE Transaction
on Neural Network.
[22] V. Torczon and M. W. Trosset, "Using approximations to accelerate
engineering design optimisation", ICASE Report No. 98-33. Technical
report, NASA Langley Research Center Hampton, VA 23681-2199,
1998.
[23] V. V. Toropov, a. A. Filatov and A. A. Polykin, "Multiparameter
structural optimization using FEM and multipoint explicit
approximations", Structural Optimization, vol. 6, pp. 7-14, 1993.
[24] V. Vapnik, "The Nature of Statistical Learning Theory", Springer-
Verlag, NY, USA, 1999.
[25] Y. Jin, M. Olhofer and B. Sendhoff, "A Framework for Evolutionary
Optimization with Approximate Fitness Functions", IEEE Transactions
on Evolutionary Computation, 6(5), pp. 481-494, (ISSN: 1089-778X).
2002.
[26] Y. Jin, M. Olhofer and B. Sendhoff., "On Evolutionary Optimisation
with Approximate Fitness Functions", Proceedings of the Genetic and
Evolutionary Computation Conference GECCO, Las Vegas, Nevada,
USA. pp. 786- 793, July 10-12, 2000.
[27] Y. Jin., "A comprehensive survey of fitness approximation in
evolutionary computation", Soft Computing Journal, 2003 (in press).