Abstract
In general, biologically-inspired multi-objective optimization algorithms comprise several parameters which values have to be selected ahead of running the algorithm. In this paper we describe a global sensitivity analysis framework that enables a better understanding of the effects of parameters on algorithm performance. For this work, we tested NSGA-III and MOEA/D on multi-objective optimization testbeds, undertaking our proposed sensitivity analysis techniques on the relevant metrics, namely Generational Distance, Inverted Generational Distance, and Hypervolume. Experimental results show that both algorithms are most sensitive to the cardinality of the population. In all analyses, two clusters of parameter usually appear: (1) the population size (Pop) and (2) the Crossover Distribution Index, Crossover Probability, Mutation Distribution Index and Mutation Probability; where the first cluster, Pop, is the most important (sensitive) parameter with respect to the others. Choosing the correct population size for the tested algorithms has a significant impact on the solution accuracy and algorithm performance. It was already known how important the population of an evolutionary algorithm was, but it was not known its importance compared to the remaining parameters. The distance between the two clusters shows how crucial the size of the population is, compared to the other parameters. Detailed analysis clearly reveals a hierarchy of parameters: on the one hand the size of the population, on the other the remaining parameters that are always grouped together (in a single cluster) without a possible significant distinction. In fact, the other parameters all have the same importance, a secondary relevance for the performance of the algorithms, something which, to date, has not been observed in the evolutionary algorithm literature. The methodology designed in this paper can be adopted to evaluate the importance of the parameters of any algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)
Wang, Z., Hutter, F., Zoghi, M., Matheson, D., de Freitas, N.: Bayesian optimization in a billion dimensions via random embeddings. J. Artif. Intell. Res. 55, 361–387 (2016)
Snoek, J., Larochelle, H., Adams, R.P.: Practical bayesian optimization of machine learning algorithms. In: Advances in Neural Information Processing Systems, vol. 4 (2012)
Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S.: Choosing multiple parameters for support vector machines. Mach. Learn. 46, 131–159 (2001)
Foo, C.S., Do, C.B., Ng, A.Y.: Efficient multiple hyperparameter learning for log-linear models. In: Advances in Neural Information Processing Systems (NIPS) 20, pp. 377–384. Curran Associates Inc. (2008)
Hutter, F., Hoos, H., Leyton-Brown, K., Stützle, T.: ParamILS: an automatic algorithm configuration framework. J. Artif. Intell. Res. (JAIR) 36, 267–306 (2009)
Eiben, A.E., Smit, S.K.: Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm Evol. Comput. 1(1), 19–31 (2011)
Bartz-Beielstein, T., Lasarczyk, C., Preuss, M.: Sequential parameter optimization. In: 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005. Proceedings, vol. 1, pp. 773–780 (2005)
Wu, F., Weimer, W., Harman, M., Jia, Y., Krinke, J.: Deep parameter optimisation. In: GECCO 2015: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, pp. 1375–1382. ACM, Madrid (2015)
Conca, P., Stracquadanio, G., Nicosia, G.: Automatic tuning of algorithms through sensitivity minimization. In: Pardalos, P., Pavone, M., Farinella, G.M., Cutello, V. (eds.) MOD 2015. LNCS, vol. 9432, pp. 14–25. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-27926-8_2
Al-Salami, N.M.A.: Evolutionary algorithm definition. Am. J. Eng. Appl. Sci. 2(6), 789–795 (2009)
Vikhar, P.A.: Evolutionary algorithms: a critical review and its future prospects. In: 2016 International Conference on Global Trends in Signal Processing, Information Computing and Communication (ICGTSPICC), pp. 261–265. Jalgaon (2016)
Deb, K., Jain, H.: An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach part i: solving problems with box constraints. IEEE Trans. Evol. Comput. 18(4), 577–601 (2014)
Zhang, Q., Li, H.: MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evol. Comput. J. 11(6), 712–731 (2007)
Yuen, T.J., Ramli, R.: Comparison of computational efficency of MOEA/D and NSGA-II for passive vehicle suspension optimization. In: 24th European Conference on Modelling and Simulation, Kuala Lumpur, Malaysia (2010)
Okabe, T., Jin, Y., Sendhoff, B.: A critical survey of performance indices for multi-objective optimisation. In: The 2003 Congress on Evolutionary Computation, vol. 4, pp. 2262–2269. IEEE Press (2003)
Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., Grunert da Fonseca, V.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evol. Comput. 7(2), 117–132 (2003)
Van Veldhuizen, D. A., Lamont, G. B.: Evolutionary computation and convergence to a pareto front. In: Late Breaking Papers on the Genetic Programmming 1998 Conference, pp. 221–228 (1998)
Van Veldhuizen, D.A.: Multiobjective Evolutionary Algorithms: Classifications, Analyses, and New Innovations. Faculty of the Graduate School of Engineering of the Air Force Institute of Technology, Air University (1999)
Zitzler, E., Thiele, L.: Multiobjective optimization using evolutionary algorithms—a comparative case study. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 292–301. Springer, Heidelberg (1998). https://doi.org/10.1007/BFb0056872
Saltelli, A.: Sensitivity analysis for importance assessment. Risk Anal. 22(3), 1–12 (2002)
Saltelli, A., et al.: Global Sensitivity Analysis: The Primer. Wiley, Chichester (2008)
Carapezza, G., et al.: Efficient behavior of photosynthetic organelles via pareto optimality, identifiability and sensitivity analysis. ACS Synthetic Biol. J. 2(5), 274–288 (2013)
Costanza, J., Carapezza, G., Angione, C., Liò, P., Nicosia, G.: Multi-objective optimisation, sensitivity and robustness analysis in FBA modelling. In: Gilbert, D., Heiner, M. (eds.) CMSB 2012. LNCS, pp. 127–147. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33636-2_9
Morris, M.D.: Factorial sampling plans for preliminary computational experiments. Technometrics 33(2), 161–174 (1991)
Campolongo, F., Cariboni, J., Saltelli, A.: An effective screening design for sensitivity analysis of large models. Environ. Modell. Software 22(10), 1509–1518 (2007)
Sobol, I.M.: Sensitivity estimates for nonlinear mathematical models. Math. Modell. Comput. Exp. 1(4), 407–414 (1993)
Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable multi-objective optimization test problems. In: The 2002 Congress on Evolutionary Computation, pp. 825–830. IEEE Press (2002)
Huband, S., Hingston, P., Barone, L., While, L.: A review of multiobjective test problems and a scalable test problem toolkit. IEEE Trans. Evol. Comput. 10(5), 477–506 (2006)
Tian, Y., Ran Cheng, R., Zhang, X., Jin, Y.: PlatEMO: A MATLAB platform for evolutionary multi-objective optimization. IEEE Comput. Intell. Mag. 12(4), 73–87 (2017)
Pianosi, F., Sarrazin, F., Wagener, T.: A matlab toolbox for global sensitivity analysis. Environ. Modell. Software 70, 80–85 (2015)
Sun, Y., Xue, B., Zhang, M., Yen, G.G.: A new two-stage evolutionary algorithm for many-objective optimization. IEEE Trans. Evol. Comput. (2018)
Nicosia, G., Cutello, V.: The clonal selection principle for in silico and in vitro computing. In: de Castro, L.N., Von Zuben, F.J. (eds.) Recent Developments in Biologically Inspired Computing (2004)
Narzisi, G., Nicosia, G., Stracquadanio, G.: Robust bio-active peptide prediction using multi-objective optimization. In: The I International Conference on Advances in Bioinformatics and Applications - BIOINFO 2010, 7–13 March, 2010, Cancun, Mexico, pp. 44–50. IEEE Press (2010)
Alden, K., Read, M., Timmis, J., Andrews, P.S., Veiga-Fernandes, H., Coles, M.: Spartan: a comprehensive tool for understanding uncertainty in simulations of biological systems. PLOS Comput. Biol. 9(2), e1002916 (2013)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Greco, A., Riccio, S.D., Timmis, J., Nicosia, G. (2019). Assessing Algorithm Parameter Importance Using Global Sensitivity Analysis. In: Kotsireas, I., Pardalos, P., Parsopoulos, K., Souravlias, D., Tsokas, A. (eds) Analysis of Experimental Algorithms. SEA 2019. Lecture Notes in Computer Science(), vol 11544. Springer, Cham. https://doi.org/10.1007/978-3-030-34029-2_26
Download citation
DOI: https://doi.org/10.1007/978-3-030-34029-2_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-34028-5
Online ISBN: 978-3-030-34029-2
eBook Packages: Computer ScienceComputer Science (R0)