iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://doi.org/10.1007/978-3-030-34029-2_26
Assessing Algorithm Parameter Importance Using Global Sensitivity Analysis | SpringerLink
Skip to main content

Assessing Algorithm Parameter Importance Using Global Sensitivity Analysis

  • Conference paper
  • First Online:
Analysis of Experimental Algorithms (SEA 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11544))

Included in the following conference series:

Abstract

In general, biologically-inspired multi-objective optimization algorithms comprise several parameters which values have to be selected ahead of running the algorithm. In this paper we describe a global sensitivity analysis framework that enables a better understanding of the effects of parameters on algorithm performance. For this work, we tested NSGA-III and MOEA/D on multi-objective optimization testbeds, undertaking our proposed sensitivity analysis techniques on the relevant metrics, namely Generational Distance, Inverted Generational Distance, and Hypervolume. Experimental results show that both algorithms are most sensitive to the cardinality of the population. In all analyses, two clusters of parameter usually appear: (1) the population size (Pop) and (2) the Crossover Distribution Index, Crossover Probability, Mutation Distribution Index and Mutation Probability; where the first cluster, Pop, is the most important (sensitive) parameter with respect to the others. Choosing the correct population size for the tested algorithms has a significant impact on the solution accuracy and algorithm performance. It was already known how important the population of an evolutionary algorithm was, but it was not known its importance compared to the remaining parameters. The distance between the two clusters shows how crucial the size of the population is, compared to the other parameters. Detailed analysis clearly reveals a hierarchy of parameters: on the one hand the size of the population, on the other the remaining parameters that are always grouped together (in a single cluster) without a possible significant distinction. In fact, the other parameters all have the same importance, a secondary relevance for the performance of the algorithms, something which, to date, has not been observed in the evolutionary algorithm literature. The methodology designed in this paper can be adopted to evaluate the importance of the parameters of any algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)

    MathSciNet  MATH  Google Scholar 

  2. Wang, Z., Hutter, F., Zoghi, M., Matheson, D., de Freitas, N.: Bayesian optimization in a billion dimensions via random embeddings. J. Artif. Intell. Res. 55, 361–387 (2016)

    Article  MathSciNet  Google Scholar 

  3. Snoek, J., Larochelle, H., Adams, R.P.: Practical bayesian optimization of machine learning algorithms. In: Advances in Neural Information Processing Systems, vol. 4 (2012)

    Google Scholar 

  4. Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S.: Choosing multiple parameters for support vector machines. Mach. Learn. 46, 131–159 (2001)

    Article  Google Scholar 

  5. Foo, C.S., Do, C.B., Ng, A.Y.: Efficient multiple hyperparameter learning for log-linear models. In: Advances in Neural Information Processing Systems (NIPS) 20, pp. 377–384. Curran Associates Inc. (2008)

    Google Scholar 

  6. Hutter, F., Hoos, H., Leyton-Brown, K., Stützle, T.: ParamILS: an automatic algorithm configuration framework. J. Artif. Intell. Res. (JAIR) 36, 267–306 (2009)

    Article  Google Scholar 

  7. Eiben, A.E., Smit, S.K.: Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm Evol. Comput. 1(1), 19–31 (2011)

    Article  Google Scholar 

  8. Bartz-Beielstein, T., Lasarczyk, C., Preuss, M.: Sequential parameter optimization. In: 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005. Proceedings, vol. 1, pp. 773–780 (2005)

    Google Scholar 

  9. Wu, F., Weimer, W., Harman, M., Jia, Y., Krinke, J.: Deep parameter optimisation. In: GECCO 2015: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, pp. 1375–1382. ACM, Madrid (2015)

    Google Scholar 

  10. Conca, P., Stracquadanio, G., Nicosia, G.: Automatic tuning of algorithms through sensitivity minimization. In: Pardalos, P., Pavone, M., Farinella, G.M., Cutello, V. (eds.) MOD 2015. LNCS, vol. 9432, pp. 14–25. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-27926-8_2

    Chapter  Google Scholar 

  11. Al-Salami, N.M.A.: Evolutionary algorithm definition. Am. J. Eng. Appl. Sci. 2(6), 789–795 (2009)

    Google Scholar 

  12. Vikhar, P.A.: Evolutionary algorithms: a critical review and its future prospects. In: 2016 International Conference on Global Trends in Signal Processing, Information Computing and Communication (ICGTSPICC), pp. 261–265. Jalgaon (2016)

    Google Scholar 

  13. Deb, K., Jain, H.: An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach part i: solving problems with box constraints. IEEE Trans. Evol. Comput. 18(4), 577–601 (2014)

    Article  Google Scholar 

  14. Zhang, Q., Li, H.: MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evol. Comput. J. 11(6), 712–731 (2007)

    Article  Google Scholar 

  15. Yuen, T.J., Ramli, R.: Comparison of computational efficency of MOEA/D and NSGA-II for passive vehicle suspension optimization. In: 24th European Conference on Modelling and Simulation, Kuala Lumpur, Malaysia (2010)

    Google Scholar 

  16. Okabe, T., Jin, Y., Sendhoff, B.: A critical survey of performance indices for multi-objective optimisation. In: The 2003 Congress on Evolutionary Computation, vol. 4, pp. 2262–2269. IEEE Press (2003)

    Google Scholar 

  17. Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., Grunert da Fonseca, V.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evol. Comput. 7(2), 117–132 (2003)

    Article  Google Scholar 

  18. Van Veldhuizen, D. A., Lamont, G. B.: Evolutionary computation and convergence to a pareto front. In: Late Breaking Papers on the Genetic Programmming 1998 Conference, pp. 221–228 (1998)

    Google Scholar 

  19. Van Veldhuizen, D.A.: Multiobjective Evolutionary Algorithms: Classifications, Analyses, and New Innovations. Faculty of the Graduate School of Engineering of the Air Force Institute of Technology, Air University (1999)

    Google Scholar 

  20. Zitzler, E., Thiele, L.: Multiobjective optimization using evolutionary algorithms—a comparative case study. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 292–301. Springer, Heidelberg (1998). https://doi.org/10.1007/BFb0056872

    Chapter  Google Scholar 

  21. Saltelli, A.: Sensitivity analysis for importance assessment. Risk Anal. 22(3), 1–12 (2002)

    Article  Google Scholar 

  22. Saltelli, A., et al.: Global Sensitivity Analysis: The Primer. Wiley, Chichester (2008)

    MATH  Google Scholar 

  23. Carapezza, G., et al.: Efficient behavior of photosynthetic organelles via pareto optimality, identifiability and sensitivity analysis. ACS Synthetic Biol. J. 2(5), 274–288 (2013)

    Article  Google Scholar 

  24. Costanza, J., Carapezza, G., Angione, C., Liò, P., Nicosia, G.: Multi-objective optimisation, sensitivity and robustness analysis in FBA modelling. In: Gilbert, D., Heiner, M. (eds.) CMSB 2012. LNCS, pp. 127–147. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33636-2_9

    Chapter  Google Scholar 

  25. Morris, M.D.: Factorial sampling plans for preliminary computational experiments. Technometrics 33(2), 161–174 (1991)

    Article  Google Scholar 

  26. Campolongo, F., Cariboni, J., Saltelli, A.: An effective screening design for sensitivity analysis of large models. Environ. Modell. Software 22(10), 1509–1518 (2007)

    Article  Google Scholar 

  27. Sobol, I.M.: Sensitivity estimates for nonlinear mathematical models. Math. Modell. Comput. Exp. 1(4), 407–414 (1993)

    MathSciNet  MATH  Google Scholar 

  28. Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable multi-objective optimization test problems. In: The 2002 Congress on Evolutionary Computation, pp. 825–830. IEEE Press (2002)

    Google Scholar 

  29. Huband, S., Hingston, P., Barone, L., While, L.: A review of multiobjective test problems and a scalable test problem toolkit. IEEE Trans. Evol. Comput. 10(5), 477–506 (2006)

    Article  Google Scholar 

  30. Tian, Y., Ran Cheng, R., Zhang, X., Jin, Y.: PlatEMO: A MATLAB platform for evolutionary multi-objective optimization. IEEE Comput. Intell. Mag. 12(4), 73–87 (2017)

    Article  Google Scholar 

  31. Pianosi, F., Sarrazin, F., Wagener, T.: A matlab toolbox for global sensitivity analysis. Environ. Modell. Software 70, 80–85 (2015)

    Article  Google Scholar 

  32. Sun, Y., Xue, B., Zhang, M., Yen, G.G.: A new two-stage evolutionary algorithm for many-objective optimization. IEEE Trans. Evol. Comput. (2018)

    Google Scholar 

  33. Nicosia, G., Cutello, V.: The clonal selection principle for in silico and in vitro computing. In: de Castro, L.N., Von Zuben, F.J. (eds.) Recent Developments in Biologically Inspired Computing (2004)

    Google Scholar 

  34. Narzisi, G., Nicosia, G., Stracquadanio, G.: Robust bio-active peptide prediction using multi-objective optimization. In: The I International Conference on Advances in Bioinformatics and Applications - BIOINFO 2010, 7–13 March, 2010, Cancun, Mexico, pp. 44–50. IEEE Press (2010)

    Google Scholar 

  35. Alden, K., Read, M., Timmis, J., Andrews, P.S., Veiga-Fernandes, H., Coles, M.: Spartan: a comprehensive tool for understanding uncertainty in simulations of biological systems. PLOS Comput. Biol. 9(2), e1002916 (2013)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giuseppe Nicosia .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Greco, A., Riccio, S.D., Timmis, J., Nicosia, G. (2019). Assessing Algorithm Parameter Importance Using Global Sensitivity Analysis. In: Kotsireas, I., Pardalos, P., Parsopoulos, K., Souravlias, D., Tsokas, A. (eds) Analysis of Experimental Algorithms. SEA 2019. Lecture Notes in Computer Science(), vol 11544. Springer, Cham. https://doi.org/10.1007/978-3-030-34029-2_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-34029-2_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-34028-5

  • Online ISBN: 978-3-030-34029-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics