iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://unpaywall.org/10.1007/978-3-030-50316-1_16
Scriptless Testing at the GUI Level in an Industrial Setting | SpringerLink
Skip to main content

Scriptless Testing at the GUI Level in an Industrial Setting

  • Conference paper
  • First Online:
Research Challenges in Information Science (RCIS 2020)

Abstract

TESTAR is a traversal-based and scriptless tool for test automation at the Graphical User Interface (GUI) level. It is different from existing test approaches because no test cases need to be defined before testing. Instead, the tests are generated during the execution, on-the-fly. This paper presents an empirical case study in a realistic industrial context where we compare TESTAR to a manual test approach of a web-based application in the rail sector. Both qualitative and quantitative research methods are used to investigate learnability, effectiveness, efficiency, and satisfaction. The results show that TESTAR was able to detect more faults and higher functional test coverage than the used manual test approach. As far as efficiency is concerned, the preparation time of both test approaches is identical, but TESTAR can realize test execution without the use of human resources. Finally, TESTAR turns out to be a learnable test approach. As a result of the study described in this paper, TESTAR technology was successfully transferred and the company will use both test approaches in a complementary way in the future.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    www.clavei.es/.

  2. 2.

    www.primefaces.org.

  3. 3.

    JSF is a Java-based component to build a GUI.

  4. 4.

    https://www.tibco.com/products/tibco-enterprise-message-service.

  5. 5.

    www.testar.org.

References

  1. Vos, T.E.J., et al.: TESTAR: tool support for test automation at the user interface level. Int. J. Inf. Syst. Model. Des. 6 (2015)

    Google Scholar 

  2. Kresse, A., Kruse, P.M.: Development and maintenance efforts testing graphical user interfaces: a comparison. In: Proceedings of the 7th International Workshop on Automating Test Case Design, Selection, and Evaluation (A-TEST 2016), pp. 52–58 (2016)

    Google Scholar 

  3. Grechanik, M., Xie, Q., Fu, C.: Experimental assessment of manual versus tool-based maintenance of GUI-directed test scripts. In: ICSM (2009)

    Google Scholar 

  4. Nguyen, B.N., Robbins, B., Banerjee, I., Memon, A.: GUITAR: an innovative tool for automated testing of GUI-driven software. Autom. Softw. Eng. 21(1), 65–105 (2013). https://doi.org/10.1007/s10515-013-0128-9

    Article  Google Scholar 

  5. Garousi, V., et al.: Comparing automated visual GUI testing tools: an industrial case study. In: ACM SIGSOFT International Workshop on Automated Software Testing (A-TEST 2017), pp. 21–28 (2017)

    Google Scholar 

  6. Aho, P., et al.: Evolution of automated regression testing of software systems through the graphical user interface. In: International Conference on Advances in Computation, Communications and Services (2016)

    Google Scholar 

  7. Leotta, M., et al.: Capture-replay vs. programmable web testing: an empirical assessment during test case evolution. In: Conference on Reverse Engineering, pp. 272–281 (2013)

    Google Scholar 

  8. Alégroth, E., Nass, M., Olsson, H.H.: JAutomate: a tool for system- and acceptance-test automation. In: Proceedings - IEEE 6th International Conference on Software Testing, Verification and Validation. ICST 2013, pp. 439–446 (2013)

    Google Scholar 

  9. Alégroth, E., Feldt, R., Ryrholm, L.: Visual GUI Testing in Practice: Challenges, Problems and Limitations. Empir. Softw. Eng. J. 20(3), 694–744 (2015). https://doi.org/10.1007/s10664-013-9293-5

    Article  Google Scholar 

  10. Kitchenham, B., Pickard, L., Pfleeger, S.: Case studies for method and tool evaluation. IEEE Softw. 12, 52–62 (1995)

    Article  Google Scholar 

  11. Briand, L., et al.: The case for context-driven software engineering research: generalizability is overrated. IEEE Softw. 34(5), 72–75 (2017)

    Article  Google Scholar 

  12. Bauersfeld, S., et al.: Evaluating the TESTAR tool in an industrial case study. In: Proceedings of the 8th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement - ESEM 2014, pp. 1–9. ACM Press, New York (2014)

    Google Scholar 

  13. Bauersfeld, S., de Rojas, A., Vos, T.E.J.: Evaluating rogue user testing in industry: an experience report. Universitat Politecnica de Valencia, Valencia 2014

    Google Scholar 

  14. Martinez, M., Esparcia, A.I., Rueda, U., Vos, T.E.J., Ortega, C.: Automated localisation testing in industry with test*. In: Wotawa, F., Nica, M., Kushik, N. (eds.) ICTSS 2016. LNCS, vol. 9976, pp. 241–248. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-47443-4_17

    Chapter  Google Scholar 

  15. Runeson, P., Host, M.: Guidelines for conducting and reporting case study research in software engineering. Empir. Softw. Eng. 14(2), 131–164 (2009). https://doi.org/10.1007/s10664-008-9102-8

    Article  Google Scholar 

  16. Vos, T.E.J., et al.: A methodological framework for evaluating software testing techniques and tools. In: 2012 12th International Conference on Quality Software, pp. 230–239 (2012)

    Google Scholar 

  17. Condori-Fernández, N., et al.: Combinatorial testing in an industrial environment - analyzing the applicability of a tool. In: Proceedings - 2014 9th International Conference on the Quality of Information and Communications Technology, QUATIC 2014, pp. 210–215 (2014)

    Google Scholar 

  18. Borjesson, E., Feldt, R.: Automated system testing using visual GUI testing tools: a comparative study in industry. In: ICST 2012 Proceedings of the 2012 IEEE Fifth International Conference on Software Testing, Verification and Validation, pp. 350–359 (2012)

    Google Scholar 

  19. Nguyen, C.D., et al.: Evaluating the FITTEST automated testing tools: an industrial case study. In: 2013 ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, pp. 332–339 (2013)

    Google Scholar 

  20. Rueda, U., et al.: TESTAR -from academic prototype towards an industry-ready tool for automated testing at the User interface level (2014)

    Google Scholar 

  21. Imparato, G.: A combined technique of GUI ripping and input perturbation testing for Android apps. In: Proceedings - International Conference on Software Engineering, pp. 760–762 (2015)

    Google Scholar 

  22. Vos, T.E.J., et al.: Industrial scaled automated structural testing with the evolutionary testing tool. In: ICST 2010 – 3rd International Conference on Software Testing, Verification and Validation, pp. 175–184 (2010)

    Google Scholar 

  23. Bae, G., Rothermel, G., Bae, D.-H.: Comparing model-based and dynamic event-extraction based GUI testing techniques: An empirical study. J. Syst. Softw. 97, 15–46 (2014)

    Article  Google Scholar 

  24. Marchetto, A., Ricca, F., Tonella, P.: A case study-based comparison of web testing techniques applied to AJAX web applications. Int. J. Softw. Tools Technol. Transf. 10, 477–492 (2008). https://doi.org/10.1007/s10009-008-0086-x

    Article  Google Scholar 

  25. Benedek, J., Miner, T.: Measuring desirability new methods for evaluating desirability in a usability lab setting. Microsoft Corporation (2002)

    Google Scholar 

  26. Esparcia-Alcazar, A., et al.: Q-learning strategies for action selection in the TESTAR automated testing tool. In: Proceedings of the 6th International Conference on Metaheuristics and Nature Inspired Computing. META (2016)

    Google Scholar 

  27. Bohme, M., Paul, S.: Probabilistic analysis of the efficiency of automated software testing. IEEE Trans. Softw. Eng. 42, 345–360 (2016)

    Article  Google Scholar 

  28. Wieringa, R., Daneva, M.: Six strategies for generalizing software engineering theories. Sci. Comput. Program. 101, 136–152 (2015). https://doi.org/10.1016/j.scico.2014.11.013. ISSN 0167-6423

    Article  Google Scholar 

Download references

Acknowledgments

We would like to acknowledge the help of the case study companies They have been supporting our research, answering our questions, and validating our analyses. We would also like to thank the involved universities. This work was partially funded by the ITEA3 TESTOMAT project and the H2020 DECODER project.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tanja E. J. Vos .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chahim, H., Duran, M., Vos, T.E.J., Aho, P., Condori Fernandez, N. (2020). Scriptless Testing at the GUI Level in an Industrial Setting. In: Dalpiaz, F., Zdravkovic, J., Loucopoulos, P. (eds) Research Challenges in Information Science. RCIS 2020. Lecture Notes in Business Information Processing, vol 385. Springer, Cham. https://doi.org/10.1007/978-3-030-50316-1_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-50316-1_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-50315-4

  • Online ISBN: 978-3-030-50316-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics