iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://doi.org/10.1007/978-3-030-30923-7_12
Conformance Testing of Schedulers for DSL-based Model Checking | SpringerLink
Skip to main content

Conformance Testing of Schedulers for DSL-based Model Checking

  • Conference paper
  • First Online:
Model Checking Software (SPIN 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11636))

Included in the following conference series:

  • 534 Accesses

Abstract

When we verify concurrent systems executed under a real operating system (OS), we should take the scheduling policy of the OS into account. However, with a specific implementation of an OS, the description of the scheduling policy does not exist or not clear to describe the behaviors of the real scheduler. In this case, we need to make assumptions in the specification by ourselves. Therefore, checking the correctness of the specification of the scheduling policy is important because it affects the verification result. In this paper, we propose a method to validate the correspondence between the specification of the scheduling policy and the implementation of the scheduler using testing techniques. The overall approach can be regarded as conformance testing. As a result, we can find the inconsistency between the implementation and the specification. That indicates the incorrectness of the specification. To deal with testing, we propose a domain-specific language (DSL) to specify the test generation with the scheduling policy. A search algorithm is introduced to determine the executions of the processes. The tests are generated automatically and exhaustively by applying model-based testing (MBT) techniques. Based on this method, we develop a tool for generating the tests. We demonstrate our method with Linux FIFO scheduling policy. The experiments show that we can facilitate the test generation and check the specification of the scheduling policy easily.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    SchDSL stands for ‘Scheduling DSL’.

  2. 2.

    https://sites.google.com/site/trannhathoa/home/sspinja (accessed: 1-Jun-2019).

  3. 3.

    We deal with the behaviors of the scheduler based on handling scheduling events.

  4. 4.

    We consider that an action of the process takes one time unit.

References

  1. Ammann, P.E., Black, P.E., Majurski, W.: Using model checking to generate tests from specifications. In: Proceedings Second International Conference on Formal Engineering Methods, 1998, pp. 46–54. IEEE (1998)

    Google Scholar 

  2. Artho, C.V., et al.: Modbat: a model-based API tester for event-driven systems. In: Bertacco, V., Legay, A. (eds.) HVC 2013. LNCS, vol. 8244, pp. 112–128. Springer, Cham (2013). https://doi.org/10.1007/978-3-319-03077-7_8

    Chapter  Google Scholar 

  3. Bettini, L.: Implementing Domain-Specific Languages with Xtext and Xtend. Packt Publishing Ltd., Birmingham (2013)

    Google Scholar 

  4. Chen, J., Aoki, T.: Conformance testing for OSEK/VDX operating system using model checking. In: 2011 18th Asia Pacific Software Engineering Conference (APSEC), pp. 274–281. IEEE (2011)

    Google Scholar 

  5. Clarke, D., Jéron, T., Rusu, V., Zinovieva, E.: STG: a symbolic test generation tool. In: Katoen, J.-P., Stevens, P. (eds.) TACAS 2002. LNCS, vol. 2280, pp. 470–475. Springer, Heidelberg (2002). https://doi.org/10.1007/3-540-46002-0_34

    Chapter  Google Scholar 

  6. DeMilli, R., Offutt, A.J.: Constraint-based automatic test data generation. IEEE Transact. Softw. Eng. 17(9), 900–910 (1991)

    Article  Google Scholar 

  7. Fowler, M.: Domain-Specific Languages. Pearson Education, London (2010)

    Google Scholar 

  8. Gerth, R.: Concise PROMELA reference (1997). http://spinroot.com/spin/Man/Quick.html

  9. Hamon, G., De Moura, L., Rushby, J.: Automated test generation with SAL. CSL Technical Note p. 15 (2005)

    Google Scholar 

  10. Holzmann, G.J.: The SPIN Model Checker: Primer and Reference Manual, vol. 1003. Addison-Wesley Reading, Boston (2004)

    Google Scholar 

  11. Im, K., Im, T., McGregor, J.D.: Automating test case definition using a domain specific language. In: Proceedings of the 46th Annual Southeast Regional Conference on XX, pp. 180–185. ACM (2008)

    Google Scholar 

  12. de Jonge, M., Ruys, T.C.: The SpinJa model checker. In: van de Pol, J., Weber, M. (eds.) SPIN 2010. LNCS, vol. 6349, pp. 124–128. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-16164-3_9

    Chapter  Google Scholar 

  13. Kerrisk, M.: SCHED(7) Linux Programmer’s Manual. http://man7.org/linux/man-pages/man7/sched.7.html, accessed 10 Jan 2017

  14. Paiva, A.C., Faria, J.P., Vidal, R.M.: Automated specification-based testing of interactive components with AsmL. In: QUATIC, pp. 119–126 (2004)

    Google Scholar 

  15. Patel, P.E., Patil, N.N.: Testcases formation using UML activity diagram. In: 2013 International Conference on Communication Systems and Network Technologies (CSNT), pp. 884–889. IEEE (2013)

    Google Scholar 

  16. Peleska, J., Vorobev, E., Lapschies, F.: Automated test case generation with SMT-solving and abstract interpretation. In: Bobaru, M., Havelund, K., Holzmann, G.J., Joshi, R. (eds.) NFM 2011. LNCS, vol. 6617, pp. 298–312. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-20398-5_22

    Chapter  Google Scholar 

  17. Seater, R., Dennis, G.: Automated test data generation with SAT (2005). http://groups.csail.mit.edu/pag/6.883/projects/mutant-test-generation.pdf

  18. Shirole, M., Kumar, R.: UML behavioral model based test case generation: a survey. ACM SIGSOFT Softw. Eng. Notes 38(4), 1–13 (2013)

    Article  Google Scholar 

  19. Swain, S.K., Mohapatra, D.P., Mall, R.: Test case generation based on use case and sequence diagram. Int. J. Softw. Eng. 3(2), 21–52 (2010)

    Google Scholar 

  20. Tran, N.H., Chiba, Y., Aoki, T.: Domain-specific language facilitates scheduling in model checking. In: 2017 24th Asia-Pacific Software Engineering Conference (APSEC), pp. 417–426. IEEE (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nhat-Hoa Tran .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tran, NH., Aoki, T. (2019). Conformance Testing of Schedulers for DSL-based Model Checking. In: Biondi, F., Given-Wilson, T., Legay, A. (eds) Model Checking Software. SPIN 2019. Lecture Notes in Computer Science(), vol 11636. Springer, Cham. https://doi.org/10.1007/978-3-030-30923-7_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30923-7_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30922-0

  • Online ISBN: 978-3-030-30923-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics