Abstract
When we verify concurrent systems executed under a real operating system (OS), we should take the scheduling policy of the OS into account. However, with a specific implementation of an OS, the description of the scheduling policy does not exist or not clear to describe the behaviors of the real scheduler. In this case, we need to make assumptions in the specification by ourselves. Therefore, checking the correctness of the specification of the scheduling policy is important because it affects the verification result. In this paper, we propose a method to validate the correspondence between the specification of the scheduling policy and the implementation of the scheduler using testing techniques. The overall approach can be regarded as conformance testing. As a result, we can find the inconsistency between the implementation and the specification. That indicates the incorrectness of the specification. To deal with testing, we propose a domain-specific language (DSL) to specify the test generation with the scheduling policy. A search algorithm is introduced to determine the executions of the processes. The tests are generated automatically and exhaustively by applying model-based testing (MBT) techniques. Based on this method, we develop a tool for generating the tests. We demonstrate our method with Linux FIFO scheduling policy. The experiments show that we can facilitate the test generation and check the specification of the scheduling policy easily.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
SchDSL stands for ‘Scheduling DSL’.
- 2.
https://sites.google.com/site/trannhathoa/home/sspinja (accessed: 1-Jun-2019).
- 3.
We deal with the behaviors of the scheduler based on handling scheduling events.
- 4.
We consider that an action of the process takes one time unit.
References
Ammann, P.E., Black, P.E., Majurski, W.: Using model checking to generate tests from specifications. In: Proceedings Second International Conference on Formal Engineering Methods, 1998, pp. 46–54. IEEE (1998)
Artho, C.V., et al.: Modbat: a model-based API tester for event-driven systems. In: Bertacco, V., Legay, A. (eds.) HVC 2013. LNCS, vol. 8244, pp. 112–128. Springer, Cham (2013). https://doi.org/10.1007/978-3-319-03077-7_8
Bettini, L.: Implementing Domain-Specific Languages with Xtext and Xtend. Packt Publishing Ltd., Birmingham (2013)
Chen, J., Aoki, T.: Conformance testing for OSEK/VDX operating system using model checking. In: 2011 18th Asia Pacific Software Engineering Conference (APSEC), pp. 274–281. IEEE (2011)
Clarke, D., Jéron, T., Rusu, V., Zinovieva, E.: STG: a symbolic test generation tool. In: Katoen, J.-P., Stevens, P. (eds.) TACAS 2002. LNCS, vol. 2280, pp. 470–475. Springer, Heidelberg (2002). https://doi.org/10.1007/3-540-46002-0_34
DeMilli, R., Offutt, A.J.: Constraint-based automatic test data generation. IEEE Transact. Softw. Eng. 17(9), 900–910 (1991)
Fowler, M.: Domain-Specific Languages. Pearson Education, London (2010)
Gerth, R.: Concise PROMELA reference (1997). http://spinroot.com/spin/Man/Quick.html
Hamon, G., De Moura, L., Rushby, J.: Automated test generation with SAL. CSL Technical Note p. 15 (2005)
Holzmann, G.J.: The SPIN Model Checker: Primer and Reference Manual, vol. 1003. Addison-Wesley Reading, Boston (2004)
Im, K., Im, T., McGregor, J.D.: Automating test case definition using a domain specific language. In: Proceedings of the 46th Annual Southeast Regional Conference on XX, pp. 180–185. ACM (2008)
de Jonge, M., Ruys, T.C.: The SpinJa model checker. In: van de Pol, J., Weber, M. (eds.) SPIN 2010. LNCS, vol. 6349, pp. 124–128. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-16164-3_9
Kerrisk, M.: SCHED(7) Linux Programmer’s Manual. http://man7.org/linux/man-pages/man7/sched.7.html, accessed 10 Jan 2017
Paiva, A.C., Faria, J.P., Vidal, R.M.: Automated specification-based testing of interactive components with AsmL. In: QUATIC, pp. 119–126 (2004)
Patel, P.E., Patil, N.N.: Testcases formation using UML activity diagram. In: 2013 International Conference on Communication Systems and Network Technologies (CSNT), pp. 884–889. IEEE (2013)
Peleska, J., Vorobev, E., Lapschies, F.: Automated test case generation with SMT-solving and abstract interpretation. In: Bobaru, M., Havelund, K., Holzmann, G.J., Joshi, R. (eds.) NFM 2011. LNCS, vol. 6617, pp. 298–312. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-20398-5_22
Seater, R., Dennis, G.: Automated test data generation with SAT (2005). http://groups.csail.mit.edu/pag/6.883/projects/mutant-test-generation.pdf
Shirole, M., Kumar, R.: UML behavioral model based test case generation: a survey. ACM SIGSOFT Softw. Eng. Notes 38(4), 1–13 (2013)
Swain, S.K., Mohapatra, D.P., Mall, R.: Test case generation based on use case and sequence diagram. Int. J. Softw. Eng. 3(2), 21–52 (2010)
Tran, N.H., Chiba, Y., Aoki, T.: Domain-specific language facilitates scheduling in model checking. In: 2017 24th Asia-Pacific Software Engineering Conference (APSEC), pp. 417–426. IEEE (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Tran, NH., Aoki, T. (2019). Conformance Testing of Schedulers for DSL-based Model Checking. In: Biondi, F., Given-Wilson, T., Legay, A. (eds) Model Checking Software. SPIN 2019. Lecture Notes in Computer Science(), vol 11636. Springer, Cham. https://doi.org/10.1007/978-3-030-30923-7_12
Download citation
DOI: https://doi.org/10.1007/978-3-030-30923-7_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-30922-0
Online ISBN: 978-3-030-30923-7
eBook Packages: Computer ScienceComputer Science (R0)