Efficient increasing of the mutation score during model-based test suite generation
Abstract
The purpose of the method is to increase the sensitivity of an automatically generated test suite to mutations of a model. Unlike existing methods for generating test scenarios that use the mutational approach to assess the resulting test set, the proposed method analyzes the possibility of detecting mutations on the fly, in the process of analyzing the model’s behavior space, by adding of special coverage goals. Two types of mutants manifestation are considered: deviations in the behavior of paths for (weak case) and in the observed output (strong case). A new algorithm is proposed for efficient search of a path with observable effect of a mutation.
Problems in programming 2020; 2-3: 331-340
Keywords
Full Text:
PDFReferences
Dssouli R., et al. Testing the control-flow, data-flow, and time aspects of communication systems: a survey. Adv. Comput. 2017. 107.
Р. 95–155.
Inozemtseva L., Holmes R. Coverage is not strongly correlated with test suite effectiveness. In: Proceedings of ACM ICSE. P. 435–445 (2015). CrossRef
Chekam T., et. al. An empirical study on mutation, statement and branch coverage fault revelation that avoids unreliable
clean program assumption. In: IEEE-ACM 39th International Conference on Software Engineering, 12 p. (2017). CrossRef
Gay G., Staats M., Whalen M., Heimdahl M. The risks of coverage-directed test case generation. IEEE Trans. Softw. Eng. 41. P. 803–819. (2015). CrossRef
Heimdahl M., Devaraj G. Specification test coverage adequacy criteria = specification test generation inadequacy criteria? In: IEEE Computer Society, HASE. P. 178–186. (2004).
Morell L.J. A theory of fault-based testing. IEEE Trans. Softw. Eng. 16(8). P. 844–857. (1990). CrossRef
Jia Y., Harman M. An Analysis and Survey of the Development of Mutation Testing. IEEE Trans. Softw. Eng. 37(5). P. 1–31. (2010). CrossRef
Papadakis M. at al. Mutation Testing Advances: An Analysis and Survey. (2019) Advances in Computers, Vol. 112. Elsevier. P. 275 – 378. CrossRef
Volkov V., et al. A survey of systematic methods for code-based test data generation. Artif. Intell. 2. P. 71–85 (2017) .
Fraser G., Arcuri A. 1600 faults in 100 projects: automatically finding faults while achieving high coverage with Evosuite. Empir. Softw. Eng. 20(3). P. 611–639. (2015). CrossRef
Choi Y.M., Lim D.J. Model‐Based Test Suite Generation Using Mutation Analysis for Fault Localization (2019). Appl. Sci. 9(17). P. 34–92. CrossRef
Chekam T. at al. Killing Stubborn Mutants with Symbolic Execution. arXiv:2001.02941. 20 p. (2020).
Hessel A., Petterson P. A global algorithm for model-based test suite generation. Electr. Notes Theor. Comp. Sci. (2007). Vol. 190. P. 47–59. CrossRef
Weigert T. at al. Generating Test Suites to Validate Legacy Systems. Lecture Notes in Computer Science (2019). Vol. 11753. P. 3–23. CrossRef
Kolchin A. A novel algorithm for attacking path explosion in model-based test generation for data flow coverage. Proc. of IEEE 1st Int. Conf. on System Analysis and Intelligent Computing, SAIC. (2018). P. 226–231. CrossRef
DeMillo Richard A. and A. Jefferson Offutt. 1991. Constraint-Based Automatic Test Data Generation. IEEE Trans. Software Eng. 17, 9 (1991). P. 900–910. CrossRef
Boonstoppel P., Cadar C., Engler D. RWset: attacking path explosion in constraint-based test generation. In: Ramakrishnan, C.R., Rehof, J. (eds.) TACAS 2008. LNCS. Vol. 4963. P. 351–366. Springer, Heidelberg (2008). CrossRef
Beyer D., Dangl M. SMT-based Software Model Checking: An Experimental Comparison of Four Algorithms. VSTTE 2016: Verified Software. Theories, Tools, and Experiments. (2016). P. 181–198. CrossRef
DeMillo R.A., Lipton R.J., and Sayward F.G. Hints on Test Data Selection: Help for the Practicing Programmer. Computer. Vol. 11, N 4.
P. 34–41. (1978). CrossRef
Meng Y., Gay G., Whalen M. Ensuring the Observability of Structural Test Obligations. IEEE Transactions on Software Engineering (Early Access). (2019). CrossRef
Li N., Offut J. An experimental comparison of four unit test criteria: mutation, edge-pair, all-uses and prime path coverage. In: IEEE International Conference on Software Testing, Verification and Validation. P. 220-229. (2009). CrossRef
Budd T.A. Mutation Analysis of Program Test Data. PhD Thesis, Yale University, New Haven, Connecticut, 1980.
Hong H., Ural H. Dependence testing: extending data flow testing with control dependence. LNCS. Vol. 3502. (2005). P. 23-39. CrossRef
Ntafos S. On required element testing. IEEE Trans. Software Eng. Vol. 10, N 6. P. 795-803. (1984). CrossRef
Letichevsky A.A. Algebraic Interaction Theory and Cyber-Physical Systems. Journal of Automation and Information Sciences. Vol. 49. P. 1-19. (2017). CrossRef
Tallam S., Gupta N. A concept analysis inspired greedy algorithm for test suite minimization. ACM Softw. Eng. Notes 31(1). P. 35-42. (2006). CrossRef
Namin A., Andrews J. The influence of size and coverage on test suite effectiveness. In Proc. of Int. Symp. on Softw. Testing. P. 57-68. (2009). CrossRef
Kolchin A., Potiyenko Weigert. Challenges for automated, model-based test scenario generation. Communications in Computer and Information Science. (2019). Vol. 1078. P. 182-194. CrossRef
Guba A., et al. A method for business logic extraction from legacy COBOL code of industrial systems. In: Proceedings of the 10th International Conference on Programming UkrPROG2016, CEUR-WS. Vol. 1631. P. 17-25. (2016).
DOI: https://doi.org/10.15407/pp2020.02-03.331
Refbacks
- There are currently no refbacks.