iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://unpaywall.org/10.1007/978-3-030-05767-1_10
Evaluating Maintainability Prejudices with a Large-Scale Study of Open-Source Projects | SpringerLink
Skip to main content

Evaluating Maintainability Prejudices with a Large-Scale Study of Open-Source Projects

  • Conference paper
  • First Online:
Software Quality: The Complexity and Challenges of Software Engineering and Software Quality in the Cloud (SWQD 2019)

Abstract

In software engineering, relying on experience can render maintainability expertise into prejudice over time. For example, based on their own experience, some consider JavaScript as inelegant language and hence of lowest maintainability. Such prejudice should not guide decisions without prior empirical validation.

Hence, we formulated 10 hypotheses about maintainability based on prejudices and test them in a large set of open-source projects (6,897 GitHub repositories, 402 million lines, 5 programming languages). We operationalize maintainability with five static analysis metrics.

We found that JavaScript code is not worse than other code, Java code shows higher maintainability than C# code and C code has longer methods than other code. The quality of interface documentation is better in Java code than in other code. Code developed by teams is not of higher and large code bases not of lower maintainability. Projects with high maintainability are not more popular or more often forked. Overall, most hypotheses are not supported by open-source data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    “Lines of code” denotes all lines in a file or method,“source lines of code” all lines while ignoring empty lines and comments.

  2. 2.

    See https://github.com/Dan1ve/MSR17CodeQualityOnGitHub.

  3. 3.

    Apart from personal experience and discussions based on blog posts such as http://live.julik.nl/2013/05/javascript-is-shit.

References

  1. ConQAT Homepage. https://www.conqat.org/. Accessed 01 July 2017

  2. GitHub Homepage. https://github.com/. Accessed 01 June 2017

  3. Homepage of GitHub API. https://developer.github.com/v3/ Accessed June 01 2017

  4. Ahmed, I., Ghorashi, S., Jensen, C.: An exploration of code quality in FOSS projects. In: Corral, L., Sillitti, A., Succi, G., Vlasenko, J., Wasserman, A.I. (eds.) OSS 2014. IAICT, vol. 427, pp. 181–190. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-642-55128-4_26

    Chapter  Google Scholar 

  5. Beck, K.: Test-Driven Development: By Example. Addison-Wesley Professional, Boston (2003)

    Google Scholar 

  6. Beller, M., Bholanath, R., McIntosh, S., Zaidman, A.: Analyzing the state of static analysis: a large-scale evaluation in open source software. In: Proceedings of 23rd IEEE International Conference on Software Analysis, Evolution, and Reengineering (SANER), pp. 470–481. IEEE (2016). https://doi.org/10.1109/SANER.2016.105

  7. Bird, C., Nagappan, N., Devanbu, P., Gall, H., Murphy, B.: Does distributed development affect software quality? An empirical case study of Windows Vista. Commun. ACM 52(8), 85–93 (2009)

    Article  Google Scholar 

  8. Bird, C., Nagappan, N., Murphy, B., Gall, H., Devanbu, P.: Don’t touch my code! examining the effects of ownership on software quality. In: Proceedings of the 19th ACM SIGSOFT Symposium and the 13th European Conference on Foundations of Software Engineering (FSE), pp. 4–14. ACM (2011)

    Google Scholar 

  9. Bissyandé, T.F., Thung, F., Lo, D., Jiang, L., Réveillere, L.: Popularity, interoperability, and impact of programming languages in 100,000 open source projects. In: Proceedings of the 37th IEEE Annual Computer Software and Applications Conference (COMPSAC), pp. 303–312. IEEE (2013)

    Google Scholar 

  10. Corral, L., Fronza, I.: Better code for better apps: a study on source code quality and market success of android applications. In: Proceedings of the Second ACM International Conference on Mobile Software Engineering and Systems (MOBILESoft), pp. 22–32. IEEE Press (2015)

    Google Scholar 

  11. Deissenboeck, F., Juergens, E., Hummel, B., Wagner, S., y Parareda, B.M., Pizka, M.: Tool support for continuous quality control. IEEE Softw. 25(5), 60–67 (2008)

    Google Scholar 

  12. Deissenboeck, F., Wagner, S., Pizka, M., Teuchert, S., Girard, J.F.: An activity-based quality model for maintainability. In: IEEE International Conference on Software Maintenance (ICSM 2007), pp. 184–193. IEEE (2007)

    Google Scholar 

  13. Eisenhardt, K.M.: Building theories from case study research. Acad. Manag. Rev. 14(4), 532–550 (1989)

    Article  Google Scholar 

  14. El Emam, K., Benlarbi, S., Goel, N., Rai, S.N.: The confounding effect of class size on the validity of object-oriented metrics. IEEE Trans. Softw. Eng. 27(7), 630–650 (2001)

    Article  Google Scholar 

  15. Fjeldstad, R.K., Hamlen, W.T.: Application program maintenance study: Report to our respondents. In: Proceedings Guide, vol. 48 (1983)

    Google Scholar 

  16. Google: Google Java Coding Guidelines (2016). https://google.github.io/styleguide/javaguide.html. Accessed 02 Oct 2017

  17. Gousios, G.: The GHTorrent dataset and tool suite. In: Proceedings of the 10th Working Conference on Mining Software Repositories (MSR), pp. 233–236. IEEE Press, Piscataway (2013)

    Google Scholar 

  18. Heinemann, L., Hummel, B., Steidl, D.: Teamscale: software quality control in real-time. In: Companion Proceedings of the 36th International Conference on Software Engineering (ICSE), pp. 592–595. ACM (2014)

    Google Scholar 

  19. Heitlager, I., Kuipers, T., Visser, J.: A practical model for measuring maintainability. In: Proceedings of 6th International Conference on the Quality of Information and Communications Technology (QUATIC 2007), pp. 30–39. IEEE (2007)

    Google Scholar 

  20. Hönick, S.: How Does the Programming Language Affect the Quantity of Code Clones in Open Source Software? Master’s thesis, Technische Universität München (2015)

    Google Scholar 

  21. Juergens, E.: Why and How to Control Cloning in Software Artifacts. Ph.D. thesis, Technische Universität München (2011)

    Google Scholar 

  22. Kabacoff, R.: R in Action: Data Analysis and Graphics with R. Manning Publications Co. (2015)

    Google Scholar 

  23. Kalliamvakou, E., Gousios, G., Blincoe, K., Singer, L., German, D.M., Damian, D.: The promises and perils of mining GitHub. In: Proceedings of the 11th Working Conference on Mining Software Repositories, pp. 92–101. ACM (2014)

    Google Scholar 

  24. Kochhar, P.S., Wijedasa, D., Lo, D.: A large scale study of multiple programming languages and code quality. In: Proceedings of the 23rd IEEE International Conference on Software Analysis, Evolution, and Reengineering (SANER), vol. 1, pp. 563–573. IEEE (2016)

    Google Scholar 

  25. Koschke, R., Bazrafshan, S.: Software-clone rates in open-source programs written in C or C++. In: Proceedings of the 23rd IEEE International Conference on Software Analysis, Evolution, and Reengineering (SANER), pp. 1–7. IEEE (2016)

    Google Scholar 

  26. Martin, R.C.: Clean Code: A Handbook of Agile Software Craftsmanship. Pearson Education, London (2009)

    Google Scholar 

  27. Miller, G.A.: The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychol. Rev. 63(2), 81 (1956)

    Article  Google Scholar 

  28. Minelli, R., Mocci, A., Lanza, M.: I know what you did last summer: an investigation of how developers spend their time. In: Proceedings of the 23rd IEEE International Conference on Program Comprehension (ICPC), pp. 25–35. IEEE Press (2015)

    Google Scholar 

  29. Nagappan, N., Murphy, B., Basili, V.: The influence of organizational structure on software quality. In: Proceedings of the 30th ACM/IEEE International Conference on (ICSE), pp. 521–530. IEEE (2008)

    Google Scholar 

  30. Norick, B., Krohn, J., Howard, E., Welna, B., Izurieta, C.: Effects of the number of developers on code quality in open source software: a case study. In: Proceedings of the ACM-IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM), p. 62. ACM (2010)

    Google Scholar 

  31. Ostberg, J.P., Wagner, S.: On automatically collectable metrics for software maintainability evaluation. In: Proceedings of the 2014 Joint Conference of the International Workshop on Software Measurement and the International Conference on Software Process and Product Measurement, pp. 32–37. IEEE (2014)

    Google Scholar 

  32. Ray, B., Posnett, D., Devanbu, P., Filkov, V.: A large scale study of programming languages and code quality in GitHub. Commun. ACM 60(10), 91–100 (2017). https://doi.org/10.1145/3126905

    Article  Google Scholar 

  33. Ray, B., Posnett, D., Filkov, V., Devanbu, P.: A large scale study of programming languages and code quality in GitHub. In: Proceedings of the 22nd ACM SIGSOFT International Symposium on Foundations of Software Engineering (FSE), pp. 155–165. ACM (2014)

    Google Scholar 

  34. Ruiz, C., Robinson, W.: Measuring open source quality: a literature review. In: Open Source Software Dynamics, Processes, and Applications, pp. 189–206. IGI Global (2013)

    Google Scholar 

  35. Samoladas, I., Gousios, G., Spinellis, D., Stamelos, I.: The SQO-OSS quality model: measurement based open source software evaluation. In: Russo, B., Damiani, E., Hissam, S., Lundell, B., Succi, G. (eds.) OSS 2008. ITIFIP, vol. 275, pp. 237–248. Springer, Boston, MA (2008). https://doi.org/10.1007/978-0-387-09684-1_19

    Chapter  Google Scholar 

  36. Samoladas, I., Stamelos, I., Angelis, L., Oikonomou, A.: Open source software development should strive for even greater code maintainability. Commun. ACM 47(10), 83–87 (2004)

    Article  Google Scholar 

  37. Spinellis, D., et al.: Evaluating the quality of open source software. Electron. Notes Theor. Comput. Sci. 233, 5–28 (2009)

    Article  Google Scholar 

  38. Stamelos, I., Angelis, L., Oikonomou, A., Bleris, G.L.: Code quality analysis in open source software development. Inf. Syst. J. 12(1), 43–60 (2002)

    Article  Google Scholar 

  39. Steidl, D., Deissenboeck, F.: How do java methods grow? In: Proceedings of the 15th IEEE International Working Conference on Source Code Analysis and Manipulation (SCAM) (2015)

    Google Scholar 

  40. Steidl, D., Deissenboeck, F., Poehlmann, M., Heinke, R., Uhink-Mergenthaler, B.: Continuous software quality control in practice. In: Proceedings of the IEEE International Conference on Software Maintenance and Evolution (ICSME), pp. 561–564 (2014)

    Google Scholar 

  41. Wagner, S., et al.: Operationalised product quality models and assessment: the quamoco approach. Inf. Softw. Technol. 62, 101–123 (2015)

    Article  Google Scholar 

  42. Weyuker, E.J., Ostrand, T.J., Bell, R.M.: Do too many cooks spoil the broth? Using the number of developers to enhance defect prediction models. Empir. Softw. Eng. 13(5), 539–559 (2008)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Tobias Roehm , Daniel Veihelmann , Stefan Wagner or Elmar Juergens .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Roehm, T., Veihelmann, D., Wagner, S., Juergens, E. (2019). Evaluating Maintainability Prejudices with a Large-Scale Study of Open-Source Projects. In: Winkler, D., Biffl, S., Bergsmann, J. (eds) Software Quality: The Complexity and Challenges of Software Engineering and Software Quality in the Cloud. SWQD 2019. Lecture Notes in Business Information Processing, vol 338. Springer, Cham. https://doi.org/10.1007/978-3-030-05767-1_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-05767-1_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-05766-4

  • Online ISBN: 978-3-030-05767-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics