Abstract
Software applications run our modern world. Software is now used to control systems with definite safety implications; it is used to collect, store and disseminate information of incredible variety; it is a vital component of communication devices, and is indispensable in almost all current entertainment and art forms. We often bemoan the fact that software dependent systems are not more dependable, but that loses sight of the fact that software engineering is a relatively young discipline, has made incredible advances—and is often used because alternatives cannot cope as well with the complexity of modern systems. A common theme in Software Engineering, and in this book, is that high complexity adversely affects our ability to develop low defect systems. Computer Scientists and Software Engineers have evolved techniques to improve our ability to develop dependable and safe complex systems. However, there is an alternative to conquering complexity—and that is to avoid it wherever possible! As an example, Canadian nuclear regulation states that safety systems in nuclear power plants have to be completely separated from the control systems in that plant, and isolated as much as possible from each other. Similar regulation is actually common in other countries as well. This is a special case of the general principle of separation of concerns. We propose a new principle, called the conservation of complexity, as a basis for addressing separation of concerns. This principle states that there is an inherent minimum complexity in any system, and that we cannot reduce it no matter what techniques we use. We can, of course, increase the complexity in our solution through inappropriate design. Hence, to address issues of dependability in today’s critical systems, we must use separation of concerns as a way of reducing the complexity of the critical aspects of such systems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
The US Nuclear Regulatory Commission’s Common-Cause Failure Data Base (CCFDB): http://nrcoe.inel.gov/results/index.cfm?fuseaction=CCFDB.showMenu.
References
Bogdanich, W.: Radiation offers new cures, and ways to do harm. The New York Times Online (2010). Published January 23, 2010. Available online: http://www.nytimes.com/2010/01/24/health/24radiation.html
Bogdanich, W., Rebelo, K.: A pinpoint beam strays invisibly, harming instead of healing. The New York Times Online (2010). Published December 28, 2010. Available online: http://www.nytimes.com/2010/12/29/health/29radiation.html
Caglayan, A., Lorczak, P., Eckhardt, D.: An experimental investigation of software diversity in a fault-tolerant avionics application. In: Proceedings Seventh Symposium on Reliable Distributed Systems, pp. 63–70 (1988)
Easterbrook, S., Johns, T.: Engineering the software for understanding climate change. Comput. Sci. Eng. 11(6), 65–74 (2009)
Fenton, N., Neil, M.: Software metrics: successes, failures and new directions. J. Syst. Softw. 47(2–3), 149–157 (1999)
Fenton, N.E., Pfleeger, S.L.: Software Metrics: A Rigorous and Practical Approach. PWS Publishing Co., Boston (1998)
Fischmeister, S., Sokolsky, O., Lee, I.: A verifiable language for programming real-time communication schedules. IEEE Transactions on Computers 1505–1519 (2007)
Hoare, C.A.R.: The emperor’s old clothes. Commun. ACM 24(2), 75–83 (1981)
IEC 61508: Functional safety of electrical/electronic/programmable electronic (E/E/EP) safety-related systems: Parts 3 and 7. International Electrotechnical Commission (IEC) (2010)
Jackson, D., Kang, E.: Separation of concerns for dependable software design. In: Proceedings of the FSE/SDP Workshop on Future of Software Engineering Research, FoSER’10, pp. 173–176. ACM, New York (2010)
Jones, C.: Software metrics: good, bad and missing. Computer 27(9), 98–100 (1994)
Kelly, D.F.: A software chasm: software engineering and scientific computing. IEEE Softw. 24(6), 119–120 (2007)
Klein, G., Elphinstone, K., Heiser, G., Andronick, J., Cock, D., Derrin, P., Elkaduwe, D., Engelhardt, K., Kolanski, R., Norrish, M., Sewell, T., Tuch, H., Winwood, S.: seL4: formal verification of an OS kernel. In: Proceedings of the ACM SIGOPS 22nd Symposium on Operating Systems Principles, SOSP ’09, pp. 207–220. ACM, New York (2009)
Ladyman, J., Lambert, J., Wiesner, K.: What is a complex system? http://philsci-archive.pitt.edu/8496/ (2011). Preprint
Lee, L.: The Day the Phones Stopped. Donald I. Fine Inc., New York (1991)
Manson, S.M.: Simplifying complexity: a review of complexity theory. Geoforum 32(3), 405–414 (2001)
Niskier, C., Maibaum, T., Schwabe, D.: A pluralistic knowledge-based approach to software specification. In: Ghezzi, C., McDermid, J. (eds.) ESEC ’89. Lecture Notes in Computer Science, vol. 387, pp. 411–423. Springer, Berlin (1989)
NRC Staff: Common-cause failure database and analysis system: event data collection, classification, and coding. Tech. rep. NUREG/CR-6268, US Nuclear Regulatory Commission (1998)
NRC Staff: Guidelines on modeling common-cause failures in probabilistic risk assessment. Tech. rep. NUREG/CR-5485, US Nuclear Regulatory Commission (1998)
Nuseibeh, B., Kramer, J., Finkelstein, A.: A framework for expressing the relationships between multiple views in requirements specification. IEEE Trans. Softw. Eng. 20, 760–773 (1994)
Parnas, D.: On the criteria to be used in decomposing systems into modules. Commun. ACM 15(12), 1053–1058 (1972)
Parnas, D.L., Clements, P.C., Weiss, D.M.: The modular structure of complex systems. IEEE Trans. Softw. Eng. SE-11(3), 66–259 (1985)
Polya, G., Stewart, I.: How to Solve It. Princeton University Press, Princeton (1948)
Sha, L.: Using simplicity to control complexity. IEEE Software, 20–28 (2001). http://doi.ieeecomputersociety.org/10.1109/MS.2001.936213
Vincenti, W.G.: What Engineers Know and how They Know It: Analytical Studies from Aeronautical History. Johns Hopkins University Press, Baltimore (1993)
Wassyng, A., Lawford, M.: Lessons learned from a successful implementation of formal methods in an industrial project. In: Araki, K., Gnesi, S., Mandrioli, D. (eds.) FME 2003: International Symposium of Formal Methods Europe Proceedings. Lecture Notes in Computer Science, vol. 2805, pp. 133–153. Springer, Pisa (2003)
Wassyng, A., Lawford, M., Maibaum, T., Luxat, J.: Separation of control and safety systems. In: Fischmeister, S., Phan, L.T. (eds.) APRES’11: Adaptive and Reconfigurable Embedded Systems, Chicago, IL, pp. 11–14 (2011)
Wassyng, A., Maibaum, T., Lawford, M.: On software certification: we need product-focused approaches. In: Choppy, C., Sokolsky, O. (eds.) Foundations of Computer Software. Future Trends and Techniques for Development. Lecture Notes in Computer Science, vol. 6028, pp. 250–274. Springer, Berlin (2010)
Weng, G., Bhalla, U., Iyengar, R.: Complexity in biological signaling systems. Science 284(5411), 92 (1999)
Acknowledgements
This work is supported by the Ontario Research Fund, and the National Science and Engineering Research Council of Canada.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag London Limited
About this chapter
Cite this chapter
Wassyng, A., Lawford, M., Maibaum, T. (2012). Separating Safety and Control Systems to Reduce Complexity. In: Hinchey, M., Coyle, L. (eds) Conquering Complexity. Springer, London. https://doi.org/10.1007/978-1-4471-2297-5_4
Download citation
DOI: https://doi.org/10.1007/978-1-4471-2297-5_4
Publisher Name: Springer, London
Print ISBN: 978-1-4471-2296-8
Online ISBN: 978-1-4471-2297-5
eBook Packages: Computer ScienceComputer Science (R0)