Complex systems |
---|
Topics |
Emergence |
Collective consciousness |
Social dynamics Collective intelligence Swarm behaviourCollective action Self-organized criticality Herd mentality Phase transition Agent-based modelling Synchronization Ant colony optimization Particle swarm optimization |
Scale-free networks Social network analysis Adaptive networksSmall-world networks Community identification Centrality Motifs Graph Theory Scaling Robustness Systems biology Dynamic networks |
Artificial neural network Evolutionary computation EvolvabilityGenetic algorithms Genetic programming Artificial life Machine learning Evolutionary developmental biology Artificial intelligence Evolutionary robotics |
Spatial fractals Reaction–diffusion systems GeomorphologyPartial differential equations Dissipative structures Percolation Cellular automata Spatial ecology Self-replication Spatial evolutionary biology |
Homeostasis Operationalization Complexity measurementFeedback Self-reference Goal-oriented System dynamics Sensemaking Entropy Cybernetics Autopoiesis Information theory Computation theory |
Time series analysis Ordinary differential equations Coupled map latticesIterative maps Phase space Attractors Stability analysis Population dynamics Chaos Multistability Bifurcation |
Prisoner's dilemma Rational choice theory Evolutionary game theoryBounded rationality Irrational behaviour |
In computer science, robustness is the ability of a computer system to cope with errors during execution[1][2] and cope with erroneous input.[2] Robustness can encompass many areas of computer science, such as robust programming, robust machine learning, and Robust Security Network. Formal techniques, such as fuzz testing, are essential to showing robustness since this type of testing involves invalid or unexpected inputs. Alternatively, fault injection can be used to test robustness. Various commercial products perform robustness testing of software analysis.[3]
This course introduces you to the principles of secure programming. It begins by discussing the philosophy and principles of secure programming, and then.
- 3Areas
- 3.1Robust programming
Introduction[edit]
In general, building robust systems that encompass every point of possible failure is difficult because of the vast quantity of possible inputs and input combinations.[4] Since all inputs and input combinations would require too much time to test, developers cannot run through all cases exhaustively. Instead, the developer will try to generalize such cases.[5] For example, imagine inputting some integer values. Some selected inputs might consist of a negative number, zero, and a positive number. When using these numbers to test software in this way, the developer generalizes the set of all reals into three numbers. This is a more efficient and manageable method, but more prone to failure. Generalizing test cases is an example of just one technique to deal with failure—specifically, failure due to invalid user input. Systems generally may also fail due to other reasons as well, such as disconnecting from a network.
Regardless, complex systems should still handle any errors encountered gracefully. There are many examples of such successful systems. Some of the most robust systems are evolvable and can be easily adapted to new situations.[4]
Challenges[edit]
Programs and software are tools focused on a very specific task, and thus aren't generalized and flexible.[4] However, observations in systems such as the internet or biological systems demonstrate adaptation to their environments. One of the ways biological systems adapt to environments is through the use of redundancy.[4] Many organs are redundant in humans. The kidney is one such example. Humans generally only need one kidney, but having a second kidney allows room for failure. This same principle may be taken to apply to software, but there are some challenges.When applying the principle of redundancy to computer science, blindly adding code is not suggested. Blindly adding code introduces more errors, makes the system more complex, and renders it harder to understand.[6] Code that doesn't provide any reinforcement to the already existing code is unwanted. The new code must instead possess equivalent functionality, so that if a function is broken, another providing the same function can replace it, using manual or automated software diversity. To do so, the new code must know how and when to accommodate the failure point.[4] This means more logic needs to be added to the system. But as a system adds more logic, components, and increases in size, it becomes more complex. Thus, when making a more redundant system, the system also becomes more complex and developers must consider balancing redundancy with complexity.
Currently, computer science practices do not focus on building robust systems.[4] Rather, they tend to focus on scalability and efficiency. One of the main reasons why there is no focus on robustness today is because it is hard to do in a general way.[4]
Areas[edit]
Robust programming[edit]
Robust programming is a style of programming that focuses on handling unexpected termination and unexpected actions.[7] It requires code to handle these terminations and actions gracefully by displaying accurate and unambiguous error messages. These error messages allow the user to more easily debug the program.
Principles[edit]
Paranoia - When building software, the programmer assumes users are out to break their code.[7] The programmer also assumes that his or her own written code may fail or work incorrectly.[7]
Stupidity - The programmer assumes users will try incorrect, bogus and malformed inputs.[7] As a consequence, the programmer returns to the user an unambiguous, intuitive error message that does not require looking up error codes. The error message should try to be as accurate as possible without being misleading to the user, so that the problem can be fixed with ease.
Dangerous implements - Users should not gain access to libraries, data structures, or pointers to data structures.[7] This information should be hidden from the user so that the user doesn't accidentally modify them and introduce a bug in the code. When such interfaces are correctly built, users use them without finding loopholes to modify the interface. The interface should already be correctly implemented, so the user does not need to make modifications. The user therefore focuses solely on his or her own code.
Can't happen - Very often, code is modified and may introduce a possibility that an 'impossible' case occurs. Impossible cases are therefore assumed to be highly unlikely instead.[7] The developer thinks about how to handle the case that is highly unlikely, and implements the handling accordingly.
Robust machine learning[edit]
Robust machine learning typically refers to the robustness of machine learning algorithms. For a machine learning algorithm to be considered robust, either the testing error has to be consistent with the training error, or the performance is stable after adding some noise to the dataset.[8]
Robust network design[edit]
Robust network design is the study of network design in the face of variable or uncertain demands.[9] In a sense, robustness in network design is broad just like robustness in software design because of the vast possibilities of changes or inputs.
Robust algorithms[edit]
There exists algorithms that tolerate errors in the input[10] or during the computation.[11] In that case, the computation eventually converges to the correct output. This phenomenon has been called 'correctness attraction'.[11]
See also[edit]
References[edit]
- ^'A Model-Based Approach for Robustness Testing'(PDF). Dl.ifip.org. Retrieved 2016-11-13.
- ^ ab1990. IEEE Standard Glossary of Software Engineering Terminology, IEEE Std 610.12-1990 defines robustness as 'The degree to which a system or component can function correctly in the presence of invalid inputs or stressful environmental conditions'
- ^Baker, Jack W.; Schubert, Matthias; Faber, Michael H. (2008). 'On the assessment of robustness'(PDF). Structural Safety. 30: 253–267. doi:10.1016/j.strusafe.2006.11.004. Retrieved 2016-11-13.
- ^ abcdefgGerald Jay Sussman (January 13, 2007). 'Building Robust Systems an essay'(PDF). Groups.csail.mit.edu. Retrieved 2016-11-13.
- ^Joseph, Joby (2009-09-21). 'Importance of Making Generalized Testcases - Software Testing Club - An Online Software Testing Community'. Software Testing Club. Retrieved 2016-11-13.
- ^Agents on the wEb : Robust Software. 'Building Robust Systems an essay'(PDF). Cse.sc.edu. Retrieved 2016-11-13.
- ^ abcdef'Robust Programming'. Nob.cs.ucdavis.edu. Retrieved 2016-11-13.
- ^El Sayed Mahmoud. 'What is the definition of the robustness of a machine learning algorithm?'. ResearchGate. Retrieved 2016-11-13.
- ^'Robust Network Design'(PDF). Math.mit.edu. Retrieved 2016-11-13.
- ^Carbin, Michael; Rinard, Martin C. (12 July 2010). 'Automatically identifying critical input regions and code in applications'(PDF). Proceedings of the 19th international symposium on Software testing and analysis - ISSTA '10. ACM. pp. 37–48. doi:10.1145/1831708.1831713. ISBN9781605588230.
- ^ abDanglot, Benjamin; Preux, Philippe; Baudry, Benoit; Monperrus, Martin (21 December 2017). 'Correctness attraction: a study of stability of software behavior under runtime perturbation'. Empirical Software Engineering. 23 (4): 2086–2119. arXiv:1611.09187. doi:10.1007/s10664-017-9571-8.
Retrieved from 'https://en.wikipedia.org/w/index.php?title=Robustness_(computer_science)&oldid=937566645'
The strong programme or strong sociology is a variety of the sociology of scientific knowledge (SSK) particularly associated with David Bloor,[1]Barry Barnes, Harry Collins, Donald A. MacKenzie,[2] and John Henry. The strong programme's influence on Science and Technology Studies is credited as being unparalleled (Latour 1999). The largely Edinburgh-based school of thought has illustrated how the existence of a scientific community, bound together by allegiance to a shared paradigm, is a prerequisite for normal scientific activity.
The strong programme is a reaction against 'weak' sociologies of science, which restricted the application of sociology to 'failed' or 'false' theories, such as phrenology. Failed theories would be explained by citing the researchers' biases, such as covert political or economic interests. Sociology would be only marginally relevant to successful theories, which succeeded because they had revealed a fact of nature. The strong programme proposed that both 'true' and 'false' scientific theories should be treated the same way. Both are caused by social factors or conditions, such as cultural context and self-interest. All human knowledge, as something that exists in the human cognition, must contain some social components in its formation process.
Characteristics[edit]
As formulated by David Bloor,[3] the strong programme has four indispensable components:
- Causality: it examines the conditions (psychological, social, and cultural) that bring about claims to a certain kind of knowledge.
- Impartiality: it examines successful as well as unsuccessful knowledge claims.
- Symmetry: the same types of explanations are used for successful and unsuccessful knowledge claims alike.
- Reflexivity: it must be applicable to sociology itself.
History[edit]
Because the strong programme originated at the 'Science Studies Unit,' University of Edinburgh, it is sometimes termed the Edinburgh School. However, there is also a Bath School associated with Harry Collins that makes similar proposals. In contrast to the Edinburgh School, which emphasizes historical approaches, the Bath School emphasizes microsocial studies of laboratories and experiments.[4] The Bath school, however, does depart from the strong programme on some fundamental issues. In the social construction of technology (SCOT) approach developed by Collins' student Trevor Pinch, as well as by the Dutch sociologist Wiebe Bijker, the strong programme was extended to technology. There are SSK-influenced scholars working in science and technology studies programs throughout the world.[5]
Criticism[edit]
In order to study scientific knowledge from a sociological point of view, the strong programme has adhered to a form of radical relativism. In other words, it argues that – in the social study of institutionalised beliefs about 'truth' – it would be unwise to use 'truth' as an explanatory resource. That would be to include the answer as part of the question (Barnes 1992), not to mention a thoroughly 'whiggish' approach towards the study of history – that is an approach seeing human history as an inevitable march towards truth and enlightenment. Alan Sokal has criticised radical relativism as part of the science wars, on the basis that such an understanding will lead inevitably towards solipsism and postmodernism. Markus Seidel attacks the main arguments – underdetermination and norm-circularity – provided by Strong Programme proponents for their relativism.[6] Strong programme scholars insist that their approach has been misunderstood by such a criticism and that its adherence to radical relativism is strictly methodological.
Notes[edit]
- ^David Bloor, 'The strengths of the strong programme.' Scientific rationality: The sociological turn (Springer Netherlands, 1984) pp. 75-94.
- ^Donald MacKenzie, 'Notes on the science and social relations debate.' Capital & Class 5.2 (1981): 47-60.
- ^David Bloor, Knowledge and Social Imagery (1976)
- ^Harry M. Collins, 'Introduction: Stages in the empirical programme of relativism.' Social studies of science (1981): 3-10. in JSTOR
- ^Wiebe E. Bijker, et al. The social construction of technological systems: New directions in the sociology and history of technology (MIT press, 2012)
- ^Markus Seidel Epistemic Relativism. A Constructive Critique, 2014, Palgrave Macmillan
See also[edit]
Bibliography[edit]
- Barnes, B. (1977). Interests and the Growth of Knowledge. London: Routledge & Kegan Paul.
- Barnes, B. (1982). T. S. Kuhn and Social Science. London: Macmillan.
- Barnes, B. (1985). About Science. Oxford: Blackwell.
- Barnes, B. (1987). 'Concept Application as Social Activity', Critica 19: 19-44.
- Barnes, B. (1992). 'Realism, relativism and finitism'. Pp. 131–147 in Cognitive Relativism and Social Science, eds. D. Raven, L. van Vucht Tijssen, and J. de Wolf.
- Barnes, B., D. Bloor, and J. Henry. (1996), Scientific Knowledge: A Sociological Analysis. University of Chicago Press. [An introduction and summary of strong sociology]
- Bijker, Wiebe E., et al. The social construction of technological systems: New directions in the sociology and history of technology (MIT press, 2012)
- Bloor, D. (1991 [1976]), Knowledge and Social Imagery, 2nd ed. Chicago: University of Chicago Press. [outlines the strong programme]
- Bloor, D. (1997). Wittgenstein, Rules and Institutions. London: Routledge.
- Bloor, D. (1999). 'Anti-Latour,' Studies in the History and Philosophy of Science Part A 20#1 pp: 81–112.
- Collins, Harry, and Trevor Pinch. The Golem at large: What you should know about technology (Cambridge University Press, 2014)
- Latour, B. (1999). 'For David Bloor and Beyond ... a reply to David Bloor's 'Anti-Latour','Studies in History & Philosophy of Science Part A30(1): 113-129.
External links[edit]
Retrieved from 'https://en.wikipedia.org/w/index.php?title=Strong_programme&oldid=933852043'