43,741 research outputs found
Learning from the Anthropocene: Adaptive Epistemology and Complexity in Strategic Managerial Thinking
open access articleTurbulence experienced in the business and social realms resonates with turbulence unfolding
throughout the biosphere, as a process of accelerating change at the stratigraphic scale
termed the Anthropocene. The Anthropocene is understood as a multiâdimensional limit point, one
dimension of which concerns the limits to the lineal epistemology prevalent since the Age of the
Enlightenment. This paper argues that future conditions necessitate the updating of a lineal epistemology
through a transition towards resilience thinking that is both adaptive and ecosystemic. A
management paradigm informed by the recognition of multiple equilibria states distinguished by
thresholds, and incorporating adaptive and resilience thinking is considered. This paradigm is
thought to enhance flexibility and the capacity to absorb influences without crossing thresholds into
alternate stable, but less desirable, states. One consequence is that evaluations of success may
change, and these changes are considered and explored as likely onâgoing challenges businesses
must grapple with into the future
Recommended from our members
Complex systems science: expert consultation report
Executive SummaryA new programme of research in Complex Systems Science must be initiated by FETThe science of complex systems (CS) is essential to establish rigorous scientific principles on which to develop the future ICT systems that are critical to the well-being, safety and prosperity of Europe and its citizens. As the âICT incubator and pathfinder for new ideas and themes for long-term research in the area of information and communication technologiesâ FET must initiate a significant new programme of research in complex systems science to underpin research and development in ICT. Complex Systems Science is a âblue skyâ research laboratory for R&D in ICT and their applications. In July 2009, ASSYST was given a set of probing questions concerning FET funding for ICT-related complex systems research. This document is based on the CS communityâs response.Complex systems research has made considerable progress and is delivering new scienceSince FET began supporting CS research, considerable progress has been made. Building on previous understanding of concepts such as emergence from interactions, far-from-equilibrium systems, border of chaos and self-organised criticality, recent CS research is now delivering rigorous theory through methods of statistical physics, network theory, and computer simulation. CS research increasingly demands high-throughput data streams and new ICT-based methods of observing and reconstructing, i.e. modelling, the dynamics from those data in areas as diverse as embryogenesis, neuroscience, transport, epidemics, linguistics, meteorology, and robotics. CS research is also beginning to address the problem of engineering robust systems of systems of systems that can adapt to changing environments, including the perplexing problem that ICT systems are too often fragile and non-adaptive.Recommendation: A Programme of Research in Complex Systems Science to Support ICTFundamental theory in Complex Systems Science is needed, but this can only be achieved through real-world applications involving large, heterogeneous, and messy data sets, including people and organisations. A long-term vision is needed. Realistic targets can be set. Fundamental research can be ensured by requiring that teams include mathematicians, computer scientists, physicists and computational social scientists.One research priority is to develop a formalism for multilevel systems of systems of systems, applicable to all areas including biology, economics, security, transportation, robotics, health, agriculture, ecology, and climate change. Another related research priority is a scientific perspective on the integration of the new science with policy and its implementation, including ethical problems related to privacy and equality.A further priority is the need for education in complex systems science. Conventional education continues to be domain-dominated, producing scientists who are for the most part still lacking fundamental knowledge in core areas of mathematics, computation, statistical physics, and social systems. Therefore:1. We recommend that FET fund a new programme of work in complex systems science as essential research for progress in the development of new kinds of ICT systems.2. We have identified the dynamics of multilevel systems as the area in complex systems science requiring a major paradigm shift, beyond which significant scientific progress cannot be made.3. We propose a call requiring: fundamental research in complex systems science; new mathematical and computational formalisms to be developed; involving a large âguinea pigâ organisation; research into policy and its meta-level information dynamics; and that all research staff have interdisciplinary knowledge through an education programme.Tangible outcomes, potential users of the new science, its impact and measures of successUsers include (i) the private and public sectors using ICT to manage complex systems and (ii) researchers in ICT, CSS, and all complex domains. The tangible output of a call will be new knowledge on the nature of complex systems in general, new knowledge of the particular complex system(s) studied, and new knowledge of the fundamental role played by ICT in the research and implementation to create real systems addressing real-world problems. The impact of the call will be seen through new high added-value opportunities in the public and private sectors, new high added-value ICT technologies, and new high added-value science to support innovation in ICT research and development. The measure of success will be through the delivery of these high added-value outcomes, and new science to better understand failures
Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems
A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud
\u
"Going back to our roots": second generation biocomputing
Researchers in the field of biocomputing have, for many years, successfully
"harvested and exploited" the natural world for inspiration in developing
systems that are robust, adaptable and capable of generating novel and even
"creative" solutions to human-defined problems. However, in this position paper
we argue that the time has now come for a reassessment of how we exploit
biology to generate new computational systems. Previous solutions (the "first
generation" of biocomputing techniques), whilst reasonably effective, are crude
analogues of actual biological systems. We believe that a new, inherently
inter-disciplinary approach is needed for the development of the emerging
"second generation" of bio-inspired methods. This new modus operandi will
require much closer interaction between the engineering and life sciences
communities, as well as a bidirectional flow of concepts, applications and
expertise. We support our argument by examining, in this new light, three
existing areas of biocomputing (genetic programming, artificial immune systems
and evolvable hardware), as well as an emerging area (natural genetic
engineering) which may provide useful pointers as to the way forward.Comment: Submitted to the International Journal of Unconventional Computin
The Coherence Problem: Mapping the Theory and Delivery of Infrastructure Resilience Across Concept, Form, Function, and Experienced Value
In this contribution we explore the interface between the functional characteristics of infrastructures as artefacts and
social need supplier. Specifically we are concerned with the ways in which infrastructure performance measures are
articulated and assessed and whether there are incongruities between the technical and broader, social goals which
infrastructure systems are intended to aspire to. Our analysis involves comparing and contrasting system design
and performance metrics across the technical â social boundary, generating new insights for those tasked with the
design and operation of networked infrastructures. The assessment delivered in the following sections is inherently
interdisciplinary and cross-sectoral in nature, bringing thinking from the social and environmental sciences together
with contributions from mathematics and engineering to offer a commentary which is relevant to all types of physical
infrastructure
Complexity, BioComplexity, the Connectionist Conjecture and Ontology of Complexity\ud
This paper develops and integrates major ideas and concepts on complexity and biocomplexity - the connectionist conjecture, universal ontology of complexity, irreducible complexity of totality & inherent randomness, perpetual evolution of information, emergence of criticality and equivalence of symmetry & complexity. This paper introduces the Connectionist Conjecture which states that the one and only representation of Totality is the connectionist one i.e. in terms of nodes and edges. This paper also introduces an idea of Universal Ontology of Complexity and develops concepts in that direction. The paper also develops ideas and concepts on the perpetual evolution of information, irreducibility and computability of totality, all in the context of the Connectionist Conjecture. The paper indicates that the control and communication are the prime functionals that are responsible for the symmetry and complexity of complex phenomenon. The paper takes the stand that the phenomenon of life (including its evolution) is probably the nearest to what we can describe with the term âcomplexityâ. The paper also assumes that signaling and communication within the living world and of the living world with the environment creates the connectionist structure of the biocomplexity. With life and its evolution as the substrate, the paper develops ideas towards the ontology of complexity. The paper introduces new complexity theoretic interpretations of fundamental biomolecular parameters. The paper also develops ideas on the methodology to determine the complexity of âtrueâ complex phenomena.\u
- âŠ