97,288 research outputs found

    Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems

    Get PDF
    A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud \u

    Emergence of Hierarchy on a Network of Complementary Agents

    Full text link
    Complementarity is one of the main features underlying the interactions in biological and biochemical systems. Inspired by those systems we propose a model for the dynamical evolution of a system composed by agents that interact due to their complementary attributes rather than their similarities. Each agent is represented by a bit-string and has an activity associated to it; the coupling among complementary peers depends on their activity. The connectivity of the system changes in time respecting the constraint of complementarity. We observe the formation of a network of active agents whose stability depends on the rate at which activity diffuses in the system. The model exhibits a non-equilibrium phase transition between the ordered phase, where a stable network is generated, and a disordered phase characterized by the absence of correlation among the agents. The ordered phase exhibits multi-modal distributions of connectivity and activity, indicating a hierarchy of interaction among different populations characterized by different degrees of activity. This model may be used to study the hierarchy observed in social organizations as well as in business and other networks.Comment: 13 pages, 4 figures, submitte

    Investigating biocomplexity through the agent-based paradigm.

    Get PDF
    Capturing the dynamism that pervades biological systems requires a computational approach that can accommodate both the continuous features of the system environment as well as the flexible and heterogeneous nature of component interactions. This presents a serious challenge for the more traditional mathematical approaches that assume component homogeneity to relate system observables using mathematical equations. While the homogeneity condition does not lead to loss of accuracy while simulating various continua, it fails to offer detailed solutions when applied to systems with dynamically interacting heterogeneous components. As the functionality and architecture of most biological systems is a product of multi-faceted individual interactions at the sub-system level, continuum models rarely offer much beyond qualitative similarity. Agent-based modelling is a class of algorithmic computational approaches that rely on interactions between Turing-complete finite-state machines--or agents--to simulate, from the bottom-up, macroscopic properties of a system. In recognizing the heterogeneity condition, they offer suitable ontologies to the system components being modelled, thereby succeeding where their continuum counterparts tend to struggle. Furthermore, being inherently hierarchical, they are quite amenable to coupling with other computational paradigms. The integration of any agent-based framework with continuum models is arguably the most elegant and precise way of representing biological systems. Although in its nascence, agent-based modelling has been utilized to model biological complexity across a broad range of biological scales (from cells to societies). In this article, we explore the reasons that make agent-based modelling the most precise approach to model biological systems that tend to be non-linear and complex

    Data Mining by Soft Computing Methods for The Coronary Heart Disease Database

    Get PDF
    For improvement of data mining technology, the advantages and disadvantages on respective data mining methods should be discussed by comparison under the same condition. For this purpose, the Coronary Heart Disease database (CHD DB) was developed in 2004, and the data mining competition was held in the International Conference on Knowledge-Based Intelligent Information and Engineering Systems (KES). In the competition, two methods based on soft computing were presented. In this paper, we report the overview of the CHD DB and the soft computing methods, and discuss the features of respective methods by comparison of the experimental results

    Optimizing radiation therapy treatments by exploring tumour ecosystem dynamics in-silico

    Get PDF
    In this contribution, we propose a system-level compartmental population dynamics model of tumour cells that interact with the patient (innate) immune system under the impact of radiation therapy (RT). The resulting in silico - model enables us to analyse the system-level impact of radiation on the tumour ecosystem. The Tumour Control Probability (TCP) was calculated for varying conditions concerning therapy fractionation schemes, radio-sensitivity of tumour sub-clones, tumour population doubling time, repair speed and immunological elimination parameters. The simulations exhibit a therapeutic benefit when applying the initial 3 fractions in an interval of 2 days instead of daily delivered fractions. This effect disappears for fast-growing tumours and in the case of incomplete repair. The results suggest some optimisation potential for combined hyperthermia-radiotherapy. Regarding the sensitivity of the proposed model, cellular repair of radiation-induced damages is a key factor for tumour control. In contrast to this, the radio-sensitivity of immune cells does not influence the TCP as long as the radio-sensitivity is higher than those for tumour cells. The influence of the tumour sub-clone structure is small (if no competition is included). This work demonstrates the usefulness of in silico – modelling for identifying optimisation potentials
    corecore