169,579 research outputs found

    Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems

    Get PDF
    A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud \u

    Neurodegeneration: Potential Causes, Prevention, and Future Treatment Options

    Get PDF
    Here I advance a hypothesis that neurodegeneration is a natural process associated with aging due to the loss of genetic redundancy following a mathematical model R(t) = R0(1-αe(βC+γI+δEt)t), where the calorie intake (C) and immune response (I) play critical roles. The early onset of neurodegenerative diseases such as Alzheimer’s disease is due to metabolic imbalance or chronic immune reactions to various infections. Therefore, the potential treatment options for neurodegenerative diseases are to modulate metabolism and immune response

    Why Catastrophic Organizational Failures Happen

    Get PDF
    Excerpt from the introduction: The purpose of this chapter is to examine the major streams of research about catastrophic failures, describing what we have learned about why these failures occur as well as how they can be prevented. The chapter begins by describing the most prominent sociological school of thought with regard to catastrophic failures, namely normal accident theory. That body of thought examines the structure of organizational systems that are most susceptible to catastrophic failures. Then, we turn to several behavioral perspectives on catastrophic failures, assessing a stream of research that has attempted to understand the cognitive, group and organizational processes that develop and unfold over time, leading ultimately to a catastrophic failure. For an understanding of how to prevent such failures, we then assess the literature on high reliability organizations (HRO). These scholars have examined why some complex organizations operating in extremely hazardous conditions manage to remain nearly error free. The chapter closes by assessing how scholars are trying to extend the HRO literature to develop more extensive prescriptions for managers trying to avoid catastrophic failures

    Redundancy Calibration of Phased Array Stations

    Get PDF
    Our aim is to assess the benefits and limitations of using the redundant visibility information in regular phased array systems for improving the calibration. Regular arrays offer the possibility to use redundant visibility information to constrain the calibration of the array independent of a sky model and a beam models of the station elements. It requires a regular arrangement in the configuration of array elements and identical beam patterns. We revised a calibration method for phased array stations using the redundant visibility information in the system and applied it successfully to a LOFAR station. The performance and limitations of the method were demonstrated by comparing its use on real and simulated data. The main limitation is the mutual coupling between the station elements, which leads to non-identical beams and stronger baseline dependent noise. Comparing the variance of the estimated complex gains with the Cramer-Rao Bound (CRB) indicates that redundancy is a stable and optimum method for calibrating the complex gains of the system. Our study shows that the use of the redundant visibility does improve the quality of the calibration in phased array systems. In addition it provides a powerful tool for system diagnostics. Our results demonstrate that designing redundancy in both the station layout and the array configuration of future aperture arrays is strongly recommended. In particular in the case of the Square Kilometre Array with its dynamic range requirement which surpasses any existing array by an order of magnitude.Comment: 16 pages, 15 figures, accepted for publication in the A&A in Section 13, acceptance date: 1st May 2012. NOTE: Please contact the first author for high resolution figure

    Optimizing ISOCAM data processing using spatial redundancy

    Get PDF
    We present new data processing techniques that allow to correct the main instrumental effects that degrade the images obtained by ISOCAM, the camera on board the Infrared Space Observatory (ISO). Our techniques take advantage of the fact that a position on the sky has been observed by several pixels at different times. We use this information (1) to correct the long term variation of the detector response, (2) to correct memory effects after glitches and point sources, and (3) to refine the deglitching process. Our new method allows the detection of faint extended emission with contrast smaller than 1% of the zodiacal background. The data reduction corrects instrumental effects to the point where the noise in the final map is dominated by the readout and the photon noises. All raster ISOCAM observations can benefit from the data processing described here. These techniques could also be applied to other raster type observations (e.g. ISOPHOT or IRAC on SIRTF).Comment: 13 pages, 10 figures, to be published in Astronomy and Astrophysics Supplement Serie

    Weak nodes detection in urban transport systems: Planning for resilience in Singapore

    Full text link
    The availability of massive data-sets describing human mobility offers the possibility to design simulation tools to monitor and improve the resilience of transport systems in response to traumatic events such as natural and man-made disasters (e.g. floods terroristic attacks, etc...). In this perspective, we propose ACHILLES, an application to model people's movements in a given transport system mode through a multiplex network representation based on mobility data. ACHILLES is a web-based application which provides an easy-to-use interface to explore the mobility fluxes and the connectivity of every urban zone in a city, as well as to visualize changes in the transport system resulting from the addition or removal of transport modes, urban zones, and single stops. Notably, our application allows the user to assess the overall resilience of the transport network by identifying its weakest node, i.e. Urban Achilles Heel, with reference to the ancient Greek mythology. To demonstrate the impact of ACHILLES for humanitarian aid we consider its application to a real-world scenario by exploring human mobility in Singapore in response to flood prevention.Comment: 9 pages, 6 figures, IEEE Data Science and Advanced Analytic

    Construct redundancy in process modelling grammars: Improving the explanatory power of ontological analysis

    Get PDF
    Conceptual modelling supports developers and users of information systems in areas of documentation, analysis or system redesign. The ongoing interest in the modelling of business processes has led to a variety of different grammars, raising the question of the quality of these grammars for modelling. An established way of evaluating the quality of a modelling grammar is by means of an ontological analysis, which can determine the extent to which grammars contain construct deficit, overload, excess or redundancy. While several studies have shown the relevance of most of these criteria, predictions about construct redundancy have yielded inconsistent results in the past, with some studies suggesting that redundancy may even be beneficial for modelling in practice. In this paper we seek to contribute to clarifying the concept of construct redundancy by introducing a revision to the ontological analysis method. Based on the concept of inheritance we propose an approach that distinguishes between specialized and distinct construct redundancy. We demonstrate the potential explanatory power of the revised method by reviewing and clarifying previous results found in the literature
    • …
    corecore