26 research outputs found

    Antifragility Predicts the Robustness and Evolvability of Biological Networks through Multi-class Classification with a Convolutional Neural Network

    Full text link
    Robustness and evolvability are essential properties to the evolution of biological networks. To determine if a biological network is robust and/or evolvable, it is required to compare its functions before and after mutations. However, this sometimes takes a high computational cost as the network size grows. Here we develop a predictive method to estimate the robustness and evolvability of biological networks without an explicit comparison of functions. We measure antifragility in Boolean network models of biological systems and use this as the predictor. Antifragility occurs when a system benefits from external perturbations. By means of the differences of antifragility between the original and mutated biological networks, we train a convolutional neural network (CNN) and test it to classify the properties of robustness and evolvability. We found that our CNN model successfully classified the properties. Thus, we conclude that our antifragility measure can be used as a predictor of the robustness and evolvability of biological networks.Comment: 22 pages, 10 figure

    Stocks and Cryptocurrencies: Anti-fragile or Robust?

    Full text link
    Antifragility was recently defined as a property of complex systems that benefit from disorder. However, its original formal definition is difficult to apply. Our approach has been to define and test a much simpler measure of antifragility for complex systems. In this work we use our antifragility measure to analyze real data from the stock market and cryptocurrency prices. Results vary between different antifragility interpretations and for each system. Our results suggest that the stock market favors robustness rather than antifragility, as in most cases the highest and lowest antifragility values are reached either by young agents or constant ones. There are no clear correlations between antifragility and different good-performance measures, while the best performers seem to fall within a robust threshold. In the case of cryptocurrencies, there is an apparent correlation between high price and high antifragility.Comment: 11 pages, 5 figure

    Emergence in artificial life

    Full text link
    Even when concepts similar to emergence have been used since antiquity, we lack an agreed definition. However, emergence has been identified as one of the main features of complex systems. Most would agree on the statement ``life is complex''. Thus, understanding emergence and complexity should benefit the study of living systems. It can be said that life emerges from the interactions of complex molecules. But how useful is this to understand living systems? Artificial life (ALife) has been developed in recent decades to study life using a synthetic approach: build it to understand it. ALife systems are not so complex, be them soft (simulations), hard (robots), or wet (protocells). Then, we can aim at first understanding emergence in ALife, for then using this knowledge in biology. I argue that to understand emergence and life, it becomes useful to use information as a framework. In a general sense, I define emergence as information that is not present at one scale but is present at another scale. This perspective avoids problems of studying emergence from a materialist framework, and can be also useful in the study of self-organization and complexity.Comment: 28 pages, 1 figur

    A conceptual and architectural characterization of antifragile systems

    Get PDF
    Antifragility is one of the terms that have recently emerged with the aim of indicating a direction that should be pursued toward the objective of designing Information and Communications Technology systems that remain trustworthy despite their dynamic and evolving operating context. We present a characterization of antifragility, aiming to clarify from a conceptual viewpoint the implications of its adoption as a design guideline and its relationships with other approaches sharing a similar objective. To this end, we discuss the inclusion of antifragility (and related concepts) within the well-known dependability taxonomy, which was proposed a few decades ago with the goal of providing a reference framework to reason about the different facets of the general concern of designing dependable systems. From our conceptual characterization, we then derive a possible path toward the engineering of antifragile systems

    An Open Logic Approach to EPM

    Get PDF
    open2noEPM is a high operative and didactic versatile tool and new application areas are envisaged continuously. In turn, this new awareness has allowed to enlarge our panorama for neurocognitive system EPM is a high operative and didactic versatile tool and new application areas are envisaged continuosly. In turn, this new awareness has allowed to enlarge our panorama for neurocognitive system behavior understanding, and to develop information conservation and regeneration systems in a numeric self-reflexive/reflective evolutive reference framework. Unfortunately, a logically closed model cannot cope with ontological uncertainty by itself; it needs a complementary logical aperture operational support extension. To achieve this goal, it is possible to use two coupled irreducible information management subsystems, based on the following ideal coupled irreducible asymptotic dichotomy: "Information Reliable Predictability" and "Information Reliable Unpredictability" subsystems. To behave realistically, overall system must guarantee both Logical Closure and Logical Aperture, both fed by environmental "noise" (better… from what human beings call "noise"). So, a natural operating point can emerge as a new Trans-disciplinary Reality Level, out of the Interaction of Two Complementary Irreducible Information Management Subsystems within their environment. In this way, it is possible to extend the traditional EPM approach in order to profit by both classic EPM intrinsic Self-Reflexive Functional Logical Closure and new numeric CICT Self-Reflective Functional Logical Aperture. EPM can be thought as a reliable starting subsystem to initialize a process of continuous self-organizing and self-logic learning refinement. understanding, and to develop information conservation and regeneration systems in a numeric self-reflexive/reflective evolutive reference framework. Unfortunately, a logically closed model cannot cope with ontological uncertainty by itself; it needs a complementary logical aperture operational support extension. To achieve this goal, it is possible to use two coupled irreducible information management subsystems, based on the following ideal coupled irreducible asymptotic dichotomy: "Information Reliable Predictability" and "Information Reliable Unpredictability" subsystems. To behave realistically, overall system must guarantee both Logical Closure and Logical Aperture, both fed by environmental "noise" (better… from what human beings call "noise"). So, a natural operating point can emerge as a new Trans-disciplinary Reality Level, out of the Interaction of Two Complementary Irreducible Information Management Subsystems within their environment. In this way, it is possible to extend the traditional EPM approach in order to profit by both classic EPM intrinsic Self-Reflexive Functional Logical Closure and new numeric CICT Self-Reflective Functional Logical Aperture. EPM can be thought as a reliable starting subsystem to initialize a process of continuous self-organizing and self-logic learning refinement.Fiorini, Rodolfo; Degiacomo, PieroFiorini, Rodolfo; Degiacomo, Pier

    Quantification of Loss of Access to Critical Services during Floods in Greater Jakarta: Integrating Social, Geospatial, and Network Perspectives

    Get PDF
    This work presents a framework for assessing the socio-physical disruption of critical infrastructure accessibility using the example of Greater Jakarta, a metropolitan area of the Indonesian city. The first pillar of the framework is damage quantification based on the real flood event in 2020. Within this pillar, the system network statistics before and shortly after the flood were compared. The results showed that the flood impeded access to facilities, distorted transport connectivity, and increased system vulnerability. Poverty was found to be negatively associated with surface elevation, suggesting that urbanization of flood-prone areas has occurred. The second pillar was a flood simulation. Our simulations identified the locations and clusters that are more vulnerable to the loss of access during floods, and the entire framework can be applied to other cities and urban areas globally and adapted to account for different disasters that physically affect urban infrastructure. This work demonstrated the feasibility of damage quantification and vulnerability assessment relying solely on open and publicly available data and tools. The framework, which uses satellite data on the occurrence of floods made available by space agencies in a timely manner, will allow for rapid ex post investigation of the socio-physical consequences of disasters. It will save resources, as the analysis can be performed by a single person, as opposed to expensive and time-consuming ground surveys. Ex ante vulnerability assessment based on simulations will help communities, urban planners, and emergency personnel better prepare for future shocks

    Dynamical heterogeneity and universality of power-grids

    Full text link
    While weak, tuned asymmetry can improve, strong heterogeneity destroys synchronization in the electric power system. We study the level of heterogeneity, by comparing large high voltage (HV) power-grids of Europe and North America. We provide an analysis of power capacities and loads of various energy sources from the databases and found heavy tailed distributions with similar characteristics. Graph topological measures, community structures also exhibit strong similarities, while the cable admittance distributions can be well fitted with the same power-laws (PL), related to the length distributions. The community detection analysis shows the level of synchronization in different domains of the European HV power grids, by solving a set of swing equations. We provide numerical evidence for frustrated synchronization and Chimera states and point out the relation of topology and level of synchronization in the subsystems. We also provide empirical data analysis of the frequency heterogeneities within the Hungarian HV network and find q-Gaussian distributions related to super-statistics of time-lagged fluctuations, which agree well with former results on the Nordic Grid.Comment: 15 pages, 15 figure

    Is Soccer a lie or simply a complex system?

    Full text link
    Understanding soccer as a complex system we base on nature and the collective behavior of many organisms that "do calculations," seeking to generate solutions in a bioinspired way. When soccer mysteries appear, complex systems science emerges as a means to provide explanations. However, given the variety of interpretations that complexity and its associated properties can have and the understanding of what a complex system is, it is convenient to provide some elements to understand how unpredictability in soccer gives way to hundreds of counterintuitive results and how the science of complexity could contribute to the understanding of many phenomena in this sport. In this context, the manuscript's objective is to synthetically address some of the most important aspects of applied complexity to soccer to bring science and sport closer togetherComment: 15 pages, in Spanish language, 6 Figure
    corecore