312 research outputs found

    Diversity of graphs with highly variable connectivity

    Get PDF
    A popular approach for describing the structure of many complex networks focuses on graph theoretic properties that characterize their large-scale connectivity. While it is generally recognized that such descriptions based on aggregate statistics do not uniquely characterize a particular graph and also that many such statistical features are interdependent, the relationship between competing descriptions is not entirely understood. This paper lends perspective on this problem by showing how the degree sequence and other constraints (e.g., connectedness, no self-loops or parallel edges) on a particular graph play a primary role in dictating many features, including its correlation structure. Building on recent work, we show how a simple structural metric characterizes key differences between graphs having the same degree sequence. More broadly, we show how the (often implicit) choice of a background set against which to measure graph features has serious implications for the interpretation and comparability of graph theoretic descriptions

    Contrasting Views of Complexity and Their Implications For Network-Centric Infrastructures

    Get PDF
    There exists a widely recognized need to better understand and manage complex “systems of systems,” ranging from biology, ecology, and medicine to network-centric technologies. This is motivating the search for universal laws of highly evolved systems and driving demand for new mathematics and methods that are consistent, integrative, and predictive. However, the theoretical frameworks available today are not merely fragmented but sometimes contradictory and incompatible. We argue that complexity arises in highly evolved biological and technological systems primarily to provide mechanisms to create robustness. However, this complexity itself can be a source of new fragility, leading to “robust yet fragile” tradeoffs in system design. We focus on the role of robustness and architecture in networked infrastructures, and we highlight recent advances in the theory of distributed control driven by network technologies. This view of complexity in highly organized technological and biological systems is fundamentally different from the dominant perspective in the mainstream sciences, which downplays function, constraints, and tradeoffs, and tends to minimize the role of organization and design

    Optimal vaccination in a stochastic epidemic model of two non-interacting populations

    Get PDF
    Developing robust, quantitative methods to optimize resource allocations in response to epidemics has the potential to save lives and minimize health care costs. In this paper, we develop and apply a computationally efficient algorithm that enables us to calculate the complete probability distribution for the final epidemic size in a stochastic Susceptible-Infected-Recovered (SIR) model. Based on these results, we determine the optimal allocations of a limited quantity of vaccine between two non-interacting populations. We compare the stochastic solution to results obtained for the traditional, deterministic SIR model. For intermediate quantities of vaccine, the deterministic model is a poor estimate of the optimal strategy for the more realistic, stochastic case.Comment: 21 pages, 7 figure

    The birth-death-suppression Markov process and wildfires

    Full text link
    Birth and death Markov processes can model stochastic physical systems from percolation to disease spread and, in particular, wildfires. We introduce and analyze a birth-death-suppression Markov process as a model of wildfire dynamics. Using analytic techniques, we characterize the probabilities and timescales of events like extinguishment of the fire and the probability of the burned area reaching a given size. The latter requires control over the embedded Markov chain. To solve this discrete process, we introduce a new set of orthogonal polynomials, the 'firewalk' polynomials, which are a continuous deformation of the Gegenbauer/ultraspherical polynomials. This allows analysis of processes with bounded cumulative population, corresponding to finite burnable substrate in the wildfire interpretation, with probabilities represented as spectral integrals. This technology is developed in order to construct a dynamic decision support framework. We devise real-time risk metrics and describe the structure of optimal fire suppression strategies. Finally, we sketch future directions including multi-event resource allocation problems and potential applications for reinforcement learning.Comment: 29 pages, 15 figure

    Technological and Economic Drivers and Constraints in the Internet's "Last Mile"

    Get PDF
    This paper investigates the physical topology of the Internet at the edge of the network, known as the "last mile", and considers the technological and economic features that drive and constrain its ongoing deployment and operation. In particular, by considering in detail the various technologies used in the delivery of network bandwidth to end-users, it is shown how the need to aggregate traffic is a dominant design objective in the construction of edge networks and furthermore that the large-scale statistics of network topologies as a whole, including features such as overall node degree distribution, are dominated by the structural features at the network edge

    Stability of a Giant Connected Component in a Complex Network

    Get PDF
    We analyze the stability of the network's giant connected component under impact of adverse events, which we model through the link percolation. Specifically, we quantify the extent to which the largest connected component of a network consists of the same nodes, regardless of the specific set of deactivated links. Our results are intuitive in the case of single-layered systems: the presence of large degree nodes in a single-layered network ensures both its robustness and stability. In contrast, we find that interdependent networks that are robust to adverse events have unstable connected components. Our results bring novel insights to the design of resilient network topologies and the reinforcement of existing networked systems

    Dynamic Resource Allocation in Disaster Response: Tradeoffs in Wildfire Suppression

    Get PDF
    Challenges associated with the allocation of limited resources to mitigate the impact of natural disasters inspire fundamentally new theoretical questions for dynamic decision making in coupled human and natural systems. Wildfires are one of several types of disaster phenomena, including oil spills and disease epidemics, where (1) the disaster evolves on the same timescale as the response effort, and (2) delays in response can lead to increased disaster severity and thus greater demand for resources. We introduce a minimal stochastic process to represent wildfire progression that nonetheless accurately captures the heavy tailed statistical distribution of fire sizes observed in nature. We then couple this model for fire spread to a series of response models that isolate fundamental tradeoffs both in the strength and timing of response and also in division of limited resources across multiple competing suppression efforts. Using this framework, we compute optimal strategies for decision making scenarios that arise in fire response policy

    Solving Defender-Attacker-Defender Models for Infrastructure Defense

    Get PDF
    In Operations Research, Computing, and Homeland Defense, R.K. Wood and R.F. Dell, editors, INFORMS, Hanover, MD, pp. 28-49.The article of record as published may be located at http://dx.doi.org10.1287/ics.2011.0047This paper (a) describes a defender-attacker-defender sequential game model (DAD) to plan defenses for an infrastructure system that will enhance that system's resilience against attacks for an intelligent adversary, (b) describes a realistic formulation of DAD for defending a transportation network, (c) develops a decomposition algorithm for solving this instance of DAD and others, and (d) demonstrates the solution of a small transportation-network example. A DAD model generally evaluates system operation through the solution of an optimization model, and the decomposition algorithm developed here requires only that this system-operation model be continuous and convex. For example, our transportation-network example incorporates a congestion model with a (convex) nonlinear objective function and linear constraints

    Risk of transition to schizophrenia following first admission with substance-induced psychotic disorder: a population-based longitudinal cohort study

    Get PDF
    The potential for drugs of abuse to induce acute psychotic symptoms is well recognised. However, the likelihood of transition from initial substance-induced psychotic disorder (SIPD) to chronic psychosis is much less well understood. This study investigated the rate of SIPD transition to schizophrenia (F20), the time to conversion and other possible related factors. Using data from the Scottish Morbidity Record, we examined all patients (n = 3486) since their first admission to psychiatric hospital with a diagnosis of SIPD [International Classification of Diseases, Tenth Revision (ICD-10) codes F10-F19, with third digit five] from January 1997 to July 2012. Patients were followed until first episode of schizophrenia (ICD-10 code F20, with any third digit) or July 2012. Any change in diagnosis was noted in the follow-up period, which ranged from 1 day to 15.5 years across the groups. The 15.5-year cumulative hazard rate was 17.3% (s.e. = 0.007) for a diagnosis of schizophrenia. Cannabis, stimulant, opiate and multiple drug-induced psychotic disorder were all associated with similar hazard rates. The mean time to transition to a diagnosis of schizophrenia was around 13 years, although over 50% did so within 2 years and over 80% of cases presented within 5 years of SIPD diagnosis. Risk factors included male gender, younger age and longer first admission. SIPD episodes requiring hospital admission for more than 2 weeks are more likely to be associated with later diagnosis of schizophrenia. Follow-up periods of more than 2 years are needed to detect the majority of those individuals who will ultimately develop schizophrenia