238 research outputs found

    The robustness of stability under link and node failures

    Get PDF
    AbstractIn the area of communication systems, stability refers to the property of keeping the amount of traffic in the system always bounded over time. Different communication system models have been proposed in order to capture the unpredictable behavior of some users and applications. Among those proposed models the adversarial queueing theory (aqt) model turned out to be the most adequate to analyze an unpredictable network. Until now, most of the research done in this field did not consider the possibility of the adversary producing failures on the network structure. The adversarial models proposed in this work incorporate the possibility of dealing with node and link failures provoked by the adversary. Such failures produce temporal disruptions of the connectivity of the system and increase the collisions of packets in the intermediate hosts of the network, and thus the average traffic load. Under such a scenario, the network is required to be equipped with some mechanism for dealing with those collisions.In addition to proposing adversarial models for faulty systems we study the relation between the robustness of the stability of the system and the management of the queues affected by the failures. When the adversary produces link or node failures the queues associated to the corresponding links can be affected in many different ways depending on whether they can receive or serve packets, or rather that they cannot. In most of the cases, protocols and networks containing very simple topologies, which were known to be universally stable in the aqt model, turn out to be unstable under some of the newly proposed adversarial models. This shows that universal stability of networks is not a robust property in the presence of failures

    Layering as Optimization Decomposition: A Mathematical Theory of Network Architectures

    Full text link

    Biodversity conservation and non-governmental organisations in Oaxaca, Mexico

    Get PDF
    The lack of local scale biodiversity assessment in Oaxacan conservation is examined. Biodiversity assessment is a prerequisite of systematic, scientifically directed conservation and in Oaxaca, as in many other parts of the world, conservation is not planned according to scientific prescriptions. This thesis investigates the reasons for this in two ways. First, it considers the technical demands of biodiversity assessment from the point of view of local conservation NGOs. Second, it considers the institutional context in which the concept of biodiversity is translated from scientific discourses to Oaxacan NGOs. It is argued that tree diversity assessment techniques as currently promoted in scientific discourses are not necessarily appropriate to the needs of local NGOs and that biodiversity is itself a contested concept in Oaxaca. This results in the lack of priority given by Oaxaca's local conservation NGOs to biodiversity assessment. It is further shown that non-systematic conservation has made an important contribution to biodiversity conservation in Oaxaca, and it is argued that it is unrealistic to expect scientific prescriptions for biodiversity planning to be translated, without modification, to rural Oaxaca

    ECOSYSTEM RESTORATION IN THE OUACHITA NATIONAL FOREST: EVALUATING THE PRAGMATISM OF PRE-EUROPEAN SETTLEMENT BENCHMARKS

    Get PDF
    This paper looks at the intersections of nature and culture through a study of forest ecosystem restoration efforts in the Ouachita National Forest (Arkansas and Oklahoma). Ecosystem restoration goals are often informed by a pre-European settlement (PES) condition, with an implicit (and occasionally explicit) assertion that such conditions are both more natural than and preferable to the contemporary state. In many cases resuming pre-suppression fire regimes remains a key mechanism for achieving this restored condition. This study’s three main objectives include: (1) determining how PES benchmarks arose in restoration thought, (2) examining how the choice to use a PES benchmark is influenced by culture, and (3) evaluating the pragmatism of including a PES benchmark in restoration projects. The issues of the naturalness of PES conditions, along with the cultural implications of adopting a PES benchmark, are critically examined against the backdrop of historic legacies of fire suppression and paleoecological change. Normative balance-of-nature ideas are discussed in light of their influence on natural resource management paradigms. Linkages are drawn between PES conditions and forest health. Evidence supporting the ecological resilience associated with PES vegetation communities is considered alongside the anticipation of future forcing factors. The idea that restored forests represent an ecological archetype is addressed. Finally, an alternative explanation concerning the tendency of ecosystem restoration efforts to converge on a single historic reference condition – a point of equifinality – is weighed against notions of: (1) anthropic degradation, (2) a regional optimum, and (3) a socially-constructed yearning for a frontier ideal. Because of the unique convergence between historical human activities and natural processes, contemporary culture has conceived of the PES time period as a sort of frontier ideal. The creation of PES benchmarks appears to be an unintentional consequence of attempts to restore forest health rigorously defined by biometric standards. This study offers, to restoration thinking, a framework for critically evaluating the inclusion of historic reference conditions and a means of responding to criticism surrounding their use. This study\u27s findings rest on evidence gathered from paleoecological and historical biogeography data, interviews, archival materials, cultural landscape interpretation, landscape and nature-based art, and complexity theory

    Resource Allocation in Networked and Distributed Environments

    Get PDF
    A central challenge in networked and distributed systems is resource management: how can we partition the available resources in the system across competing users, such that individual users are satisfied and certain system-wide objectives of interest are optimized? In this thesis, we deal with many such fundamental and practical resource allocation problems that arise in networked and distributed environments. We invoke two sophisticated paradigms -- linear programming and probabilistic methods -- and develop provably-good approximation algorithms for a diverse collection of applications. Our main contributions are as follows. Assignment problems: An assignment problem involves a collection of objects and locations, and a load value associated with each object-location pair. Our goal is to assign the objects to locations while minimizing various cost functions of the assignment. This setting models many applications in manufacturing, parallel processing, distributed storage, and wireless networks. We present a single algorithm for assignment which generalizes many classical assignment schemes known in the literature. Our scheme is derived through a fusion of linear algebra and randomization. In conjunction with other ideas, it leads to novel guarantees for multi-criteria parallel scheduling, broadcast scheduling, and social network modeling. Precedence constrained scheduling: We consider two precedence constrained scheduling problems, namely sweep scheduling and tree scheduling, which are inspired by emerging applications in high performance computing. Through a careful use of randomization, we devise the first approximation algorithms for these problems with near-optimal performance guarantees. Wireless communication: Wireless networks are prone to interference. This prohibits proximate network nodes from transmitting simultaneously, and introduces fundamental challenges in the design of wireless communication protocols. We develop fresh geometric insights for characterizing wireless interference. We combine our geometric analysis with linear programming and randomization, to derive near-optimal algorithms for latency minimization and throughput capacity estimation in wireless networks. In summary, the innovative use of linear programming and probabilistic techniques for resource allocation, and the novel ways of connecting them with application-specific ideas is the pivotal theme and the focal point of this thesis

    Computer Aided Verification

    Get PDF
    This open access two-volume set LNCS 11561 and 11562 constitutes the refereed proceedings of the 31st International Conference on Computer Aided Verification, CAV 2019, held in New York City, USA, in July 2019. The 52 full papers presented together with 13 tool papers and 2 case studies, were carefully reviewed and selected from 258 submissions. The papers were organized in the following topical sections: Part I: automata and timed systems; security and hyperproperties; synthesis; model checking; cyber-physical systems and machine learning; probabilistic systems, runtime techniques; dynamical, hybrid, and reactive systems; Part II: logics, decision procedures; and solvers; numerical programs; verification; distributed systems and networks; verification and invariants; and concurrency

    The evolution of nutritional co-endosymbionts in cicadas

    Get PDF
    Symbiosis occurs between organisms in all domains of life. The evolution of obligate symbionts from free-living bacteria typically results in the loss of genes involved in metabolic independence and an overall reduction in genome size. Outside the organelles, the most extreme examples of genome reduction come from the intracellular symbionts of sap-feeding insects. The genomes of these bacteria encode very few genes other than those involved in translation, replication, and amino acid synthesis. Candidatus Hodgkinia cicadicola (Hodgkinia) and Candidatus Sulcia muelleri (Sulcia) live in specialized insect cells (bacteriocytes) of the cicada Diceroprocta semicincta, and have undergone severe gene loss. Hodgkinia in particular retains one of the smallest gene sets of all bacteria, and even less than many organelles. As a result, the Hodgkinia genome is left with a seemingly incomplete set of genes that are required for cellular life, including core genes in the translational machinery. I analyzed a set of Hodgkinia genomes and performed several experiments to uncover the constraints guiding the evolution of Hodgkinia. What mutational and selective pressures are acting on the Hodgkinia genome? How do essential cellular enzymatic reactions occur in Hodgkinia cells? Does the cicada host complement Hodgkinia\u27s limited genetic repertoire? How does the evolution of insect endosymbionts compare to the evolution of organelles? My work provides answers to many of these questions, and deepens our understanding of intracellular symbioses

    Novel Algorithm Development for ‘NextGeneration’ Sequencing Data Analysis

    Get PDF
    In recent years, the decreasing cost of ‘Next generation’ sequencing has spawned numerous applications for interrogating whole genomes and transcriptomes in research, diagnostic and forensic settings. While the innovations in sequencing have been explosive, the development of scalable and robust bioinformatics software and algorithms for the analysis of new types of data generated by these technologies have struggled to keep up. As a result, large volumes of NGS data available in public repositories are severely underutilised, despite providing a rich resource for data mining applications. Indeed, the bottleneck in genome and transcriptome sequencing experiments has shifted from data generation to bioinformatics analysis and interpretation. This thesis focuses on development of novel bioinformatics software to bridge the gap between data availability and interpretation. The work is split between two core topics – computational prioritisation/identification of disease gene variants and identification of RNA N6 -adenosine Methylation from sequencing data. The first chapter briefly discusses the emergence and establishment of NGS technology as a core tool in biology and its current applications and perspectives. Chapter 2 introduces the problem of variant prioritisation in the context of Mendelian disease, where tens of thousands of potential candidates are generated by a typical sequencing experiment. Novel software developed for candidate gene prioritisation is described that utilises data mining of tissue-specific gene expression profiles (Chapter 3). The second part of chapter investigates an alternative approach to candidate variant prioritisation by leveraging functional and phenotypic descriptions of genes and diseases from multiple biomedical domain ontologies (Chapter 4). Chapter 5 discusses N6 AdenosineMethylation, a recently re-discovered posttranscriptional modification of RNA. The core of the chapter describes novel software developed for transcriptome-wide detection of this epitranscriptomic mark from sequencing data. Chapter 6 presents a case study application of the software, reporting the previously uncharacterised RNA methylome of Kaposi’s Sarcoma Herpes Virus. The chapter further discusses a putative novel N6-methyl-adenosine -RNA binding protein and its possible roles in the progression of viral infection
    • 

    corecore