593 research outputs found

    On the Existence of Spacetime Structure

    Get PDF
    I examine the debate between substantivalists and relationalists about the ontological character of spacetime and conclude it is not well posed. I argue that the so-called Hole Argument does not bear on the debate, because it provides no clear criterion to distinguish the positions. I propose two such precise criteria and construct separate arguments based on each to yield contrary conclusions, one supportive of something like relationalism and the other of something like substantivalism. The lesson is that one must fix an investigative context in order to make such criteria precise, but different investigative contexts yield inconsistent results. I examine questions of existence about spacetime structures other than the spacetime manifold itself to argue that it is more fruitful to focus on pragmatic issues of physicality, a notion that lends itself to several different explications, all of philosophical interest, none privileged a priori over any of the others. I conclude by suggesting an extension of the lessons of my arguments to the broader debate between realists and instrumentalists.Comment: 42 pages, 2 figures, forthcoming (2015) in British Journal for Philosophy of Scienc

    Adaptive Loss Inference Using Unicast End-to-End Measurements

    Get PDF
    We address the problem of inferring link loss rates from unicast end-to-end measurements on the basis of network tomography. Because measurement probes will incur additional traffic overheads, most tomography-based approaches perform the inference by collecting the measurements only on selected paths to reduce the overhead. However, all previous approaches select paths offline, which will inevitably miss many potential identifiable links, whose loss rates should be unbiasedly determined. Furthermore, if element failures exist, an appreciable number of the selected paths may become unavailable. In this paper, we creatively propose an adaptive loss inference approach in which the paths are selected sequentially depending on the previous measurement results. In each round, we compute the loss rates of links that can be unbiasedly determined based on the current measurement results and remove them from the system. Meanwhile, we locate the most possible failures based on the current measurement outcomes to avoid selecting unavailable paths in subsequent rounds. In this way, all identifiable and potential identifiable links can be determined unbiasedly using only 20% of all available end-to-end measurements. Compared with a previous classical approach through extensive simulations, the results strongly confirm the promising performance of our proposed approach

    Modelling the genomic structure, and antiviral susceptibility of Human Cytomegalovirus

    Get PDF
    Human Cytomegalovirus (HCMV) is found ubiquitously in humans worldwide, and once acquired, the infection persists within the host throughout their life. Although Immunocompetent people rarely are affected by HCMV infections, their related diseases pose a major health problem worldwide for those with compromised or suppressed immune systems such as transplant recipients. Additionally, congenital transmission of HCMV is the most common infectious cause of birth defects globally and is associated with a substantial economic burden. This thesis explores the application of statistical modelling and genomics to unpick three key areas of interest in HCMV research. First, a comparative genomics analysis of global HCMV strains was undertaken to delineate the molecular population structure of this highly variable virus. By including in-house sequenced viruses of African origin and by developing a statistical framework to deconvolute highly variable regions of the genome, novel and important insights into the co-evolution of HCMV with its host were uncovered. Second, a rich database relating mutations to drug sensitivity was curated for all the antiviral treated herpesviruses. This structured information along with the development of a mutation annotation pipeline, allowed the further development of statistical models that predict the phenotype of a virus from its sequence. The predictive power of these models was validated for HSV1 by using external unseen mutation data provided in collaboration with the UK Health Security Agency. Finally, a nonlinear mixed effects model, expanded to account for Ganciclovir pharmacokinetics and pharmacodynamics, was developed by making use of rich temporal HCMV viral load data. This model allowed the estimation of the impact of immune-clearance versus antiviral inhibition in controlling HCMV lytic replication in already established infections post-haematopoietic stem cell transplant

    Inferring infection hazard in wildlife populations by linking data across individual and population scales

    Get PDF
    Our ability to infer unobservable disease-dynamic processes such as force of infection (infection hazard for susceptible hosts) has transformed our understanding of disease transmission mechanisms and capacity to predict disease dynamics. Conventional methods for inferring FOI estimate a time-averaged value and are based on population-level processes. Because many pathogens exhibit epidemic cycling and FOI is the result of processes acting across the scales of individuals and populations, a flexible framework that extends to epidemic dynamics and links within-host processes to FOI is needed. Specifically, within-host antibody kinetics in wildlife hosts can be short-lived and produce patterns that are repeatable across individuals, suggesting individual-level antibody concentrations could be used to infer time since infection and hence FOI. Using simulations and case studies (influenza A in lesser snow geese and Yersinia pestis in coyotes), we argue that with careful experimental and surveillance design, the population-level FOI signal can be recovered from individual-level antibody kinetics, despite substantial individual-level variation. In addition to improving inference, the cross-scale quantitative antibody approach we describe can reveal insights into drivers of individual-based variation in disease response, and the role of poorly understood processes such as secondary infections, in population-level dynamics of disease

    Sugarbeet Model Development for Soil and Water Quality Assessment

    Get PDF
    Sugarbeet (Beta vulgaris) is considered as one of the most viable alternatives to corn for biofuel production as it may be qualified as “advanced” biofuel feedstocks under the ‘EISA 2007’. Production of deep rooted sugarbeet may play a significant role in enhancing utilization of deeper layer soil water and nutrients, and thus may significantly affect soil health and water quality through recycling of water and nutrients. A model can be useful in predicting the sugarbeet growth, and its effect on soil and water quality. A sugarbeet model was developed by adopting and modifying the Crop Environment and Resource Synthesis-Beet (CERES-Beet) model. It was linked to the Cropping System Model (CSM) of the Decision Support System for Agrotechnology (DSSAT) and was termed as CSM-CERES-Beet. The CSM-CERES-Beet model was then linked to the plant growth module of the Root Zone Water Quality Model (RZWQM2) to simulate crop growth, soil water and NO3-N transport in crop fields. For both DSSAT and RZWQM2, parameter estimation (PEST) software was used for model calibration, evaluation, predictive uncertainty analysis, sensitivity, and identifiability. The DSSAT model was evaluated with two sets of experimental data collected in two different regions and under different environmental conditions, one in Bucharest, Romania and the other in Carrington, ND, USA, while RZWQM2 was evaluated for only Carrington, ND experimental data. Both DSSAT and RZWQM2 performed well in simulating leaf area index, leaf or top weight, and root weight for the datasets used (d-statistic = 0.783-0.993, rRMSE = 0.006-1.014). RZWQM2 was also used to evaluate soil water and NO3-N contents and did well (d-statistic = 0.709-0.992, rRMSE = 0.066-1.211). The RZWQM2 was applied for simulating the effects of crop rotation and tillage operations on sugarbeet production. Hypothetical crop rotation and tillage operation scenarios identified wheat as the most suitable previous year crop for sugarbeet and moldboard plow as the most suitable tillage operation method. Both DSSAT and RZWQM2 enhanced with CSM-CERES-Beet may be used to simulate sugarbeet production under different management scenarios for different soils and under different climatic conditions in the Red River Valley.USDA National Institute of Food and Agriculture Foundational Program (Award No.: 2013-67020-21366

    Un modèle de trafic adapté à la volatilité de charge d'un service de vidéo à la demande: Identification, validation et application à la gestion dynamique de ressources.

    Get PDF
    Dynamic resource management has become an active area of research in the Cloud Computing paradigm. Cost of resources varies significantly depending on configuration for using them. Hence efficient management of resources is of prime interest to both Cloud Providers and Cloud Users. In this report we suggest a probabilistic resource provisioning approach that can be exploited as the input of a dynamic resource management scheme. Using a Video on Demand use case to justify our claims, we propose an analytical model inspired from standard models developed for epidemiology spreading, to represent sudden and intense workload variations. As an essential step we also derive a heuristic identification procedure to calibrate all the model parameters and evaluate the performance of our estimator on synthetic time series. We show how good can our model fit to real workload traces with respect to the stationary case in terms of steady-state probability and autocorrelation structure. We find that the resulting model verifies a Large Deviation Principle that statistically characterizes extreme rare events, such as the ones produced by "buzz effects" that may cause workload overflow in the VoD context. This analysis provides valuable insight on expectable abnormal behaviors of systems. We exploit the information obtained using the Large Deviation Principle for the proposed Video on Demand use-case for defining policies (Service Level Agreements). We believe these policies for elastic resource provisioning and usage may be of some interest to all stakeholders in the emerging context of cloud networking.La gestion dynamique de ressources est un élément clé du paradigme de cloud computing et plus récemment de celui de cloud networking. Dans ce contexte d'infrastructures virtualisées, la réduction des coûts associés à l'utilisation et à la ré-allocation des ressources contraint les opé- rateurs et les utilisateurs de clouds à une gestion rationnelle de celles-ci. Dans ce travail nous proposons une description probabiliste des besoins liée à la volatilité de la charge d'un service de distribution de vidéos à la demande. Cette description peut alors servir de consigne (input) à la provision et à l'allocation dynamique des ressources nécessaires. Notre approche repose sur la construction d'un modèle stochastique inspiré des modèles de Markov standards de propaga- tion épidémiologique, capable de reproduire des variations soudaines et intenses d'activité (buzz). Nous proposons alors une procédure heuristique d'identification du modèle à partir de séries tem- porelles du nombre d'utilisateurs connectés au serveur. Les performances d'estimation de chacun des paramètres du modèle sont évaluées numériquement, et nous vérifions l'adéquation du modèle aux données en comparant les distributions des états stationnaires ainsi que les fonctions d'auto- corrélation des processus. Les propriétés markoviennes de notre modèle garantissent qu'il vérifie un principe de grandes dé- viations permettant de caractériser statistiquement l'ampleur et la durée d'évènements extrêmes et rares tels que ceux produits par les buzzs. C'est cette propriété que nous exploitons pour di- mensionner le volume de ressources (e.g. bande-passante, nombre de serveurs, taille de buffers) à prévoir pour réaliser un bon compromis entre coût de re-déploiement des infrastructures et qualité de service. Cette approche probabiliste de la gestion des ressources ouvre des perspectives sur les politiques de Service Level Agreement adaptées aux clouds et servant au mieux les intérêts des opérateurs de réseaux, de services et de leurs clients

    Model-Based Dynamic Resource Management for Service Oriented Clouds

    Get PDF
    Cloud computing is a flexible platform for software as a service, as more and more applications are deployed on cloud. Major challenges in cloud include how to characterize the workload of the applications and how to manage the cloud resources efficiently by sharing them among many applications. The current state of the art considers a simplified model of the system, either ignoring the software components altogether or ignoring the relationship between individual software services. This thesis considers the following resource management problems for cloud-based service providers: (i) how to estimate the parameters of the current workload, (ii) how to meet Quality of Service (QoS) targets while minimizing infrastructure cost, (iii) how to allocate resources considering performance costs of virtual machine reconfigurations. To address the above problems, we propose a model-based feedback loop approach. The cloud infrastructure, the services, and the applications are modelled using Layered Queuing Models (LQM). These models are then optimized. Mathematical techniques are used to reduce the complexity of the models and address the scalability issues. The main contributions of this thesis are: (i) Extended Kalman Filter (EKF) based techniques improved by dynamic clustering for scalable estimation of workload parameters, (ii) combination of adaptive empirical models (tuned during runtime) and stepwise optimizations for improving the overall allocation performance, (iii) dynamic service placement algorithms that consider the cost of virtual machine reconfiguration

    Multi-Scale Modeling of the Innate Immune System: A Dynamic Investigation into Pathogenic Detection

    Get PDF
    Having a well-functioning immune system can mean the difference between a mild ailment and a life-threatening infection; however, predicting how a disease will progress has proven to be a significant challenge. The dynamics driving the immune system are governed by a complex web of cell types, signaling proteins, and regulatory genes that have to strike a balance between disease elimination and rampant inflammation. An insufficient immune response will induce a prolonged disease state, but an excessive response will cause unnecessary cell dead and extensive tissue damage. This balance is usually self-regulated, but medical intervention is often necessary to correct imbalances. Unfortunately, these therapies are imperfect and accompanied by mild to debilitating side-effects caused by off-target effects. By developing a detailed understanding of the immune response, the goal of this dissertation is to predict how the immune system will respond to infection and determine how new potential therapies could overcome these threats. Computational modeling provides an opportunity to synthesize current immunological observations and predict response outcomes to pathogenic infections. When coupled with experimental data, these models can simulate signaling pathway dynamics that drive the immune response, incorporate regulatory feedback mechanisms, and model inherent biological noise. Taken together, computational modeling can explain emergent behavior that cannot be determined from experiment alone. This dissertation will unitize two computational modeling techniques: ordinary differential equations (ODEs) and agent-based modeling (ABMs). Ultimately, they are combined in a novel way to model cellular immune responses across multiple length scales, creating a more accurate representation of the pathogenic response. TLR4 and cGAS signaling are prominent in a number of diseases and dysregulations including---but not limited to---autoimmunity, cancer, HIV, HSV, tuberculosis, and sepsis. These two signaling pathways are so prevalent because they are activated extremely early and help drive the downstream immune signaling. Modeling how cells dynamically regulate these pathways is critical for understanding how diseases circumvent feedback mechanisms and how new therapies can restore immune function to combat disease progression. By using ODE and ABM techniques, these studies aim to incrementally expand our knowledge of innate immune signaling and understand how feedback mechanisms control disease severity
    corecore