13 research outputs found

    Dynamic chain graph models for time series network data

    Get PDF
    This paper introduces a new class of Bayesian dynamic models for inference and forecasting in high-dimensional time series observed on networks. The new model, called the dynamic chain graph model, is suitable for multivariate time series which exhibit symmetries within subsets of series and a causal drive mechanism between these subsets. The model can accommodate high-dimensional, non-linear and non-normal time series and enables local and parallel computation by decomposing the multivariate problem into separate, simpler sub-problems of lower dimensions. The advantages of the new model are illustrated by forecasting traffic network flows and also modelling gene expression data from transcriptional networks

    Genetic differences in host infectivity affect disease spread and survival in epidemics

    Get PDF
    Abstract Survival during an epidemic is partly determined by host genetics. While quantitative genetic studies typically consider survival as an indicator for disease resistance (an individual’s propensity to avoid becoming infected or diseased), mortality rates of populations undergoing an epidemic are also affected by endurance (the propensity of diseased individual to survive the infection) and infectivity (i.e. the propensity of an infected individual to transmit disease). Few studies have demonstrated genetic variation in disease endurance, and no study has demonstrated genetic variation in host infectivity, despite strong evidence for considerable phenotypic variation in this trait. Here we propose an experimental design and statistical models for estimating genetic diversity in all three host traits. Using an infection model in fish we provide, for the first time, direct evidence for genetic variation in host infectivity, in addition to variation in resistance and endurance. We also demonstrate how genetic differences in these three traits contribute to survival. Our results imply that animals can evolve different disease response types affecting epidemic survival rates, with important implications for understanding and controlling epidemics

    Pathogen transmission from vaccinated hosts can cause dose-dependent reduction in virulence

    Get PDF
    Many livestock and human vaccines are leaky because they block symptoms but do not prevent infection or onward transmission. This leakiness is concerning because it increases vaccination coverage required to prevent disease spread and can promote evolution of increased pathogen virulence. Despite leakiness, vaccination may reduce pathogen load, affecting disease transmission dynamics. However, the impacts on post-transmission disease development and infectiousness in contact individuals are unknown. Here, we use transmission experiments involving Marek disease virus (MDV) in chickens to show that vaccination with a leaky vaccine substantially reduces viral load in both vaccinated individuals and unvaccinated contact individuals they infect. Consequently, contact birds are less likely to develop disease symptoms or die, show less severe symptoms, and shed less infectious virus themselves, when infected by vaccinated birds. These results highlight that even partial vaccination with a leaky vaccine can have unforeseen positive consequences in controlling the spread and symptoms of disease

    Entropy analysis of high-definition transcranial electric stimulation effects on EEG dynamics

    Get PDF
    A foundation of medical research is time series analysis—the behavior of variables of interest with respect to time. Time series data are often analyzed using the mean, with statistical tests applied to mean differences, and has the assumption that data are stationary. Although widely practiced, this method has limitations. Here we present an alternative statistical approach with sample analysis that provides a summary statistic accounting for the non-stationary nature of time series data. This work discusses the use of entropy as a measurement of the complexity of time series, in the context of Neuroscience, due to the non-stationary characteristic of the data. To elucidate our argument, we conducted entropy analysis on a sample of electroencephalographic (EEG) data from an interventional study using non-invasive electrical brain stimulation. We demonstrated that entropy analysis could identify intervention-related change in EEG data, supporting that entropy can be a useful “summary” statistic in non-linear dynamical systems

    Disentangling genetic variation for resistance and endurance to scuticociliatosis in turbot using pedigree and genomic information

    Get PDF
    Selective breeding for improving host responses to infectious pathogens is a promising option for disease control. In fact, disease resilience, the ability of a host to survive or cope with infectious challenge, has become a highly desirable breeding goal. However, resilience is a complex trait composed of two different host defence mechanisms, namely resistance (the ability of a host to avoid becoming infected or diseased) and endurance (the ability of an infected host to survive the infection). While both could be targeted for genetic improvement, it is currently unknown how they contribute to survival, as reliable estimates of genetic parameters for both traits obtained simultaneously are scarce. A difficulty lies in obtaining endurance phenotypes for genetic analyses. In this study, we present the results from an innovative challenge test carried out in turbot whose design allowed disentangling the genetic basis of resistance and endurance to Philasterides dicentrarchi, a parasite causing scuticociliatosis that leads to substantial economic losses in the aquaculture industry. A noticeable characteristic of the parasite is that it causes visual signs that can be used for disentangling resistance and endurance. Our results showed the existence of genetic variation for both traits (heritability = 0.26 and 0.12 for resistance and endurance, respectively) and for the composite trait resilience (heritability = 0.15). The genetic correlation between resistance and resilience was very high (0.90) indicating that both are at a large extent the same trait, but no significant genetic correlation was found between resistance and endurance. A total of 18,125 SNPs obtained from 2b-RAD sequencing enabled genome-wide association analyses for detecting QTLs controlling the three traits. A candidate QTL region on linkage group 19 that explains 33% of the additive genetic variance was identified for resilience. The region contains relevant genes related to immune response and defence mechanisms. Although no significant associations were found for resistance, the pattern of association was the same as for resilience. For endurance, one significant association was found on linkage group 2. The accuracy of genomic breeding values was also explored for resilience, showing that it increased by 12% when compared with the accuracy of pedigree-based breeding values. To our knowledge, this is the first study in turbot disentangling the genetic basis of resistance and endurance to scuticociliatosis.This work was supported by the European Union’s Seventh Framework Programme (KBBE.2013.1.2-659 10) under Grant Agreement No. 613611, Ministerio de Ciencia, Innovación y Universidades, Spain (Grant CGL2016-75904-C2-2-P), and Fondos FEDER.S

    Recurrent events; Bootstrap; Asymptotic theory; Coverage probability

    No full text
    Experiments related to recurrent events provide information about the number of events, time to their ocurrence and their costs. Nelson (1995) presents a methodology to obtain confidence intervals for the cost and the number of cumulative events. Apart from this, it is possible to construct confidence intervals via computer-intensive methods, where the bootstrap is a particular case. In this dissertation we present these two procedures and make a comparison, checking the coverage probability and the sample size influence in the precision of the intervals provided by the two methods. One of the advantages of the methodology presented in this dissertation is the possibility for its application in several areas and its easy computational implementation. An example from engineering illustrates the methodology.Financiadora de Estudos e ProjetosExperimentos com dados de eventos recorrentes fornecem informações sobre o número de eventos, o tempo até a ocorrência do evento e custos dos mesmos. Um exemplo consiste em dados relacionados à garantia de equipamentos manufaturados, onde o objetivo básico é estimar o custo médio acumulado e o número médio acumulado de eventos. Nelson (1995) propõe uma metodologia para obtenção de intervalos de confiança para o custo e o número médio acumulado de eventos baseada na teoria assintótica. Além desta metodologia, é possível obter intervalos de confiança via métodos computacionalmente intensivos, em particular, bootstrap. O objetivo deste trabalho é apresentar estes dois métodos, assim como realizar uma comparação dos mesmos a partir da verificação da probabilidade de cobertura e a influência do tamanho da amostra na precisão dos intervalos de confi- ança construídos a partir dos dois procedimentos apresentados. Dentre as vantagens da metodologia aqui apresentada, é a possibilidade de sua aplicação em diversas áreas do conhecimento, assim como sua facilidade de implementação computacional. Um exemplo, proveniente da área de engenharia, é apresentado

    Poly-bagging predictors for classification modelling for credit scoring

    No full text
    Credit scoring modelling comprises one of the leading formal tools for supporting the granting of credit. Its core objective consists of the generation of a score by means of which potential clients can be listed in the order of the probability of default. A critical factor is whether a credit scoring model is accurate enough in order to provide correct classification of the client as a good or bad payer. In this context the concept of bootstraping aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the fitted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper we propose a new bagging-type variant procedure, which we call poly-bagging, consisting of combining predictors over a succession of resamplings. The study is derived by credit scoring modelling. The proposed poly-bagging procedure was applied to some different artificial datasets and to a real granting of credit dataset up to three successions of resamplings. We observed better classification accuracy for the two-bagged and the three-bagged models for all considered setups. These results lead to a strong indication that the poly-bagging approach may promote improvement on the modelling performance measures, while keeping a flexible and straightforward bagging-type structure easy to implement
    corecore