52 research outputs found

    Modelling the economic efficiency of using different strategies to control Porcine Reproductive & Respiratory Syndrome at herd level

    Get PDF
    PRRS is among the diseases with the highest economic impact in pig production worldwide. Different strategies have been developed and applied to combat PRRS at farm level. The broad variety of available intervention strategies makes it difficult to decide on the most cost-efficient strategy for a given farm situation, as it depends on many farm-individual factors like disease severity, prices or farm structure. Aim of this study was to create a simulation tool to estimate the cost-efficiency of different control strategies at individual farm level. Baseline is a model that estimates the costs of PRRS, based on changes in health and productivity, in a specific farm setting (e.g. farm type, herd size, type of batch farrowing). The model evaluates different intervention scenarios: depopulation/repopulation (D/R), close & roll-over (C&R), mass vaccination of sows (MS), mass vaccination of sows and vaccination of piglets (MS + piglets), improvements in internal biosecurity (BSM), and combinations of vaccinations with BSM. Data on improvement in health and productivity parameters for each intervention were obtained through literature review and from expert opinions. The economic efficiency of the different strategies was assessed over 5 years through investment appraisals: the resulting expected value (EV) indicated the most cost-effective strategy. Calculations were performed for 5 example scenarios with varying farm type (farrow-to-finish – breeding herd), disease severity (slightly – moderately – severely affected) and PRRSV detection (yes – no). The assumed herd size was 1000 sows with farm and price structure as commonly found in Germany. In a moderately affected (moderate deviations in health and productivity parameters from what could be expected in an average negative herd), unstable farrow-to-finish herd, the most cost-efficient strategies according to their median EV were C&R (€1′126′807) and MS + piglets (€ 1′114′649). In a slightly affected farrow-to-finish herd, no virus detected, the highest median EV was for MS + piglets (€ 721′745) and MS (€ 664′111). Results indicate that the expected benefits of interventions and the most efficient strategy depend on the individual farm situation, e.g. disease severity. The model provides new insights regarding the cost-efficiency of various PRRSV intervention strategies at farm level. It is a valuable tool for farmers and veterinarians to estimate expected economic consequences of an intervention for a specific farm setting and thus enables a better informed decision

    Satellite Derived Forest Phenology and Its Relation with Nephropathia Epidemica in Belgium

    Get PDF
    The connection between nephropathia epidemica (NE) and vegetation dynamics has been emphasized in recent studies. Changing climate has been suggested as a triggering factor of recently observed epidemiologic peaks in reported NE cases. We have investigated whether there is a connection between the NE occurrence pattern in Belgium and specific trends in remotely sensed phenology parameters of broad-leaved forests. The analysis of time series of the MODIS Enhanced Vegetation Index revealed that changes in forest phenology, considered in literature as an effect of climate change, may affect the mechanics of NE transmission

    Model-based prediction of outbreak dynamics of nephropathia epidemicausing climate and vegetation data

    No full text
    Wildlife-originated zoonotic diseases in general are a major contributor to emerging infectious diseases. Fifteen emerging zoonotic or vector-borne infections with increasing impact on humans in Europe were identified during the period 2000-2006. Global climate change may be a major contributor to the spread of these zoonotic diseases. Rodent borne hantavirus infections are part of this list. Puumala virus (PUUV), hosted by the bank vole (Myodes glareolus), is such a hantavirus. It is common over vast areas of Europe and causes a general mild form hemorrhagic fever with renal syndrome (HFRS) called nephropathia epidemica (NE). It is well established that climate is an important determinant of the spatial and temporal distribution of vectors (in epidemiology, a vector is any agent -person, animal or microorganism- that carries and transmits an infectious pathogen into another living organism) and pathogens. Therefore a change in climate is expected to cause changes in the geographical range, seasonality (inter annual variability) and in the incidence rate (with or without changes in geographical or seasonal patterns) of NE outbreaks. The main aim of this dissertation is in developing modelling approaches for monitoring and predicting NE outbreaks by taking into account the most significant environmental factors which affect the temporal and spatial pattern of NE cases by using compact model structures that take into account climate and vegetation data. In the chapters 2 to 6 of this dissertation we discussed in detail how data-based (mechanistic) models can be used to model and predict outbreaks of nephropatia epidemica (NE) as a basis for the development of disease prevention and control strategy. In contrast with the mechanistic modelling approach, data-based modelling techniques identify the dynamic characteristics of processes based on measured data and are as such (initially) not based on a priori process knowledge. In this dissertation, we discussed how knowledge obtained from mechanistic epidemiological population models can be used to improve the data-based model structures. In chapter 2, we discussed the importance of the carrying capacity for modelling the NE prevalence. Furthermore, we discussed the link between carrying capacity and the forest phenology which explains the possibility of predicting NE outbreaks based only on the climatological and vegetation data, without any knowledge of the bank vole’s population dynamics (chapter 3). In the second part of this thesis we described a modelling approach to predict the NE outbreaks by taking into account measured population dynamics of the bank voles only and knowledge from a mechanistic epidemiological model (Chapter 3 and 4). Human hantavirus epidemics have often been explained by bank vole abundance. Therefore in order to be able to control and prevent the occurrence of the NE cases (as an example of zoonotic disease), it is important to detect and monitor the environmental factors that affect the spatial and temporal variations of the bank vole. A method was described to produce maps of potential geographical distribution of bank voles in Western Europe based on occurrence data points of bank voles and climate information and land cover maps (chapter 5) and in chapter 6 we modelled the bank vole population dynamics in Belgium and Finland using a data-based modelling approach. The results of the current study help to define significant environmental factors on the spread of the disease. Developing a dynamic data-based mechanistic modelling approach for NE may form the basis of an expert tool to predict and prevent the incidence of NE cases by making use of remote sensing tools for measuring broad leaves forest phenology and monitoring the vegetation dynamics together with climatological data.status: publishe

    Exploiting Censored Information in Self-Training for Time-to-Event Prediction

    No full text
    A common problem in medical applications is predicting the time until an event of interest such as the onset of a disease, time to tumor recurrence, and time to mortality. Traditionally, classical survival analysis techniques have been used to address this problem. However, these techniques are of limited usage when considering nonlinear and interaction effects among biomarkers, and high profiling survival datasets. Although supervised machine learning techniques have shown some advantages over standard statistical methods in handling high-dimensional datasets, their application to survival analysis, particularly in the context of feature-based approaches, is at best limited. A major reason behind this is the difficulty in processing censored data, which is a common component of survival analysis. In this paper, we have transformed the time-to-event prediction problem into a semi-supervised regression problem. We utilize a self-training wrapper approach, where an outer layer guides the iterative refinement of predictions. This approach enhances the performance of our model by leveraging confident predictions from censored instances. The self-training wrapper is applied in conjunction with random survival forests as the base learner. In this approach, censored observations are introduced as partially labeled observations since their predicted time (target value) should exceed the censoring time. First, the algorithm builds a base model over the observed instances and then augments them iteratively with highly confident predictions over the censored set, using a smart stopping criterion based on the censoring time. The proposed approach has been evaluated and compared on fifteen real-world survival analysis datasets, including clinical and high-dimensional data. The ability of our proposed approach to integrate partial supervision information within a semi-supervised learning strategy has enabled it to achieve competitive performance compared to baseline models, particularly in the case of a high-dimensional regime

    The effect of different consensus definitions on diagnosing acute kidney injury events and their association with in-hospital mortality.

    Full text link
    peer reviewedBACKGROUND: Due to the existence of different AKI definitions, analyzing AKI incidence and associated outcomes is challenging. We investigated the incidence of AKI events defined by 4 different definitions (standard AKIN and KDIGO, and modified AKIN-4 and KDIGO-4) and its association with in-hospital mortality. METHODS: A total of 7242 adult Greek subjects were investigated. To find the association between AKI stages and in-hospital mortality, we considered both the number of AKI events and the most severe stage of AKI reached by each patient, adjusted for age, sex, and AKI staging, using multivariable logistic regression. To predict mortality in AKI patients, as defined by the four definitions, a classification task with two prediction models (random forest and logistic regression) was also conducted. RESULTS: The incidence of AKI using the KDIGO-4 was 6.72% for stage 1a, 15.71% for stage 1b, 8.06% for stage2, and 2.97% for stage3; however, these percentages for AKIN-4 were 11%, 5.83%, 1.75%, and 0.33% for stage 1a, stage 1b, stage 2, and stage 3, respectively. Results showed KDIGO-4 is more sensitive in detecting AKI events. In-hospital mortality increased as the stage of AKI events increased for both KDIGO-4 and AKIN-4; however, KDIGO-4 (KDIGO) had a higher odds ratio at a higher stage of AKI compared to AKIN-4 (AKIN). Lastly, when using KDIGO, random forest and logistic regression models performed almost equally with a c-statistic of 0.825 and 0.854, respectively. CONCLUSION: The present study confirms that within the KDIGO AKI stage 1, there are two sub-populations with different clinical outcomes (mortality)
    • …
    corecore