205 research outputs found

    Multi-criteria decision analysis tools for prioritising emerging or re-emerging infectious diseases associated with climate change in Canada

    Get PDF
    Global climate change is known to result in the emergence or re-emergence of some infectious diseases. Reliable methods to identify the infectious diseases of humans and animals and that are most likely to be influenced by climate are therefore required. Since different priorities will affect the decision to address a particular pathogen threat, decision makers need a standardised method of prioritisation. Ranking methods and Multi-Criteria Decision approaches provide such a standardised method and were employed here to design two different pathogen prioritisation tools. The opinion of 64 experts was elicited to assess the importance of 40 criteria that could be used to prioritise emerging infectious diseases of humans and animals in Canada. A weight was calculated for each criterion according to the expert opinion. Attributes were defined for each criterion as a transparent and repeatable method of measurement. Two different Multi-Criteria Decision Analysis tools were tested, both of which used an additive aggregation approach. These were an Excel spreadsheet tool and a tool developed in software 'M-MACBETH'. The tools were trialed on nine 'test' pathogens. Two different methods of criteria weighting were compared, one using fixed weighting values, the other using probability distributions to account for uncertainty and variation in expert opinion. The ranking of the nine pathogens varied according to the weighting method that was used. In both tools, using both weighting methods, the diseases that tended to rank the highest were West Nile virus, Giardiasis and Chagas, while Coccidioidomycosis tended to rank the lowest. Both tools are a simple and user friendly approach to prioritising pathogens according to climate change by including explicit scoring of 40 criteria and incorporating weighting methods based on expert opinion. They provide a dynamic interactive method that can help to identify pathogens for which a full risk assessment should be pursued

    The use of expert opinion to assess the risk of emergence or re-emergence of infectious diseases in Canada associated with climate change

    Get PDF
    Global climate change is predicted to lead to an increase in infectious disease outbreaks. Reliable surveillance for diseases that are most likely to emerge is required, and given limited resources, policy decision makers need rational methods with which to prioritise pathogen threats. Here expert opinion was collected to determine what criteria could be used to prioritise diseases according to the likelihood of emergence in response to climate change and according to their impact. We identified a total of 40 criteria that might be used for this purpose in the Canadian context. The opinion of 64 experts from academic, government and independent backgrounds was collected to determine the importance of the criteria. A weight was calculated for each criterion based on the expert opinion. The five that were considered most influential on disease emergence or impact were: potential economic impact, severity of disease in the general human population, human case fatality rate, the type of climate that the pathogen can tolerate and the current climatic conditions in Canada. There was effective consensus about the influence of some criteria among participants, while for others there was considerable variation. The specific climate criteria that were most likely to influence disease emergence were: an annual increase in temperature, an increase in summer temperature, an increase in summer precipitation and to a lesser extent an increase in winter temperature. These climate variables were considered to be most influential on vector-borne diseases and on food and water-borne diseases. Opinion about the influence of climate on air-borne diseases and diseases spread by direct/indirect contact were more variable. The impact of emerging diseases on the human population was deemed more important than the impact on animal populations

    Diagnosis of cattle diseases endemic to sub-Saharan Africa : evaluating a low cost decision support tool in use by veterinary personnel

    Get PDF
    Background: Diagnosis is key to control and prevention of livestock diseases. In areas of sub-Saharan Africa where private practitioners rarely replace Government veterinary services reduced in effectiveness by structural adjustment programmes, those who remain lack resources for diagnosis and might benefit from decision support. Methodology/Principal Findings: We evaluated whether a low-cost diagnostic decision support tool would lead to changes in clinical diagnostic practice by fifteen veterinary and animal health officers undertaking primary animal healthcare in Uganda. The eight diseases covered by the tool included 98% of all bovine diagnoses made before or after its introduction. It may therefore inform proportional morbidity in the area; breed, age and geographic location effects were consistent with current epidemiological understanding. Trypanosomosis, theileriosis, anaplasmosis, and parasitic gastroenteritis were the most common conditions among 713 bovine clinical cases diagnosed prior to introduction of the tool. Thereafter, in 747 bovine clinical cases estimated proportional morbidity of fasciolosis doubled, while theileriosis and parasitic gastroenteritis were diagnosed less commonly and the average number of clinical signs increased from 3.5 to 4.9 per case, with 28% of cases reporting six or more signs compared to 3% beforehand. Anaemia/pallor, weakness and staring coat contributed most to this increase, approximately doubling in number and were recorded in over half of all cases. Finally, although lack of a gold standard hindered objective assessment of whether the tool improved the reliability of diagnosis, informative concordance and misclassification matrices yielded useful insights into its role in the diagnostic process. Conclusions/Significance: The diagnostic decision support tool covered the majority of diagnoses made before or after its introduction, leading to a significant increase in the number of clinical signs recorded, suggesting this as a key beneficial consequence of its use. It may also inform approximate proportional morbidity and represent a useful epidemiological tool in poorly resourced areas

    Imperfect estimation of Lepeophtheirus salmonis abundance and its impact on salmon lice treatment on Atlantic salmon farms

    Get PDF
    Accurate monitoring of sea lice levels on salmon farms is critical to the efficient management of louse infestation, as decisions around whether and when to apply treatment depend on an estimation of abundance. However, as with all sampling, the estimated abundance of salmon lice through sampling salmon cannot perfectly represent the abundance on a given farm. While suggestions to improve the accuracy of lice abundance estimates have previously been made, the significance of the accuracy of such estimation has been poorly understood. Understanding the extent of error or bias in sample estimates can facilitate an assessment as to how influential this “imperfect” information will likely be on management decisions, and support methods to mitigate negative outcomes associated with such imperfect estimates. Here, we built a model of a hypothetical Atlantic salmon farm using ordinary differential equations and simulated salmon lice (Lepeophtheirus salmonis) abundance over an entire production cycle, during which salmon were periodically sampled using Monte Carlo approaches that adopted a variety of sample sizes, treatment thresholds, and sampling intervals. The model could thus track two instances of salmon lice abundance: true abundance (based on the underlying model) and monitored abundance (based on the values that could be estimated under different simulated sampling protocols). Treatments, which depend on monitored abundance, could be characterized as early, timely, or late, as a result of over-estimation, appropriate estimation, and under-estimation, respectively. To achieve timely treatment, it is important to delay treatments until true abundance equals some treatment threshold and to execute treatment as soon as this threshold is reached. Adopting larger sample sizes increased the frequency of timely treatments, largely by reducing the incidence of early treatments due to less variance in the monitored abundance. Changes in sampling interval and treatment threshold also influenced the accuracy of abundance estimates and thus the frequency of timely treatments. This study has implications for the manner in which fish should be sampled on salmon farms to ensure accurate salmon lice abundance estimates and consequently the effective application treatment

    Rearing and handling injuries in broiler chickens and risk factors for wing injuries during loading

    Get PDF
    Some injuries to broilers occur during rearing, but most injuries occur during handling before slaughter. Records provided by a processing plant for loads transported over a 19 mo period during 2009 and 2010 were examined. The median percentage of wing injuries per load was 5.7%, whereas injuries to the legs, breast, or shoulders were all less than 1% per load. Risk factors for wing injuries were examined by considering the data from each load by handling event (i.e., loads originating from the same producer on the same date). A multilevel model with three levels, producer (n = 86), handling event (n = 1694), and load (n = 4219), was fitted. The final model included weight, sex, season, catching team, time of day at which loading began, speed of loading, and an interaction between speed of loading and time of day. Factors that reduced the risk of wing injuries were loading lighter birds, loads containing only cockerels, and loading in the fall. The predicted percentage of wing injuries was relatively constant for slower loading speeds, but it was increased significantly when faster loading speeds were adopted during daytime (0700–1700). Identification of these risk factors can be used to adjust loading practices

    Targets and measures : challenges associated with reporting low sea lice levels on Atlantic salmon farms

    Get PDF
    A popular framing of Goodhart's Law states, "When a measure become a target, it ceases to be a good measure". The extent to which this may be the case in the reporting of sea louse infestation on salmon farms is explored here. Due to the importance of controlling sea louse infestation on salmon farms, monitoring programmes are active in most salmon producing regions and, in many, a maximum allowable sea louse level is specified. Using publicly accessible data from Norway and BC, Canada, this study investigated the extent to which the framing of these programmes, in particular the specification of low threshold levels, may be affecting the veracity of the reported sea louse infestation data. In BC, where the threshold level is set to 3 mobile Lepeophtheirus salmonis little evidence of anomalous patterns in the data and the overall proportion of females within the adult sea lice population is around 0.43. By contrast, in Norway where lower sea louse limits are in place (at either 0.5 or 0.2 adult female L. salmonis), there is evidence of unexpected and sharp reductions in the abundance of adult females reported around these threshold values. In addition, the average proportion of females is estimated to be only around 0.20 of the total adult L. salmonis population. The unexpected observations in the data were much more evident for farms in the southern areas of Norway and over the most recent years. These findings appear to support the case that the measurement of sea lice on salmon farms can be significantly influenced by targets (particularly those which are highly demanding), and that as such, researchers and fish health professionals should be aware of potential biases within these data. In addition, regulators should carefully consider the unintended consequences of setting certain sea louse thresholds and the ways in which the potential to effectively review data quality and accuracy may be impacted by the choice of sea louse stage(s) that are reported

    Effectiveness of emamectin benzoate for treatment of Lepeophtheirus salmonis on farmed Atlantic salmon Salmo salar in the Bay of Fundy, Canada

    Get PDF
    Emamectin benzoate (an avermectin chemotherapeutant administered to fish as an in-feed treatment) has been used to treat infestations of sea lice Lepeophtheirus salmonis on farmed Atlantic salmon Salmo salar in the Bay of Fundy, New Brunswick, Canada, since 1999. This retrospective study examined the effectiveness of 114 emamectin benzoate treatment episodes from 2004 to 2008 across 54 farms. Study objectives were to establish whether changes in the effectiveness of emamectin benzoate were present for this period, examine factors associated with treatment outcome, and determine variables that influenced differences in L. salmonis abundance after treatment. The analysis was carried out in 2 parts: first, trends in treatment effectiveness and L. salmonis abundance were explored, and second, statistical modelling (linear and logistic regression) was used to examine the effects of multiple variables on post-treatment abundance and treatment outcome. Post-treatment sea lice abundance increased in the later years examined. Mean abundance differed between locations in the Bay of Fundy, and higher numbers were found at farms closer to the mainland and lower levels were found in the areas around Grand Manan Island. Treatment effectiveness varied by geographical region and decreased over time. There was an increased risk for unsuccessful treatments in 2008, and treatments applied during autumn months were more likely to be ineffective than those applied during summer months

    Syndromic surveillance using veterinary laboratory data : algorithm combination and customization of alerts

    Get PDF
    Background: Syndromic surveillance research has focused on two main themes: the search for data sources that can provide early disease detection; and the development of efficient algorithms that can detect potential outbreak signals. Methods: This work combines three algorithms that have demonstrated solid performance in detecting simulated outbreak signals of varying shapes in time series of laboratory submissions counts. These are: the Shewhart control charts designed to detect sudden spikes in counts; the EWMA control charts developed to detect slow increasing outbreaks; and the Holt-Winters exponential smoothing, which can explicitly account for temporal effects in the data stream monitored. A scoring system to detect and report alarms using these algorithms in a complementary way is proposed. Results: The use of multiple algorithms in parallel resulted in increased system sensitivity. Specificity was decreased in simulated data, but the number of false alarms per year when the approach was applied to real data was considered manageable (between 1 and 3 per year for each of ten syndromic groups monitored). The automated implementation of this approach, including a method for on-line filtering of potential outbreak signals is described. Conclusion: The developed system provides high sensitivity for detection of potential outbreak signals while also providing robustness and flexibility in establishing what signals constitute an alarm. This flexibility allows an analyst to customize the system for different syndromes

    Veterinary syndromic surveillance : current initiatives and potential for development

    Get PDF
    This paper reviews recent progress in the development of syndromic surveillance systems for veterinary medicine. Peer-reviewed and grey literature were searched in order to identify surveillance systems that explicitly address outbreak detection based on systematic monitoring of animal population data, in any phase of implementation. The review found that developments in veterinary syndromic surveillance are focused not only on animal health, but also on the use of animals as sentinels for public health, representing a further step towards One Medicine. The main sources of information are clinical data from practitioners and laboratory data, but a number of other sources are being explored. Due to limitations inherent in the way data on animal health is collected, the development of veterinary syndromic surveillance initially focused on animal health data collection strategies, analyzing historical data for their potential to support systematic monitoring, or solving problems of data classification and integration. Systems based on passive notification or data transfers are now dealing with sustainability issues. Given the ongoing barriers in availability of data, diagnostic laboratories appear to provide the most readily available data sources for syndromic surveillance in animal health. As the bottlenecks around data source availability are overcome, the next challenge is consolidating data standards for data classification, promoting the integration of different animal health surveillance systems, and also the integration to public health surveillance. Moreover, the outputs of systems for systematic monitoring of animal health data must be directly connected to real-time decision support systems which are increasingly being used for disease management and control

    Data-fed, needs-driven : designing analytical workflows fit for disease surveillance

    Get PDF
    Syndromic surveillance has been an important driver for the incorporation of “big data analytics” into animal disease surveillance systems over the past decade. As the range of data sources to which automated data digitalization can be applied continues to grow, we discuss how to move beyond questions around the means to handle volume, variety and velocity, so as to ensure that the information generated is fit for disease surveillance purposes. We make the case that the value of data-driven surveillance depends on a "needs-driven" design approach to data digitalization and information delivery and highlight some of the current challenges and research frontiers in syndromic surveillance
    • …
    corecore