249 research outputs found

    Confidence in assessing the effectiveness of bath treatments for the control of sea lice on Norwegian salmon farms

    Get PDF
    The salmon louse Lepeophtheirus salmonis is the most important ectoparasite of farmed salmonids in the Northern hemisphere, having a major economic and ecological impact on the sustainability of this sector of the aquaculture industry. To a large extent, control of L. salmonis relies on the use of topical delousing chemical treatments in the form of baths. Improvements in methods for the administration and assessment of bathtreatments have not kept pace with the rapid modernization and intensification of the salmon industry. Bathtreatments present technical and biological challenges, including best practice methods for the estimation of the effect of licetreatment interventions. In this communication, we compare and contrast methods to calculate and interpret treatmenteffectiveness at pen and site level. The methods are illustrated for the calculation of the percentage reduction in mean abundance of mobile lice with a measure of confidence. Six different methods for the calculation of confidence intervals across different probability levels were compared. We found the quasi-Poisson method with a 90% confidence interval to be informative and robust for the measurement of bathtreatment performance

    Multi-criteria decision analysis tools for prioritising emerging or re-emerging infectious diseases associated with climate change in Canada

    Get PDF
    Global climate change is known to result in the emergence or re-emergence of some infectious diseases. Reliable methods to identify the infectious diseases of humans and animals and that are most likely to be influenced by climate are therefore required. Since different priorities will affect the decision to address a particular pathogen threat, decision makers need a standardised method of prioritisation. Ranking methods and Multi-Criteria Decision approaches provide such a standardised method and were employed here to design two different pathogen prioritisation tools. The opinion of 64 experts was elicited to assess the importance of 40 criteria that could be used to prioritise emerging infectious diseases of humans and animals in Canada. A weight was calculated for each criterion according to the expert opinion. Attributes were defined for each criterion as a transparent and repeatable method of measurement. Two different Multi-Criteria Decision Analysis tools were tested, both of which used an additive aggregation approach. These were an Excel spreadsheet tool and a tool developed in software 'M-MACBETH'. The tools were trialed on nine 'test' pathogens. Two different methods of criteria weighting were compared, one using fixed weighting values, the other using probability distributions to account for uncertainty and variation in expert opinion. The ranking of the nine pathogens varied according to the weighting method that was used. In both tools, using both weighting methods, the diseases that tended to rank the highest were West Nile virus, Giardiasis and Chagas, while Coccidioidomycosis tended to rank the lowest. Both tools are a simple and user friendly approach to prioritising pathogens according to climate change by including explicit scoring of 40 criteria and incorporating weighting methods based on expert opinion. They provide a dynamic interactive method that can help to identify pathogens for which a full risk assessment should be pursued

    The use of expert opinion to assess the risk of emergence or re-emergence of infectious diseases in Canada associated with climate change

    Get PDF
    Global climate change is predicted to lead to an increase in infectious disease outbreaks. Reliable surveillance for diseases that are most likely to emerge is required, and given limited resources, policy decision makers need rational methods with which to prioritise pathogen threats. Here expert opinion was collected to determine what criteria could be used to prioritise diseases according to the likelihood of emergence in response to climate change and according to their impact. We identified a total of 40 criteria that might be used for this purpose in the Canadian context. The opinion of 64 experts from academic, government and independent backgrounds was collected to determine the importance of the criteria. A weight was calculated for each criterion based on the expert opinion. The five that were considered most influential on disease emergence or impact were: potential economic impact, severity of disease in the general human population, human case fatality rate, the type of climate that the pathogen can tolerate and the current climatic conditions in Canada. There was effective consensus about the influence of some criteria among participants, while for others there was considerable variation. The specific climate criteria that were most likely to influence disease emergence were: an annual increase in temperature, an increase in summer temperature, an increase in summer precipitation and to a lesser extent an increase in winter temperature. These climate variables were considered to be most influential on vector-borne diseases and on food and water-borne diseases. Opinion about the influence of climate on air-borne diseases and diseases spread by direct/indirect contact were more variable. The impact of emerging diseases on the human population was deemed more important than the impact on animal populations

    Diagnosis of cattle diseases endemic to sub-Saharan Africa : evaluating a low cost decision support tool in use by veterinary personnel

    Get PDF
    Background: Diagnosis is key to control and prevention of livestock diseases. In areas of sub-Saharan Africa where private practitioners rarely replace Government veterinary services reduced in effectiveness by structural adjustment programmes, those who remain lack resources for diagnosis and might benefit from decision support. Methodology/Principal Findings: We evaluated whether a low-cost diagnostic decision support tool would lead to changes in clinical diagnostic practice by fifteen veterinary and animal health officers undertaking primary animal healthcare in Uganda. The eight diseases covered by the tool included 98% of all bovine diagnoses made before or after its introduction. It may therefore inform proportional morbidity in the area; breed, age and geographic location effects were consistent with current epidemiological understanding. Trypanosomosis, theileriosis, anaplasmosis, and parasitic gastroenteritis were the most common conditions among 713 bovine clinical cases diagnosed prior to introduction of the tool. Thereafter, in 747 bovine clinical cases estimated proportional morbidity of fasciolosis doubled, while theileriosis and parasitic gastroenteritis were diagnosed less commonly and the average number of clinical signs increased from 3.5 to 4.9 per case, with 28% of cases reporting six or more signs compared to 3% beforehand. Anaemia/pallor, weakness and staring coat contributed most to this increase, approximately doubling in number and were recorded in over half of all cases. Finally, although lack of a gold standard hindered objective assessment of whether the tool improved the reliability of diagnosis, informative concordance and misclassification matrices yielded useful insights into its role in the diagnostic process. Conclusions/Significance: The diagnostic decision support tool covered the majority of diagnoses made before or after its introduction, leading to a significant increase in the number of clinical signs recorded, suggesting this as a key beneficial consequence of its use. It may also inform approximate proportional morbidity and represent a useful epidemiological tool in poorly resourced areas

    The role of laboratory data in 'knowledgeable surveillance'

    Get PDF
    Over the past decade the availability of digital data relating to animal health has grown exponentially, and with it an interest in making effective and timely use of these data. In particular the use of syndrome-based indicators to augment traditional laboratory results for the purpose of disease surveillance has been the focus of a number of studies. The volume and semi-structured nature of such data, together with the fact that it must often be processed in real time, have led to methodological challenges in the appropriate interpretation of these novel data sources. In this talk I will discuss a range of techniques ranging from text-mining, times series analyses and clustering algorithms that can be used to identify syndromic signals in laboratory test request data, together with statistical techniques that can be used to detect the various types of temporal aberrations that can occur. These approaches have been implemented in systems linked to animal health laboratory systems in Canada and Sweden, and their use will be illustrated by way of case-based examples. However, the isolated use of laboratory data is rarely adequate in the context of syndromic surveillance, and a variety of animal health data sources are being explored for early disease detection. In terms of 'next steps' towards successfully using such data, I believe that the integration of evidence from multiple sources is of critical importance. A key challenge in moving forward is the need to ensure that aggregation and comparison across data sources is being made among similar objects. In this context we are exploring the use of knowledge-based ontologies, which provide machine-readable methods for the representation of and inference from data. We will discuss one such pilot ontology – AHSO (Animal Health Surveillance Ontology) – and illustrate the ways in which the availability of frameworks such as this can be complemented by recent advances in computer science, including deep learning and the Semantic Web. Research results from these areas will allow for the integration of information derived from diagnostic data with that extracted from other sources of animal health information, including clinical records, mortality and even regular production data, to create a framework for truly "knowledgeable" surveillance

    Imperfect estimation of Lepeophtheirus salmonis abundance and its impact on salmon lice treatment on Atlantic salmon farms

    Get PDF
    Accurate monitoring of sea lice levels on salmon farms is critical to the efficient management of louse infestation, as decisions around whether and when to apply treatment depend on an estimation of abundance. However, as with all sampling, the estimated abundance of salmon lice through sampling salmon cannot perfectly represent the abundance on a given farm. While suggestions to improve the accuracy of lice abundance estimates have previously been made, the significance of the accuracy of such estimation has been poorly understood. Understanding the extent of error or bias in sample estimates can facilitate an assessment as to how influential this “imperfect” information will likely be on management decisions, and support methods to mitigate negative outcomes associated with such imperfect estimates. Here, we built a model of a hypothetical Atlantic salmon farm using ordinary differential equations and simulated salmon lice (Lepeophtheirus salmonis) abundance over an entire production cycle, during which salmon were periodically sampled using Monte Carlo approaches that adopted a variety of sample sizes, treatment thresholds, and sampling intervals. The model could thus track two instances of salmon lice abundance: true abundance (based on the underlying model) and monitored abundance (based on the values that could be estimated under different simulated sampling protocols). Treatments, which depend on monitored abundance, could be characterized as early, timely, or late, as a result of over-estimation, appropriate estimation, and under-estimation, respectively. To achieve timely treatment, it is important to delay treatments until true abundance equals some treatment threshold and to execute treatment as soon as this threshold is reached. Adopting larger sample sizes increased the frequency of timely treatments, largely by reducing the incidence of early treatments due to less variance in the monitored abundance. Changes in sampling interval and treatment threshold also influenced the accuracy of abundance estimates and thus the frequency of timely treatments. This study has implications for the manner in which fish should be sampled on salmon farms to ensure accurate salmon lice abundance estimates and consequently the effective application treatment

    Clinical observations associated with proven and unproven cases in the ESCRS study of prophylaxis of postoperative endophthalmitis after cataract surgery

    Get PDF
    Aims to describe cases of postoperative endophthalmitis in the European Society of Cataract and Refractive Surgeons (ESCRS) study of the prophylaxis of endophthalmitis, compare characteristics of unproven cases and cases proven by culture or polymerase chain reaction, and compare the characteristics with those in other reported series. Twenty-four ophthalmology units in Austria, Belgium, Germany, Italy, Poland, Portugal, Spain, Turkey, and the United Kingdom. Univariable and multivariable logistic regression models were used to analyze data forstatistical association of signs and symptoms in cases with proven or unproven endophthalmitis. Specific data describing characteristics of the cases were compared between the 2 types of cases. Data from 29 endophthalmitis cases were analyzed. Swollen lids and pain were statistically associated with proven cases of endophthalmitis on univariable regression analysis. Multivariable analysis indicated that swollen lids and an opaque vitreous were associated with proven cases. Five cases of endophthalmitis occurred in the cefuroxime-treated groups. No case of streptococcal infection occurred in the cefuroxime-treated groups. However, cases of infection due to streptococci showed striking differences in visual acuity and were associated with earlier onset. Characteristics in the 29 cases parallel results in previous studies, such as the Endophthalmitis Vitrectomy Study, although the addition of a control group in the ESCRS study elicited additional findings. Swollen lids, pain, and an opaque vitreous were statistically associated with proven endophthalmitis cases in the ESCRS study

    Web‐based application to guide sampling on salmon farms for more precise estimation of sea louse abundance

    Get PDF
    Objective: Efficiently managing sea lice on salmon farms through active surveillance, crucial for lice abundance estimation, is challenging due to the need for effective sampling schemes. To address this, we developed an application that considers infestation levels, farm structure, and management protocols, enhancing the precision of sampling strategies for sea louse abundance estimation. Methods: Simulation‐based methods are valuable for estimating suitable sample sizes in complex studies where standard formulae are inadequate. We introduce FishSampling, an open Web‐based application tailored to determine precise sample sizes for specific scenarios and objectives. Result: The model incorporates factors such as sea lice abundance, farm pen numbers, potential clustering effects among these pens, and the desired confidence level. Simulation outcomes within this application provide practical advice on how to decide on the number of fish and pens to sample, under varying levels of assumed clustering. Conclusion: This approach can be used across the salmon aquaculture sector to improve sampling strategies for sea lice abundance estimation and balance surveillance costs against health objectives

    Smart phone-based herd health management tool

    Get PDF

    Rearing and handling injuries in broiler chickens and risk factors for wing injuries during loading

    Get PDF
    Some injuries to broilers occur during rearing, but most injuries occur during handling before slaughter. Records provided by a processing plant for loads transported over a 19 mo period during 2009 and 2010 were examined. The median percentage of wing injuries per load was 5.7%, whereas injuries to the legs, breast, or shoulders were all less than 1% per load. Risk factors for wing injuries were examined by considering the data from each load by handling event (i.e., loads originating from the same producer on the same date). A multilevel model with three levels, producer (n = 86), handling event (n = 1694), and load (n = 4219), was fitted. The final model included weight, sex, season, catching team, time of day at which loading began, speed of loading, and an interaction between speed of loading and time of day. Factors that reduced the risk of wing injuries were loading lighter birds, loads containing only cockerels, and loading in the fall. The predicted percentage of wing injuries was relatively constant for slower loading speeds, but it was increased significantly when faster loading speeds were adopted during daytime (0700–1700). Identification of these risk factors can be used to adjust loading practices
    corecore