92 research outputs found
Suckling systems in calf rearing in organic dairy farming in the Netherlands
In an on-farm experiment three calf rearing methods were compared: bucket feeding of milk replacer, bucket feeding of tank milk and suckling of mother or nurse cow up to three months of age. Aim was to determine whether the technical results of suckling systems in calf rearing were satisfactory. Calves reared in a suckling system reached significantly higher liveweights at weaning (90 days). Although the average growth rate between weaning and the age of 1 year did not differ significantly, liveweight at 1 year did still differ significantly. Compared to both bucket fed rearing groups, suckling did not have a significant effect on Somatic Cell Count (SCC) of mothers. Suckling systems in calf rearing in organic dairy production show satisfactory technical results. Calves have the potential to grow fast and no negative effect of suckling on SCC or general animal health were observed
Inaccuracy of routine susceptibility tests for detection of erythromycin resistance of Campylobacter jejuni and Campylobacter coli
In The Netherlands, both an increase in and regional differences in erythromycin resistance of Campylobacter jejuni and Campylobacter coli have been reported. To determine the accuracy of routine tests for erythromycin resistance, 48 erythromycin-resistant isolates from various laboratories that participate in the Dutch surveillance of Campylobacter infections were reinvestigated. Initial susceptibility testing for erythromycin had been performed by disk diffusion in six and MIC-based methods in two laboratories. Reinvestigation was carried out using broth microdilution as a reference standard, as well as E-test and genetic resistance testing. Of 36 C. jejuni isolates reported by the initial laboratories as erythromycin-resistant, four (11%) and five (14%) were confirmed as erythromycin-resistant using broth microdilution according to CLSI and EUCAST resistance criteria, respectively. Erythromycin resistance was found in eight of 12 (67%) C. coli isolates according to both criteria. Results of E-tests were in accordance with these results in all isolates. Resistance-associated mutations in the 23S rRNA gene (A2059G and A2058T) were found in all isolates showing high-level resistance, whereas none were found in susceptible isolates. Routine determination of the erythromycin resistance of C. jejuni and C. coli shows unacceptable interlaboratory variation. In the absence of standardized protocols and interpretive criteria for disk diffusion, and while we await the development of easily applicable and reliable methods for molecular resistance testing, the use of broth microdilution remains the best method
Quantifying antimicrobial resistance at veal calf farms
This study was performed to determine a sampling strategy to quantify the prevalence of antimicrobial resistance on veal calf farms, based on the variation in antimicrobial resistance within and between calves on five farms. Faecal samples from 50 healthy calves (10 calves/farm) were collected. From each individual sample and one pooled faecal sample per farm, 90 selected Escherichia coli isolates were tested for their resistance against 25 mg/L amoxicillin, 25 mg/L tetracycline, 0.5 mg/L cefotaxime, 0.125 mg/L ciprofloxacin and 8/152 mg/L trimethoprim/sulfamethoxazole (tmp/s) by replica plating. From each faecal sample another 10 selected E. coli isolates were tested for their resistance by broth microdilution as a reference. Logistic regression analysis was performed to compare the odds of testing an isolate resistant between both test methods (replica plating vs. broth microdilution) and to evaluate the effect of pooling faecal samples. Bootstrap analysis was used to investigate the precision of the estimated prevalence of resistance to each antimicrobial obtained by several simulated sampling strategies. Replica plating showed similar odds of E. coli isolates tested resistant compared to broth microdilution, except for ciprofloxacin (OR 0.29, p=0.05). Pooled samples showed in general lower odds of an isolate being resistant compared to individual samples, although these differences were not significant. Bootstrap analysis showed that within each antimicrobial the various compositions of a pooled sample provided consistent estimates for the mean proportion of resistant isolates. Sampling strategies should be based on the variation in resistance among isolates within faecal samples and between faecal samples, which may vary by antimicrobial. In our study, the optimal sampling strategy from the perspective of precision of the estimated levels of resistance and practicality consists of a pooled faecal sample from 20 individual animals, of which 90 isolates are tested for their susceptibility by replica plating
Concerted Efforts to Control or Eliminate Neglected Tropical Diseases
Background: The London Declaration (2012) was formulated to support and focus the control and elimination of ten neglected tropical diseases (NTDs), with targets for 2020 as formulated by the WHO Roadmap. Five NTDs (lymphatic filariasis, onchocerciasis, schistosomiasis, soil-transmitted helminths and trachoma) are to be controlled by preventive chemotherapy (PCT), and four (Chagas’ disease, human African trypanosomiasis, leprosy and visceral leishmaniasis) by innovative and intensified disease management (IDM). Guinea worm, virtually eradicated, is not considered here. We aim to estimate the global health impact of meeting these targets in terms of averted morbidity, mortality, and disability adjusted life years (DALYs). Methods: The Global Burden of Disease (GBD) 2010 study provides prevalence and burden estimates for all nine NTDs in 1990 and 2010, by country, age and sex, which were taken as the basis for our calculations. Estimates for other years were obtained by interpolating between 1990 (or the start-year of large-scale control efforts) and 2010, and further extrapolating until 2030, such that the 2020 targets were met. The NTD disease manifestations considered in the GBD study were analyzed as either reversible or irreversible. Health impacts were assessed by comparing the results of achieving the targets with the counterfactual, construed as the health burden had the 1990 (or 2010 if higher) situation continued unabated. Principle Findings/Conclusions: Our calculations show that meeting the targets will lead to about 600 million averted DALYs in the period 2011–2030, nearly equally distributed between PCT and IDM-NTDs, with the health gain amongst PCT-NTDs mostly (96%) due to averted disability and amongst IDM-NTDs largely (95%) from averted mortality. These health gains include about 150 million averted irreversible disease manifestations (e.g. blindness) and 5 million averted deaths. Control of soil-transmitted helminths accounts for one third of all averted DALYs. We conclude that the projected health impact of the London Declaration justifies the required efforts
The significance of flood duration for flood damage assessment
Introduction Flood risks can be reduced by either reducing the probability or the consequences of a flooding. These consequences can be quantified with flood damage models. Such models determine flood damage based on the water depth and the land use. This thesis will investigate the need to also use the flood duration as input parameter. Problem definition Besides the water depth, also other factors determine the resulting flood damages. These factors are often not taken into account in flood damage models. One of these influences is the flood duration. The longer a flooding lasts, the larger the material damage, and especially damage due to interruption will be. Flood duration causes interruptions and extra material damages. Taking into account flood duration can, therefore, theoretically make flood damage models more accurate. Flood duration predictions are, however, at the moment rarely done. This thesis aims to get both a qualitative and quantitative understanding of flood duration and the importance of flood duration for damage assessments. Research This thesis aims to explore the possibilities of assessing flood duration for flood risk management. This is approached by the following steps. 1. Development of a better understanding of flood duration. By looking at different areas and flood threats, a flood type categorization was developed and durations were estimated for each flood type 2. Exploration of the influence of flood duration on damage. A modeling method to roughly estimate the duration-dependent damage was developed. The framework of this method may also be useful for future duration dependent flood damage models. 3. Two case studies were carried out to study flood duration and its influence on damage in more detail: First the Betuwe and Tieler & Culemburgerwaard area was studied and secondly the area threatened by a breach at the Parksluizen in Rotterdam was focused on. Different scenarios were used with varying breach locations, measures and use of outlet and drainage structures. Results 1. The most important factors which determine the flood duration are duration necessary to repair the breaches, the possibilities for drainage by gravity, the elevation and elevation variation in the area and the magnitude of the flood event. Flooding durations in the Netherlands vary between hours and about one year. 2. Adding flood duration as input to flood damage models adds a little extra accuracy. This is limited because flood duration is correlated with the water depth. With the current flood damage accuracy, incorporating flood duration is only useful for specific cost benefit analysis related to measures that aim to change the flood duration. Conclusions and recommendations Flood duration can be significant for large floods in low and endyked areas. In these cases flood duration can also have a significant impact on the damage. However, a complex economic model is necessary to quantify this. Therefore, flood duration can only reach its full value as an input, in combination with better economic modeling.Water ResourcesWater ManagementCivil Engineering and Geoscience
Suckling systems in calf rearing in organic dairy farming in the Netherlands
In an on-farm experiment three calf rearing methods were compared: bucket feeding of milk replacer, bucket feeding of tank milk and suckling of mother or nurse cow up to three months of age. Aim was to determine whether the technical results of suckling systems in calf rearing were satisfactory. Calves reared in a suckling system reached significantly higher liveweights at weaning (90 days). Although the average growth rate between weaning and the age of 1 year did not differ significantly, liveweight at 1 year did still differ significantly. Compared to both bucket fed rearing groups, suckling did not have a significant effect on Somatic Cell Count (SCC) of mothers. Suckling systems in calf rearing in organic dairy production show satisfactory technical results. Calves have the potential to grow fast and no negative effect of suckling on SCC or general animal health were observed
Practical implications of increasing 'natural living' through suckling systems in organic dairy calf rearing; Theme: Values in Organic Agriculture
The introduction of suckling systems in organic dairy calf rearing has the potential to enhance animal welfare in terms of ‘natural living’ and to live up to consumers’ expectations about organic agriculture. This study describes the implications of suckling systems in a practical organic dairy context. Results show that farmers can successfully develop and implement a suckling system in calf rearing. The consumption of mothers’ milk resulted in high weaning weights at 3 months of age. No immediate animal health problems linked to suckling systems occurred. Compared with traditional bucket feeding of milk, suckling systems resulted in increased natural behaviour such as calff–cow bonding, natural sucking behaviour and care-taking behaviour. Some farmers had difficulties accepting negative implications of suckling systems such as stress after weaning and loss of marketable milk. Although suckling of the own mother was seen as the most natural suckling system, farmers adapted their suckling system to calves suckling nurse cows. In order to implement successfully a suckling system, farmers have to step back from control and give calf and cow a chance. In the case of increasing ‘natural living’ through implementation of a suckling system, farmers should be encouraged to take enough time to accomplish this attitude change.The introduction of suckling systems in organic dairy calf rearing has the potential to enhance animal welfare in terms of 'natural living' and to live up to consumers' expectations about organic agriculture. This study describes the implications of suckling systems in a practical organic dairy context. Results show that farmers can successfully develop and implement a suckling system in calf rearing. The consumption of mothers' milk resulted in high weaning weights at 3 months of age. No immediate animal health problems linked to suckling systems occurred. Compared with traditional bucket feeding of milk, suckling systems resulted in increased natural behaviour such as calf-cow bonding, natural sucking behaviour and care-taking behaviour. Some farmers had difficulties accepting negative implications of suckling systems such as stress after weaning and loss of marketable milk. Although suckling of the own mother was seen as the most natural suckling system, farmers adapted their suckling system to calves suckling nurse cows. In order to implement successfully a suckling system, farmers have to step back from control and give calf and cow a chance. In the case of increasing 'natural living' through implementation of a suckling system, farmers should be encouraged to take enough time to accomplish this attitude change
- …