71 research outputs found

    Internet of animals : characterisation of LoRa sub-GHz off-body wireless channel in dairy barns

    Get PDF
    Advances in wireless sensor technologies and MEMS have made it possible to automatically monitor the health status of dairy cows using Internet of things (IoT) and wireless body area networks. Since on-cow measuring devices are energy constrained, a proper characterisation of the off-body wireless channel between the on-cow sensor nodes and the back-end base station is required for an optimised deployment of these networks in barns. In this Letter, the long range (LoRa) off-body wireless channel has been characterised at 868 MHz, a typical IoT frequency. Both path loss and temporal fading were investigated using LoRa motes. Based on this characterisation, network planning and energy consumption optimisation of the on-body nodes could be performed, which enables the deployment of reliable dairy cow monitoring systems

    Inter-rater reliability of categorical versus continuous scoring of fish vitality: does it affect the utility of the reflex action mortality predictor (RAMP) approach?

    Get PDF
    Scoring reflex responsiveness and injury of aquatic organisms has gained popularity as predictors of discard survival. Given this method relies upon the individual interpretation of scoring criteria, an evaluation of its robustness is done here to test whether protocol-instructed, multiple raters with diverse backgrounds (research scientist, technician, and student) are able to produce similar or the same reflex and injury score for one of the same flatfish (European plaice, Pleuronectes platessa) after experiencing commercial fishing stressors. Inter-rater reliability for three raters was assessed by using a 3-point categorical scale (‘absent’, ‘weak’, ‘strong’) and a tagged visual analogue continuous scale (tVAS, a 10 cm bar split in three labelled sections: 0 for ‘absent’, ‘weak’, ‘moderate’, and ‘strong’) for six reflex responses, and a 4-point scale for four injury types. Plaice (n = 304) were sampled from 17 research beam-trawl deployments during four trips. Fleiss kappa (categorical scores) and intra-class correlation coefficients (ICC, continuous scores) indicated variable inter-rater agreement by reflex type (ranging between 0.55 and 0.88, and 67% and 91% for Fleiss kappa and ICC, respectively), with least agreement among raters on extent of injury (Fleiss kappa between 0.08 and 0.27). Despite differences among raters, which did not significantly influence the relationship between impairment and predicted survival, combining categorical reflex and injury scores always produced a close relationship of such vitality indices and observed delayed mortality. The use of the continuous scale did not improve fit of these models compared with using the reflex impairment index based on categorical scores. Given these findings, we recommend using a 3-point categorical over a continuous scale. We also determined that training rather than experience of raters minimised inter-rater differences. Our results suggest that cost-efficient reflex impairment and injury scoring may be considered a robust technique to evaluate lethal stress and damage of this flatfish species on-board commercial beam-trawl vessels

    Culling-Induced Changes in Badger (Meles meles) Behaviour, Social Organisation and the Epidemiology of Bovine Tuberculosis

    Get PDF
    In the UK, attempts since the 1970s to control the incidence of bovine tuberculosis (bTB) in cattle by culling a wildlife host, the European badger (Meles meles), have produced equivocal results. Culling-induced social perturbation of badger populations may lead to unexpected outcomes. We test predictions from the ‘perturbation hypothesis’, determining the impact of culling operations on badger populations, movement of surviving individuals and the influence on the epidemiology of bTB in badgers using data dervied from two study areas within the UK Government's Randomised Badger Culling Trial (RBCT). Culling operations did not remove all individuals from setts, with between 34–43% of badgers removed from targeted social groups. After culling, bTB prevalence increased in badger social groups neighbouring removals, particularly amongst cubs. Seventy individual adult badgers were fitted with radio-collars, yielding 8,311 locational fixes from both sites between November 2001 and December 2003. Home range areas of animals surviving within removed groups increased by 43.5% in response to culling. Overlap between summer ranges of individuals from Neighbouring social groups in the treatment population increased by 73.3% in response to culling. The movement rate of individuals between social groups was low, but increased after culling, in Removed and Neighbouring social groups. Increased bTB prevalence in Neighbouring groups was associated with badger movements both into and out of these groups, although none of the moving individuals themselves tested positive for bTB. Significant increases in both the frequency of individual badger movements between groups and the emergence of bTB were observed in response to culling. However, no direct evidence was found to link the two phenomena. We hypothesise that the social disruption caused by culling may not only increase direct contact and thus disease transmission between surviving badgers, but may also increase social stress within the surviving population, causing immunosuppression and enhancing the expression of disease

    Sensitivity of the integrated Welfare Quality® scores to changing values of individual dairy cattle welfare measures

    Get PDF
    The Welfare Quality((R)) (WQ) protocol for on-farm dairy cattle welfare assessment describes 33 measures and a step-wise method to integrate the outcomes into 12 criteria scores, grouped into four principle scores and into an overall welfare categorisation with four possible levels. The relative contribution of various welfare measures to the integrated scores has been contested. Using a European dataset (491 herds), we investigated: i) variation in sensitivity of integrated outcomes to extremely low and high values of measures, criteria and principles by replacing each actual value with minimum and maximum observed and theoretically possible values; and ii) the reasons for this variation in sensitivity. As intended by the WQ consortium, the sensitivity of integrated scores depends on: i) the observed value of the specific measures/criteria; ii) whether the change was positive/negative; and iii) the relative weight attributed to the measures. Additionally, two unintended factors of considerable influence appear to be side-effects of the complexity of the integration method. Namely: i) the number of measures integrated into criteria and principle scores; and ii) the aggregation method of the measures. Therefore, resource-based measures related to drinkers (which have been criticised with respect to their validity to assess absence of prolonged thirst), have a much larger influence on integrated scores than health-related measures such as 'mortality rate' and 'lameness score'. Hence, the integration method of the WQ protocol for dairy cattle should be revised to ensure that the relative contribution of the various welfare measures to the integrated scores more accurately reflect their relevance for dairy cattle welfare

    Inference of the infection status of individuals using longitudinal testing data from cryptic populations: Towards a probabilistic approach to diagnosis

    Get PDF
    Effective control of many diseases requires the accurate detection of infected individuals. Confidently ascertaining whether an individual is infected can be challenging when diagnostic tests are imperfect and when some individuals go for long periods of time without being observed or sampled. Here, we use a multi-event capture-recapture approach to model imperfect observations of true epidemiological states. We describe a method for interpreting potentially disparate results from individuals sampled multiple times over an extended period, using empirical data from a wild badger population naturally infected with Mycobacterium bovis as an example. We examine the effect of sex, capture history and current and historical diagnostic test results on the probability of being truly infected, given any combination of diagnostic test results. In doing so, we move diagnosis away from the traditional binary classification of apparently infected versus uninfected to a probability-based interpretation which is updated each time an individual is re-sampled. Our findings identified temporal variation in infection status and suggest that capture probability is influenced by year, season and infection status. This novel approach to combining ecological and epidemiological data may aid disease management decision-making by providing a framework for the integration of multiple diagnostic test data with other information

    Identifying physiological measures of lifetime welfare status in pigs: exploring the usefulness of haptoglobin, C-reactive protein and hair cortisol sampled at the time of slaughter

    Get PDF
    Background: Physiological measures indicative of the welfare status of animals during rearing could form part of an abattoir-based animal health and welfare assessment tool. A total of 66 pigs were used in this study, the aim of which was to assess how serum concentrations of haptoglobin (Hp) and C-reactive protein (CRP) (assessed in 51 pigs), and hair concentrations of cortisol (assessed in 65 pigs), measured at or close to slaughter, reflected welfare-related indicators recorded from the animal during its lifetime. These indicators were recorded at intervals between 7 and 21 weeks of age and included assigning scores for levels of tail and skin lesions, recording the presence or absence of certain health issues, and conducting qualitative behavioural assessments (QBA). Results: Pigs recorded as having tail lesions during their lifetime had higher hair cortisol levels than those with no tail lesions (tail lesions: 47.87 ± 3.34 pg/mg, no tail lesions: 42.20 ± 3.29 pg/mg, P = 0.023), and pigs recorded as having moderate or severe tail lesions had higher Hp levels than those with no or mild tail lesions (moderate/severe: 1.711 mg/ml ± 0.74, none/mild: 0.731 mg/ml ±0.10, P = 0.010). Pigs recorded as being lame during their lifetime tended to have higher hair cortisol levels than non-lame pigs (lame: 52.72 pg/mg ± 3.83, not lame: 43.07 pg/mg ± 2.69, P = 0.062). QBA scores were not associated with any of the physiological measures (P > 0.05). Receiver Operator Curve (ROC) analysis was also carried out to get a better understanding of the usefulness of the physiological measures in discriminating animals that had had welfare-related issues recorded during their lifetime from those that had not. Hair cortisol was determined as having ‘moderate’ accuracy in discriminating pigs that were tail bitten on-farm from unbitten pigs (AUC: 0.748) while Hp and CRP were determined to have no meaningful discriminatory ability (AUC < 0.600). Conclusion: This research should be repeated on a larger scale, but the results suggest that hair cortisol measured at slaughter could provide insight into the welfare status of pigs during their lifetime. Hp may be a useful indicator of tail lesions in pigs. However, further research utilising a greater proportion of severely bitten pigs is required before conclusions can be drawn

    Development of pig welfare assessment protocol integrating animal-, environment-, and management-based measures

    Get PDF
    Abstract Background Due to increased interest in animal welfare, there is now a need for a comprehensive assessment protocol to be used in intensive pig farming systems. There are two current welfare assessment protocols for pigs: Welfare Quality&#174; Assessment Protocols (applicable in the Europe Union), that mostly focuses on animal-based measures, and the Swine Welfare Assurance Program (applicable in the United States), that mostly focuses on management- and environment-based measures. In certain cases, however, animal-based measures might not be adequate for properly assessing pig welfare status. Similarly, welfare assessment that relies only on environment- and management-based measures might not represent the actual welfare status of pigs. Therefore, the objective of this paper was to develop a new welfare protocol by integrating animal-, environment-, and management-based measures. The background for selection of certain welfare criteria and modification of the scoring systems from existing welfare assessment protocols are described. Methods The developed pig welfare assessment protocol consists of 17 criteria that are related to four main principles of welfare (good feeding, good housing, good health, and appropriate behavior). Good feeding, good housing, and good health were assessed using a 3-point scale: 0 (good welfare), 1 (moderate welfare), and 2 (poor welfare). In certain cases, only a 2-point scale was used: 0 (certain condition is present) or 2 (certain condition is absent). Appropriate behavior was assessed by scan sampling of positive and negative social behaviors based on qualitative behavior assessment and human-animal relationship tests. Results Modification of the body condition score into a 3-point scale revealed pigs with a moderate body condition (score 1). Moreover, additional criteria such as feed quality confirmed that farms had moderate (score 1) or poor feed quality (score 2), especially those farms located in a high relative humidity region. Conclusions The developed protocol can be utilized to assess welfare status in an intensive pig farming system. Although further improvements are still needed, this study is a first step in developing a pig welfare assessment protocol that combines animal-, environment-, and management-based measures

    Irish pig farmer's perceptions and experiences of tail and ear biting.

    Get PDF
    peer-reviewedAbnormal behaviours such as ear and tail biting of pigs is of significant welfare and economic concern. Currently, pig welfare legislation is under renewed focus by the EU commission and is likely to be enforced more thoroughly. The legislation prohibits routine tail docking and requires adequate enrichment to be provided. In Ireland, tail-docking is still the most utilised control mechanism to combat tail biting, but biting is still widespread even in tail-docked pigs. In addition, as pig farms are almost all fully slatted, bedding type material cannot be provided. Thus, the opinions, and practices of farmers in countries like Ireland, which may need to make significant adaptations to typical pig management systems soon, need to be considered and addressed. We carried out a survey of pig farmers during 2015 in order to gain a greater understanding of the extent of biting on Irish farms, perception on the most important preventive measures, current enrichment use and actions following outbreaks. Fifty-eight farmers from 21 Counties responded with an average herd size of 710 ± 597 sows (range 90–3000 sows). Only two farms had experienced no biting in the last year. Of the farms that had experienced tail biting (88%), 86% had also experienced ear biting. The most common concerns relating to biting were condemnation and reduced productivity of bitten pigs with both receiving an average score of 4 (most serious). Ear biting occurred most commonly in the 2nd stage (approximately 47–81 days from weaning) weaner and tail biting in the finishing stage. The most important preventive measures were felt to be taking care of animal health, restricting density, maintaining an even quality of feed/content and maintaining good air movement. Sixty-five percent of respondents added additional enrichment following an outbreak. Chains were the most common form of enrichment currently used (83%). Those not using chains favoured wood, toys and rope (17%). Identification of the most effective and accessible control and prevention measures both for the animals and for the farming community is thus essential. Improved understanding of the concerns and practices of producers, which this survey contributes to, is a first step towards this aim

    Effect of synthetic hormones on reproduction in Mastomys natalensis

    Get PDF
    Rodent pest management traditionally relies on some form of lethal control. Developing effective fertility control for pest rodent species could be a major breakthrough particularly in the context of managing rodent population outbreaks. This laboratory-based study is the first to report on the effects of using fertility compounds on an outbreaking rodent pest species found throughout sub-Saharan Africa. Mastomys natalensis were fed bait containing the synthetic steroid hormones quinestrol and levonorgestrel, both singly and in combination, at three concentrations (10, 50, 100 ppm) for seven days. Consumption of the bait and animal body mass was mostly the same between treatments when analysed by sex, day and treatment. However, a repeated measures ANOVA indicated that quinestrol and quinestrol+levonorgestrel treatments reduced consumption by up to 45%, particularly at the higher concentrations of 50 and 100 ppm. Although there was no clear concentration effect on animal body mass, quinestrol and quinestrol+levonorgestrel lowered body mass by up to 20% compared to the untreated and levonorgestrel treatments. Quinestrol and quinestrol+levonorgestrel reduced the weight of male rat testes, epididymis and seminal vesicles by 60-80%, and sperm concentration and motility were reduced by more than 95%. No weight changes were observed to uterine and ovarian tissue; however, high uterine oedema was observed among all female rats consuming treated bait at 8 days and 40 days from trial start. Trials with mate pairing showed there were significant differences in the pregnancy rate with all treatments when compared to the untreated control group of rodents
    • …
    corecore