450 research outputs found

    Protective-restoring to maintain self integrity : a grounded theory of the human experience of dog relinquishment

    Get PDF
    Dog relinquishment or ‘getting rid of the dog’ is common practice in Australia and other countries where dogs are kept as pets. Each year thousands of dogs are relinquished for a variety of reasons. While losing a pet through death can be as devastating for some people as the death of a loved human, little is known about the human impact of losing a dog through relinquishment. This qualitative study sought to explore the experience of dog relinquishment from the perspectives of a Western Australian sample of 21 relinquishers, 10 adults who had experienced dog relinquishment in childhood and 15 animal welfare workers. Data, collected via semistructured interviews, were transcribed verbatim and analysed in accord with Straussian grounded theory methodology, an inductive, interpretative methodology, utilising the constant comparative method. The substantive grounded theory of ‘protective-restoring to maintain self integrity in the face of a self disturbing experience’ that was generated from an interpretative analysis of the data, describes the human experience of dog relinquishment as one of psychological, social and moral conflicts that challenged participants self and social image. The theory proposes that those who experience dog relinquishment personally or professionally experience a disturbed self integrity (i.e., a sense of cognitive and emotional unease). Five conditions, identified as threats to self integrity, were found to contribute to participants’ sense of unease, namely the culture of relinquishment, a crisis of conscience, a fear of losing face, losing faith and losing the dog. Variation in participants’ experience was accounted for by individual and social conditions that influenced the type, intensity, frequency and duration of their unease. Participants’ experience of dog relinquishment was characterised by one or more of three types of unease, namely, cognitive dissonance, psychological stress and grief, which were dealt with through a process identified as protective-restoring. The continuous four phase process of protective-restoring involved recognition, identification, assessment and counteraction of threats to self integrity. Its aim was to protect participants from further threats and to restore their self integrity. Six types of strategies were identified that participants employed during the counteracting phase of the protective-restoring process, namely, self enhancing, blaming, impact reducing, emotional management, avoiding and blocking. Strategies employed were not always successful and in some circumstances increased rather than reduced the unease of participants. Further the strategies sometimes contributed to the unease of others. These findings indicate that the human experience of dog relinquishment is multidimensional and complex. Further, given its potential to detrimentally impact the mental health and wellbeing of large numbers of adults and children, dog relinquishment is an experience that should not be trivialised or ignored. As well as contributing to the human-animal interaction body of knowledge, the substantive theory that has emerged from this research could be used to inform the development of a screening tool to identify those who are likely to be negatively impacted by dog relinquishment. Further, the theory could also be used to inform the development of interventions that could be used to assist adults and children to deal with the negative impact of dog relinquishment

    Clitellate worms (Annelida) in late-glacial and Holocene sedimentary DNA records from the Polar Urals and northern Norway

    Get PDF
    While there are extensive macro‐ and microfossil records of a range of plants and animals from the Quaternary, earthworms and their close relatives amongst annelids are not preserved as fossils and therefore the knowledge of their past distributions is limited. This lack of fossils means that clitellate worms (Annelida) are currently underused in palaeoecological research, even though they can provide valuable information about terrestrial and aquatic environmental conditions. Their DNA might be preserved in sediments, which offers an alternative method for detection. Here we analyse lacustrine sediments from lakes in the Polar Urals, Arctic Russia, covering the period 24 000–1300 cal. a BP, and NE Norway, covering 10 700–3300 cal. a BP, using a universal mammal 16S rDNA marker. While mammals were recorded using the marker (reindeer was detected twice in the Polar Urals core at 23 000 and 14 000 cal. a BP, and four times in the Norwegian core at 11 000 cal. a BP and between 3600–3300 cal. a BP), worm extracellular DNA ‘bycatch’ was rather high. In this paper we present the first reported worm detection from ancient DNA. Our results demonstrate that both aquatic and terrestrial clitellates can be identified in late‐Quaternary lacustrine sediments, and the ecological information retrievable from this group warrants further research with a more targeted approach.publishedVersio

    Holocene floristic diversity and richness in northeast Norway revealed by sedimentary ancient DNA (sedaDNA) and pollen

    Get PDF
    Source at https://doi.org/10.1111/bor.12357We present a Holocene record of floristic diversity and environmental change for the central Varanger Peninsula, Finnmark, based on ancient DNA extracted from the sediments of a small lake (sedaDNA). The record covers the period c. 10 700 to 3300 cal. a BP and is complemented by pollen data. Measures of species richness, sample evenness and beta diversity were calculated based on sedaDNA sampling intervals and 1000‐year time windows. We identified 101 vascular plant and 17 bryophyte taxa, a high proportion (86%) of which are still growing within the region today. The high species richness (>60 taxa) observed in the Early Holocene, including representatives from all important plant functional groups, shows that modern shrub‐tundra communities, and much of their species complement, were in place as early as c. 10 700 cal. a BP. We infer that postglacial colonization of the area occurred prior to the full Holocene, during the Pleistocene‐Holocene transition, Younger Dryas stadial or earlier. Abundant DNA of the extra‐limital aquatic plant Callitriche hermaphroditica suggests it expanded its range northward between c. 10 200 and 9600 cal. a BP, when summers were warmer than present. High values of Pinus DNA occur throughout the record, but we cannot say with certainty if they represent prior local presence; however, pollen influx values >500 grains cm−2 a−1 between c. 8000 and 7300 cal. a BP strongly suggest the presence of pine woodland during this period. As the site lies beyond the modern tree limit of pine, it is likely that this expansion also reflects a response to warmer Early Holocene summers

    The polaroid image as photo-object

    Get PDF
    This article is part of a larger project on the cultural history of Polaroid photography and draws on research done at the Polaroid Corporate archive at Harvard and at the Polaroid company itself. It identifies two cultural practices engendered by Polaroid photography, which, at the point of its extinction, has briefly flared into visibility again. It argues that these practices are mistaken as novel but are in fact rediscoveries of practices that stretch back as many as five decades. The first section identifies Polaroid image-making as a photographic equivalent of what Tom Gunning calls the ‘cinema of attractions’. That is, the emphasis in its use is on the display of photographic technologies rather than the resultant image. Equally, the common practice, in both fine art and vernacular circles, of making composite pictures with Polaroid prints, draws attention from image content and redirects it to the photo as object

    Effect of MRI on preterm infants and their families: a randomised trial with nested diagnostic and economic evaluation.

    Get PDF
    BACKGROUND: We tested the hypothesis that routine MRI would improve the care and well-being of preterm infants and their families. DESIGN: Parallel-group randomised trial (1.1 allocation; intention-to-treat) with nested diagnostic and cost evaluations (EudraCT 2009-011602-42). SETTING: Participants from 14 London hospitals, imaged at a single centre. PATIENTS: 511 infants born before 33 weeks gestation underwent both MRI and ultrasound around term. 255 were randomly allocated (siblings together) to receive only MRI results and 255 only ultrasound from a paediatrician unaware of unallocated results; one withdrew before allocation. MAIN OUTCOME MEASURES: Maternal anxiety, measured by the State-Trait Anxiety inventory (STAI) assessed in 206/214 mothers receiving MRI and 217/220 receiving ultrasound. Secondary outcomes included: prediction of neurodevelopment, health-related costs and quality of life. RESULTS: After MRI, STAI fell from 36.81 (95% CI 35.18 to 38.44) to 32.77 (95% CI 31.54 to 34.01), 31.87 (95% CI 30.63 to 33.12) and 31.82 (95% CI 30.65 to 33.00) at 14 days, 12 and 20 months, respectively. STAI fell less after ultrasound: from 37.59 (95% CI 36.00 to 39.18) to 33.97 (95% CI 32.78 to 35.17), 33.43 (95% CI 32.22 to 34.63) and 33.63 (95% CI 32.49 to 34.77), p=0.02. There were no differences in health-related quality of life. MRI predicted moderate or severe functional motor impairment at 20 months slightly better than ultrasound (area under the receiver operator characteristic curve (CI) 0.74; 0.66 to 0.83 vs 0.64; 0.56 to 0.72, p=0.01) but cost £315 (CI £295-£336) more per infant. CONCLUSIONS: MRI increased costs and provided only modest benefits. TRIAL REGISTRATION: ClinicalTrials.gov NCT01049594 https://clinicaltrials.gov/ct2/show/NCT01049594.EudraCT: EudraCT: 2009-011602-42 (https://www.clinicaltrialsregister.eu/)

    Multihospital Outbreak of Clostridium difficile Ribotype 027 Infection: Epidemiology and Analysis of Control Measures

    Get PDF
    Objective. To report a large outbreak of Clostridium difficile infection (CDI; ribotype 027) between June 2007 and August 2008, describe infection control measures, and evaluate the impact of restricting the use of fluoroquinolones in controlling the outbreak. Design. Outbreak investigation in 3 acute care hospitals of the Northern Health and Social Care Trust in Northern Ireland. Interventions. Implementation of a series of CDI control measures that targeted high-risk antibiotic agents (ie, restriction of fluoroquinolones), infection control practices, and environmental hygiene. Results. A total of 318 cases of CDI were identified during the outbreak, which was the result of the interaction between C. difficile ribotype 027 being introduced into the affected hospitals for the first time and other predisposing risk factors (ranging from host factors to suboptimal compliance with antibiotic guidelines and infection control policies). The 30-day all-cause mortality rate was 24.5%; however, CDI was the attributable cause of death for only 2.5% of the infected patients. Time series analysis showed that restricting the use of fluoroquinolones was associated with a significant reduction in the incidence of CDI (coefficient, —0.054; lag time, 4 months; P = .003). Conclusion. These findings provide additional evidence to support the value of antimicrobial stewardship as an essential element of multifaceted interventions to control CDI outbreaks. The present CDI outbreak was ended following the implementation of an action plan improving communication, antibiotic stewardship, infection control practices, environmental hygiene, and surveillanc

    'To live and die [for] Dixie': Irish civilians and the Confederate States of America

    Get PDF
    Around 20,000 Irishmen served in the Confederate army in the Civil War. As a result, they left behind, in various Southern towns and cities, large numbers of friends, family, and community leaders. As with native-born Confederates, Irish civilian support was crucial to Irish participation in the Confederate military effort. Also, Irish civilians served in various supporting roles: in factories and hospitals, on railroads and diplomatic missions, and as boosters for the cause. They also, however, suffered in bombardments, sieges, and the blockade. Usually poorer than their native neighbours, they could not afford to become 'refugees' and move away from the centres of conflict. This essay, based on research from manuscript collections, contemporary newspapers, British Consular records, and Federal military records, will examine the role of Irish civilians in the Confederacy, and assess the role this activity had on their integration into Southern communities. It will also look at Irish civilians in the defeat of the Confederacy, particularly when they came under Union occupation. Initial research shows that Irish civilians were not as upset as other whites in the South about Union victory. They welcomed a return to normalcy, and often 'collaborated' with Union authorities. Also, Irish desertion rates in the Confederate army were particularly high, and I will attempt to gauge whether Irish civilians played a role in this. All of the research in this paper will thus be put in the context of the Drew Gilpin Faust/Gary Gallagher debate on the influence of the Confederate homefront on military performance. By studying the Irish civilian experience one can assess how strong the Confederate national experiment was. Was it a nation without a nationalism

    A point-of-care clinical trial comparing insulin administered using a sliding scale versus a weight-based regimen

    Get PDF
    Background Clinical trials are widely considered the gold standard in comparative effectiveness research (CER) but the high cost and complexity of traditional trials and concerns about generalizability to broad patient populations and general clinical practice limit their appeal. Unsuccessful implementation of CER results limits the value of even the highest quality trials. Planning for a trial comparing two standard strategies of insulin administration for hospitalized patients led us to develop a new method for a clinical trial designed to be embedded directly into the clinical care setting thereby lowering the cost, increasing the pragmatic nature of the overall trial, strengthening implementation, and creating an integrated environment of research-based care

    Assessing the Quality of Decision Support Technologies Using the International Patient Decision Aid Standards instrument (IPDASi)

    Get PDF
    Objectives To describe the development, validation and inter-rater reliability of an instrument to measure the quality of patient decision support technologies (decision aids). Design Scale development study, involving construct, item and scale development, validation and reliability testing. Setting There has been increasing use of decision support technologies – adjuncts to the discussions clinicians have with patients about difficult decisions. A global interest in developing these interventions exists among both for-profit and not-for-profit organisations. It is therefore essential to have internationally accepted standards to assess the quality of their development, process, content, potential bias and method of field testing and evaluation. Methods Scale development study, involving construct, item and scale development, validation and reliability testing. Participants Twenty-five researcher-members of the International Patient Decision Aid Standards Collaboration worked together to develop the instrument (IPDASi). In the fourth Stage (reliability study), eight raters assessed thirty randomly selected decision support technologies. Results IPDASi measures quality in 10 dimensions, using 47 items, and provides an overall quality score (scaled from 0 to 100) for each intervention. Overall IPDASi scores ranged from 33 to 82 across the decision support technologies sampled (n = 30), enabling discrimination. The inter-rater intraclass correlation for the overall quality score was 0.80. Correlations of dimension scores with the overall score were all positive (0.31 to 0.68). Cronbach's alpha values for the 8 raters ranged from 0.72 to 0.93. Cronbach's alphas based on the dimension means ranged from 0.50 to 0.81, indicating that the dimensions, although well correlated, measure different aspects of decision support technology quality. A short version (19 items) was also developed that had very similar mean scores to IPDASi and high correlation between short score and overall score 0.87 (CI 0.79 to 0.92). Conclusions This work demonstrates that IPDASi has the ability to assess the quality of decision support technologies. The existing IPDASi provides an assessment of the quality of a DST's components and will be used as a tool to provide formative advice to DSTs developers and summative assessments for those who want to compare their tools against an existing benchmark
    • 

    corecore