589 research outputs found

    Piloting data linkage in a prospective cohort study of a GP referral scheme to avoid unnecessary emergency department conveyance

    Get PDF
    BACKGROUND: UK Ambulance services are under pressure to safely stream appropriate patients away from the Emergency Department (ED). Even so, there has been little evaluation of patient outcomes. We investigated differences between patients who are conveyed directly to ED after calling 999 and those referred by an ambulance crew to a novel GP referral scheme. METHODS: This was a prospective study comparing patients from two cohorts, one conveyed directly to the ED (n = 4219) and the other referred to a GP by the on-scene paramedic (n = 321). To compare differences in patient outcomes, we include follow-up data of a smaller subset of each cohort (up to n = 150 in each) including hospital admission, history of long-term illness, previous ED attendance, length of stay, hospital investigations, internal transfers, 30-day re-admission and 10-month mortality. RESULTS: Older individuals, females, and those with minor incidents were more likely to be referred to a GP than conveyed directly to ED. Of those patients referred to the GP, only 22.4% presented at ED within 30 days. These patients were more likely to be admitted then than were those initially conveyed directly to ED (59% vs 31%). Those conveyed to ED had a higher risk of death compared to those who were referred to the GP (HR: 2.59; 95% CI 1.14–5.89), however when analyses were restricted to those who presented at ED within 30 days, there was no difference in mortality risk (HR: 1.45; 95% CI 0.58–3.65). CONCLUSIONS: Despite limited data and a small sample size, there were differences between patients conveyed directly to ED and those who were referred into GP care. Initial evidence suggests that referring individuals to a GP may provide an appropriate and safe alternative path of care. This pilot study demonstrated a need for larger scale, methodologically rigorous study to demonstrate the benefits of alternative conveyance schemes and recommend changes to the current system of urgent and emergency care

    The Benefit of Enhanced Contractility in the Infarct Borderzone: A Virtual Experiment

    Get PDF
    Objectives: Contractile function in the normally perfused infarct borderzone (BZ) is depressed. However, the impact of reduced BZ contractility on left ventricular (LV) pump function is unknown. As a consequence, there have been no therapies specifically designed to improve BZ contractility. We tested the hypothesis that an improvement in borderzone contractility will improve LV pump function. Methods: From a previously reported study, magnetic resonance imaging (MRI) images with non-invasive tags were used to calculate 3D myocardial strain in five sheep 16 weeks after anteroapical myocardial infarction. Animal-specific finite element (FE) models were created using MRI data and LV pressure obtained at early diastolic filling. Analysis of borderzone function using those FE models has been previously reported. Chamber stiffness, pump function (Starling’s law) and stress in the fiber, cross fiber, and circumferential directions were calculated. Animal-specific FE models were performed for three cases: (a) impaired BZ contractility (INJURED); (b) BZ-contractility fully restored (100% BZ IMPROVEMENT); or (c) BZ-contractility partially restored (50% BZ IMPROVEMENT). Results: 100% BZ IMPROVEMENT and 50% BZ IMPROVEMENT both caused an upward shift in the Starling relationship, resulting in a large (36 and 26%) increase in stroke volume at LVPED = 20 mmHg (8.0 ml, p < 0.001). Moreover, there were a leftward shift in the end-systolic pressure volume relationship, resulting in a 7 and 5% increase in LVPES at 110 mmHg (7.7 ml, p < 0.005). It showed that even 50% BZ IMPROVEMENT was sufficient to drive much of the calculated increase in function. Conclusion: Improved borderzone contractility has a beneficial effect on LV pump function. Partial improvement of borderzone contractility was sufficient to drive much of the calculated increase in function. Therapies specifically designed to improve borderzone contractility should be developed

    Who should be prioritized for renal transplantation?: Analysis of key stakeholder preferences using discrete choice experiments

    Get PDF
    Background Policies for allocating deceased donor kidneys have recently shifted from allocation based on Human Leucocyte Antigen (HLA) tissue matching in the UK and USA. Newer allocation algorithms incorporate waiting time as a primary factor, and in the UK, young adults are also favoured. However, there is little contemporary UK research on the views of stakeholders in the transplant process to inform future allocation policy. This research project aimed to address this issue. Methods Discrete Choice Experiment (DCE) questionnaires were used to establish priorities for kidney transplantation among different stakeholder groups in the UK. Questionnaires were targeted at patients, carers, donors / relatives of deceased donors, and healthcare professionals. Attributes considered included: waiting time; donor-recipient HLA match; whether a recipient had dependents; diseases affecting life expectancy; and diseases affecting quality of life. Results Responses were obtained from 908 patients (including 98 ethnic minorities); 41 carers; 48 donors / relatives of deceased donors; and 113 healthcare professionals. The patient group demonstrated statistically different preferences for every attribute (i.e. significantly different from zero) so implying that changes in given attributes affected preferences, except when prioritizing those with no rather than moderate diseases affecting quality of life. The attributes valued highly related to waiting time, tissue match, prioritizing those with dependents, and prioritizing those with moderate rather than severe diseases affecting life expectancy. Some preferences differed between healthcare professionals and patients, and ethnic minority and non-ethnic minority patients. Only non-ethnic minority patients and healthcare professionals clearly prioritized those with better tissue matches. Conclusions Our econometric results are broadly supportive of the 2006 shift in UK transplant policy which emphasized prioritizing the young and long waiters. However, our findings suggest the need for a further review in the light of observed differences in preferences amongst ethnic minorities, and also because those with dependents may be a further priority.</p

    Flower Bats (Glossophaga soricina) and Fruit Bats (Carollia perspicillata) Rely on Spatial Cues over Shapes and Scents When Relocating Food

    Get PDF
    Natural selection can shape specific cognitive abilities and the extent to which a given species relies on various cues when learning associations between stimuli and rewards. Because the flower bat Glossophaga soricina feeds primarily on nectar, and the locations of nectar-producing flowers remain constant, G. soricina might be predisposed to learn to associate food with locations. Indeed, G. soricina has been observed to rely far more heavily on spatial cues than on shape cues when relocating food, and to learn poorly when shape alone provides a reliable cue to the presence of food.Here we determined whether G. soricina would learn to use scent cues as indicators of the presence of food when such cues were also available. Nectar-producing plants fed upon by G. soricina often produce distinct, intense odors. We therefore expected G. soricina to relocate food sources using scent cues, particularly the flower-produced compound, dimethyl disulfide, which is attractive even to G. soricina with no previous experience of it. We also compared the learning of associations between cues and food sources by G. soricina with that of a related fruit-eating bat, Carollia perspicillata. We found that (1) G. soricina did not learn to associate scent cues, including dimethyl disulfide, with feeding sites when the previously rewarded spatial cues were also available, and (2) both the fruit-eating C. perspicillata and the flower-feeding G. soricina were significantly more reliant on spatial cues than associated sensory cues for relocating food.These findings, taken together with past results, provide evidence of a powerful, experience-independent predilection of both species to rely on spatial cues when attempting to relocate food

    Rich Pickings Near Large Communal Roosts Favor ‘Gang’ Foraging by Juvenile Common Ravens, Corvus corax

    Get PDF
    Ravens (Corvus corax) feed primarily on rich but ephemeral carcasses of large animals, which are usually defended by territorial pairs of adults. Non-breeding juveniles forage socially and aggregate in communal winter roosts, and these appear to function as ‘information centers’ regarding the location of the rare food bonanzas: individuals search independently of one another and pool their effort by recruiting each other at roosts. However, at a large raven roost in Newborough on Anglesey, North Wales, some juveniles have been observed recently to forage in ‘gangs’ and to roost separately from other birds. Here we adapt a general model of juvenile common raven foraging behavior where, in addition to the typical co-operative foraging strategy, such gang foraging behavior could be evolutionarily stable near winter raven roosts. We refocus the model on the conditions under which this newly documented, yet theoretically anticipated, gang-based foraging has been observed. In the process, we show formally how the trade off between search efficiency and social opportunity can account for the existence of the alternative social foraging tactics that have been observed in this species. This work serves to highlight a number of fruitful avenues for future research, both from a theoretical and empirical perspective

    An excess of niche differences maximizes ecosystem functioning

    Get PDF
    Ecologists have long argued that higher functioning in diverse communities arises from the niche differences stabilizing species coexistence and from the fitness differences driving competitive dominance. However, rigorous tests are lacking. We couple field-parameterized models of competition between 10 annual plant species with a biodiversity-functioning experiment under two contrasting environmental conditions, to study how coexistence determinants link to biodiversity effects (selection and complementarity). We find that complementarity effects positively correlate with niche differences and selection effects differences correlate with fitness differences. However, niche differences also contribute to selection effects and fitness differences to complementarity effects. Despite this complexity, communities with an excess of niche differences (where niche differences exceeded those needed for coexistence) produce more biomass and have faster decomposition rates under drought, but do not take up nutrients more rapidly. We provide empirical evidence that the mechanisms determining coexistence correlate with those maximizing ecosystem functioning. It is unclear how biodiversity-ecosystem functioning and species coexistence mechanisms are linked. Here, Godoy and colleagues combine field-parameterised competition models with a BEF experiment to show that mechanisms leading to more stable species coexistence lead to greater productivity, but not necessarily to enhanced functions other than primary production

    Domestic horses (Equus caballus) discriminate between negative and positive human nonverbal vocalisations

    Get PDF
    The ability to discriminate between emotion in vocal signals is highly adaptive in social species. It may also be adaptive for domestic species to distinguish such signals in humans. Here we present a playback study investigating whether horses spontaneously respond in a functionally relevant way towards positive and negative emotion in human nonverbal vocalisations. We presented horses with positively- and negatively-valenced human vocalisations (laughter and growling, respectively) in the absence of all other emotional cues. Horses were found to adopt a freeze posture for significantly longer immediately after hearing negative versus positive human vocalisations, suggesting that negative voices promote vigilance behaviours and may therefore be perceived as more threatening. In support of this interpretation, horses held their ears forwards for longer and performed fewer ear movements in response to negative voices, which further suggest increased vigilance. In addition, horses showed a right-ear/left-hemisphere bias when attending to positive compared with negative voices, suggesting that horses perceive laughter as more positive than growling. These findings raise interesting questions about the potential for universal discrimination of vocal affect and the role of lifetime learning versus other factors in interspecific communication

    Threat-sensitive anti-predator defence in precocial wader, the northern lapwing Vanellus vanellus

    Get PDF
    Birds exhibit various forms of anti-predator behaviours to avoid reproductive failure, with mobbing—observation, approach and usually harassment of a predator—being one of the most commonly observed. Here, we investigate patterns of temporal variation in the mobbing response exhibited by a precocial species, the northern lapwing (Vanellus vanellus). We test whether brood age and self-reliance, or the perceived risk posed by various predators, affect mobbing response of lapwings. We quantified aggressive interactions between lapwings and their natural avian predators and used generalized additive models to test how timing and predator species identity are related to the mobbing response of lapwings. Lapwings diversified mobbing response within the breeding season and depending on predator species. Raven Corvus corax, hooded crow Corvus cornix and harriers evoked the strongest response, while common buzzard Buteo buteo, white stork Ciconia ciconia, black-headed gull Chroicocephalus ridibundus and rook Corvus frugilegus were less frequently attacked. Lapwings increased their mobbing response against raven, common buzzard, white stork and rook throughout the breeding season, while defence against hooded crow, harriers and black-headed gull did not exhibit clear temporal patterns. Mobbing behaviour of lapwings apparently constitutes a flexible anti-predator strategy. The anti-predator response depends on predator species, which may suggest that lapwings distinguish between predator types and match mobbing response to the perceived hazard at different stages of the breeding cycle. We conclude that a single species may exhibit various patterns of temporal variation in anti-predator defence, which may correspond with various hypotheses derived from parental investment theory
    corecore