257 research outputs found

    Prey life-history and bioenergetic responses across a predation gradient

    Get PDF
    To evaluate the importance of non-consumptive effects of predators on prey life histories under natural conditions, an index of predator abundance was developed for naturally occurring populations of a common prey fish, the yellow perch Perca flavescens, and compared to life-history variables and rates of prey energy acquisition and allocation as estimated from mass balance models. The predation index was positively related to maximum size and size at maturity in both male and female P. flavescens, but not with life span or reproductive investment. The predation index was positively related to size-adjusted specific growth rates and growth efficiencies but negatively related to model estimates of size-adjusted specific consumption and activity rates in both vulnerable (small) and invulnerable (large) size classes of P. flavescens. These observations suggest a trade-off between growth and activity rates, mediated by reduced activity in response to increasing predator densities. Lower growth rates and growth efficiencies in populations with fewer predators, despite increased consumption suggests either 1) a reduction in prey resources at lower predator densities or 2) an intrinsic cost of rapid prey growth that makes it unfavourable unless offset by a perceived threat of predation. This study provides evidence of trade-offs between growth and activity rates induced by predation risk in natural prey fish populations and illustrates how behavioural modification induced through predation can shape the life histories of prey fish species

    Anthropogenic Disturbance and Population Viability of Woodland Caribou in Ontario

    Get PDF
    One of the most challenging tasks in wildlife conservation and management is to clarify how spatial variation in land cover due to anthropogenic disturbance influences wildlife demography and long‐term viability. To evaluate this, we compared rates of survival and population growth by woodland caribou (Rangifer tarandus caribou) from 2 study sites in northern Ontario, Canada that differed in the degree of anthropogenic disturbance because of commercial logging and road development, resulting in differences in predation risk due to gray wolves (Canis lupus). We used an individual‐based model for population viability analysis (PVA) that incorporated adaptive patterns of caribou movement in relation to predation risk and food availability to predict stochastic variation in rates of caribou survival. Field estimates of annual survival rates for adult female caribou in the unlogged ( x̄ = 0.90) and logged ( x̄ = 0.76) study sites recorded during 2010–2014 did not differ significantly (P \u3e 0.05) from values predicted by the individual‐based PVA model (unlogged:  x̄ = 0.87; logged:  x̄ = 0.79). Outcomes from the individual‐based PVA model and a simpler stage‐structured matrix model suggest that substantial differences in adult survival largely due to wolf predation are likely to lead to long‐term decline of woodland caribou in the commercially logged landscape, whereas the unlogged landscape should be considerably more capable of sustaining caribou. Estimates of population growth rates (λ) for the 2010–2014 period differed little between the matrix model and the individual‐based PVA model for the unlogged (matrix model  x̄ = 1.01; individual‐based model x̄ = 0.98) and logged landscape (matrix model x̄ = 0.88; individual‐based model x̄ = 0.89). We applied the spatially explicit PVA model to assess the viability of woodland caribou across 14 woodland caribou ranges in Ontario. Outcomes of these simulations suggest that woodland caribou ranges that have experienced significant levels of commercial forestry activities in the past had annual growth rates 0.96. These differences were strongly related to regional variation in wolf densities. Our results suggest that increased wolf predation risk due to anthropogenic disturbance is of sufficient magnitude to cause appreciable risk of population decline in woodland caribou in Ontario. © 2020 The Authors. The Journal of Wildlife Management published by Wiley Periodicals, Inc. on behalf of The Wildlife Society

    Compliance assessment of ambulatory Alzheimer patients to aid therapeutic decisions by healthcare professionals

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Compliance represents a major determinant for the effectiveness of pharmacotherapy. Compliance reports summarising electronically compiled compliance data qualify healthcare needs and can be utilised as part of a compliance enhancing intervention. Nevertheless, evidence-based information on a sufficient level of compliance is scarce complicating the interpretation of compliance reports. The purpose of our pilot study was to determine the compliance of ambulatory Alzheimer patients to antidementia drugs under routine therapeutic use using electronic monitoring. In addition, the forgiveness of donepezil (i.e. its ability to sustain adequate pharmacological response despite suboptimal compliance) was characterised and evidence-based guidance for the interpretation of compliance reports was intended to be developed.</p> <p>Methods</p> <p>We determined the compliance of four different antidementia drugs by electronic monitoring in 31 patients over six months. All patients were recruited from the gerontopsychiatric clinic of a university hospital as part of a pilot study. The so called medication event monitoring system (MEMS) was employed, consisting of a vial with a microprocessor in the lid which records the time (date, hour, minute) of every opening. Daily compliance served as primary outcome measure, defined as percentage of days with correctly administered doses of medication. In addition, pharmacokinetics and pharmacodynamics of donepezil were simulated to systematically assess therapeutic undersupply also incorporating study compliance patterns. Statistical analyses were performed with SPSS and Microsoft Excel.</p> <p>Results</p> <p>Median daily compliance was 94% (range 48%-99%). Ten patients (32%) were non-compliant at least for one month. One-sixth of patients taking donepezil displayed periods of therapeutic undersupply. For 10 mg and 5 mg donepezil once-daily dosing, the estimated forgiveness of donepezil was 80% and 90% daily compliance or two and one dosage omissions at steady state, respectively. Based on the simulation findings we developed rules for the evidence-based interpretation of donepezil compliance reports.</p> <p>Conclusions</p> <p>Compliance in ambulatory Alzheimer patients was for the first time assessed under routine conditions using electronic monitoring: On average compliance was relatively high but variable between patients. The approach of pharmacokinetic/pharmacodynamic <it>in silico </it>simulations was suitable to characterise the forgiveness of donepezil suggesting evidence-based recommendations for the interpretation of compliance reports.</p

    Selection for Forage and Avoidance of Risk by Woodland Caribou (Rangifer Tarandus Caribou) at Coarse andLocal Scales

    Get PDF
    The relationship between selection at coarse and fine spatiotemporal spatial scales is still poorly understood. Some authors claim that, to accommodate different needs at different scales, individuals should have contrasting selection patterns at different scales of selection, while others claim that coarse scale selection patterns should reflect fine scale selection decisions. Here we examine site selection by 110 woodland caribou equipped with GPS radio‐collars with respect to forage availability and predation risk across a broad gradient in availability of both variables in boreal forests of Northern Ontario. We tested whether caribou selection for forage and avoidance of risk was consistent between coarse (seasonal home range) and fine scales of selection. We found that local selection patterns predicted coarse scale selection patterns, indicating a close relationship between the drivers of selection at both spatial scales

    Landscape-Level Wolf Space Use is Correlated With Prey Abundance, Ease of Mobility and the Distribution of Prey Habitat

    Get PDF
    Predator space use influences ecosystem dynamics, and a fundamental goal assumed for a foraging predator is to maximize encounter rate with prey. This can be achieved by disproportionately utilizing areas of high prey density or, where prey are mobile and therefore spatially unpredictable, utilizing patches of their prey\u27s preferred resources. A third, potentially complementary strategy is to increase mobility by using linear features like roads and/or frozen waterways. Here, we used novel population-level predator utilization distributions (termed localized density distributions ) in a single-predator (Wolf), two-prey (moose and caribou) system to evaluate these space-use hypotheses. The study was conducted in contrasting sections of a large boreal forest area in northern Ontario, Canada, with a spatial gradient of human disturbances and predator and prey densities. Our results indicated that wolves consistently used forest stands preferred by moose, their main prey species in this part of Ontario. Direct use of prey-rich areas was also significant but restricted to where there was a high local density of moose, whereas use of linear features was pronounced where local moose density was lower. These behaviors suggest that Wolf foraging decisions, while consistently influenced by spatially anchored patches of prey forage resources, were also determined by local ecological conditions, specifically prey density. Wolves appeared to utilize prey-rich areas when regional preferred prey density exceeded a threshold that made this profitable, whereas they disproportionately used linear features that promoted mobility when low prey density made directly tracking prey distribution unprofitable

    Evolutionary tradeoffs in cellular composition across diverse bacteria

    Get PDF
    One of the most important classic and contemporary interests in biology is the connection between cellular composition and physiological function. Decades of research have allowed us to understand the detailed relationship between various cellular components and processes for individual species, and have uncovered common functionality across diverse species. However, there still remains the need for frameworks that can mechanistically predict the tradeoffs between cellular functions and elucidate and interpret average trends across species. Here we provide a comprehensive analysis of how cellular composition changes across the diversity of bacteria as connected with physiological function and metabolism, spanning five orders of magnitude in body size. We present an analysis of the trends with cell volume that covers shifts in genomic, protein, cellular envelope, RNA and ribosomal content. We show that trends in protein content are more complex than a simple proportionality with the overall genome size, and that the number of ribosomes is simply explained by cross-species shifts in biosynthesis requirements. Furthermore, we show that the largest and smallest bacteria are limited by physical space requirements. At the lower end of size, cell volume is dominated by DNA and protein content—the requirement for which predicts a lower limit on cell size that is in good agreement with the smallest observed bacteria. At the upper end of bacterial size, we have identified a point at which the number of ribosomes required for biosynthesis exceeds available cell volume. Between these limits we are able to discuss systematic and dramatic shifts in cellular composition. Much of our analysis is connected with the basic energetics of cells where we show that the scaling of metabolic rate is surprisingly superlinear with all cellular components

    Improving Coping Skills for Self-management of Treatment Side Effects Can Reduce Antiretroviral Medication Nonadherence among People Living with HIV

    Get PDF
    BackgroundHuman immunodeficiency virus (HIV) treatment side effects have a deleterious impact on treatment adherence, which is necessary to optimize treatment outcomes including morbidity and mortality.PurposeTo examine the effect of the Balance Project intervention, a five-session, individually delivered HIV treatment side effects coping skills intervention on antiretroviral medication adherence.MethodsHIV+ men and women (N = 249) on antiretroviral therapy (ART) with self-reported high levels of ART side effect distress were randomized to intervention or treatment as usual. The primary outcome was self-reported ART adherence as measured by a combined 3-day and 30-day adherence assessment.ResultsIntent-to-treat analyses revealed a significant difference in rates of nonadherence between intervention and control participants across the follow-up time points such that those in the intervention condition were less likely to report nonadherence. Secondary analyses revealed that intervention participants were more likely to seek information about side effects and social support in efforts to cope with side effects.ConclusionsInterventions focusing on skills related to ART side-effects management show promise for improving ART adherence among persons experiencing high levels of perceived ART side effects

    The Risk of Virologic Failure Decreases with Duration of HIV Suppression, at Greater than 50% Adherence to Antiretroviral Therapy

    Get PDF
    Background: We hypothesized that the percent adherence to antiretroviral therapy necessary to maintain HIV suppression would decrease with longer duration of viral suppression. Methodology: Eligible participants were identified from the REACH cohort of marginally housed HIV infected adults in San Francisco. Adherence to antiretroviral therapy was measured through pill counts obtained at unannounced visits by research staff to each participant's usual place of residence. Marginal structural models and targeted maximum likelihood estimation methodologies were used to determine the effect of adherence to antiretroviral therapy on the probability of virologic failure during early and late viral suppression. Principal Findings: A total of 221 subjects were studied (median age 44.1 years; median CD4+ T cell nadir 206 cells/mm3). Most subjects were taking the following types of antiretroviral regimens: non-nucleoside reverse transcriptase inhibitor based (37%), ritonavir boosted protease inhibitor based (28%), or unboosted protease inhibitor based (25%). Comparing the probability of failure just after achieving suppression vs. after 12 consecutive months of suppression, there was a statistically significant decrease in the probability of virologic failure for each range of adherence proportions we considered, as long as adherence was greater than 50%. The estimated risk difference, comparing the probability of virologic failure after 1 month vs. after 12 months of continuous viral suppression was 0.47 (95% CI 0.23–0.63) at 50–74% adherence, 0.29 (CI 0.03–0.50) at 75–89% adherence, and 0.36 (CI 0.23–0.48) at 90–100% adherence. Conclusions: The risk of virologic failure for adherence greater than 50% declines with longer duration of continuous suppression. While high adherence is required to maximize the probability of durable viral suppression, the range of adherence capable of sustaining viral suppression is wider after prolonged periods of viral suppression

    Persistence survey of Toxic Shock Syndrome toxin-1 producing Staphylococcus aureus and serum antibodies to this superantigen in five groups of menstruating women

    Get PDF
    Background: Menstrual Toxic Shock Syndrome (mTSS) is thought to be associated with the vaginal colonization with specific strains of Staphylococcus aureus TSST-1 in women who lack sufficient antibody titers to this toxin. There are no published studies that examine the seroconversion in women with various colonization patterns of this organism. Thus, the aim of this study was to evaluate the persistence of Staphylococcus aureus colonization at three body sites (vagina, nares, and anus) and serum antibody to toxic shock syndrome toxin-producing Staphylococcus aureus among a small group of healthy, menstruating women evaluated previously in a larger study. Methods: One year after the completion of that study, 311 subjects were recalled into 5 groups. Four samples were obtained from each participant at several visits over an additional 6-11 month period: 1) an anterior nares swab; 2) an anal swab; 3) a vagina swab; and 4) a blood sample. Gram stain, a catalase test, and a rapid S. aureus-specific latex agglutination test were performed to phenotypically identify S. aureus from sample swabs. A competitive ELISA was used to quantify TSST-1 production. Human TSST-1 IgG antibodies were determined from the blood samples using a sandwich ELISA method. Results: We found only 41% of toxigenic S. aureus and 35.5% of non-toxigenic nasal carriage could be classified as persistent. None of the toxigenic S. aureus vaginal or anal carriage could be classified as persistent. Despite the low persistence of S. aureus colonization, subjects colonized with a toxigenic strain were found to display distributions of antibody titers skewed toward higher titers than other subjects. Seven percent (5/75) of subjects became seropositive during recall, but none experienced toxic shock syndrome-like symptoms. Conclusions: Nasal carriage of S. aureus appears to be persistent and the best predicator of subsequent colonization, whereas vaginal and anal carriage appear to be more transient. From these findings, it appears that antibody titers in women found to be colonized with toxigenic S. aureus remained skewed toward higher titers whether or not the colonies were found to be persistent or transient in nature. This suggests that colonization at some point in time is sufficient to elevate antibody titer levels and those levels appear to be persistent. Results also indicate that women can become seropositive without experiencing signs or symptoms of toxic shock syndrome
    corecore