289 research outputs found

    Evaluation of the Impacts of Radio-Marking Devices on Feral Horses and Burros in a Captive Setting

    Get PDF
    Radio-collars and other radio-marking devices have been invaluable tools for wildlife managers for \u3e40 years. These marking devices have improved our understanding of wildlife spatial ecology and demographic parameters and provided new data facilitating model development for species conservation and management. Although these tools have been used on virtually all North American ungulates, their deployment on feral horses (Equus ferus caballus) or burros (E. asinus) has been limited. To determine if radio-collars and radio-tags could be safely deployed on feral equids, we conducted a 1-year observational study in 2015 to investigate fit and wear of radio-collars on feral horses and burros kept in pastures/pens at the Bureau of Land Management contracted adoption facility in Pauls Valley, Oklahoma, USA. We assessed the impact of radio-collars and transmitter tags on individual behavior, body condition, and evaluated neck surface for effects. We tested 2 radio-collar shapes (teardrop and oval) and a radio-tag (i.e., avian backpack) braided into the mane and tail of horses. Behavior of mares did not differ between radio-collared (n = 12) and control (uncollared; n = 12) individuals. Despite the small sample size, collared burro jennies (n = 4) spent more time standing than controls (n = 4). Stallions wearing radio-collars (n = 9) fed less, moved less, and stood more than controls (n = 8). During the study, we did not detect injuries to the necks of mares or burro jennies, but stallions developed small sores (that healed while still wearing radio-collars and re-haired within 3 months). Two radio-collars occasionally flipped forward over the ears onto the foreheads of stallions. Although our study confirmed that radio-collars could be safely deployed on captive mares and jennies, stallions proved challenging for a variety of reasons. While our conclusions were optimistic, longer studies will be required to ensure radio-collar safety on free-ranging feral horses and burros

    Population growth, movements, and status of the Nushagak Peninsula Caribou Herd following reintroduction, 1988 - 2000

    Get PDF
    Barren ground caribou (Rangifer tarandus) were reintroduced to the Nushagak Peninsula, Alaska in February of 1988 after an absence of more than 100 years. Since reintroduction, herd growth and population dynamics have been monitored closely. At this time, there has been no significant dispersal from the herds' core range. The Nushagak Peninsula Caribou Herd (NPCH) grew rapidly from 146 reintroduced individuals to over 1000 in 13 years. Dramatic mean annual growth during the first 6 years (1988-1994) of 38% (r = 0.32) can be attributed to the high percentage of females in the initial reintroduction, high calf production and survival, exceptional range conditions, few predators, and no hunting. However, the populations' exceptional growth (peak counts of 1400) slowed and stabilized between 1996¬1998 and then decreased between 1998 and 2000. Size, body condition and weights of calves captured in 2000 were significantly lower than those captured in 1995 and 1997. Although calf production also decreased from close to 100% (1990-1995) to about 91% (1996-2000), overall calf survival continued to be high. Legal harvest began in 1995, and harvest reports have accounted for approximately 3% of population mortality annually. Although brown bears (Ursus arctos) and wolves (Canis lupus) are present, the extent of predation is unknown. Mean home range of the NPCH was 674 km2 and group sizes were greatest during post-calving aggregation in July (mean = 127). Caribou population density on the Nushagak Peninsula reached approximately 1.2 caribou/km2 in 1997 before declining to about 1.0 caribou/km2. A range survey in 1994 noted only trace utilization of lichens on the Nushagak Peninsula by caribou. A subsequent survey in 1999 found moderate to severe utilization in 46% of plots, suggesting the reintroduced herd was beginning to alter range condition. Between 1997 and 2000, both calf production and condition of 10-month-old calves declined. Calving has also been delayed in recent years. However, we suspect the reduced herd growth can be attributed to increasing hunting pressure and some dispersal of caribou from the Peninsula, not reduced range condition

    Considering the role of cognitive control in expert performance

    Get PDF
    © 2014, Springer Science+Business Media Dordrecht. Dreyfus and Dreyfus’ (1986) influential phenomenological analysis of skill acquisition proposes that expert performance is guided by non-cognitive responses which are fast, effortless and apparently intuitive in nature. Although this model has been criticised (e.g., by Breivik Journal of Philosophy of Sport, 34, 116–134 2007, Journal of the Philosophy of Sport, 40, 85–106 2013; Eriksen 2010; Montero Inquiry:An interdisciplinary Journal of Philosophy, 53, 105–122 2010; Montero and Evans 2011) for over-emphasising the role that intuition plays in facilitating skilled performance, it does recognise that on occasions (e.g., when performance goes awry for some reason) a form of ‘detached deliberative rationality’ may be used by experts to improve their performance. However, Dreyfus and Dreyfus (1986) see no role for calculative problem solving or deliberation (i.e., drawing on rules or mental representations) when performance is going well. In the current paper, we draw on empirical evidence, insights from athletes, and phenomenological description to argue that ‘continuous improvement’ (i.e., the phenomenon whereby certain skilled performers appear to be capable of increasing their proficiency even though they are already experts; Toner and Moran 2014) among experts is mediated by cognitive (or executive) control in three distinct sporting situations (i.e., in training, during pre-performance routines, and while engaged in on-line skill execution). We conclude by arguing that Sutton et al. Journal of the British Society for Phenomenology, 42, 78–103 (2011) ‘applying intelligence to the reflexes’ (AIR) approach may help to elucidate the process by which expert performers achieve continuous improvement through analytical/mindful behaviour during training and competition

    Design choices for observational studies of the effect of exposure on disease incidence.

    Get PDF
    The purpose of this paper is to help readers choose an appropriate observational study design for measuring an association between an exposure and disease incidence. We discuss cohort studies, sub-samples from cohorts (case-cohort and nested case-control designs), and population-based or hospital-based case-control studies. Appropriate study design is the foundation of a scientifically valid observational study. Mistakes in design are often irremediable. Key steps are understanding the scientific aims of the study and what is required to achieve them. Some designs will not yield the information required to realise the aims. The choice of design also depends on the availability of source populations and resources. Choosing an appropriate design requires balancing the pros and cons of various designs in view of study aims and practical constraints. We compare various cohort and case-control designs to estimate the effect of an exposure on disease incidence and mention how certain design features can reduce threats to study validity

    Epigenotyping in Peripheral Blood Cell DNA and Breast Cancer Risk: A Proof of Principle Study

    Get PDF
    Background: Epigenetic changes are emerging as one of the most important events in carcinogenesis. Two alterations in the pattern of DNA methylation in breast cancer (BC) have been previously reported; active estrogen receptor-a (ER-a) is associated with decreased methylation of ER-a target (ERT) genes, and polycomb group target (PCGT) genes are more likely than other genes to have promoter DNA hypermethylation in cancer. However, whether DNA methylation in normal unrelated cells is associated with BC risk and whether these imprints can be related to factors which can be modified by the environment, is unclear.Methodology/Principal Findings: Using quantitative methylation analysis in a case-control study (n = 1,083) we found that DNA methylation of peripheral blood cell DNA provides good prediction of BC risk. We also report that invasive ductal and invasive lobular BC is characterized by two different sets of genes, the latter particular by genes involved in the differentiation of the mesenchyme (PITX2, TITF1, GDNF and MYOD1). Finally we demonstrate that only ERT genes predict ER positive BC; lack of peripheral blood cell DNA methylation of ZNF217 predicted BC independent of age and family history (odds ratio 1.49; 95% confidence interval 1.12-1.97; P = 0.006) and was associated with ER-a bioactivity in the corresponding serum.Conclusion/Significance: This first large-scale epigenotyping study demonstrates that DNA methylation may serve as a link between the environment and the genome. Factors that can be modulated by the environment (like estrogens) leave an imprint in the DNA of cells that are unrelated to the target organ and indicate the predisposition to develop a cancer. Further research will need to demonstrate whether DNA methylation profiles will be able to serve as a new tool to predict the risk of developing chronic diseases with sufficient accuracy to guide preventive measures

    Long-Term Benefits from Early Antiretroviral Therapy Initiation in HIV Infection

    Get PDF
    BACKGROUND: For people with HIV and CD4+ counts >500 cells/mm3, early initiation of antiretroviral therapy (ART) reduces serious AIDS and serious non-AIDS (SNA) risk compared with deferral of treatment until CD4+ counts are 500 cells/mm3, excess risk of AIDS and SNA associated with delaying treatment initiation was diminished after ART initiation, but persistent excess risk remained. (Funded by the National Institute of Allergy and Infectious Diseases and others.)

    Growth and CD4 patterns of adolescents living with perinatally acquired HIV worldwide, a CIPHER cohort collaboration analysis.

    Get PDF
    INTRODUCTION Adolescents living with HIV are subject to multiple co-morbidities, including growth retardation and immunodeficiency. We describe growth and CD4 evolution during adolescence using data from the Collaborative Initiative for Paediatric HIV Education and Research (CIPHER) global project. METHODS Data were collected between 1994 and 2015 from 11 CIPHER networks worldwide. Adolescents with perinatally acquired HIV infection (APH) who initiated antiretroviral therapy (ART) before age 10 years, with at least one height or CD4 count measurement while aged 10-17 years, were included. Growth was measured using height-for-age Z-scores (HAZ, stunting if <-2 SD, WHO growth charts). Linear mixed-effects models were used to study the evolution of each outcome between ages 10 and 17. For growth, sex-specific models with fractional polynomials were used to model non-linear relationships for age at ART initiation, HAZ at age 10 and time, defined as current age from 10 to 17 years of age. RESULTS A total of 20,939 and 19,557 APH were included for the growth and CD4 analyses, respectively. Half were females, two-thirds lived in East and Southern Africa, and median age at ART initiation ranged from 7 years in sub-Saharan African regions. At age 10, stunting ranged from 6% in North America and Europe to 39% in the Asia-Pacific; 19% overall had CD4 counts <500 cells/mm3 . Across adolescence, higher HAZ was observed in females and among those in high-income countries. APH with stunting at age 10 and those with late ART initiation (after age 5) had the largest HAZ gains during adolescence, but these gains were insufficient to catch-up with non-stunted, early ART-treated adolescents. From age 10 to 16 years, mean CD4 counts declined from 768 to 607 cells/mm3 . This decline was observed across all regions, in males and females. CONCLUSIONS Growth patterns during adolescence differed substantially by sex and region, while CD4 patterns were similar, with an observed CD4 decline that needs further investigation. Early diagnosis and timely initiation of treatment in early childhood to prevent growth retardation and immunodeficiency are critical to improving APH growth and CD4 outcomes by the time they reach adulthood
    corecore