64 research outputs found

    Study design considerations for the Standardized Treatment of Pulmonary Exacerbations 2 (STOP2): A trial to compare intravenous antibiotic treatment durations in CF

    Get PDF
    BACKGROUND: Pulmonary exacerbations (PEx) in cystic fibrosis (CF) are common and contribute to morbidity and mortality. Duration of IV antibiotic therapy to treat PEx varies widely in the US, and there are few data to guide treatment decisions. METHODS: We combined a survey of CF stakeholders with retrospective analyses of a recent observational study of CF PEx to design a multicenter, randomized, prospective study comparing the efficacy and safety of different durations of IV antibiotics for PEx to meet the needs of people with CF and their caregivers. RESULTS: IV antibiotic duration was cited as the most important PEx research question by responding CF physicians and top concern among surveyed CF patients/caregivers. During PEx, forced expiratory volume in 1s (FEV1% predicted) and symptom responses at 7-10days of IV antibiotics identified two distinct groups: early robust responders (ERR) who subsequently experienced greater FEV1 improvements compared to non-ERR (NERR). In addition to greater FEV1 and symptom responses, only 14% of ERR patients were treated with IV antibiotics for >15days, compared with 45% of NERR patients. CONCLUSIONS: A divergent trial design that evaluates subjects' interim improvement in FEV1 and symptoms to tailor randomization to IV treatment duration (10 vs. 14days for ERR, 14 vs. 21days for NERR) may alleviate physician and patient concerns about excess or inadequate treatment. Such a study has the potential to provide evidence necessary to standardize IV antibiotic duration in CF PEx care -a first step to conducting PEx research of other treatment features

    Lower Extremity Biomechanics and Self-Reported Foot-Strike Patterns Among Runners in Traditional and Minimalist Shoes

    Get PDF
    The injury incidence rate among runners is approximately 50%. Some individuals have advocated using an anterior–foot-strike pattern to reduce ground reaction forces and injury rates that they attribute to a rear–foot-strike pattern. The proportion of minimalist shoe wearers who adopt an anterior–foot-strike pattern remains unclear

    Sex Promotes Spatial and Dietary Segregation in a Migratory Shorebird during the Non-Breeding Season

    Get PDF
    Several expressions of sexual segregation have been described in animals, especially in those exhibiting conspicuous dimorphism. Outside the breeding season, segregation has been mostly attributed to size or age-mediated dominance or to trophic niche divergence. Regardless of the recognized implications for population dynamics, the ecological causes and consequences of sexual segregation are still poorly understood. We investigate the foraging habits of a shorebird showing reversed sexual dimorphism, the black-tailed godwit Limosa limosa, during the winter season, and found extensive segregation between sexes in spatial distribution, microhabitat use and dietary composition. Males and females exhibited high site-fidelity but differed in their distributions at estuary-scale. Male godwits (shorter-billed) foraged more frequently in exposed mudflats than in patches with higher water levels, and consumed more bivalves and gastropods and fewer polychaetes than females. Females tended to be more frequently involved and to win more aggressive interactions than males. However, the number of aggressions recorded was low, suggesting that sexual dominance plays a lesser role in segregation, although its importance cannot be ruled out. Dimorphism in the feeding apparatus has been used to explain sex differences in foraging ecology and behaviour of many avian species, but few studies confirmed that morphologic characteristics drive individual differences within each sex. We found a relationship between resource use and bill size when pooling data from males and females. However, this relationship did not hold for either sex separately, suggesting that differences in foraging habits of godwits are primarily a function of sex, rather than bill size. Hence, the exact mechanisms through which this segregation operates are still unknown. The recorded differences in spatial distribution and resource use might expose male and female to distinct threats, thus affecting population dynamics through differential mortality. Therefore, population models and effective conservation strategies should increasingly take sex-specific requirements into consideration

    Population Genetic Analysis Infers Migration Pathways of Phytophthora ramorum in US Nurseries

    Get PDF
    Recently introduced, exotic plant pathogens may exhibit low genetic diversity and be limited to clonal reproduction. However, rapidly mutating molecular markers such as microsatellites can reveal genetic variation within these populations and be used to model putative migration patterns. Phytophthora ramorum is the exotic pathogen, discovered in the late 1990s, that is responsible for sudden oak death in California forests and ramorum blight of common ornamentals. The nursery trade has moved this pathogen from source populations on the West Coast to locations across the United States, thus risking introduction to other native forests. We examined the genetic diversity of P. ramorum in United States nurseries by microsatellite genotyping 279 isolates collected from 19 states between 2004 and 2007. Of the three known P. ramorum clonal lineages, the most common and genetically diverse lineage in the sample was NA1. Two eastward migration pathways were revealed in the clustering of NA1 isolates into two groups, one containing isolates from Connecticut, Oregon, and Washington and the other isolates from California and the remaining states. This finding is consistent with trace forward analyses conducted by the US Department of Agriculture's Animal and Plant Health Inspection Service. At the same time, genetic diversities in several states equaled those observed in California, Oregon, and Washington and two-thirds of multilocus genotypes exhibited limited geographic distributions, indicating that mutation was common during or subsequent to migration. Together, these data suggest that migration, rapid mutation, and genetic drift all play a role in structuring the genetic diversity of P. ramorum in US nurseries. This work demonstrates that fast-evolving genetic markers can be used to examine the evolutionary processes acting on recently introduced pathogens and to infer their putative migration patterns, thus showing promise for the application of forensics to plant pathogens

    Viral Mimicry of Cdc2/Cyclin-Dependent Kinase 1 Mediates Disruption of Nuclear Lamina during Human Cytomegalovirus Nuclear Egress

    Get PDF
    The nuclear lamina is a major obstacle encountered by herpesvirus nucleocapsids in their passage from the nucleus to the cytoplasm (nuclear egress). We found that the human cytomegalovirus (HCMV)-encoded protein kinase UL97, which is required for efficient nuclear egress, phosphorylates the nuclear lamina component lamin A/C in vitro on sites targeted by Cdc2/cyclin-dependent kinase 1, the enzyme that is responsible for breaking down the nuclear lamina during mitosis. Quantitative mass spectrometry analyses, comparing lamin A/C isolated from cells infected with viruses either expressing or lacking UL97 activity, revealed UL97-dependent phosphorylation of lamin A/C on the serine at residue 22 (Ser22). Transient treatment of HCMV-infected cells with maribavir, an inhibitor of UL97 kinase activity, reduced lamin A/C phosphorylation by approximately 50%, consistent with UL97 directly phosphorylating lamin A/C during HCMV replication. Phosphorylation of lamin A/C during viral replication was accompanied by changes in the shape of the nucleus, as well as thinning, invaginations, and discrete breaks in the nuclear lamina, all of which required UL97 activity. As Ser22 is a phosphorylation site of particularly strong relevance for lamin A/C disassembly, our data support a model wherein viral mimicry of a mitotic host cell kinase activity promotes nuclear egress while accommodating viral arrest of the cell cycle

    The Changing Landscape for Stroke\ua0Prevention in AF: Findings From the GLORIA-AF Registry Phase 2

    Get PDF
    Background GLORIA-AF (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients with Atrial Fibrillation) is a prospective, global registry program describing antithrombotic treatment patterns in patients with newly diagnosed nonvalvular atrial fibrillation at risk of stroke. Phase 2 began when dabigatran, the first non\u2013vitamin K antagonist oral anticoagulant (NOAC), became available. Objectives This study sought to describe phase 2 baseline data and compare these with the pre-NOAC era collected during phase 1. Methods During phase 2, 15,641 consenting patients were enrolled (November 2011 to December 2014); 15,092 were eligible. This pre-specified cross-sectional analysis describes eligible patients\u2019 baseline characteristics. Atrial fibrillation disease characteristics, medical outcomes, and concomitant diseases and medications were collected. Data were analyzed using descriptive statistics. Results Of the total patients, 45.5% were female; median age was 71 (interquartile range: 64, 78) years. Patients were from Europe (47.1%), North America (22.5%), Asia (20.3%), Latin America (6.0%), and the Middle East/Africa (4.0%). Most had high stroke risk (CHA2DS2-VASc [Congestive heart failure, Hypertension, Age  6575 years, Diabetes mellitus, previous Stroke, Vascular disease, Age 65 to 74 years, Sex category] score  652; 86.1%); 13.9% had moderate risk (CHA2DS2-VASc = 1). Overall, 79.9% received oral anticoagulants, of whom 47.6% received NOAC and 32.3% vitamin K antagonists (VKA); 12.1% received antiplatelet agents; 7.8% received no antithrombotic treatment. For comparison, the proportion of phase 1 patients (of N = 1,063 all eligible) prescribed VKA was 32.8%, acetylsalicylic acid 41.7%, and no therapy 20.2%. In Europe in phase 2, treatment with NOAC was more common than VKA (52.3% and 37.8%, respectively); 6.0% of patients received antiplatelet treatment; and 3.8% received no antithrombotic treatment. In North America, 52.1%, 26.2%, and 14.0% of patients received NOAC, VKA, and antiplatelet drugs, respectively; 7.5% received no antithrombotic treatment. NOAC use was less common in Asia (27.7%), where 27.5% of patients received VKA, 25.0% antiplatelet drugs, and 19.8% no antithrombotic treatment. Conclusions The baseline data from GLORIA-AF phase 2 demonstrate that in newly diagnosed nonvalvular atrial fibrillation patients, NOAC have been highly adopted into practice, becoming more frequently prescribed than VKA in Europe and North America. Worldwide, however, a large proportion of patients remain undertreated, particularly in Asia and North America. (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients With Atrial Fibrillation [GLORIA-AF]; NCT01468701

    Association Between Running Shoe Characteristics and Lower Extremity Injuries in United States Military Academy Cadets.

    No full text
    BACKGROUND: Running-related overuse injuries are very common among recreational runners, with the reported annual injury rates ranging from 39% to 85%. Relatively few large prospective cohort studies have been conducted to investigate injury risk associated with different running shoe characteristics, and the results of the existing studies are often contradictory. PURPOSE/HYPOTHESIS: The purpose was to investigate the relationship between running shoe characteristics and lower extremity musculoskeletal injury. It was hypothesized that the risk of injury would be increased in individuals wearing shoes with minimal torsional stiffness and heel height compared with those wearing shoes with greater levels of torsional stiffness and heel height. STUDY DESIGN: Cohort study; Level of evidence, 2. METHODS: The study included 1025 incoming cadets. Shoe torsional stiffness and heel height were calculated and recorded. Demographic data were recorded and analyzed as potential covariates. Lower extremity injuries sustained over 9 weeks during cadet basic training were documented by use of the Armed Forces Health Longitudinal Technology Application and the Cadet Illness and Injury Tracking System. Kaplan-Meier survival curves were estimated, with time to incident lower extremity injury as the primary outcome by level of the independent predictor variables. Risk factors or potential covariates were carried forward into multivariable Cox proportional hazards regression models. Absolute and relative risk reduction and numbers needed to treat were calculated. RESULTS: Approximately 18.1% of participants incurred a lower extremity injury. Cadets wearing shoes with moderate lateral torsional stiffness were 49% less likely to incur any type of lower extremity injury and 52% less likely to incur an overuse lower extremity injury than cadets wearing shoes with minimal lateral torsional stiffness, both of which were statistically significant observations. Injury risk was similar among cadets wearing shoes with minimal and extreme lateral torsional stiffness. CONCLUSION: Shoes with mild to moderate lateral torsional stiffness may be appropriate in reducing risk of lower extremity injury in cadets. Shoes with minimal lateral torsional stiffness should be discouraged in this population
    corecore