809 research outputs found

    Characterization of health care utilization in patients receiving implantable cardioverter-defibrillator therapies: An analysis of the managed ventricular pacing trial.

    Get PDF
    BACKGROUND: Implantable cardioverter-defibrillators (ICDs) are effective in terminating lethal arrhythmias, but little is known about the degree of health care utilization (HCU) after ICD therapies. OBJECTIVE: Using data from the managed ventricular pacing trial, we sought to identify the incidence and types of HCU in ICD patients after receiving ICD therapy (shocks or antitachycardia pacing [ATP]). METHODS: We analyzed HCU events (ventricular tachyarrhythmia [VTA]-related, heart failure-related, ICD implant procedure-related, ICD system-related, or other) and their association with ICD therapies (shocked ventricular tachycardia episode, ATP-terminated ventricular tachycardia episode, and inappropriately shocked episode). RESULTS: A total of 1879 HCUs occurred in 695 of 1030 subjects (80% primary prevention) and were classified as follows: 133 (7%) VTA-related, 373 (20%) heart failure-related, 97 (5%) implant procedure-related, 115 (6%) system-related, and 1160 (62%) other. Of 2113 treated VTA episodes, 1680 (80%) received ATP only and 433 (20%) received shocks. Stratifying VTA-related HCUs on the basis of the type of ICD therapy delivered, there were 25 HCUs per 100 shocked VTA episodes compared with 1 HCU per 100 ATP-terminated episodes. Inappropriate ICD shocks occurred in 8.7% of the subjects and were associated with 115 HCUs. The majority of HCUs (52%) began in the emergency department, and 66% of all HCUs resulted in hospitalization. CONCLUSION: For VTA-related HCUs, shocks are associated with a 25-fold increase in HCUs compared to VTAs treated by ATP only. Application of evidence-based strategies and automated device-based algorithms to reduce ICD shocks (higher rate cutoffs, use of ATP, and arrhythmia detection) may help reduce HCUs

    Chiasma

    Get PDF
    Newspaper reporting on events at the Boston University School of Medicine in the 1960s

    Regional Patterns in the Otolith Chemistry of Juvenile Spotted Seatrout (\u3ci\u3eCynoscion nebulosus\u3c/i\u3e) Differ Under Contrasting Hydrological Regimes

    Get PDF
    The value of using otolith chemistry to characterize recruitment in terms of natal source regions depends on how consistently spatio-temporal variation can be resolved. The objective of this study was to compare regional classification patterns in the otolith chemistry of juvenile Spotted Seatrout (Cynoscion nebulosus) between two years experiencing disparate hydrological regimes, and separated by a five year interlude. Spatial patterns in the whole-otolith chemistry of juveniles of this estuarine-dependent species were compared between years using five otolith elements and two stable isotopes. Consistent size-related trends in uptake and deposition were evidenced by parallel ontogenetic relationships for six otolith variables. Nine natal regions were discerned equally well in both years; and region accounted for similar overall amounts of variation in the seven otolith variables in both years. However, the otolith variables did not distinguish the nine regions in the same manner in both years, and natal regions varied in how similar they were in otolith chemistry between years. Consequently, between-year cross-classification accuracy varied widely among regions, and geographic distance per se was unimportant for explaining regional patterns in otolith chemistry. Salinity correlated significantly with regional patterns in otolith chemistry in 2001, but not at all in 2006 when conditions were much drier. Regional patterns in individual otolith variables reflected either a general trend based on hydrology, a regional-local effect whereby geographically closer regions exhibited similar otolith chemistry, or a location-specific effect for which there was either no correlation in otolith concentration among regions between years, or a significant but individualistic relationship. In addition to elucidating limitations of using otolith chemistry to identify natal source regions or for tracking fish movements, knowing more about how and why otolith chemistry varies could be used to address specific questions about early recruitment dynamics, or to aid in the development of more reliable instruments for discerning natal source contributions

    Neurocognitive Predictors of Treatment Response to Randomized Treatment in Adults with Tic Disorders

    Get PDF
    Tourette\u27s disorder (TS) and chronic tic disorder (CTD) are neurodevelopmental disorders characterized by involuntary vocal and motor tics. Consequently, TS/CTD have been conceptualized as disorders of cognitive and motor inhibitory control. However, most neurocognitive studies have found comparable or superior inhibitory capacity among individuals with TS/CTD relative to healthy controls. These findings have led to the hypothesis that individuals with TS/CTD develop increased inhibitory control due to the constant need to inhibit tics. However, the role of cognitive control in TS/CTD is not yet understood, particularly in adults. To examine the role of inhibitory control in TS/CTD, the present study investigated this association by assessing the relationship between inhibitory control and treatment response in a large sample of adults with TS/CTD. As part of a large randomized trial comparing behavior therapy versus supportive psychotherapy for TS/CTD, a battery of tests, including tests of inhibitory control was administered to 122 adults with TS/CTD at baseline. We assessed the association between neuropsychological test performance and change in symptom severity, as well as compared the performance of treatment responders and non-responders as defined by the Clinical Global Impression Scale. Results indicated that change in symptoms, and treatment response were not associated with neuropsychological performance on tests of inhibitory control, intellectual ability, or motor function, regardless of type of treatment. The finding that significant change in symptom severity of TS/CTD patients is not associated with impairment or change in inhibitory control regardless of treatment type suggests that inhibitory control may not be a clinically relevant facet of these disorders in adults

    Device Therapies Among Patients Receiving Primary Prevention Implantable Cardioverter-Defibrillators in the Cardiovascular Research Network

    Get PDF
    BACKGROUND: Primary prevention implantable cardioverter-defibrillators (ICDs) reduce mortality in selected patients with left ventricular systolic dysfunction by delivering therapies (antitachycardia pacing or shocks) to terminate potentially lethal arrhythmias; inappropriate therapies also occur. We assessed device therapies among adults receiving primary prevention ICDs in 7 healthcare systems. METHODS AND RESULTS: We linked medical record data, adjudicated device therapies, and the National Cardiovascular Data Registry ICD Registry. Survival analysis evaluated therapy probability and predictors after ICD implant from 2006 to 2009, with attention to Centers for Medicare and Medicaid Services Coverage With Evidence Development subgroups: left ventricular ejection fraction, 31% to 35%; nonischemic cardiomyopathy \u3c9 \u3emonths\u27 duration; and New York Heart Association class IV heart failure with cardiac resynchronization therapy defibrillator. Among 2540 patients, 35% wereold, 26% were women, and 59% were white. During 27 (median) months, 738 (29%) received ≥1 therapy. Three-year therapy risk was 36% (appropriate, 24%; inappropriate, 12%). Appropriate therapy was more common in men (adjusted hazard ratio [HR], 1.84; 95% confidence interval [CI], 1.43-2.35). Inappropriate therapy was more common in patients with atrial fibrillation (adjusted HR, 2.20; 95% CI, 1.68-2.87), but less common among patients ≥65 years old versus younger (adjusted HR, 0.72; 95% CI, 0.54-0.95) and in recent implants (eg, in 2009 versus 2006; adjusted HR, 0.66; 95% CI, 0.46-0.95). In Centers for Medicare and Medicaid Services Coverage With Evidence Development analysis, inappropriate therapy was less common with cardiac resynchronization therapy defibrillator versus single chamber (adjusted HR, 0.55; 95% CI, 0.36-0.84); therapy risk did not otherwise differ for Centers for Medicare and Medicaid Services Coverage With Evidence Development subgroups. CONCLUSIONS: In this community cohort of primary prevention patients receiving ICD, therapy delivery varied across demographic and clinical characteristics, but did not differ meaningfully for Centers for Medicare and Medicaid Services Coverage With Evidence Development subgroups

    Comparison of Inappropriate Shocks and Other Health Outcomes Between Single- and Dual-Chamber Implantable Cardioverter-Defibrillators for Primary Prevention of Sudden Cardiac Death: Results from the Cardiovascular Research Network Longitudinal Study of Implantable Cardioverter-Defibrillators

    Get PDF
    Background In US clinical practice, many patients who undergo placement of an implantable cardioverter‐defibrillator (ICD) for primary prevention of sudden cardiac death receive dual‐chamber devices. The superiority of dual‐chamber over single‐chamber devices in reducing the risk of inappropriate ICD shocks in clinical practice has not been established. The objective of this study was to compare risk of adverse outcomes, including inappropriate shocks, between single‐ and dual‐chamber ICDs for primary prevention. Methods and Results We identified patients receiving a single‐ or dual‐chamber ICD for primary prevention who did not have an indication for pacing from 15 hospitals within 7 integrated health delivery systems in the Longitudinal Study of Implantable Cardioverter‐Defibrillators from 2006 to 2009. The primary outcome was time to first inappropriate shock. ICD shocks were adjudicated for appropriateness. Other outcomes included all‐cause hospitalization, heart failure hospitalization, and death. Patient, clinician, and hospital‐level factors were accounted for using propensity score weighting methods. Among 1042 patients without pacing indications, 54.0% (n=563) received a single‐chamber device and 46.0% (n=479) received a dual‐chamber device. In a propensity‐weighted analysis, device type was not significantly associated with inappropriate shock (hazard ratio, 0.91; 95% confidence interval, 0.59–1.38 [P=0.65]), all‐cause hospitalization (hazard ratio, 1.03; 95% confidence interval, 0.87–1.21 [P=0.76]), heart failure hospitalization (hazard ratio, 0.93; 95% confidence interval, 0.72–1.21 [P=0.59]), or death (hazard ratio, 1.19; 95% confidence interval, 0.93–1.53 [P=0.17]). Conclusions Among patients who received an ICD for primary prevention without indications for pacing, dual‐chamber devices were not associated with lower risk of inappropriate shock or differences in hospitalization or death compared with single‐chamber devices. This study does not justify the use of dual‐chamber devices to minimize inappropriate shocks

    Chiasma

    Get PDF
    Newspaper reporting on events at the Boston University School of Medicine in the 1960s

    Aerobic exercise improves sleep in U. S. active duty service members following brief treatment for posttraumatic stress disorder symptoms

    Get PDF
    IntroductionPhysical exercise is a lifestyle intervention that can positively impact aspects of physical and psychological health. There is a growing body of evidence suggesting that physical exercise, sleep, and PTSD are interrelated. This study investigated possible relationships. Three research questions were posed: (1) Did randomization to an aerobic exercise intervention reduce insomnia more than being randomized to an intervention without exercise, (2) Did change in sleep predict change in PTSD symptoms, and (3) Did change in sleep impact the relationship between exercise and PTSD symptom reductions?MethodsData were collected from 69 treatment-seeking active duty service members with PTSD symptoms randomized into one of four conditions; two conditions included aerobic exercise, and two conditions did not include exercise. Participants in the exercise groups exercised five times per week keeping their heart rate > 60% of their heart rate reserve for 20–25 min.ResultsAt baseline, 58% of participants reported moderate or severe insomnia. PTSD symptom severity decreased following treatment for all groups (p < 0.001). Participants randomized to exercise reported greater reductions in insomnia compared to those in the no exercise group (p = 0.47). However, change in insomnia did not predict change in PTSD symptoms nor did it significantly impact the relationship between exercise and PTSD symptom reductions.DiscussionAdding exercise to evidence-based treatments for PTSD could reduce sleep disturbance, a characteristic of PTSD not directly addressed with behavioral therapies. A better understanding of exercise as a lifestyle intervention that can reduce PTSD symptoms and insomnia is warranted

    Preparing for Climatic Change: The Water, Salmon, and Forests of the Pacific Northwest

    Get PDF
    The impacts of year-to-year and decade-to-decade climatic variations on some of the Pacific Northwest’s key natural resources can be quantified to estimate sensitivity to regional climatic changes expected as part of anthropogenic global climatic change. Warmer, drier years, often associated with El Niño events and/or the warm phase of the Pacific Decadal Oscillation, tend to be associated with below-average snowpack, streamflow, and flood risk, below-average salmon survival, below-average forest growth, and above-average risk of forest fire. During the 20th century, the region experienced a warming of 0.8 ◦C. Using output from eight climate models, we project a further warming of 0.5–2.5 ◦C (central estimate 1.5 ◦C) by the 2020s, 1.5–3.2 ◦C (2.3◦C) by the 2040s, and an increase in precipitation except in summer. The foremost impact of a warming climate will be the reduction of regional snowpack, which presently supplies water for ecosystems and human uses during the dry summers. Our understanding of past climate also illustrates the responses of human management systems to climatic stresses, and suggests that a warming of the rate projected would pose significant challenges to the management of natural resources. Resource managers and planners currently have few plans for adapting to or mitigating the ecological and economic effects of climatic change

    Preparing for Climatic Change: The Water, Salmon, and Forests of the Pacific Northwest

    Get PDF
    The impacts of year-to-year and decade-to-decade climatic variations on some of the Pacific Northwest’s key natural resources can be quantified to estimate sensitivity to regional climatic changes expected as part of anthropogenic global climatic change. Warmer, drier years, often associated with El Niño events and/or the warm phase of the Pacific Decadal Oscillation, tend to be associated with below-average snowpack, streamflow, and flood risk, below-average salmon survival, below-average forest growth, and above-average risk of forest fire. During the 20th century, the region experienced a warming of 0.8 ◦C. Using output from eight climate models, we project a further warming of 0.5–2.5 ◦C (central estimate 1.5 ◦C) by the 2020s, 1.5–3.2 ◦C (2.3◦C) by the 2040s, and an increase in precipitation except in summer. The foremost impact of a warming climate will be the reduction of regional snowpack, which presently supplies water for ecosystems and human uses during the dry summers. Our understanding of past climate also illustrates the responses of human management systems to climatic stresses, and suggests that a warming of the rate projected would pose significant challenges to the management of natural resources. Resource managers and planners currently have few plans for adapting to or mitigating the ecological and economic effects of climatic change
    corecore