16 research outputs found
Pain Care in the Department of Veterans Affairs: Understanding How a Cultural Shift in Pain Care Impacts Provider Decisions and Collaboration
OBJECTIVE: Over the past decade, the Department of Veterans Affairs (VA) has experienced a sizeable shift in its approach to pain. The VA\u27s 2009 Pain Management Directive introduced the Stepped Care Model, which emphasizes an interdisciplinary approach to pain management involving pain referrals and management from primary to specialty care providers. Additionally, the Opioid Safety Initiative and 2017 VA/Department of Defense (DoD) clinical guidelines on opioid prescribing set a new standard for reducing opioid use in the VA. These shifts in pain care have led to new pain management strategies that rely on multidisciplinary teams and nonpharmacologic pain treatments. The goal of this study was to examine how the cultural transformation of pain care has impacted providers, the degree to which VA providers are aware of pain care services at their facilities, and their perceptions of multidisciplinary care and collaboration across VA disciplines.
METHODS: We conducted semistructured phone interviews with 39 VA clinicians in primary care, mental health, pharmacy, and physical therapy/rehabilitation at eight Veterans Integrated Service Network medical centers in New England.
RESULTS: We identified four major themes concerning interdisciplinary pain management approaches: 1) the culture of VA pain care has changed dramatically, with a greater focus on nonpharmacologic approaches to pain, though many old school providers continue to prefer medication options; 2) most facilities in this sample have no clear roadmap about which pain treatment pathway to follow, with many providers unaware of what treatment to recommend when; 3) despite multiple options for pain treatment, VA multidisciplinary teams generally work together to ensure that veterans receive coordinated pain care; and 4) veteran preferences for care may not align with existing pain care pathways.
CONCLUSIONS: The VA has shifted its practices regarding pain management, with a greater emphasis on nonpharmacologic pain options. The proliferation of nonpharmacologic pain management strategies requires stakeholders to know how to choose among alternative treatments
Screening, Brief Intervention, and Referral to Treatment for Pain Management for Veterans Seeking Service-Connection Payments for Musculoskeletal Disorders: SBIRT-PM Study Protocol
BACKGROUND: Veterans with significant chronic pain from musculoskeletal disorders are at risk of substance misuse. Veterans whose condition is the result of military service may be eligible for a disability pension. Department of Veterans Affairs compensation examinations, which determine the degree of disability and whether it was connected to military service, represent an opportunity to engage Veterans in pain management and substance use treatments. A multisite randomized clinical trial is testing the effectiveness and cost-effectiveness of Screening, Brief Intervention, and Referral to Treatment for Pain Management (SBIRT-PM) for Veterans seeking compensation for musculoskeletal disorders. This telephone-based intervention is delivered through a hub-and-spoke configuration.
DESIGN: This study is a two-arm, parallel-group, 36-week, multisite randomized controlled single-blind trial. It will randomize 1,100 Veterans experiencing pain and seeking service-connection for musculoskeletal disorders to either SBIRT-PM or usual care across eight New England VA medical centers. The study balances pragmatic with explanatory methodological features. Primary outcomes are pain severity and number of substances misused. Nonpharmacological pain management and substance use services utilization are tracked in the trial.
SUMMARY: Early trial enrollment targets were met across sites. SBIRT-PM could help Veterans, at the time of their compensation claims, use multimodal pain treatments and reduce existing substance misuse. Strategies to address COVID-19 pandemic impacts on the SBIRT-PM protocol have been developed to maintain its pragmatic and exploratory integrity
Population genomics meet Lagrangian simulations: Oceanographic patterns and long larval duration ensure connectivity among Paracentrotus lividus populations in the Adriatic and Ionian seas
Connectivity between populations influences both their dynamics and the genetic structuring of species. In this study, we explored connectivity patterns of a marine species with long-distance dispersal, the edible common sea urchin Paracentrotus lividus, focusing mainly on the Adriatic-Ionian basins (Central Mediterranean). We applied a multidisciplinary approach integrating population genomics, based on 1,122 single nucleotide polymorphisms (SNPs) obtained from 2b-RAD in 275 samples, with Lagrangian simulations performed with a biophysical model of larval dispersal. We detected genetic homogeneity among eight population samples collected in the focal Adriatic-Ionian area, whereas weak but significant differentiation was found with respect to two samples from the Western Mediterranean (France and Tunisia). This result was not affected by the few putative outlier loci identified in our dataset. Lagrangian simulations found a significant potential for larval exchange among the eight Adriatic-Ionian locations, supporting the hypothesis of connectivity of P. lividus populations in this area. A peculiar pattern emerged from the comparison of our results with those obtained from published P. lividus cytochrome b (cytb) sequences, the latter revealing genetic differentiation in the same geographic area despite a smaller sample size and a lower power to detect differences. The comparison with studies conducted using nuclear markers on other species with similar pelagic larval durations in the same Adriatic-Ionian locations indicates species-specific differences in genetic connectivity patterns and warns against generalizing single-species results to the entire community of rocky shore habitats
Dissecting the Shared Genetic Architecture of Suicide Attempt, Psychiatric Disorders, and Known Risk Factors
Background Suicide is a leading cause of death worldwide, and nonfatal suicide attempts, which occur far more frequently, are a major source of disability and social and economic burden. Both have substantial genetic etiology, which is partially shared and partially distinct from that of related psychiatric disorders. Methods We conducted a genome-wide association study (GWAS) of 29,782 suicide attempt (SA) cases and 519,961 controls in the International Suicide Genetics Consortium (ISGC). The GWAS of SA was conditioned on psychiatric disorders using GWAS summary statistics via multitrait-based conditional and joint analysis, to remove genetic effects on SA mediated by psychiatric disorders. We investigated the shared and divergent genetic architectures of SA, psychiatric disorders, and other known risk factors. Results Two loci reached genome-wide significance for SA: the major histocompatibility complex and an intergenic locus on chromosome 7, the latter of which remained associated with SA after conditioning on psychiatric disorders and replicated in an independent cohort from the Million Veteran Program. This locus has been implicated in risk-taking behavior, smoking, and insomnia. SA showed strong genetic correlation with psychiatric disorders, particularly major depression, and also with smoking, pain, risk-taking behavior, sleep disturbances, lower educational attainment, reproductive traits, lower socioeconomic status, and poorer general health. After conditioning on psychiatric disorders, the genetic correlations between SA and psychiatric disorders decreased, whereas those with nonpsychiatric traits remained largely unchanged. Conclusions Our results identify a risk locus that contributes more strongly to SA than other phenotypes and suggest a shared underlying biology between SA and known risk factors that is not mediated by psychiatric disorders.Peer reviewe
Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study
Purpose Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom. Methods Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded. Results The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia. Conclusion We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes
Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study
Purpose:
Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom.
Methods:
Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded.
Results:
The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia.
Conclusion:
We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes