28232 research outputs found
Sort by
Predictors of pelvic tilt normalization: a multicenter study on the impact of regional and lower-extremity compensation on pelvic alignment after complex adult spinal deformity surgery.
The objective was to determine the degree of regional decompensation to pelvic tilt (PT) normalization after complex adult spinal deformity (ASD) surgery. Operative ASD patients with 1 year of PT measurements were included. Patients with normalized PT at baseline were excluded. Predicted PT was compared to actual PT, tested for change from baseline, and then compared against age-adjusted, Scoliosis Research Society-Schwab, and global alignment and proportion (GAP) scores. Lower-extremity (LE) parameters included the cranial-hip-sacrum angle, cranial-knee-sacrum angle, and cranial-ankle-sacrum angle. LE compensation was set as the 1-year upper tertile compared with intraoperative baseline. Univariate analyses were used to compare normalized and nonnormalized data against alignment outcomes. Multivariable logistic regression analyses were used to develop a model consisting of significant predictors for normalization related to regional compensation. In total, 156 patients met the inclusion criteria (mean ± SD age 64.6 ± 9.1 years, BMI 27.9 ± 5.6 kg/m2, Charlson Comorbidity Index 1.9 ± 1.6). Patients with normalized PT were more likely to have overcorrected pelvic incidence minus lumbar lordosis and sagittal vertical axis at 6 weeks (p 0.05). Patients with nonnormalized PT had higher rates of LE compensation across joints (all p < 0.01). Overall, patients with normalized PT at 1 year had the greatest odds of resolving LE compensation (OR 9.6, p < 0.001). Patients with normalized PT at 1 year had lower rates of implant failure (8.9% vs 19.5%, p < 0.05), rod breakage (1.3% vs 13.8%, p < 0.05), and pseudarthrosis (0% vs 4.6%, p < 0.05) compared with patients with nonnormalized PT. The complication rate was significantly lower for patients with normalized PT at 1 year (56.7% vs 66.1%, p = 0.02), despite comparable health-related quality of life scores. Patients with PT normalization had greater rates of resolution in thoracic and LE compensation, leading to lower rates of complications by 1 year. Thus, consideration of both the lower extremities and thoracic regions in surgical planning is vital to preventing adverse outcomes and maintaining pelvic alignment
Depth- and curvature-based quantitative susceptibility mapping analyses of cortical iron in Alzheimer's disease.
In addition to amyloid beta plaques and neurofibrillary tangles, Alzheimer's disease (AD) has been associated with elevated iron in deep gray matter nuclei using quantitative susceptibility mapping (QSM). However, only a few studies have examined cortical iron, using more macroscopic approaches that cannot assess layer-specific differences. Here, we conducted column-based QSM analyses to assess whether AD-related increases in cortical iron vary in relation to layer-specific differences in the type and density of neurons. We obtained global and regional measures of positive (iron) and negative (myelin, protein aggregation) susceptibility from 22 adults with AD and 22 demographically matched healthy controls. Depth-wise analyses indicated that global susceptibility increased from the pial surface to the gray/white matter boundary, with a larger slope for positive susceptibility in the left hemisphere for adults with AD than controls. Curvature-based analyses indicated larger global susceptibility for adults with AD versus controls; the right hemisphere versus left; and gyri versus sulci. Region-of-interest analyses identified similar depth- and curvature-specific group differences, especially for temporo-parietal regions. Finding that iron accumulates in a topographically heterogenous manner across the cortical mantle may help explain the profound cognitive deterioration that differentiates AD from the slowing of general motor processes in healthy aging
Integrating frame-level boundary detection and deepfake detection for locating manipulated regions in partially spoofed audio forgery attacks
A Critical Evaluation of the Biological Construct Skeletal Muscle Hypertrophy: Size Matters but So Does the Measurement
Characterizing epigenetic aging in an adult sickle cell disease cohort.
AbstractSickle cell disease (SCD) affects ∼100 000 predominantly African American individuals in the United States, causing significant cellular damage, increased disease complications, and premature death. However, the contribution of epigenetic factors to SCD pathophysiology remains relatively unexplored. DNA methylation (DNAm), a primary epigenetic mechanism for regulating gene expression in response to the environment, is an important driver of normal cellular aging. Several DNAm epigenetic clocks have been developed to serve as a proxy for cellular aging. We calculated the epigenetic ages of 89 adults with SCD (mean age, 30.64 years; 60.64% female) using 5 published epigenetic clocks: Horvath, Hannum, PhenoAge, GrimAge, and DunedinPACE. We hypothesized that in chronic disease, such as SCD, individuals would demonstrate epigenetic age acceleration, but the results differed depending on the clock used. Recently developed clocks more consistently demonstrated acceleration (GrimAge, DunedinPACE). Additional demographic and clinical phenotypes were analyzed to explore their association with epigenetic age estimates. Chronological age was significantly correlated with epigenetic age in all clocks (Horvath, r = 0.88; Hannum, r = 0.89; PhenoAge, r = 0.85; GrimAge, r = 0.88; DunedinPACE, r = 0.34). The SCD genotype was associated with 2 clocks (PhenoAge, P = .02; DunedinPACE, P < .001). Genetic ancestry, biological sex, β-globin haplotypes, BCL11A rs11886868, and SCD severity were not associated. These findings, among the first to interrogate epigenetic aging in adults with SCD, demonstrate epigenetic age acceleration with recently developed epigenetic clocks but not older-generation clocks. Further development of epigenetic clocks may improve their predictive ability and utility for chronic diseases such as SCD
Factors Affecting Post-trial Sustainment or De-implementation of Study Interventions: A Narrative Review.
In contrast to traditional randomized controlled trials, embedded pragmatic clinical trials (ePCTs) are conducted within healthcare settings with real-world patient populations. ePCTs are intentionally designed to align with health system priorities leveraging existing healthcare system infrastructure and resources to ease intervention implementation and increase the likelihood that effective interventions translate into routine practice following the trial. The NIH Pragmatic Trials Collaboratory, funded by the National Institutes of Health (NIH), supports the conduct of large-scale ePCT Demonstration Projects that address major public health issues within healthcare systems. The Collaboratory has a unique opportunity to draw on the Demonstration Project experiences to generate lessons learned related to ePCTs and the dissemination and implementation of interventions tested in ePCTs. In this article, we use case studies from six completed Demonstration Projects to summarize the Collaboratory's experience with post-trial interpretation of results, and implications for sustainment (or de-implementation) of tested interventions. We highlight three key lessons learned. First, ineffective interventions (i.e., ePCT is null for the primary outcome) may be sustained if they have other measured benefits (e.g., secondary outcome or subgroup) or even perceived benefits (e.g., staff like the intervention). Second, effective interventions-even those solicited by the health system and/or designed with significant health system partner buy-in-may not be sustained if they require significant resources. Third, alignment with policy incentives is essential for achieving sustainment and scale-up of effective interventions. Our experiences point to several recommendations to aid in considering post-trial sustainment or de-implementation of interventions tested in ePCTs: (1) include secondary outcome measures that are salient to health system partners; (2) collect all appropriate data to allow for post hoc analysis of subgroups; (3) collect experience data from clinicians and staff; (4) engage policy-makers before starting the trial
Skeletal Muscle Protein Composition Adaptations to 10 Weeks of High-Load Resistance Training in Previously-Trained Males
Myelography Using Energy-Integrating Detector CT Versus Photon-Counting Detector CT for Detection of CSF-Venous Fistulas in Patients With Spontaneous Intracranial Hypotension.
Background: CSF-venous fistulas (CVFs) are an increasingly recognized cause of spontaneous intracranial hypotension (SIH) that are often diminutive in size and exceedingly difficult to detect by conventional imaging. Objective: This study's objective was to compare EID-CT myelography and PCD-CT myelography in terms of image quality and diagnostic performance for detecting CVFs in patients with SIH. Methods: This retrospective study included 38 patients (15 men, 23 women; mean age, 55±10 years) with SIH who underwent both clinically indicated EID-CT myelography (slice thickness, 0.625 mm) and PCD-CT myelography (slice thickness, 0.2 mm; performed in ultrahigh-resolution mode) to assess for CSF leak. Three blinded radiologists reviewed examinations in random order, assessing image noise, discernibility of spinal nerve root sleeves, and overall image quality using 0-100 scales (100=highest quality), and recording locations of CVFs. Definite CVFs were defined as CVFs described in CT myelography reports using unequivocal language and showing attenuation >70 HU. Results: For all readers, PCD-CT myelography, in comparison with EID-CT myelography, showed higher image noise (reader 1: 69±19 vs 38±15; reader 2: 59±9 vs 49±13; reader 3: 57±13 vs 43±15), higher nerve root sleeve discernibility (reader 1: 84±19 vs 30±14; reader 2: 84±19 vs 70±19; reader 3: 60±13 vs 52±12), and higher overall image quality (reader 1: 84±21 vs 40±15; reader 2: 81±10 vs 72±20; reader 3: 58±11 vs 53±11) (all p.05). Conclusion: In comparison with EID-CT myelography, PCD-CT myelography yielded significantly improved image quality with significantly higher sensitivity for CVFs without significant loss of specificity. Clinical Impact: The findings support a potential role of PCD-CT myelography in facilitating earlier diagnosis and targeted treatment of SIH, avoiding high morbidity during potentially prolonged diagnostic workups