26968 research outputs found
Sort by
Resident Case Series: The Utility of the Athletic Shoulder Test Using Wireless Portable Force Plates for Measuring Peak Force in NCAA Division One Collegiate Quarterbacks
BACKGROUND: Given the high incidence of shoulder injuries among quarterbacks due to the repetitive stress of the overhead throwing motion, there is a need for targeted, sport-specific assessments that reflect the functional demands of throwing. The Athletic Shoulder (ASH) test, previously validated in rugby and baseball populations, offers a potential solution for upper extremity strength assessment in football quarterbacks. The primary purpose of this case series was to explore the feasibility of conducting the ASH test as a method for evaluating upper extremity strength in overhead athletes, specifically collegiate quarterbacks.
STUDY DESIGN: Case series.
METHODS: Three NCAA Division I quarterbacks participated in weekly ASH testing over a nine-week period. Peak isometric force, peak force normalized to body weight, and limb symmetry indices were measured in the shoulder Y and T positions using portable force plates. Testing was completed pre-practice on non-game days to minimize fatigue effects and maintain consistency.
RESULTS: The dominant shoulder consistently produced greater peak force than the non-dominant side in both Y and T positions. The mean dominant-to-non-dominant limb symmetry index was 1.13 in the Y position and 1.14 in the T position. Within limbs, peak force was greater in the Y position than in the T position. In both positions and in both arms, athletes generated at least 14% of their body weight in force on average.
CONCLUSION: These findings can provide insight into strength asymmetries and functional performance benchmarks, enabling strength coaches, athletic trainers, and rehabilitation professionals to fine-tune training and rehabilitation programs. Incorporating the ASH test into a standardized assessment battery may enhance the ability to evaluate an athlete\u27s readiness to train or compete, promoting a proactive approach to performance optimization and injury prevention. Its utility and actionable metrics make the ASH test a practical tool for in-season monitoring, allowing practitioners to make informed, data-driven adjustments throughout the season.
LEVEL OF EVIDENCE: Level 4
Advances in Molecular Imaging of VEGFRs: Innovations in Imaging and Therapeutics
Vascular endothelial growth factor receptors (VEGFRs) are key regulators of angiogenesis, lymphangiogenesis, and vascular permeability, playing essential roles in both physiological and pathological processes. The VEGFR family, including VEGFR-1, VEGFR-2, and VEGFR-3, interacts with structurally related VEGF ligands (VEGFA, VEGFB, VEGFC, VEGFD, and placental growth factor [PlGF]), activating downstream signaling pathways that mediate critical cellular processes, including proliferation, migration, and survival. Dysregulation of VEGFR signaling has been implicated in numerous diseases, such as cancer, cardiovascular conditions, and inflammatory disorders. Targeting VEGFRs with radiopharmaceuticals, such as radiolabeled peptides, antibodies, and specific tracers like 64Cu-bevacizumab and 89Zr-ramucirumab, has emerged as a powerful strategy for non-invasive imaging of VEGFR expression and distribution in vivo. Through positron emission tomography (PET) and single-photon emission computed tomography (SPECT), these targeted tracers enable real-time visualization of angiogenic and lymphangiogenic activity, providing insights into disease progression and therapeutic responses. This review explores the current advances in VEGFR-targeted imaging, focusing on the development of novel tracers, radiolabeling techniques, and their in vivo imaging characteristics. We discuss the preclinical and clinical applications of VEGFR imaging, highlight existing challenges, and provide perspectives on future innovations that could further enhance precision diagnostics and therapeutic monitoring in angiogenesis and lymphangiogenesis- driven diseases
Effects of Long COVID on Healthcare Utilization
BACKGROUND: While most research on Long COVID (LC) has focused on symptoms and quality of life, there remains a critical need to better understand the effect of LC on resource utilization. This study sought to determine the type and amount of healthcare utilization among participants with versus without LC.
METHODS: This was a secondary analysis of a prospective, longitudinal, multicenter U.S. study of adult participants with symptomatic COVID-19, confirmed with testing, who completed 3-month post-infection surveys and had electronic health record data for at least 180 days pre- and post-index testing. We excluded participants with any COVID-19 infections within the 6 months following enrollment. Consistent with prior work, LC was defined as ≥3 post-infectious symptoms at 3 months, while those with \u3c 3 symptoms were categorized as not having LC. Our primary outcome was to compare the change in visit types between pre- and post-index testing (hospitalization, emergency department visit, office visit, procedure, telehealth, and other). As secondary outcomes, we assessed differences in visit complexity using the summative length of each encounter for each category as a measure of total healthcare usage.
RESULTS: A total of 847 participants met inclusion criteria (179 LC, 668 non-LC). When compared with the pre-index period, there was an overall increase in visit numbers of all six visit categories during the post-index period for all groups, most pronounced in office and telehealth visits. When compared with the non-LC group, the LC group was less likely to have ED visits (OR: 0.1; 95% CI 0.0-0.5). However, among those with LC who had at least one hospitalization, they were more likely to have additional hospitalizations (OR: 2.6; 95% CI 1.5-4.6). Visit length for office visits and hospitalization in the LC group was increased when compared with the non-LC group, though this diminished after adjustment for patient baseline characteristics.
CONCLUSIONS: All participants who were infected with SARS-CoV-2 had a marked increase in healthcare utilization during the subsequent 180 days. The LC group had significantly higher rates of additional hospitalization compared with those without LC, which may help to inform healthcare resource planning
Sustained Impact: Long-Term Application of Diagnostic Uncertainty Communication Training
OBJECTIVES: More than one-third of discharged emergency department (ED) patients leave without a clear diagnosis for their symptoms. In 2019-2020, we implemented a simulation-based mastery learning curriculum across two academic medical centers to train emergency medicine residents to discuss diagnostic uncertainty during ED discharge, guided by the Uncertainty Communication Checklist (UCC). We sought to assess if this cohort continues to apply skills learned and to obtain trainee insights into the most valuable checklist items.
METHODS: A survey was emailed to all 109 participants who completed the training in 2019-2020. Questions assessed how often participants currently encountered uncertainty and used the skills learned in the curriculum. Additionally, participants rated how important it was to keep each of the 21 UCC items in future uncertainty communication training (4-point Likert scale: very important/important/low importance/not important). Means and proportions are reported.
RESULTS: Sixty-five individuals responded (60%). Mean age was 33; and 30.8% were female. More than 90% encountered diagnostic uncertainty more than once per shift, and 74% applied skills learned in the training often or all the time. Seven of the 21 UCC items received endorsement by more than 70% of respondents as very important to retain in future trainings. The item receiving the most endorsement as very important (90%) was to Clearly state that either \u27life-threatening\u27 or \u27dangerous\u27 conditions have not been found. Items with lower rankings related to generally accepted communication best-practices (e.g., make eye contact ).
CONCLUSION: Four years following diagnostic uncertainty training completion, most respondents frequently employ skills learned from uncertainty communication training
Secondary Malignancies in Patients with Meningioma: A Surveillance, Epidemiology, and End Results Data Analysis
BACKGROUND: The risk of secondary primary malignancies (SPMs) in meningioma patients is not well understood. In this unidirectional analysis, we evaluated the risk of SPMs occurring following a primary diagnosis of meningioma.
METHODS: The Surveillance, Epidemiology, and End Results (SEER-17) database (2000-2020) was used to identify 124,769 meningioma patients from a total of 9,208,295 cancer cases. Standardized incidence ratios (SIRs) were calculated using SEER\u27s statistical analysis package to evaluate SPM risk. Basic demographic and treatment information was collected as well.
RESULTS: Of the 124,769 patients, 11,411 (9.2%) received diagnoses of an SPM, which correlates to a higher risk than the general population (SIR, 1.17; 99% confidence interval [CI], 1.15-1.19). Patients with meningiomas had an increased risk of the following cancers: cutaneous melanoma (SIR, 1.40; 99% CI, 1.26-1.56), kidney and renal pelvis (SIR, 1.66; 99% CI, 1.47-1.86), brain and other nervous system (SIR, 3.45; 99% CI, 2.99-3.97), thyroid (SIR, 2.48; 99% CI, 2.19-2.80), and non-Hodgkin\u27s lymphoma (SIR, 1.29; 99% CI, 1.15-1.44). Females were more predisposed to cancers of the lung (SIR, 1.19; 99% CI, 1.12-1.26), digestive system (SIR, 1.06; 99% CI, 1.01-1.12), and breast (SIR, 1.09; 99% CI, 1.04-1.14). Older patients demonstrated an increased risk of SPM development, with the 65-85-year-old group having an odds ratio of 9.06 (P = 0.009).
CONCLUSIONS: SEER data confirm an increased risk of SPMs following meningioma diagnosis. Further research may uncover shared genetic factors between meningioma and these SPMs, and increased awareness of SPM risk could inform future screening strategies
Epigenetic Profiling of Cell-Free DNA in Cerebrospinal Fluid: A Novel Biomarker Approach for Metabolic Brain Diseases
Due to their clinical heterogeneity, nonspecific symptoms, and the limitations of existing biomarkers and imaging modalities, metabolic brain diseases (MBDs), such as mitochondrial encephalopathies, lysosomal storage disorders, and glucose metabolism syndromes, pose significant diagnostic challenges. This review examines the growing potential of cell-free DNA (cfDNA) derived from cerebrospinal fluid (CSF) epigenetic profiling as a dynamic, cell-type-specific, minimally invasive biomarker approach for MBD diagnosis and monitoring. We review important technological platforms and their use in identifying CNS-specific DNA methylation patterns indicative of neuronal injury, neuroinflammation, and metabolic reprogramming, including cfMeDIP-seq, enzymatic methyl sequencing (EM-seq), and targeted bisulfite sequencing. By synthesizing current findings across disorders such as MELAS, Niemann-Pick disease, Gaucher disease, GLUT1 deficiency syndrome, and diabetes-associated cognitive decline, we highlight the superior diagnostic and prognostic resolution offered by CSF cfDNA methylation signatures relative to conventional CSF markers or neuroimaging. We also address technical limitations, interpretive challenges, and translational barriers to clinical implementation. Ultimately, this review explores CSF cfDNA epigenetic analysis as a liquid biopsy modality. The central objective is to assess whether epigenetic profiling of CSF-derived cfDNA can serve as a reliable and clinically actionable biomarker for improving the diagnosis and longitudinal monitoring of metabolic brain diseases
Chronic Cluster Headache Is a Rare Disease: Implications for Diagnosis, Treatment and Public Health
Cluster headache (CH) is a rare and painful primary headache disorder characterized by severe unilateral pain and cranial autonomic symptoms. This perspective examines the epidemiological evidence supporting the classification of chronic cluster headache (CCH) as a rare disease, noting a prevalence of CH of approximately 124 per 100,000 individuals, with only 3.5-13.7% manifesting CCH. This prevalence meets criteria established by both the US Food and Drug Administration and European Medicines Agency for rare disease designation. The rarity of CCH creates substantial clinical and research challenges, including prolonged diagnostic delays, limited research funding and a dearth of approved treatments. The economic burden is particularly notable, with annual costs exceeding €20,000 per patient. Addressing these challenges requires a coordinated approach focusing on increased research funding, enhanced policy advocacy, improved diagnostic training and the development of comprehensive disease registries to advance both patient care and scientific understanding of this devastating neurological condition
Effects of Oxygen Manipulation on Myofibroblast Phenotypic Transformation in Patients With Radiation-Induced Fibrosis
We tested if hyperoxic conditions can reduce the proportion of active myofibroblasts, which are assumed to be a major driver of head and neck radiation-induced fibrosis, as measured by expression levels of pro-fibrotic genes. Radiated, non-cancerous soft tissue from the head and neck and skin/soft tissue from non-radiated flap donor site were collected from each patient. Myofibroblast density was quantified using immunofluorescence staining with α-SMA and DAPI and visualisation under confocal microscopy and compared between baseline non-radiated and radiated tissue from the same patient. From each tissue specimen, fibroblast cell lines were cultured and exposed to either normoxic, hypoxic, or hyperoxic conditions for 10 days. Total RNA was extracted and reverse-transcribed, and gene expression levels were quantified using RT-PCR. Relative gene expression levels of pro-fibrotic genes COL1A1, COL3A1, FN-EDA, α-SMA, HIF-1α, VEGFα, and VEGFR were compared between normoxic, hypoxic, and hyperoxic treatment groups. Three patients with six total tissue samples were acquired. Radiated tissue contained a higher density of myofibroblasts (calculated as cells/m
Tumor Microenvironment Governs the Prognostic Landscape of Immunotherapy for Head and Neck Squamous Cell Carcinoma: A Computational Model-Guided Analysis
Immune checkpoint inhibition (ICI) has emerged as a critical treatment strategy for squamous cell carcinoma of the head and neck (HNSCC) that halts the immune escape of the tumor cells. Increasing evidence suggests that the onset, progression, and lack of/no response of HNSCC to ICI are emergent properties arising from the interactions within the tumor microenvironment (TME). Deciphering how the diversity of cellular and molecular interactions leads to distinct HNSCC TME subtypes subsequently governing the ICI response remains largely unexplored. We developed a cellular-molecular model of the HNSCC TME that incorporates multiple cell types, cellular states, and transitions, and molecularly mediated paracrine interactions. Simulation across the selected parameter space of the HNSCC TME network shows that distinct mechanistic balances within the TME give rise to the five clinically observed TME subtypes such as immune/non-fibrotic, immune/fibrotic, fibrotic only and immune/fibrotic desert. We predict that the cancer-associated fibroblast, beyond a critical proliferation rate, drastically worsens the ICI response by hampering the accessibility of the CD8 + killer T cells to the tumor cells. Our analysis reveals that while an Interleukin-2 (IL-2) + ICI combination therapy may improve response in the immune desert scenario, Osteopontin (OPN) and Leukemia Inhibition Factor (LIF) knockout with ICI yields the best response in a fibro-dominated scenario. Further, we predict Interleukin-8 (IL-8), and lactate can serve as crucial biomarkers for ICI-resistant HNSCC phenotypes. Overall, we provide an integrated quantitative framework that explains a wide range of TME-mediated resistance mechanisms for HNSCC and predicts TME subtype-specific targets that can lead to an improved ICI outcome
Adjuvant Radiotherapy in pT3-4N0M0 Major Salivary Gland Cancer Without Other Adverse Features
OBJECTIVE: To investigate adjuvant radiotherapy (aRT) utilization and associated differences in overall survival (OS) in pT3-4N0M0 major salivary gland cancer (MSGC) without evidence of other adverse pathologic features.
METHODS: The 2006 to 2018 National Cancer Database was queried for patients with MSGC classified as pT3-4N0M0. Patients with evidence of lymphovascular invasion or positive surgical margins were excluded. Multivariable binary logistic and Cox regression models adjusting for patient demographics, pathologic features including histology, and treatment were implemented.
RESULTS: Of 897 patients satisfying inclusion criteria, 368 (41.0 %) underwent aRT. Compared with those not undergoing aRT, patients undergoing aRT were younger (median [IQR] 61 [49-72] vs. 67 [53-78] years) and more frequently had adenoid cystic carcinoma (AdCC) (18.2 % vs. 10.4 %) and high-grade disease (33.7 % vs. 21.6 %) (P \u3c 0.001). On multivariable binary logistic regression, age at diagnosis (aOR 0.98, 95 % CI 0.97-0.99) and Black race (aOR 0.50, 95 % CI 0.32-0.77) were associated with decreased odds of undergoing aRT (P \u3c 0.005); AdCC (aOR 2.58, 95 % CI 1.54-4.32), high grade (aOR 1.81, 95 % CI 1.04-1.98), pT4 classification (aOR 1.43, 95 % CI 1.04-1.98), and neck dissection (aOR 1.41, 95 % CI 1.05-1.90) were associated with increased odds (P \u3c 0.05). Patients undergoing aRT had higher 5-year OS than those not undergoing aRT on Kaplan-Meier (82.9 % vs. 67.0 %, P \u3c 0.001) and multivariable Cox (aHR 0.50, 95 % CI 0.37-0.66, P \u3c 0.001) analyses.
CONCLUSION: aRT was utilized in approximately 40 % of patients with pT3-4N0M0 MSGC without evidence of other adverse pathologic features. aRT was associated with higher OS, highlighting the continued need for efforts promoting guideline-recommended care.
LEVEL OF EVIDENCE: 4 LAY SUMMARY: pT3-4N0M0 major salivary gland cancer without evidence of other adverse pathologic features represents a rare, moderate-risk disease category with high variation in treatment strategies depending on histopathology and physician and patient preferences. Our study of this disease category suggests that adjuvant radiotherapy is associated with higher 5-year overall survival