13 research outputs found

    Static vs. Expandable PEEK Interbody Cages: A Comparison of One-Year Clinical and Radiographic Outcomes for One-Level TLIF

    Get PDF
    Introduction: Degenerative spine disease is a disabling condition affecting many worldwide. Transoforaminal lumbar interbody fusion (TLIF) procedures help stabilize the spine, while improving back and/or leg pain. With the introduction of new implant designs and modifications, focus has shifted to optimizing spinopelvic alignment, fusion rates, and more. This study aims to explore the effect of static versus expandable polyetheretherketone (PEEK) cages on patient-reported outcomes (PROMs) and radiographic outcomes (subsidence, disk height, and alignment parameters). Materials/Methods: A retrospective cohort study was conducted using a database of patients in a single, high volume academic center. Patient outcomes were obtained from charts and radiographic outcomes were measured using standing, lateral radiographs. Data were analyzed using mean sample t-tests or categorical chi-squared tests, and multiple linear regression where appropriate. Results: Our results showed improved Oswestry Disability Index (ODI) scores perioperatively in the expandable cage group compared to the static cage group at the three-month and one-year time periods. In addition, there were a significantly greater proportion of patients that reached minimal clinically important difference (MCID) in the expandable group compared to the static cage group. There were no significant changes in subsidence or alignment parameters between the two groups at the one-year time period. Conclusion: Overall, our results show that TLIF patients treated with expandable PEEK cages had significantly greater improvement in one-year outcomes compared to patients with static cages. Expandable cages confer the advantage of more precise insertion into the intervertebral disk space, while providing a way to tailor the cage height for better distraction and spinal alignment. Further prospective studies are warranted to get a better idea of the impact of interbody design on clinical/radiographic outcomes

    Timing of Preoperative Surgical Antibiotic Prophylaxis Prior to One-to-Three Level Elective Lumbar Fusion

    Get PDF
    Introduction: Surgical site infections (SSIs) in patients who have undergone lumbar fusion surgery can be a devastating complication. The aim of this study was to examine the relationship between timing of preoperative antibiotic prophylaxis and adverse outcomes such as SSIs with subgroup analysis stratified by antibiotic type. Methods: This retrospective cohort included patients having undergone a 1-3 level lumbar fusion who were allocated into 5 groups based on the time from antibiotic administration to incision (Group A = 0-15 min, Group B = 16-30 min, Group C = 31-45 min, Group D = 46 – 60 min, and Group E = 61+ min). The primary outcome was SSIs, with secondary outcomes including: 90-day readmission and revision surgery. Statistical analysis, including a logistic regression model, was used to examine the relationship between patient/case characteristics, preoperative antibiotic timing, and postoperative outcomes. Results: From 1,131 patients, 26 SSIs were identified. Logistic regression analysis revealed that Group E patients were 1.4 times more likely to develop an SSI and 7.2 times more likely to be readmitted within 90 days compared to Group A. Furthermore, Groups B-D did not have significantly higher odds of infection compared to Group A. Patients treated with vancomycin showed the lowest incidence of SSI when administered between 16-30 minutes, however cefazolin showed the lowest incidence when administered between 0-15 minutes. Discussion: Preoperative antibiotic administration beyond one hour is associated with higher rates of SSI. Furthermore, vancomycin use is optimal when administered between 16 and 30 minutes prior to incision. While antibiotic prophylaxis effects depend on patient characteristics and varying pharmacokinetics, an effective regimen can significantly improve patient outcomes

    Establishing the Injury Severity of Subaxial Cervical Spine Trauma: Validating the Hierarchical Nature of the AO Spine Subaxial Cervical Spine Injury Classification System.

    Get PDF
    STUDY DESIGN Global cross-sectional survey. OBJECTIVE To validate the AO Spine Subaxial Cervical Spine Injury Classification by examining the perceived injury severity by surgeon across AO geographical regions and practice experience. SUMMARY OF BACKGROUND DATA Previous subaxial cervical spine injury classifications have been limited by subpar interobserver reliability and clinical applicability. In an attempt to create a universally validated scheme with prognostic value, AO Spine established a subaxial cervical spine injury classification involving four elements: (1) injury morphology, (2) facet injury involvement, (3) neurologic status, and (4) case-specific modifiers. METHODS A survey was sent to 272 AO Spine members across all geographic regions and with a variety of practice experience. Respondents graded the severity of each variable of the classification system on a scale from zero (low severity) to 100 (high severity). Primary outcome was to assess differences in perceived injury severity for each injury type over geographic regions and level of practice experience. RESULTS A total of 189 responses were received. Overall, the classification system exhibited a hierarchical progression in subtype injury severity scores. Only three subtypes showed a significant difference in injury severity score among geographic regions: F3 (floating lateral mass fracture, p:0.04), N3 (incomplete spinal cord injury, p:0.03), and M2 (critical disk herniation, p:0.04). When stratified by surgeon experience, pairwise comparison showed only 2 morphological subtypes, B1 (bony posterior tension band injury, p:0.02) and F2 (unstable facet fracture, p:0.03), and one neurologic subtype (N3, p:0.02) exhibited a significant difference in injury severity score. CONCLUSIONS The AO Spine Subaxial Cervical Spine Injury Classification System has shown to be reliable and suitable for proper patient management. The study shows this classification is substantially generalizable by geographic region and surgeon experience; and provides a consistent method of communication among physicians while covering the majority of subaxial cervical spine traumatic injuries.Level of Evidence: 4

    Regional and experiential differences in surgeon preference for the treatment of cervical facet injuries: a case study survey with the AO Spine Cervical Classification Validation Group

    Get PDF
    Purpose The management of cervical facet dislocation injuries remains controversial. The main purpose of this investigation was to identify whether a surgeon’s geographic location or years in practice infuences their preferred management of traumatic cervical facet dislocation injuries. Methods A survey was sent to 272 AO Spine members across all geographic regions and with a variety of practice experience. The survey included clinical case scenarios of cervical facet dislocation injuries and asked responders to select preferences among various diagnostic and management options. Results A total of 189 complete responses were received. Over 50% of responding surgeons in each region elected to initiate management of cervical facet dislocation injuries with an MRI, with 6 case exceptions. Overall, there was considerable agreement between American and European responders regarding management of these injuries, with only 3 cases exhibiting a signifcant diference. Additionally, results also exhibited considerable management agreement between those with≤10 and>10 years of practice experience, with only 2 case exceptions noted. Conclusion More than half of responders, regardless of geographical location or practice experience, identifed MRI as a screening imaging modality when managing cervical facet dislocation injuries, regardless of the status of the spinal cord and prior to any additional intervention. Additionally, a majority of surgeons would elect an anterior approach for the surgical management of these injuries. The study found overall agreement in management preferences of cervical facet dislocation injuries around the globe. Keywords Cervical spine · Trauma · Spinal injuries · Joint dislocations · Neck injuries · Spinal disease

    Perioperative Chanage in Cervical Lordosis and Health-Related Quality-of-Life Outcomes

    Get PDF
    BACKGROUND: Surgeons have scrutinized spinal alignment and its impact on improving clinical outcomes following anterior cervical discectomy and fusion (ACDF). The primary analysis of this study examines the relationship between change in perioperative cervical lordosis (CL) and health-related quality-of-life (HRQOL) outcomes after ACDF. Secondary analysis evaluates the effects of fusion construct length on outcomes in patients grouped by preoperative cervical alignment. METHODS: A retrospective cohort study was performed on an institutional database including patients who underwent 1- to 3-level ACDF. C2-C7 CL was measured preoperatively and at final follow-up. For primary analysis, patients were classified based on their perioperative cervical lordotic correction: (1) kyphotic, (2) maintained, and (3) restored. For secondary analysis, patients were categorized based on their preoperative C2-C7 CL: (1) kyphotic, (2) neutral, and (3) lordotic. Demographics and perioperative change in patient-reported outcome measures were compared between groups. RESULTS: A total of 308 patients were included. A significant difference was noted among maintained, restored, and kyphotic groups in terms of delta physical compositeshort form-12 score (ΔPCS-12) (9.0 vs 10.3 vs 1.5; CONCLUSIONS: Significantly greater PCS-12 and VAS arm improvement were seen in patients whose cervical sagittal alignment was restored to neutral/lordotic compared with those who remained kyphotic. Multivariate analysis demonstrated no association between construct length and perioperative outcomes. CLINICAL RELEVANCE: The results of this study highlight the importance of sagittal alignment and restoration of CL after short-segment ACDF. Irrespective of preoperative sagittal alignment, the length of ACDF fusion construct does not have a significant impact on clinical outcomes

    Regional and experiential differences in surgeon preference for the treatment of cervical facet injuries: a case study survey with the AO Spine Cervical Classification Validation Group

    Get PDF
    PURPOSE: The management of cervical facet dislocation injuries remains controversial. The main purpose of this investigation was to identify whether a surgeon\u27s geographic location or years in practice influences their preferred management of traumatic cervical facet dislocation injuries. METHODS: A survey was sent to 272 AO Spine members across all geographic regions and with a variety of practice experience. The survey included clinical case scenarios of cervical facet dislocation injuries and asked responders to select preferences among various diagnostic and management options. RESULTS: A total of 189 complete responses were received. Over 50% of responding surgeons in each region elected to initiate management of cervical facet dislocation injuries with an MRI, with 6 case exceptions. Overall, there was considerable agreement between American and European responders regarding management of these injuries, with only 3 cases exhibiting a significant difference. Additionally, results also exhibited considerable management agreement between those with ≤ 10 and \u3e 10 years of practice experience, with only 2 case exceptions noted. CONCLUSION: More than half of responders, regardless of geographical location or practice experience, identified MRI as a screening imaging modality when managing cervical facet dislocation injuries, regardless of the status of the spinal cord and prior to any additional intervention. Additionally, a majority of surgeons would elect an anterior approach for the surgical management of these injuries. The study found overall agreement in management preferences of cervical facet dislocation injuries around the globe

    Prospective, multicentre study of screening, investigation and management of hyponatraemia after subarachnoid haemorrhage in the UK and Ireland

    Get PDF
    Background: Hyponatraemia often occurs after subarachnoid haemorrhage (SAH). However, its clinical significance and optimal management are uncertain. We audited the screening, investigation and management of hyponatraemia after SAH. Methods: We prospectively identified consecutive patients with spontaneous SAH admitted to neurosurgical units in the United Kingdom or Ireland. We reviewed medical records daily from admission to discharge, 21 days or death and extracted all measurements of serum sodium to identify hyponatraemia (<135 mmol/L). Main outcomes were death/dependency at discharge or 21 days and admission duration >10 days. Associations of hyponatraemia with outcome were assessed using logistic regression with adjustment for predictors of outcome after SAH and admission duration. We assessed hyponatraemia-free survival using multivariable Cox regression. Results: 175/407 (43%) patients admitted to 24 neurosurgical units developed hyponatraemia. 5976 serum sodium measurements were made. Serum osmolality, urine osmolality and urine sodium were measured in 30/166 (18%) hyponatraemic patients with complete data. The most frequently target daily fluid intake was >3 L and this did not differ during hyponatraemic or non-hyponatraemic episodes. 26% (n/N=42/164) patients with hyponatraemia received sodium supplementation. 133 (35%) patients were dead or dependent within the study period and 240 (68%) patients had hospital admission for over 10 days. In the multivariable analyses, hyponatraemia was associated with less dependency (adjusted OR (aOR)=0.35 (95% CI 0.17 to 0.69)) but longer admissions (aOR=3.2 (1.8 to 5.7)). World Federation of Neurosurgical Societies grade I–III, modified Fisher 2–4 and posterior circulation aneurysms were associated with greater hazards of hyponatraemia. Conclusions: In this comprehensive multicentre prospective-adjusted analysis of patients with SAH, hyponatraemia was investigated inconsistently and, for most patients, was not associated with changes in management or clinical outcome. This work establishes a basis for the development of evidence-based SAH-specific guidance for targeted screening, investigation and management of high-risk patients to minimise the impact of hyponatraemia on admission duration and to improve consistency of patient care

    Epigenetic control of cancer by neuropeptides

    No full text
    Neuropeptides act as neurohormones, neurotransmitters and/or neuromodulators. Neuropeptides maintain physiological homeostasis and are paramount in molecular mechanisms of disease progression and regulation, including in cancer. Neuropeptides, by their definition, originate and are secreted from the neuronal cells, they are able to signal to neighboring cells or are released into the blood flow, if they act as neurohormones. The majority of neuropeptides exert their functions through G protein-coupled receptors, with certain exceptions. Although previous studies indicate that neuropeptides function in supporting proliferation of malignant cells in many types of solid tumor, the antitumorigenic action of the neuropeptides and their receptors, for example, in gastric cancers and chondrosarcoma, were also reported. It is known that epigenetically modified chromatin regulates molecular mechanisms involved in gene expression and malignant progression. The epigenetic modifications are genetically heritable, although they do not cause changes in DNA sequence. DNA methylation, histone modifications and miRNA expression are subject to those modifications. While there is substantial data on epigenetic regulation of neuropeptides, the epigenetic control of cancer by neuropeptides is considered to be uncharted territory. The aim of the current review is to describe the involvement of neuropeptides in the epigenetic machinery of cancer based on data obtained from our laboratory and from other authors

    Analysis of IL6-protein complexes in chondrosarcoma

    No full text
    Cytokines produced in the tumour microenvironment serve important roles in cancer pathogenesis or in the supression of disease progression. Metastatic chondrosarcoma is a cancer of the cartilage, and our group previously reported from a human ELISA assay that interleukin 6 (IL6) expression in JJ012 chondrosarcoma cells was 86-fold lower than that in C28 chondrocytes, indicating its role as an anti-inflammatory and anti-tumorigenic factor. Additionally, to the best of our knowledge, the study was the first to demonstrate downregulation of IL6 in a human chondrosarcoma cell line. To fully elucidate the effect of this IL6 downregulation, it is important to identify protein complexes and components that bind IL6 and potentially affect its gene expression directly or indirectly. To investigate IL6-protein interactions leading to these differences in IL6 expression, the current study performed a gel retardation electrophoretic mobility shift assay (EMSA), followed by 2D gel phoresis, in-gel trypsin digestion and proteomic mass spectral analysis. The results indicated a presence of ubiquitination enzymes in C28 chondrocytes, while none were identified in JJ012 chondrosarcoma cells. While it seems counterintuitive, it may be that the absence of ubiquitination of certain factors leads to the downregulation of IL6 expression in human chondrosarcoma. Therefore, dysregulated ubiquitination may be among the possible mechanisms for the markedly reduced IL6 expression in chondrosarcoma
    corecore