25 research outputs found

    Inpatient Opioid Use Varies by Construct Length Among Laminoplasty Versus Laminectomy and Fusion Patients

    Get PDF
    BACKGROUND: Laminoplasty (LP) and laminectomy and fusion (LF) are utilized to achieve decompression in patients with symptomatic degenerative cervical myelopathy (DCM). Comparative analyses aimed at determining outcomes and clarifying indications between these procedures represent an area of active research. Accordingly, we sought to compare inpatient opioid use between LP and LF patients and to determine if opioid use correlated with length of stay. METHODS: Sociodemographic information, surgical and hospitalization data, and medication administration records were abstracted for patients \u3e18 years of age who underwent LP or LF for DCM in the Mass General Brigham (MGB) health system between 2017 and 2019. Specifically, morphine milligram equivalents (MME) of oral and parenteral pain medication given after arrival in the recovery area until discharge from the hospital were collected. Categorical variables were analyzed using chi-squared analysis or Fisher exact test when appropriate. Continuous variables were compared using Independent samples RESULTS: One hundred eight patients underwent LF, while 138 patients underwent LP. Total inpatient opioid use was significantly higher in the LF group (312 vs. 260 MME, p=.03); this difference was primarily driven by higher postoperative day 0 pain medication requirements. Furthermore, more LF patients required high dose (\u3e80 MME/day) regimens. While length of stay was significantly different between groups, with LF patients staying approximately 1 additional day, postoperative day 0 MME was not a significant predictor of this difference. When operative levels including C2, T1, and T2 were excluded, the differences in total opioid use and average length of stay lost significance. CONCLUSIONS: Inpatient opioid use and length of stay were significantly greater in LF patients compared to LP patients; however, when constructs including C2, T1, T2 were excluded from analysis, these differences lost significance. Such findings highlight the impact of operative extent between these procedures. Future studies incorporating patient reported outcomes and evaluating long-term pain needs will provide a more complete understanding of postoperative outcomes between these 2 procedures

    Identification and Validation of Novel Cerebrospinal Fluid Biomarkers for Staging Early Alzheimer's Disease

    Get PDF
    Ideally, disease modifying therapies for Alzheimer disease (AD) will be applied during the 'preclinical' stage (pathology present with cognition intact) before severe neuronal damage occurs, or upon recognizing very mild cognitive impairment. Developing and judiciously administering such therapies will require biomarker panels to identify early AD pathology, classify disease stage, monitor pathological progression, and predict cognitive decline. To discover such biomarkers, we measured AD-associated changes in the cerebrospinal fluid (CSF) proteome.CSF samples from individuals with mild AD (Clinical Dementia Rating [CDR] 1) (n = 24) and cognitively normal controls (CDR 0) (n = 24) were subjected to two-dimensional difference-in-gel electrophoresis. Within 119 differentially-abundant gel features, mass spectrometry (LC-MS/MS) identified 47 proteins. For validation, eleven proteins were re-evaluated by enzyme-linked immunosorbent assays (ELISA). Six of these assays (NrCAM, YKL-40, chromogranin A, carnosinase I, transthyretin, cystatin C) distinguished CDR 1 and CDR 0 groups and were subsequently applied (with tau, p-tau181 and Aβ42 ELISAs) to a larger independent cohort (n = 292) that included individuals with very mild dementia (CDR 0.5). Receiver-operating characteristic curve analyses using stepwise logistic regression yielded optimal biomarker combinations to distinguish CDR 0 from CDR>0 (tau, YKL-40, NrCAM) and CDR 1 from CDR<1 (tau, chromogranin A, carnosinase I) with areas under the curve of 0.90 (0.85-0.94 95% confidence interval [CI]) and 0.88 (0.81-0.94 CI), respectively.Four novel CSF biomarkers for AD (NrCAM, YKL-40, chromogranin A, carnosinase I) can improve the diagnostic accuracy of Aβ42 and tau. Together, these six markers describe six clinicopathological stages from cognitive normalcy to mild dementia, including stages defined by increased risk of cognitive decline. Such a panel might improve clinical trial efficiency by guiding subject enrollment and monitoring disease progression. Further studies will be required to validate this panel and evaluate its potential for distinguishing AD from other dementing conditions

    The seeds of divergence: the economy of French North America, 1688 to 1760

    Get PDF
    Generally, Canada has been ignored in the literature on the colonial origins of divergence with most of the attention going to the United States. Late nineteenth century estimates of income per capita show that Canada was relatively poorer than the United States and that within Canada, the French and Catholic population of Quebec was considerably poorer. Was this gap long standing? Some evidence has been advanced for earlier periods, but it is quite limited and not well-suited for comparison with other societies. This thesis aims to contribute both to Canadian economic history and to comparative work on inequality across nations during the early modern period. With the use of novel prices and wages from Quebec—which was then the largest settlement in Canada and under French rule—a price index, a series of real wages and a measurement of Gross Domestic Product (GDP) are constructed. They are used to shed light both on the course of economic development until the French were defeated by the British in 1760 and on standards of living in that colony relative to the mother country, France, as well as the American colonies. The work is divided into three components. The first component relates to the construction of a price index. The absence of such an index has been a thorn in the side of Canadian historians as it has limited the ability of historians to obtain real values of wages, output and living standards. This index shows that prices did not follow any trend and remained at a stable level. However, there were episodes of wide swings—mostly due to wars and the monetary experiment of playing card money. The creation of this index lays the foundation of the next component. The second component constructs a standardized real wage series in the form of welfare ratios (a consumption basket divided by nominal wage rate multiplied by length of work year) to compare Canada with France, England and Colonial America. Two measures are derived. The first relies on a “bare bones” definition of consumption with a large share of land-intensive goods. This measure indicates that Canada was poorer than England and Colonial America and not appreciably richer than France. However, this measure overestimates the relative position of Canada to the Old World because of the strong presence of land-intensive goods. A second measure is created using a “respectable” definition of consumption in which the basket includes a larger share of manufactured goods and capital-intensive goods. This second basket better reflects differences in living standards since the abundance of land in Canada (and Colonial America) made it easy to achieve bare subsistence, but the scarcity of capital and skilled labor made the consumption of luxuries and manufactured goods (clothing, lighting, imported goods) highly expensive. With this measure, the advantage of New France over France evaporates and turns slightly negative. In comparison with Britain and Colonial America, the gap widens appreciably. This element is the most important for future research. By showing a reversal because of a shift to a different type of basket, it shows that Old World and New World comparisons are very sensitive to how we measure the cost of living. Furthermore, there are no sustained improvements in living standards over the period regardless of the measure used. Gaps in living standards observed later in the nineteenth century existed as far back as the seventeenth century. In a wider American perspective that includes the Spanish colonies, Canada fares better. The third component computes a new series for Gross Domestic Product (GDP). This is to avoid problems associated with using real wages in the form of welfare ratios which assume a constant labor supply. This assumption is hard to defend in the case of Colonial Canada as there were many signs of increasing industriousness during the eighteenth and nineteenth centuries. The GDP series suggest no long-run trend in living standards (from 1688 to circa 1765). The long peace era of 1713 to 1740 was marked by modest economic growth which offset a steady decline that had started in 1688, but by 1760 (as a result of constant warfare) living standards had sunk below their 1688 levels. These developments are accompanied by observations that suggest that other indicators of living standard declined. The flat-lining of incomes is accompanied by substantial increases in the amount of time worked, rising mortality and rising infant mortality. In addition, comparisons of incomes with the American colonies confirm the results obtained with wages— Canada was considerably poorer. At the end, a long conclusion is provides an exploratory discussion of why Canada would have diverged early on. In structural terms, it is argued that the French colony was plagued by the problem of a small population which prohibited the existence of scale effects. In combination with the fact that it was dispersed throughout the territory, the small population of New France limited the scope for specialization and economies of scale. However, this problem was in part created, and in part aggravated, by institutional factors like seigneurial tenure. The colonial origins of French America’s divergence from the rest of North America are thus partly institutional

    The Seeds of Divergence: The Economy of French North America, 1688 to 1760

    Full text link

    Factors influencing the number of applications submitted per applicant to orthopedic residency programs

    No full text
    Background: From 2002 to 2014, the orthopedic surgery residency applicant pool increased by 25% while the number of applications submitted per applicant rose by 69%, resulting in an increase of 109% in the number of applications received per program. Objective: This study aimed to identify applicant factors associated with an increased number of applications to orthopedic surgery residency programs. Design: An anonymous survey was sent to all applicants applying to the orthopedic surgery residency program at Loyola University. Questions were designed to define the number of applications submitted per respondent as well as the strength of their application. Of 733 surveys sent, 140 (19.1%) responses were received. Setting: An academic institution in Maywood, IL. Participants: Fourth-year medical students applying to the orthopedic surgery residency program at Loyola University. Results: An applicant's perception of how competitive he or she was (applicants who rated themselves as ‘average’ submitted more applications than those who rated themselves as either ‘good’ or ‘outstanding’, p=0.001) and the number of away rotations (those who completed >2 away rotations submitted more applications, p=0.03) were significantly associated with an increased number of applications submitted. No other responses were found to be associated with an increased number of applications submitted. Conclusion: Less qualified candidates are not applying to significantly more programs than their more qualified counterparts. The increasing number of applications represents a financial strain on the applicant, given the costs required to apply to more programs, and a time burden on individual programs to screen increasing numbers of applicants. In order to stabilize or reverse this alarming trend, orthopedic surgery residency programs should openly disclose admission criteria to prospective candidates, and medical schools should provide additional guidance for candidates in this process

    Identification of conserved regulatory RNA structures in prokaryotic metabolic pathway genes

    No full text
    A combination of algorithms to search RNA sequence for the potential for secondary structure formation, and search large numbers of sequences for structural similarity, were used to search the 5 ′ UTRs of annotated genes in the Escherichia coli genome for regulatory RNA structures. Using this approach, similar RNA structures that regulate genes in the thiamin metabolic pathway were identified. In addition, several putative regulatory structures were discovered upstream of genes involved in other metabolic pathways including glycerol metabolism and ethanol fermentation. The results demonstrate that this computational approach is a powerful tool for discovery of important RNA structures within prokaryotic organisms

    Outpatient spine clinic utilization is associated with reduced emergency department visits following spine surgery

    No full text
    Study design: Review of TRICARE claims (2006-2014) data to assess Emergency Department (ED) utilization following spine surgery. Objective: The aim of this study was to determine utilization rates and predictors of ED utilization following spine surgical interventions. Summary of background data: Visits to the ED following surgical intervention represent an additional stress to the healthcare system. While factors associated with readmission following spine surgery have been studied, drivers of postsurgical ED visits, including appropriate and inappropriate use, remain underinvestigated. Methods: TRICARE claims were queried to identify patients who had undergone one of three common spine procedures (lumbar arthrodesis, discectomy, decompression). ED utilization at 30- and 90 days was assessed as the primary outcome. Outpatient spine surgical clinic utilization was considered the primary predictor variable. Multivariable logistic regression was used to adjust for confounders. Results: Between 2006 and 2014, 48,868 patients met inclusion criteria. Fifteen percent (n = 7183) presented to the ED within 30 days postdischarge. By 90 days, 29% of patients (n = 14,388) presented to an ED. The 30- and 90-day complication rates were 6% (n = 2802) and 8% (n = 4034), respectively, and readmission rates were 5% (n = 2344) and 8% (n = 3842), respectively. Use of outpatient spine clinic services significantly reduced the likelihood of ED utilization at 30 [odds ratio (OR) 0.48; 95% confidence interval (95% CI) 0.46-0.53] and 90 days (OR 0.55; 95% CI 0.52-0.57). Conclusion: Within 90 days following spine surgery, 29% of patients sought care in the ED. However, only one-third of these patients had a complication recorded, and even fewer were readmitted. This suggests a high rate of unnecessary ED utilization. Outpatient utilization of spine clinics was the only factor independently associated with a reduced likelihood of ED utilizatio
    corecore