503 research outputs found
Recommended from our members
Physical Function and Quality of Life After Resection of Mobile Spine Chondrosarcoma.
Study Design:Retrospective cohort study. Objectives:(1) To assess patient-reported outcomes-physical function, pain, and quality of life-in patients who underwent resection of a mobile spine chondrosarcoma. (2) To assess complications (90 days), readmissions, reoperations, oncological outcomes, and neurologic status. Methods:Thirty-three patients with spinal conventional chondrosarcoma resection between 1984 and 2014 at one hospital were included. The primary outcome measures were-minimally 6 months after surgery-the EuroQol 5 Dimensions (EQ5D), PROMIS-Physical Function, PROMIS-Pain Intensity, and Oswestry (ODI) Disability Index, or Neck (NDI) Disability established in 14 out of 20 alive (70.0%) patients. Complications, readmission, reoperations, oncological outcomes, and neurological status were reported for the complete cohort of 33 patients. Results:After spine chondrosarcoma resection, patients (n = 14) reported worse physical function (median 43, range 22-61, P = .026), worse quality of life (median EQ5D 0.70, range 0.04-1, P = .022), and comparable pain intensity (median 47, range 31-56, P = .362) when compared with US general population values. The median NDI/ODI was 25 (range 0-72) indicating mild to moderate disability. Patients undergoing reoperation had worse patient-reported outcomes than those who did not. Eighteen (55.5%) out of 33 patients suffered complications (90 days), 14 (42.4%) had unplanned readmission, and 13 (39.4%) underwent reoperation. Intralesional resection was associated with increased readmission, reoperation, and recurrence rate. Conclusions:Chondrosarcoma affects quality of life and physical function and its treatment frequently results in complications and reoperations. Our findings can be used to inform future patients about expected outcomes
Prognostic Factors in Dedifferentiated Chondrosarcoma: A Retrospective Analysis of a Large Series Treated at a Single Institution.
Background:Dedifferentiated chondrosarcomas (DDCSs) are highly malignant tumors with a dismal prognosis and present a significant challenge in clinical management. Methods:In an IRB approved retrospective protocol, we identified 72 patients with DDCS treated at our institution between 1993 and 2017 and reviewed clinicopathological characteristics, treatment modalities, and outcomes to analyze prognostic factors. Results:Femur (44.4%), pelvis (22.2%), and humerus (12.5%) were most commonly involved sites. Twenty-three patients (31.9%) presented with distant metastasis, and 3 (4.2%) of them also had regional lymph node involvement. The median overall survival (OS) was 13.9 months. On multivariate analysis, pathological fracture, larger tumor size, lymph node involvement, metastasis at diagnosis, extraosseous extension, and undifferentiated pleomorphic sarcoma component correlated with worse OS, whereas surgical resection and chemotherapy were associated with improved OS. For progression-free survival (PFS), pathological fracture and metastasis at diagnosis showed increased risk, while chemotherapy was associated with decreased risk. Among patients who received chemotherapy, doxorubicin and cisplatin were significantly associated with improved PFS but not OS. Among patients without metastasis at diagnosis, 17 (34.7%) developed local recurrence. Thirty-one (63.3%) developed distant metastases at a median interval of 18.1 months. On multivariate analysis, R1/R2 resection was related with local recurrence, while macroscopic dedifferentiated component was associated with distant metastasis. Conclusions:The prognosis of DDCS is poor. Complete resection remains a significant prognostic factor for local control. Chemotherapy with doxorubicin and cisplatin seems to have better PFS. More prognostic, multicenter trials are warranted to further explore the effectiveness of chemotherapy in selected DDCS patients
Seasonal and interannual effects of hypoxia on fish habitat quality in central Lake Erie
1. Hypoxia occurs seasonally in many stratified coastal marine and freshwater ecosystems when bottom dissolved oxygen (DO) concentrations are depleted below 2–3 mg O2 L-1.
2. We evaluated the effects of hypoxia on fish habitat quality in the central basin of Lake Erie from 1987 to 2005, using bioenergetic growth rate potential (GRP) as a proxy for habitat quality. We compared the effect of hypoxia on habitat quality of (i) rainbow smelt, Osmerus mordax mordax Mitchill (young-of-year, YOY, and adult), a cold-water planktivore, (ii) emerald shiner, Notropis atherinoides Rafinesque (adult), a warm-water planktivore, (iii) yellow perch, Perca flavescens Mitchill (YOY and adult), a cool-water benthopelagic omnivore and (iv) round goby Neogobius melanostomus Pallas (adult) a eurythermal benthivore. Annual thermal and DO profiles were generated from 1D thermal and DO hydrodynamics models developed for Lake Erie’s central basin.
3. Hypoxia occurred annually, typically from mid-July to mid-October, which spatially and temporally overlaps with otherwise high benthic habitat quality. Hypoxia reduced the habitat quality across fish species and life stages, but the magnitude of the reduction varied both among and within species because of the differences in tolerance to low DO levels and warm-water temperatures.
4. Across years, trends in habitat quality mirrored trends in phosphorus concentration and water column oxygen demand in central Lake Erie. The per cent reduction in habitat quality owing to hypoxia was greatest for adult rainbow smelt and round goby (mean: -35%), followed by adult emerald shiner (mean: -12%), YOY rainbow smelt (mean: -10%) and YOY and adult yellow perch (mean: -8.5%).
5. Our results highlight the importance of differential spatiotemporally interactive effects of DO and temperature on relative fish habitat quality and quantity. These effects have the potential to influence the performance of individual fish species as well as population dynamics, trophic interactions and fish community structure
Measuring and Correcting Wind-Induced Pointing Errors of the Green Bank Telescope Using an Optical Quadrant Detector
Wind-induced pointing errors are a serious concern for large-aperture
high-frequency radio telescopes. In this paper, we describe the implementation
of an optical quadrant detector instrument that can detect and provide a
correction signal for wind-induced pointing errors on the 100m diameter Green
Bank Telescope (GBT). The instrument was calibrated using a combination of
astronomical measurements and metrology. We find that the main wind-induced
pointing errors on time scales of minutes are caused by the feedarm being blown
along the direction of the wind vector. We also find that wind-induced
structural excitation is virtually non-existent. We have implemented offline
software to apply pointing corrections to the data from imaging instruments
such as the MUSTANG 3.3 mm bolometer array, which can recover ~70% of
sensitivity lost due to wind-induced pointing errors. We have also performed
preliminary tests that show great promise for correcting these pointing errors
in real-time using the telescope's subreflector servo system in combination
with the quadrant detector signal.Comment: 17 pages, 11 figures; accepted for publication in PAS
Lower Prevalence of Antibiotic-Resistant Enterococci on U.S. Conventional Poultry Farms that Transitioned to Organic Practices
Background: In U.S. conventional poultry production, antimicrobials are used for therapeutic, prophylactic, and nontherapeutic purposes. Researchers have shown that this can select for antibiotic-resistant commensal and pathogenic bacteria on poultry farms and in poultry-derived products. However, no U.S. studies have investigated on-farm changes in resistance as conventional poultry farms transition to organic practices and cease using antibiotics
Tissue Microarray Immunohistochemical Detection of Brachyury Is Not a Prognostic Indicator in Chordoma
Brachyury is a marker for notochord-derived tissues and neoplasms, such as chordoma. However, the prognostic relevance of brachyury expression in chordoma is still unknown. The improvement of tissue microarray technology has provided the opportunity to perform analyses of tumor tissues on a large scale in a uniform and consistent manner. This study was designed with the use of tissue microarray to determine the expression of brachyury. Brachyury expression in chordoma tissues from 78 chordoma patients was analyzed by immunohistochemical staining of tissue microarray. The clinicopathologic parameters, including gender, age, location of tumor and metastatic status were evaluated. Fifty-nine of 78 (75.64%) tumors showed nuclear staining for brachyury, and among them, 29 tumors (49.15%) showed 1+ (<30% positive cells) staining, 15 tumors (25.42%) had 2+ (31% to 60% positive cells) staining, and 15 tumors (25.42%) demonstrated 3+ (61% to 100% positive cells) staining. Brachyury nuclear staining was detected more frequently in sacral chordomas than in chordomas of the mobile spine. However, there was no significant relationship between brachyury expression and other clinical variables. By Kaplan-Meier analysis, brachyury expression failed to produce any significant relationship with the overall survival rate. In conclusion, brachyury expression is not a prognostic indicator in chordoma
Does the SORG Orthopaedic Research Group Hip Fracture Delirium Algorithm Perform Well on an Independent Intercontinental Cohort of Patients With Hip Fractures Who Are 60 Years or Older?
Background Postoperative delirium in patients aged 60 years or older with hip fractures adversely affects clinical and functional outcomes. The economic cost of delirium is estimated to be as high as USD 25,000 per patient, with a total budgetary impact between USD 6.6 to USD 82.4 billion annually in the United States alone. Forty percent of delirium episodes are preventable, and accurate risk stratification can decrease the incidence and improve clinical outcomes in patients. A previously developed clinical prediction model (the SORG Orthopaedic Research Group hip fracture delirium machine-learning algorithm) is highly accurate on internal validation (in 28,207 patients with hip fractures aged 60 years or older in a US cohort) in identifying at-risk patients, and it can facilitate the best use of preventive interventions; however, it has not been tested in an independent population. For an algorithm to be useful in real life, it must be valid externally, meaning that it must perform well in a patient cohort different from the cohort used to "train" it. With many promising machine-learning prediction models and many promising delirium models, only few have also been externally validated, and even fewer are international validation studies. Question/purpose Does the SORG hip fracture delirium algorithm, initially trained on a database from the United States, perform well on external validation in patients aged 60 years or older in Australia and New Zealand? Methods We previously developed a model in 2021 for assessing risk of delirium in hip fracture patients using records of 28,207 patients obtained from the American College of Surgeons National Surgical Quality Improvement Program. Variables included in the original model included age, American Society of Anesthesiologists (ASA) class, functional status (independent or partially or totally dependent for any activities of daily living), preoperative dementia, preoperative delirium, and preoperative need for a mobility aid. To assess whether this model could be applied elsewhere, we used records from an international hip fracture registry. Between June 2017 and December 2018, 6672 patients older than 60 years of age in Australia and New Zealand were treated surgically for a femoral neck, intertrochanteric hip, or subtrochanteric hip fracture and entered into the Australian & New Zealand Hip Fracture Registry. Patients were excluded if they had a pathological hip fracture or septic shock. Of all patients, 6% (402 of 6672) did not meet the inclusion criteria, leaving 94% (6270 of 6672) of patients available for inclusion in this retrospective analysis. Seventy-one percent (4249 of 5986) of patients were aged 80 years or older, after accounting for 5% (284 of 6270) of missing values; 68% (4292 of 6266) were female, after accounting for 0.06% (4 of 6270) of missing values, and 83% (4690 of 5661) of patients were classified as ASA III/IV, after accounting for 10% (609 of 6270) of missing values. Missing data were imputed using the missForest methodology. In total, 39% (2467 of 6270) of patients developed postoperative delirium. The performance of the SORG hip fracture delirium algorithm on the validation cohort was assessed by discrimination, calibration, Brier score, and a decision curve analysis. Discrimination, known as the area under the receiver operating characteristic curves (c-statistic), measures the model's ability to distinguish patients who achieved the outcomes from those who did not and ranges from 0.5 to 1.0, with 1.0 indicating the highest discrimination score and 0.50 the lowest. Calibration plots the predicted versus the observed probabilities, a perfect plot has an intercept of 0 and a slope of 1. The Brier score calculates a composite of discrimination and calibration, with 0 indicating perfect prediction and 1 the poorest. Results The SORG hip fracture algorithm, when applied to an external patient cohort, distinguished between patients at low risk and patients at moderate to high risk of developing postoperative delirium. The SORG hip fracture algorithm performed with a c-statistic of 0.74 (95% confidence interval 0.73 to 0.76). The calibration plot showed high accuracy in the lower predicted probabilities (intercept -0.28, slope 0.52) and a Brier score of 0.22 (the null model Brier score was 0.24). The decision curve analysis showed that the model can be beneficial compared with no model or compared with characterizing all patients as at risk for developing delirium. Conclusion Algorithms developed with machine learning are a potential tool for refining treatment of at-risk patients. If high-risk patients can be reliably identified, resources can be appropriately directed toward their care. Although the current iteration of SORG should not be relied on for patient care, it suggests potential utility in assessing risk. Further assessment in different populations, made easier by international collaborations and standardization of registries, would be useful in the development of universally valid prediction models. The model can be freely accessed at: https://sorg-apps.shinyapps.io/hipfxdelirium/
Plasmid origin of replication of herpesvirus papio: DNA sequence and enhancer function.
Herpesvirus papio (HVP) is a lymphotropic virus of baboons which is related to Epstein-Barr virus (EBV) and produces latent infection. The nucleotide sequence of the 5,775-base-pair (bp) EcoRI K fragment of HVP, which has previously been shown to confer the ability to replicate autonomously, has been determined. Within this DNA fragment is a region which bears structural and sequence similarity to the ori-P region of EBV. The HVP ori-P region has a 10- by 26-bp tandem array which is related to the 20- by 30-bp tandem array from the EBV ori-P region. In HVP there is an intervening region of 764 bp followed by five partial copies of the 26-bp monomer. Both the EBV and HVP 3' regions have the potential to form dyad structures which, however, differ in arrangement. We also demonstrate that a transcriptional enhancer which requires transactivation by a virus-encoded factor is present in the HVP ori-P
Innovating carbon-capture biotechnologies through ecosystem-inspired solutions
Rising atmospheric carbon concentrations affect global health, the economy, and overall quality of life. We are fast approaching climate tipping points that must be addressed, not only by reducing emissions but also through new innovation and action toward carbon capture for sequestration and utilization (CCSU). In this perspective, we delineate next-generation biotechnologies for CCSU supported by engineering design principles derived from ecological processes inspired by three major biomes (plant-soil, deep biosphere, and marine). These are to interface with existing industrial infrastructure and, in some cases, tap into the carbon sink potential of nature. To develop ecosystem-inspired biotechnology, it is important to identify accessible control points of CO2 and CH4 within a given system as well as value-chain opportunities that drive innovation. In essence, we must supplement natural biogeochemical carbon sinks with new bioengineering solutions
- …