1,599 research outputs found
Neurologic improvement after thoracic, thoracolumbar, and lumbar spinal cord (conus medullaris) injuries
Study Design. Retrospective. Objective. With approximately 10,000 new spinal cord injury (SCI) patients in the United States each year, predicting public health outcomes is an important public health concern. Combining all regions of the spine in SCI trials may be misleading if the lumbar and sacral regions (conus) have a neurologic improvement at different rates than the thoracic or thoracolumbar spinal cord.
Summary of Background Data. Over a 10-year period between January 1995 to 2005, 1746 consecutive spinal injured patients were seen, evaluated, and treated through a level 1 trauma referral center. A retrospective analysis was performed on 150 patients meeting the criteria of T4 to S5 injury, excluding gunshot wounds. One-year follow-up data were available on 95 of these patients.
Methods. Contingency table analyses (chi-squared statistics) and multivariate logistic regression. Variables of interest included level of injury, initial American Spinal Injury Association (ASIA), age, race, and etiology.
Results. A total of 92.9% of lumbar (conus) patients neurologically improved one ASIA level or more compared with 22.4% of thoracic or thoracolumbar spinal cord-injured patients. Only 7.7% of ASIA A patients showed neurologic improvement, compared with 95.2% of ASIA D patients; ASIA B patients demonstrated a 66.7% improvement rate, whereas ASIA C had a 84.6% improvement rate. When the two effects were considered jointly in a multivariate analysis, ASIA A and thoracic/thoracolumbar patients had only a 4.1% rate of improvement, compared with 96% for lumbar (conus) and incomplete patients (ASIA B-D) and 66.7% to 72.2% for the rest of the patients. All of these relationships were significant to P \u3c 0.001 (chi-square test). There was no link to age or gender, and race and etiology were secondary to region and severity of injury.
Conclusion. Thoracic (T4-T9) SCIs have the least potential for neurologic improvement. Thoracolumbar (T10-T12) and lumbar (conus) spinal cord have a greater neurologic improvement rate, which might be related to a greater proportion of lower motor neurons. Thus, defining the exact region of injury and potential for neurologic improvement should be considered in future clinical trial design. Combining all anatomic regions of the spine in SCI trials may be misleading if different regions have neurologic improvement at different rates. Over a ten-year period, 95 complete thoracic/thoracolumbar SCI patients had only a 4.1% rate of neurologic improvement, compared with 96.0% for incomplete lumbar (conus) patients and 66.7% to 72.2% for all others
Results of a Prospective Trial to Evaluate Novel Lung Function Imaging for Lung Cancer Surgery
Background Surgery is the primary form of definitive treatment for early-stage lung cancer. Poor lung function before surgery places patients at high risk of pulmonary complications after resection. Surgeons evaluate patient fitness for surgery using pulmonary function tests (PFTs) to calculate the predicted postoperative PFT (ppoPFT). Conventional ppoPFT calculations assume homogeneous lung function, which can be inaccurate. 4DCT-ventilation is a novel lung function imaging modality developed in radiation oncology that uses 4DCT data to calculate high-resolution ventilation maps.https://jdc.jefferson.edu/radoncposters/1001/thumbnail.jp
Ultrasensitive force and displacement detection using trapped ions
The ability to detect extremely small forces is vital for a variety of
disciplines including precision spin-resonance imaging, microscopy, and tests
of fundamental physical phenomena. Current force-detection sensitivity limits
have surpassed 1 (atto ) through coupling of micro or
nanofabricated mechanical resonators to a variety of physical systems including
single-electron transistors, superconducting microwave cavities, and individual
spins. These experiments have allowed for probing studies of a variety of
phenomena, but sensitivity requirements are ever-increasing as new regimes of
physical interactions are considered. Here we show that trapped atomic ions are
exquisitely sensitive force detectors, with a measured sensitivity more than
three orders of magnitude better than existing reports. We demonstrate
detection of forces as small as 174 (yocto ), with a
sensitivity 390 using crystals of Be
ions in a Penning trap. Our technique is based on the excitation of normal
motional modes in an ion trap by externally applied electric fields, detection
via and phase-coherent Doppler velocimetry, which allows for the discrimination
of ion motion with amplitudes on the scale of nanometers. These experimental
results and extracted force-detection sensitivities in the single-ion limit
validate proposals suggesting that trapped atomic ions are capable of detecting
of forces with sensitivity approaching 1 . We anticipate that
this demonstration will be strongly motivational for the development of a new
class of deployable trapped-ion-based sensors, and will permit scientists to
access new regimes in materials science.Comment: Expanded introduction and analysis. Methods section added. Subject to
press embarg
Ionizing Radiation-Induced Oxidative Stress Alters miRNA Expression
). treatment, and 45 after etoposide treatment. Substantial overlap between the miRNA expression changes between agents was observed suggesting a signature miRNA response to cell stress. Changes in the expression of selected miRNA species varied in response to radiation dose and time. Finally, production of reactive oxygen species (ROS) increased with increasing doses of radiation and pre-treatment with the thiol antioxidant cysteine decreased both ROS production and the miRNA response to radiation., and etoposide. Additionally, pre-treatment with cysteine prevented radiation-induced alterations in miRNA expression which suggests that miRNAs are responsive to oxidative stress. Taken together, these results imply that miRNAs play a role in cellular defense against exogenous stress and are involved in the generalized cellular response to genotoxic oxidative stress
The Contribution of Cancer Incidence, Stage at Diagnosis and Survival to Racial Differences in Years of Life Expectancy
African Americans have higher cancer mortality rates than whites. Understanding the relative contribution of cancer incidence, stage at diagnosis and survival after diagnosis to the racial gap in life expectancy has important implications for directing future health disparity interventions toward cancer prevention, screening and treatment.
We estimated the degree to which higher cancer mortality among African Americans is due to higher incidence rates, later stage at diagnosis or worse survival after diagnosis.
Stochastic model of cancer incidence and survival after diagnosis.
Surveillance and Epidemiology End Result cancer registry and National Health Interview Survey data.
Life expectancy if African Americans had the same cancer incidence, stage and survival after diagnosis as white adults.
African-American men and women live 1.47 and 0.91 fewer years, respectively, than whites as the result of all cancers combined. Among men, racial differences in cancer incidence, stage at diagnosis and survival after diagnosis account for 1.12 (95% CI: 0.52 to 1.36), 0.17 (95% CI: −0.03 to 0.33) and 0.21 (95% CI: 0.05 to 0.34) years of the racial gap in life expectancy, respectively. Among women, incidence, stage and survival after diagnosis account for 0.41 (95% CI: −0.29 to 0.60), 0.26 (95% CI: −0.06 to 0.40) and 0.31 (95% CI: 0.05 to 0.40) years, respectively. Differences in stage had a smaller impact on the life expectancy gap compared with the impact of incidence. Differences in cancer survival after diagnosis had a significant impact for only two cancers—breast (0.14 years; 95% CI: 0.05 to 0.16) and prostate (0.05 years; 95% CI 0.01 to 0.09).
In addition to breast and colorectal cancer screening, national efforts to reduce disparities in life expectancy should also target cancer prevention, perhaps through smoking cessation, and differences in survival after diagnosis among persons with breast and prostate cancer
Tissue Microenvironments Define and Get Reinforced by Macrophage Phenotypes in Homeostasis or during Inflammation, Repair and Fibrosis
Current macrophage phenotype classifications are based on distinct in vitro culture conditions that do not adequately mirror complex tissue environments. In vivo monocyte progenitors populate all tissues for immune surveillance which supports the maintenance of homeostasis as well as regaining homeostasis after injury. Here we propose to classify macrophage phenotypes according to prototypical tissue environments, e.g. as they occur during homeostasis as well as during the different phases of (dermal) wound healing. In tissue necrosis and/or infection, damage- and/or pathogen-associated molecular patterns induce proinflammatory macrophages by Toll-like receptors or inflammasomes. Such classically activated macrophages contribute to further tissue inflammation and damage. Apoptotic cells and antiinflammatory cytokines dominate in postinflammatory tissues which induce macrophages to produce more antiinflammatory mediators. Similarly, tumor-associated macrophages also confer immunosuppression in tumor stroma. Insufficient parenchymal healing despite abundant growth factors pushes macrophages to gain a profibrotic phenotype and promote fibrocyte recruitment which both enforce tissue scarring. Ischemic scars are largely devoid of cytokines and growth factors so that fibrolytic macrophages that predominantly secrete proteases digest the excess extracellular matrix. Together, macrophages stabilize their surrounding tissue microenvironments by adapting different phenotypes as feed-forward mechanisms to maintain tissue homeostasis or regain it following injury. Furthermore, macrophage heterogeneity in healthy or injured tissues mirrors spatial and temporal differences in microenvironments during the various stages of tissue injury and repair. Copyright (C) 2012 S. Karger AG, Base
Not the End of the World? Post-Classical Decline and Recovery in Rural Anatolia
Between the foundation of Constantinople as capital of the eastern half of the Roman Empire in 330 CE and its sack by the Fourth Crusade in 1204 CE, the Byzantine Empire underwent a full cycle from political-economic stability, through rural insecurity and agrarian decline, and back to renewed prosperity. These stages plausibly correspond to the phases of over-extension (K), subsequent release (Ω) and recovery (α) of the Adaptive Cycle in Socio-Ecological Systems. Here we track and partly quantify the consequences of those changes in different regions of Anatolia, firstly for rural settlement (via regional archaeological surveys) and secondly for land cover (via pollen analysis). We also examine the impact of climate changes on the agrarian system. While individual histories vary, the archaeological record shows a major demographic decline between ca .650 and ca. 900 CE in central and southwestern Anatolia, which was then a frontier zone between Byzantine and Arab armies. In these regions, and also in northwest Anatolia, century-scale trends in pollen indicate a substantial decline in the production of cereal and tree crops, and a smaller decline in pastoral activity. During the subsequent recovery (α) phase after 900 CE there was strong regional differentiation, with central Anatolia moving to a new economic system based on agro-pastoralism, while lowland areas of northern and western Anatolia returned to the cultivation of commercial crops such as olive trees. The extent of recovery in the agrarian economy was broadly predictable by the magnitude of its preceding decline, but the trajectories of recovery varied between different regions
An IL28B Genotype-Based Clinical Prediction Model for Treatment of Chronic Hepatitis C
BACKGROUND:Genetic variation in IL28B and other factors are associated with sustained virological response (SVR) after pegylated-interferon/ribavirin treatment for chronic hepatitis C (CHC). Using data from the HALT-C Trial, we developed a model to predict a patient's probability of SVR based on IL28B genotype and clinical variables. METHODS:HALT-C enrolled patients with advanced CHC who had failed previous interferon-based treatment. Subjects were re-treated with pegylated-interferon/ribavirin during trial lead-in. We used step-wise logistic regression to calculate adjusted odds ratios (aOR) and create the predictive model. Leave-one-out cross-validation was used to predict a priori probabilities of SVR and determine area under the receiver operator characteristics curve (AUC). RESULTS:Among 646 HCV genotype 1-infected European American patients, 14.2% achieved SVR. IL28B rs12979860-CC genotype was the strongest predictor of SVR (aOR, 7.56; p<.0001); the model also included HCV RNA (log10 IU/ml), AST:ALT ratio, Ishak fibrosis score and prior ribavirin treatment. For this model AUC was 78.5%, compared to 73.0% for a model restricted to the four clinical predictors and 60.0% for a model restricted to IL28B genotype (p<0.001). Subjects with a predicted probability of SVR <10% had an observed SVR rate of 3.8%; subjects with a predicted probability >10% (43.3% of subjects) had an SVR rate of 27.9% and accounted for 84.8% of subjects actually achieving SVR. To verify that consideration of both IL28B genotype and clinical variables is required for treatment decisions, we calculated AUC values from published data for the IDEAL Study. CONCLUSION:A clinical prediction model based on IL28B genotype and clinical variables can yield useful individualized predictions of the probability of treatment success that could increase SVR rates and decrease the frequency of futile treatment among patients with CHC
Down-regulation of four putative arabinoxylan feruloyl transferase genes from family PF02458 reduces ester-linked ferulate content in rice cell walls
Industrial processes to produce ethanol from lignocellulosic materials are available, but improved efficiency is necessary to make them economically viable. One of the limitations for lignocellulosic conversion to ethanol is the inaccessibility of the cellulose and hemicelluloses within the tight cell wall matrix. Ferulates (FA) can cross-link different arabinoxylan molecules in the cell wall of grasses via diferulate and oligoferulate bridges. This complex cross-linking is thought to be a key factor in limiting the biodegradability of grass cell walls and, therefore, the reduction in FA is an attractive target to improve enzyme accessibility to cellulose and hemicelluloses. Unfortunately, our knowledge of the genes responsible for the incorporation of FA to the cell wall is limited. A bioinformatics prediction based on the gene similarities and higher transcript abundance in grasses relative to dicot species suggested that genes from the pfam family PF02458 may act as arabinoxylan feruloyl transferases. We show here that the FA content in the cell walls and the transcript levels of rice genes Os05g08640, Os06g39470, Os01g09010 and Os06g39390, are both higher in the stems than in the leaves. In addition, an RNA interference (RNAi) construct that simultaneously down-regulates transcript levels of these four genes is associated with a significant reduction in FA of the cell walls from the leaves of the transgenic plants relative to the control (19% reduction, P < 0.0001). Therefore, our experimental results in rice support the bioinformatics prediction that members of family PF02458 are involved in the incorporation of FA into the cell wall in grasses
The Surviving Sepsis Campaign: results of an international guideline-based performance improvement program targeting severe sepsis
Objective: The Surviving Sepsis Campaign (SSC or “the Campaign”) developed guidelines for management of severe sepsis and septic shock. A performance improvement initiative targeted changing clinical behavior (process improvement) via bundles based on key SSC guideline recommendations on process improvement and patient outcomes. Design and setting: A multifaceted intervention to facilitate compliance with selected guideline recommendations in the ICU, ED, and wards of individual hospitals and regional hospital networks was implemented voluntarily in the US, Europe, and South America. Elements of the guidelines were “bundled” into two sets of targets to be completed within 6 h and within 24 h. An analysis was conducted on data submitted from January 2005 through March 2008. Main results: Data from 15,022 subjects at 165 sites were analyzed to determine the compliance with bundle targets and association with hospital mortality. Compliance with the entire resuscitation bundle increased linearly from 10.9% in the first site quarter to 31.3% by the end of 2 years (P < 0.0001). Compliance with the entire management bundle started at 18.4% in the first quarter and increased to 36.1% by the end of 2 years (P = 0.008). Compliance with all bundle elements increased significantly, except for inspiratory plateau pressure, which was high at baseline. Unadjusted hospital mortality decreased from 37 to 30.8% over 2 years (P = 0.001). The adjusted odds ratio for mortality improved the longer a site was in the Campaign, resulting in an adjusted absolute drop of 0.8% per quarter and 5.4% over 2 years (95% CI, 2.5–8.4%). Conclusions: The Campaign was associated with sustained, continuous quality improvement in sepsis care. Although not necessarily cause and effect, a reduction in reported hospital mortality rates was associated with participation. The implications of this study may serve as an impetus for similar improvement efforts.Electronic supplementary material The online version of this article (doi:10.1007/s00134-009-1738-3) contains supplementary material, which is available to authorized users
- …