622 research outputs found
Strong Ultraviolet Pulse From a Newborn Type Ia Supernova
Type Ia supernovae are destructive explosions of carbon oxygen white dwarfs.
Although they are used empirically to measure cosmological distances, the
nature of their progenitors remains mysterious, One of the leading progenitor
models, called the single degenerate channel, hypothesizes that a white dwarf
accretes matter from a companion star and the resulting increase in its central
pressure and temperature ignites thermonuclear explosion. Here we report
observations of strong but declining ultraviolet emission from a Type Ia
supernova within four days of its explosion. This emission is consistent with
theoretical expectations of collision between material ejected by the supernova
and a companion star, and therefore provides evidence that some Type Ia
supernovae arise from the single degenerate channel.Comment: Accepted for publication on the 21 May 2015 issue of Natur
High-throughput identification of genotype-specific cancer vulnerabilities in mixtures of barcoded tumor cell lines.
Hundreds of genetically characterized cell lines are available for the discovery of genotype-specific cancer vulnerabilities. However, screening large numbers of compounds against large numbers of cell lines is currently impractical, and such experiments are often difficult to control. Here we report a method called PRISM that allows pooled screening of mixtures of cancer cell lines by labeling each cell line with 24-nucleotide barcodes. PRISM revealed the expected patterns of cell killing seen in conventional (unpooled) assays. In a screen of 102 cell lines across 8,400 compounds, PRISM led to the identification of BRD-7880 as a potent and highly specific inhibitor of aurora kinases B and C. Cell line pools also efficiently formed tumors as xenografts, and PRISM recapitulated the expected pattern of erlotinib sensitivity in vivo
Impacts of large-scale climatic disturbances on the terrestrial carbon cycle
BACKGROUND: The amount of carbon dioxide in the atmosphere steadily increases as a consequence of anthropogenic emissions but with large interannual variability caused by the terrestrial biosphere. These variations in the CO(2 )growth rate are caused by large-scale climate anomalies but the relative contributions of vegetation growth and soil decomposition is uncertain. We use a biogeochemical model of the terrestrial biosphere to differentiate the effects of temperature and precipitation on net primary production (NPP) and heterotrophic respiration (Rh) during the two largest anomalies in atmospheric CO(2 )increase during the last 25 years. One of these, the smallest atmospheric year-to-year increase (largest land carbon uptake) in that period, was caused by global cooling in 1992/93 after the Pinatubo volcanic eruption. The other, the largest atmospheric increase on record (largest land carbon release), was caused by the strong El Niño event of 1997/98. RESULTS: We find that the LPJ model correctly simulates the magnitude of terrestrial modulation of atmospheric carbon anomalies for these two extreme disturbances. The response of soil respiration to changes in temperature and precipitation explains most of the modelled anomalous CO(2 )flux. CONCLUSION: Observed and modelled NEE anomalies are in good agreement, therefore we suggest that the temporal variability of heterotrophic respiration produced by our model is reasonably realistic. We therefore conclude that during the last 25 years the two largest disturbances of the global carbon cycle were strongly controlled by soil processes rather then the response of vegetation to these large-scale climatic events
Observational and Physical Classification of Supernovae
This chapter describes the current classification scheme of supernovae (SNe).
This scheme has evolved over many decades and now includes numerous SN Types
and sub-types. Many of these are universally recognized, while there are
controversies regarding the definitions, membership and even the names of some
sub-classes; we will try to review here the commonly-used nomenclature, noting
the main variants when possible. SN Types are defined according to
observational properties; mostly visible-light spectra near maximum light, as
well as according to their photometric properties. However, a long-term goal of
SN classification is to associate observationally-defined classes with specific
physical explosive phenomena. We show here that this aspiration is now finally
coming to fruition, and we establish the SN classification scheme upon direct
observational evidence connecting SN groups with specific progenitor stars.
Observationally, the broad class of Type II SNe contains objects showing strong
spectroscopic signatures of hydrogen, while objects lacking such signatures are
of Type I, which is further divided to numerous subclasses. Recently a class of
super-luminous SNe (SLSNe, typically 10 times more luminous than standard
events) has been identified, and it is discussed. We end this chapter by
briefly describing a proposed alternative classification scheme that is
inspired by the stellar classification system. This system presents our
emerging physical understanding of SN explosions, while clearly separating
robust observational properties from physical inferences that can be debated.
This new system is quantitative, and naturally deals with events distributed
along a continuum, rather than being strictly divided into discrete classes.
Thus, it may be more suitable to the coming era where SN numbers will quickly
expand from a few thousands to millions of events.Comment: Extended final draft of a chapter in the "SN Handbook". Comments most
welcom
Does minimally invasive lumbar disc surgery result in less muscle injury than conventional surgery? A randomized controlled trial
The concept of minimally invasive lumbar disc surgery comprises reduced muscle injury. The aim of this study was to evaluate creatine phosphokinase (CPK) in serum and the cross-sectional area (CSA) of the multifidus muscle on magnetic resonance imaging as indicators of muscle injury. We present the results of a double-blind randomized trial on patients with lumbar disc herniation, in which tubular discectomy and conventional microdiscectomy were compared. In 216 patients, CPK was measured before surgery and at day 1 after surgery. In 140 patients, the CSA of the multifidus muscle was measured at the affected disc level before surgery and at 1 year after surgery. The ratios (i.e. post surgery/pre surgery) of CPK and CSA were used as outcome measures. The multifidus atrophy was classified into three grades ranging from 0 (normal) to 3 (severe atrophy), and the difference between post and pre surgery was used as an outcome. Patients’ low-back pain scores on the visual analogue scale (VAS) were documented before surgery and at various moments during follow-up. Tubular discectomy compared with conventional microdiscectomy resulted in a nonsignificant difference in CPK ratio, although the CSA ratio was significantly lower in tubular discectomy. At 1 year, there was no difference in atrophy grade between both groups nor in the percentage of patients showing an increased atrophy grade (14% tubular vs. 18% conventional). The postoperative low-back pain scores on the VAS improved in both groups, although the 1-year between-group mean difference of improvement was 3.5 mm (95% CI; 1.4–5.7 mm) in favour of conventional microdiscectomy. In conclusion, tubular discectomy compared with conventional microdiscectomy did not result in reduced muscle injury. Postoperative evaluation of CPK and the multifidus muscle showed similar results in both groups, although patients who underwent tubular discectomy reported more low-back pain during the first year after surgery
Early readmission and length of hospitalization practices in the Dialysis Outcomes and Practice Patterns Study (DOPPS)
Background: Rising hospital care costs have created pressure to shorten hospital stays and emphasize outpatient care. This study tests the hypothesis that shorter median length of stay (LOS) as a dialysis facility practice is associated with higher rates of early readmission. Methods: Readmission within 30 days of each hospitalization was evaluated for participants in the Dialysis Outcomes and Practice Patterns Study, an observational study of randomly selected hemodialysis patients in the United States (142 facilities, 5095 patients with hospitalizations), five European countries (101 facilities, 2281 patients with hospitalizations), and Japan (58 facilities, 883 patients with hospitalizations). Associations between median facility LOS (estimated from all hospitalizations at the facility and interpreted as a dialysis facility practice pattern) and odds of readmission were assessed using logistic regression, adjusted for patient characteristics and the LOS of each index hospitalization. Results: Risk of readmission was directly and significantly associated with LOS of the index hospitalization (adjusted odds ratio [AOR] 1.005 per day in median facility LOS, p = 0.007) and inversely associated with median facility LOS (AOR = 0.974 per day, p = 0.016). This latter association was strongest for US hemodialysis centers (AOR = 0.954 per day, p = 0.015). Conclusions: Dialysis facilities with shorter median hospital LOS for their patients have higher odds of readmission, particularly in the United States, where there is greater pressure to shorten LOS. The determinants and consequences of practices related to hospital LOS for hemodialysis patients should be further studied.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/73641/1/j.1492-7535.2004.01107.x.pd
Neurocognitive Consequences of HIV Infection in Older Adults: An Evaluation of the “Cortical” Hypothesis
The incidence and prevalence of older adults living with HIV infection is increasing. Recent reports of increased neuropathologic and metabolic alterations in older HIV+ samples, including increased cortical beta-amyloid, have led some researchers to suggest that aging with HIV may produce a neuropsychological profile akin to that which is observed in “cortical” dementias (e.g., impairment in memory consolidation). To evaluate this possibility, we examined four groups classified by HIV serostatus and age (i.e., younger ≤40 years and older ≥50 years): (1) Younger HIV− (n = 24); (2) Younger HIV+ (n = 24); (3) Older HIV− (n = 20); and (4) Older HIV+ (n = 48). Main effects of aging were observed on episodic learning and memory, executive functions, and visuoconstruction, and main effects of HIV were observed on measures of verbal learning and memory. The interaction of age and HIV was observed on a measure of verbal recognition memory, which post hoc analyses showed to be exclusively attributed to the superior performance of the younger HIV seronegative group. Thus, in this sample of older HIV-infected individuals, the combined effects of HIV and aging do not appear to result in a “cortical” pattern of cognitive deficits
Attainment of clinical performance targets and improvement in clinical outcomes and resource use in hemodialysis care: a prospective cohort study
BACKGROUND: Clinical performance targets are intended to improve patient outcomes in chronic disease through quality improvement, but evidence of an association between multiple target attainment and patient outcomes in routine clinical practice is often lacking. METHODS: In a national prospective cohort study (ESRD Quality, or EQUAL), we examined whether attainment of multiple targets in 668 incident hemodialysis patients from 74 U.S. not-for-profit dialysis clinics was associated with better outcomes. We measured whether the following accepted clinical performance targets were met at 6 months after study enrollment: albumin (≥4.0 g/dl), hemoglobin (≥11 g/dl), calcium-phosphate product (<55 mg(2)/dl(2)), dialysis dose (Kt/V≥1.2), and vascular access type (fistula). Outcomes included mortality, hospital admissions, hospital days, and hospital costs. RESULTS: Attainment of each of the five targets was associated individually with better outcomes; e.g., patients who attained the albumin target had decreased mortality [relative hazard (RH) = 0.55, 95% confidence interval (CI), 0.41–0.75], hospital admissions [incidence rate ratio (IRR) = 0.67, 95% CI, 0.62–0.73], hospital days (IRR = 0.61, 95% CI, 0.58–0.63), and hospital costs (average annual cost reduction = $3,282, P = 0.002), relative to those who did not. Increasing numbers of targets attained were also associated, in a graded fashion, with decreased mortality (P = 0.030), fewer hospital admissions and days (P < 0.001 for both), and lower costs (P = 0.029); these trends remained statistically significant for all outcomes after adjustment (P < 0.001), except cost, which was marginally significant (P = 0.052). CONCLUSION: Attainment of more clinical performance targets, regardless of which targets, was strongly associated with decreased mortality, hospital admissions, and resource use in hemodialysis patients
Incremental dialysis for preserving residual kidney function-Does one size fit all when initiating dialysis?
While many patients have substantial residual kidney function (RKF) when initiating hemodialysis (HD), most patients with end stage renal disease in the United States are initiated on 3-times per week conventional HD regimen, with little regard to RKF or patient preference. RKF is associated with many benefits including survival, volume control, solute clearance, and reduced inflammation. Several strategies have been recommended to preserve RKF after HD initiation, including an incremental approach to HD initiation. Incremental HD prescriptions are personalized to achieve adequate volume control and solute clearance with consideration to a patient's endogenous renal function. This allows the initial use of less frequent and/or shorter HD treatment sessions. Regular measurement of RKF is important because HD frequency needs to be increased as RKF inevitably declines. We narratively review the results of 12 observational cohort studies of twice-weekly compared to thrice-weekly HD. Incremental HD is associated with several benefits including preservation of RKF as well as extending the event-free life of arteriovenous fistulas and grafts. Patient survival and quality of life, however, has been variably associated with incremental HD. Serious risks must also be considered, including increased hospitalization and mortality perhaps related to fluid and electrolyte shifts after a long interdialytic interval. On the basis of the above literature review, and our clinical experience, we suggest patient characteristics which may predict favorable outcomes with an incremental approach to HD. These include substantial RKF, adequate volume control, lack of significant anemia/electrolyte imbalance, satisfactory health-related quality of life, low comorbid disease burden, and good nutritional status without evidence of hypercatabolism. Clinicians should engage patients in on-going conversations to prepare for incremental HD initiation and to ensure a smooth transition to thrice-weekly HD when needed
- …