2,315 research outputs found

    A Substantial Population of Low Mass Stars in Luminous Elliptical Galaxies

    Full text link
    The stellar initial mass function (IMF) describes the mass distribution of stars at the time of their formation and is of fundamental importance for many areas of astrophysics. The IMF is reasonably well constrained in the disk of the Milky Way but we have very little direct information on the form of the IMF in other galaxies and at earlier cosmic epochs. Here we investigate the stellar mass function in elliptical galaxies by measuring the strength of the Na I doublet and the Wing-Ford molecular FeH band in their spectra. These lines are strong in stars with masses <0.3 Msun and weak or absent in all other types of stars. We unambiguously detect both signatures, consistent with previous studies that were based on data of lower signal-to-noise ratio. The direct detection of the light of low mass stars implies that they are very abundant in elliptical galaxies, making up >80% of the total number of stars and contributing >60% of the total stellar mass. We infer that the IMF in massive star-forming galaxies in the early Universe produced many more low mass stars than the IMF in the Milky Way disk, and was probably slightly steeper than the Salpeter form in the mass range 0.1 - 1 Msun.Comment: To appear in Natur

    Exoplanet Catalogues

    Full text link
    One of the most exciting developments in the field of exoplanets has been the progression from 'stamp-collecting' to demography, from discovery to characterisation, from exoplanets to comparative exoplanetology. There is an exhilaration when a prediction is confirmed, a trend is observed, or a new population appears. This transition has been driven by the rise in the sheer number of known exoplanets, which has been rising exponentially for two decades (Mamajek 2016). However, the careful collection, scrutiny and organisation of these exoplanets is necessary for drawing robust, scientific conclusions that are sensitive to the biases and caveats that have gone into their discovery. The purpose of this chapter is to discuss and demonstrate important considerations to keep in mind when examining or constructing a catalogue of exoplanets. First, we introduce the value of exoplanetary catalogues. There are a handful of large, online databases that aggregate the available exoplanet literature and render it digestible and navigable - an ever more complex task with the growing number and diversity of exoplanet discoveries. We compare and contrast three of the most up-to-date general catalogues, including the data and tools that are available. We then describe exoplanet catalogues that were constructed to address specific science questions or exoplanet discovery space. Although we do not attempt to list or summarise all the published lists of exoplanets in the literature in this chapter, we explore the case study of the NASA Kepler mission planet catalogues in some detail. Finally, we lay out some of the best practices to adopt when constructing or utilising an exoplanet catalogue.Comment: 14 pages, 6 figures. Invited review chapter, to appear in "Handbook of Exoplanets", edited by H.J. Deeg and J.A. Belmonte, section editor N. Batalh

    Search algorithms as a framework for the optimization of drug combinations

    Get PDF
    Combination therapies are often needed for effective clinical outcomes in the management of complex diseases, but presently they are generally based on empirical clinical experience. Here we suggest a novel application of search algorithms, originally developed for digital communication, modified to optimize combinations of therapeutic interventions. In biological experiments measuring the restoration of the decline with age in heart function and exercise capacity in Drosophila melanogaster, we found that search algorithms correctly identified optimal combinations of four drugs with only one third of the tests performed in a fully factorial search. In experiments identifying combinations of three doses of up to six drugs for selective killing of human cancer cells, search algorithms resulted in a highly significant enrichment of selective combinations compared with random searches. In simulations using a network model of cell death, we found that the search algorithms identified the optimal combinations of 6-9 interventions in 80-90% of tests, compared with 15-30% for an equivalent random search. These findings suggest that modified search algorithms from information theory have the potential to enhance the discovery of novel therapeutic drug combinations. This report also helps to frame a biomedical problem that will benefit from an interdisciplinary effort and suggests a general strategy for its solution.Comment: 36 pages, 10 figures, revised versio

    Cognitive function in a randomized trial of evolocumab

    Get PDF
    Inga Stuķēna as well as a complete list of investigators is provided in the Supplementary Appendix, available at NEJM.org. https://www.nejm.org/doi/suppl/10.1056/NEJMoa1701131/suppl_file/nejmoa1701131_appendix.pdf Funding Information: (Funded by Amgen; EBBINGHAUS ClinicalTrials.gov number, NCT02207634.) Supported by Amgen. We thank Sarah T. Farias, Ph.D., at UC Davis Health for providing the English-language and translated versions of the Everyday Cognition (ECog) tool. Publisher Copyright: Copyright © 2017 Massachusetts Medical Society.BACKGROUND: Findings from clinical trials of proprotein convertase subtilisin–kexin type 9 (PCSK9) inhibitors have led to concern that these drugs or the low levels of low-density lipoprotein (LDL) cholesterol that result from their use are associated with cognitive deficits. METHODS: In a subgroup of patients from a randomized, placebo-controlled trial of evolocumab added to statin therapy, we prospectively assessed cognitive function using the Cambridge Neuropsychological Test Automated Battery. The primary end point was the score on the spatial working memory strategy index of executive function (scores range from 4 to 28, with lower scores indicating a more efficient use of strategy and planning). Secondary end points were the scores for working memory (scores range from 0 to 279, with lower scores indicating fewer errors), episodic memory (scores range from 0 to 70, with lower scores indicating fewer errors), and psychomotor speed (scores range from 100 to 5100 msec, with faster times representing better performance). Assessments of cognitive function were performed at baseline, week 24, yearly, and at the end of the trial. The primary analysis was a noninferiority comparison of the mean change from baseline in the score on the spatial working memory strategy index of executive function between the patients who received evolocumab and those who received placebo; the noninferiority margin was set at 20% of the standard deviation of the score in the placebo group. RESULTS: A total of 1204 patients were followed for a median of 19 months; the mean (±SD) change from baseline over time in the raw score for the spatial working memory strategy index of executive function (primary end point) was −0.21±2.62 in the evolocumab group and −0.29±2.81 in the placebo group (P<0.001 for noninferiority; P=0.85 for superiority). There were no significant between-group differences in the secondary end points of scores for working memory (change in raw score, −0.52 in the evolocumab group and −0.93 in the placebo group), episodic memory (change in raw score, −1.53 and −1.53, respectively), or psychomotor speed (change in raw score, 5.2 msec and 0.9 msec, respectively). In an exploratory analysis, there were no associations between LDL cholesterol levels and cognitive changes. CONCLUSIONS: In a randomized trial involving patients who received either evolocumab or placebo in addition to statin therapy, no significant between-group difference in cognitive function was observed over a median of 19 months.publishersversionPeer reviewe

    Significant primordial star formation at redshifts z ~ 3-4

    Full text link
    Four recent observational results have challenged our understanding of high--redshift galaxies, as they require the presence of far more ultraviolet photons than should be emitted by normal stellar populations. First, there is significant ultraviolet emission from Lyman Break Galaxies (LBGs) at wavelenghts shorter than 912\AA. Second, there is strong Lyman alpha emission from extended ``blobs'' with little or no associated apparent ionizing continuum. Third, there is a population of galaxies with unusually strong Lyman-alpha emission lines. And fourth, there is a strong HeII (1640 \AA) emission line in a composite of LBGs. The proposed explanations for the first three observations are internally inconsistent, and the fourth puzzle has remained hitherto unexplained. Here we show that all four problems are resolved simultaneously if 10-30 percent of the stars in many galaxies at z ~ 3-4 are mainly primordial - unenriched by elements heavier than helium ('metals'). Most models of hierarchical galaxy formation assume efficient intra--galactic metal mixing, and therefore do not predict metal-free star formation at redshifts significantly below z ~5. Our results imply that micro-mixing of metals within galaxies is inefficient on a ~ Gyr time-scale, a conclusion that can be verified with higher resolution simulations, and future observations of the HeII emission line.Comment: Nature in press, March 23rd issue. Under Nature embargo. Reference and acknowledgement adde

    Arduous implementation: Does the Normalisation Process Model explain why it's so difficult to embed decision support technologies for patients in routine clinical practice

    Get PDF
    Background: decision support technologies (DSTs, also known as decision aids) help patients and professionals take part in collaborative decision-making processes. Trials have shown favorable impacts on patient knowledge, satisfaction, decisional conflict and confidence. However, they have not become routinely embedded in health care settings. Few studies have approached this issue using a theoretical framework. We explained problems of implementing DSTs using the Normalization Process Model, a conceptual model that focuses attention on how complex interventions become routinely embedded in practice.Methods: the Normalization Process Model was used as the basis of conceptual analysis of the outcomes of previous primary research and reviews. Using a virtual working environment we applied the model and its main concepts to examine: the 'workability' of DSTs in professional-patient interactions; how DSTs affect knowledge relations between their users; how DSTs impact on users' skills and performance; and the impact of DSTs on the allocation of organizational resources.Results: conceptual analysis using the Normalization Process Model provided insight on implementation problems for DSTs in routine settings. Current research focuses mainly on the interactional workability of these technologies, but factors related to divisions of labor and health care, and the organizational contexts in which DSTs are used, are poorly described and understood.Conclusion: the model successfully provided a framework for helping to identify factors that promote and inhibit the implementation of DSTs in healthcare and gave us insights into factors influencing the introduction of new technologies into contexts where negotiations are characterized by asymmetries of power and knowledge. Future research and development on the deployment of DSTs needs to take a more holistic approach and give emphasis to the structural conditions and social norms in which these technologies are enacte
    corecore