3,766 research outputs found
Interacting dark matter contribution to the Galactic 511 keV gamma ray emission: constraining the morphology with INTEGRAL/SPI observations
We compare the full-sky morphology of the 511 keV gamma ray excess measured
by the INTEGRAL/SPI experiment to predictions of models based on dark matter
(DM) scatterings that produce low-energy positrons: either MeV-scale DM that
annihilates directly into e+e- pairs, or heavy DM that inelastically scatters
into an excited state (XDM) followed by decay into e+e- and the ground state.By
direct comparison to the data, we find that such explanations are consistent
with dark matter halo profiles predicted by numerical many-body simulations for
a Milky Way-like galaxy. Our results favor an Einasto profile over the cuspier
NFW distribution and exclude decaying dark matter scenarios whose predicted
spatial distribution is too broad. We obtain a good fit to the shape of the
signal using six fewer degrees of freedom than previous empirical fits to the
511 keV data. We find that the ratio of flux at Earth from the galactic bulge
to that of the disk is between 1.9 and 2.4, taking into account that 73% of the
disk contribution may be attributed to the beta decay of radioactive 26Al.Comment: 7 pages, 4 figures. Includes minor corrections, and a discussion of
threshold energies in XDM models. Published in JCA
On morphological hierarchical representations for image processing and spatial data clustering
Hierarchical data representations in the context of classi cation and data
clustering were put forward during the fties. Recently, hierarchical image
representations have gained renewed interest for segmentation purposes. In this
paper, we briefly survey fundamental results on hierarchical clustering and
then detail recent paradigms developed for the hierarchical representation of
images in the framework of mathematical morphology: constrained connectivity
and ultrametric watersheds. Constrained connectivity can be viewed as a way to
constrain an initial hierarchy in such a way that a set of desired constraints
are satis ed. The framework of ultrametric watersheds provides a generic scheme
for computing any hierarchical connected clustering, in particular when such a
hierarchy is constrained. The suitability of this framework for solving
practical problems is illustrated with applications in remote sensing
Recommended from our members
Demonstrations to Support Change to the >260 ppm Mercury Treatment Regulations
The U.S. Department of Energy (DOE) and the U. S. Environmental Protection Agency (EPA) are working together to justify a change in the Land Disposal Restriction for High Mercury (>260 ppm mercury) waste. The present regulation that requires roasting or retorting is based on recovering and recycling the mercury in the waste. However, most of DOEās High Mercury waste is radioactively contaminated, eliminating the possibility of its recycle. The radioactive mercury recovered must be amalgamated and disposed. In addition, concern over fugitive emissions from retorting and roasting operations has raised the question of whether such processing is environmentally sound. A change to the regulation to allow stabilization and disposal would reduce the overall environmental threat, if the stabilization process can reduce the leachability of the mercury to regulatory levels. Demonstrations are underway to gather data showing that the High Mercury waste can be safely stabilized. At the same time, comparison tests are being conducted using an improved form of the baseline retorting technology to better quantify the fugitive emission problem and determine the full capability of thermal desorption systems. A first round of demonstrations stabilizing mercury in soil from Brookhaven National Laboratory (BNL) has been completed. Four groups demonstrated their process on the waste: 1) BNL demonstrated its Sulfur Polymer Stabilization/Solidification process; 2) Nuclear Fuel Services used their DeHg (de-merk) process, 3) Allied Technology Group used chemical stabilization, and 4) Sepradyne demonstrated their vacuum thermal desorption system. All groups were successful in their tests, reaching regulatory levels for mercury leachability. Data for each group will be presented. DOE, EPA, and the University of Cincinnati are presently working on another series of tests involving treatment of surrogate sludge and soil by commercial vendors. Protocols that better determine the waste formās ability to withstand leaching are being used to analyze the stabilized surrogates. Results of these and the previous demonstrations will be used to determine whether the High Mercury treatment regulation can be safely changed
Analyzing 2D gel images using a two-component empirical bayes model
<p>Abstract</p> <p>Background</p> <p>Two-dimensional polyacrylomide gel electrophoresis (2D gel, 2D PAGE, 2-DE) is a powerful tool for analyzing the proteome of a organism. Differential analysis of 2D gel images aims at finding proteins that change under different conditions, which leads to large-scale hypothesis testing as in microarray data analysis. Two-component empirical Bayes (EB) models have been widely discussed for large-scale hypothesis testing and applied in the context of genomic data. They have not been implemented for the differential analysis of 2D gel data. In the literature, the mixture and null densities of the test statistics are estimated separately. The estimation of the mixture density does not take into account assumptions about the null density. Thus, there is no guarantee that the estimated null component will be no greater than the mixture density as it should be.</p> <p>Results</p> <p>We present an implementation of a two-component EB model for the analysis of 2D gel images. In contrast to the published estimation method, we propose to estimate the mixture and null densities simultaneously using a constrained estimation approach, which relies on an iteratively re-weighted least-squares algorithm. The assumption about the null density is naturally taken into account in the estimation of the mixture density. This strategy is illustrated using a set of 2D gel images from a factorial experiment. The proposed approach is validated using a set of simulated gels.</p> <p>Conclusions</p> <p>The two-component EB model is a very useful for large-scale hypothesis testing. In proteomic analysis, the theoretical null density is often not appropriate. We demonstrate how to implement a two-component EB model for analyzing a set of 2D gel images. We show that it is necessary to estimate the mixture density and empirical null component simultaneously. The proposed constrained estimation method always yields valid estimates and more stable results. The proposed estimation approach proposed can be applied to other contexts where large-scale hypothesis testing occurs.</p
'It's small steps, but that leads to bigger changes': evaluation of a nurture group intervention
This article presents the results of a small-scale research project that aimed to evaluate the effectiveness of a part-time nurture group recently established in one primary school. Qualitative interviews were used to gather staff, pupil and parental perceptions about the nurture group. These focused on what difference the nurture group was making to the pupils concerned but also on views about what factors contributed to noted changes. All stakeholder groups identified areas of development for nurture group pupils. These included improved social skills, growth in personal confidence, greater engagement with academic tasks, and fewer incidences of undesirable behaviour. The evidence gathered so far suggests that the nurture group offered an effective way of supporting the social, emotional and behavioural skills of a group of 'at-risk' pupils. A range of factors thought to be important in achieving these outcomes are highlighted. These align broadly with the theoretical underpinnings of nurture groups
Recommended from our members
Biology and evolutionary games
This chapter surveys some evolutionary games used in biological sciences. These include the Hawk-Dove game, the Prisonerās Dilemma, RockāPaperāScissors, the war of attrition, the Habitat Selection game, predatorprey games, and signalling games
Temporal and Geographic variation in the validity and internal consistency of the Nursing Home Resident Assessment Minimum Data Set 2.0
<p>Abstract</p> <p>Background</p> <p>The Minimum Data Set (MDS) for nursing home resident assessment has been required in all U.S. nursing homes since 1990 and has been universally computerized since 1998. Initially intended to structure clinical care planning, uses of the MDS expanded to include policy applications such as case-mix reimbursement, quality monitoring and research. The purpose of this paper is to summarize a series of analyses examining the internal consistency and predictive validity of the MDS data as used in the "real world" in all U.S. nursing homes between 1999 and 2007.</p> <p>Methods</p> <p>We used person level linked MDS and Medicare denominator and all institutional claim files including inpatient (hospital and skilled nursing facilities) for all Medicare fee-for-service beneficiaries entering U.S. nursing homes during the period 1999 to 2007. We calculated the sensitivity and positive predictive value (PPV) of diagnoses taken from Medicare hospital claims and from the MDS among all new admissions from hospitals to nursing homes and the internal consistency (alpha reliability) of pairs of items within the MDS that logically should be related. We also tested the internal consistency of commonly used MDS based multi-item scales and examined the predictive validity of an MDS based severity measure viz. one year survival. Finally, we examined the correspondence of the MDS discharge record to hospitalizations and deaths seen in Medicare claims, and the completeness of MDS assessments upon skilled nursing facility (SNF) admission.</p> <p>Results</p> <p>Each year there were some 800,000 new admissions directly from hospital to US nursing homes and some 900,000 uninterrupted SNF stays. Comparing Medicare enrollment records and claims with MDS records revealed reasonably good correspondence that improved over time (by 2006 only 3% of deaths had no MDS discharge record, only 5% of SNF stays had no MDS, but over 20% of MDS discharges indicating hospitalization had no associated Medicare claim). The PPV and sensitivity levels of Medicare hospital diagnoses and MDS based diagnoses were between .6 and .7 for major diagnoses like CHF, hypertension, diabetes. Internal consistency, as measured by PPV, of the MDS ADL items with other MDS items measuring impairments and symptoms exceeded .9. The Activities of Daily Living (ADL) long form summary scale achieved an alpha inter-consistency level exceeding .85 and multi-item scale alpha levels of .65 were achieved for well being and mood, and .55 for behavior, levels that were sustained even after stratification by ADL and cognition. The Changes in Health, End-stage disease and Symptoms and Signs (CHESS) index, a summary measure of frailty was highly predictive of one year survival.</p> <p>Conclusion</p> <p>The MDS demonstrates a reasonable level of consistency both in terms of how well MDS diagnoses correspond to hospital discharge diagnoses and in terms of the internal consistency of functioning and behavioral items. The level of alpha reliability and validity demonstrated by the scales suggest that the data can be useful for research and policy analysis. However, while improving, the MDS discharge tracking record should still not be used to indicate Medicare hospitalizations or mortality. It will be important to monitor the performance of the MDS 3.0 with respect to consistency, reliability and validity now that it has replaced version 2.0, using these results as a baseline that should be exceeded.</p
Patient-reported outcome measures for rehabilitation hospitals: a scoping review
Purpose: Patient-Reported Outcome Measures (PROMs) aim to facilitate patient-centred care by objectively measuring consumer views of their health and well-being in addition to monitoring patient outcomes. This review sought to identify PROMs suitable for adults receiving inpatient rehabilitation to guide clinical practice and consumer engagement in healthcare. Material and methods: The scoping review methodology was guided by PRISMA-ScR and JBI guidelines. Seven electronic databases (Medline, Embase, CINAHL, PsycInfo, Cochrane CENTRAL, Cochrane Reviews, Scopus) and grey literature were searched from January 2000 to October 2022. Two reviewers independently screened the articles. Data were extracted and summarised thematically to derive clinical implications. Results: Of 9096 records retrieved, 51 articles were included for analysis. Fifty-nine key PROMs were identified in the rehabilitation literature. The Euro-QOL 5D was reported for more than one-third of the studies. There were numerous condition-specific PROMs pertaining to health conditions such as arthritis, stroke and cardiac failure or symptoms such as pain, depression, fatigue and weakness. Most rehabilitation trials reported using PROMs before therapy and after discharge to monitor within-admission changes. Conclusions: PROMs are frequently used in rehabilitation research and have the potential to yield helpful data for the evaluation of clinical services
Cross-site comparison of ribosomal depletion kits for Illumina RNAseq library construction
Background
Ribosomal RNA (rRNA) comprises at least 90% of total RNA extracted from mammalian tissue or cell line samples. Informative transcriptional profiling using massively parallel sequencing technologies requires either enrichment of mature poly-adenylated transcripts or targeted depletion of the rRNA fraction. The latter method is of particular interest because it is compatible with degraded samples such as those extracted from FFPE and also captures transcripts that are not poly-adenylated such as some non-coding RNAs. Here we provide a cross-site study that evaluates the performance of ribosomal RNA removal kits from Illumina, Takara/Clontech, Kapa Biosystems, Lexogen, New England Biolabs and Qiagen on intact and degraded RNA samples.
Results
We find that all of the kits are capable of performing significant ribosomal depletion, though there are differences in their ease of use. All kits were able to remove ribosomal RNA to below 20% with intact RNA and identify ~ā14,000 protein coding genes from the Universal Human Reference RNA sample at >1FPKM. Analysis of differentially detected genes between kits suggests that transcript length may be a key factor in library production efficiency.
Conclusions
These results provide a roadmap for labs on the strengths of each of these methods and how best to utilize them. Keywords: RNAseqr; RNA depletion; Illumina; NGS; ABRF; TranscriptomicsNational Cancer Institute (U.S.) (Grant P30-CA14051)National Institute of Environmental Health Sciences (Grant P30-ES002109
- ā¦