4,009 research outputs found
Quantitative Intensity Harmonization of Dopamine Transporter SPECT Images Using Gamma Mixture Models
PURPOSE:
Differences in site, device, and/or settings may cause large variations in the intensity profile of dopamine transporter (DAT) single-photon emission computed tomography (SPECT) images. However, the current standard to evaluate these images, the striatal binding ratio (SBR), does not efficiently account for this heterogeneity and the assessment can be unequivalent across distinct acquisition pipelines. In this work, we present a voxel-based automated approach to intensity normalize such type of data that improves on cross-session interpretation.
PROCEDURES:
The normalization method consists of a reparametrization of the voxel values based on the cumulative density function (CDF) of a Gamma distribution modeling the specific region intensity. The harmonization ability was tested in 1342 SPECT images from the PPMI repository, acquired with 7 distinct gamma camera models and at 24 different sites. We compared the striatal quantification across distinct cameras for raw intensities, SBR values, and after applying the Gamma CDF (GDCF) harmonization. As a proof-of-concept, we evaluated the impact of GCDF normalization in a classification task between controls and Parkinson disease patients.
RESULTS:
Raw striatal intensities and SBR values presented significant differences across distinct camera models. We demonstrate that GCDF normalization efficiently alleviated these differences in striatal quantification and with values constrained to a fixed interval [0, 1]. Also, our method allowed a fully automated image assessment that provided maximal classification ability, given by an area under the curve (AUC) of AUC = 0.94 when used mean regional variables and AUC = 0.98 when used voxel-based variables.
CONCLUSION:
The GCDF normalization method is useful to standardize the intensity of DAT SPECT images in an automated fashion and enables the development of unbiased algorithms using multicenter datasets. This method may constitute a key pre-processing step in the analysis of this type of images.Instituto de Salud Carlos III FI14/00497 MV15/00034Fondo Europeo de Desarrollo Regional FI14/00497 MV15/00034ISCIII-FEDER PI16/01575Wellcome Trust UK Strategic Award 098369/Z/12/ZNetherland Organization for Scientific Research NWO-Vidi 864-12-00
Studying DNA Double-Strand Break Repair: An Ever-Growing Toolbox
To ward off against the catastrophic consequences of persistent DNA double-strand breaks (DSBs), eukaryotic cells have developed a set of complex signaling networks that detect these DNA lesions, orchestrate cell cycle checkpoints and ultimately lead to their repair. Collectively, these signaling networks comprise the DNA damage response (DDR). The current knowledge of the molecular determinants and mechanistic details of the DDR owes greatly to the continuous development of ground-breaking experimental tools that couple the controlled induction of DSBs at distinct genomic positions with assays and reporters to investigate DNA repair pathways, their impact on other DNA-templated processes and the specific contribution of the chromatin environment. In this review, we present these tools, discuss their pros and cons and illustrate their contribution to our current understanding of the DDR.European Research Council (ERC-2014-CoG 647344
Galaxy size trends as a consequence of cosmology
We show that recently documented trends in galaxy sizes with mass and
redshift can be understood in terms of the influence of underlying cosmic
evolution; a holistic view which is complimentary to interpretations involving
the accumulation of discreet evolutionary processes acting on individual
objects. Using standard cosmology theory, supported with results from the
Millennium simulations, we derive expected size trends for collapsed cosmic
structures, emphasising the important distinction between these trends and the
assembly paths of individual regions. We then argue that the observed variation
in the stellar mass content of these structures can be understood to first
order in terms of natural limitations of cooling and feedback. But whilst these
relative masses vary by orders of magnitude, galaxy and host radii have been
found to correlate linearly. We explain how these two aspects will lead to
galaxy sizes that closely follow observed trends and their evolution, comparing
directly with the COSMOS and SDSS surveys. Thus we conclude that the observed
minimum radius for galaxies, the evolving trend in size as a function of mass
for intermediate systems, and the observed increase in the sizes of massive
galaxies, may all be considered an emergent consequence of the cosmic
expansion.Comment: 14 pages, 13 figures. Accepted by MNRA
Comparing PyMorph and SDSS photometry. II. The differences are more than semantics and are not dominated by intracluster light
The Sloan Digital Sky Survey pipeline photometry underestimates the
brightnesses of the most luminous galaxies. This is mainly because (i) the SDSS
overestimates the sky background and (ii) single or two-component Sersic-based
models better fit the surface brightness profile of galaxies, especially at
high luminosities, than does the de Vaucouleurs model used by the SDSS
pipeline. We use the PyMorph photometric reductions to isolate effect (ii) and
show that it is the same in the full sample as in small group environments, and
for satellites in the most massive clusters as well. None of these are expected
to be significantly affected by intracluster light (ICL). We only see an
additional effect for centrals in the most massive halos, but we argue that
even this is not dominated by ICL. Hence, for the vast majority of galaxies,
the differences between PyMorph and SDSS pipeline photometry cannot be ascribed
to the semantics of whether or not one includes the ICL when describing the
stellar mass of massive galaxies. Rather, they likely reflect differences in
star formation or assembly histories. Failure to account for the SDSS
underestimate has significantly biased most previous estimates of the SDSS
luminosity and stellar mass functions, and therefore Halo Model estimates of
the z ~ 0.1 relation between the mass of a halo and that of the galaxy at its
center. We also show that when one studies correlations, at fixed group mass,
with a quantity which was not used to define the groups, then selection effects
appear. We show why such effects arise, and should not be mistaken for physical
effects.Comment: 15 pages, 17 figures, accepted for publication in MNRAS. The PyMorph
luminosities and stellar masses are available at
https://www.physics.upenn.edu/~ameert/SDSS_PhotDec
Stroboscopic vision and sustained attention during coincidence-anticipation
We compared coincidence-anticipation performance in normal vision and stroboscopic vision as a function of time-on-task. Participants estimated the arrival time of a real object that moved with constant acceleration (-0.7, 0, +0.7 m/s2) in a pseudo-randomised order across 4 blocks of 30 trials in both vision conditions, received in a counter-balanced order. Participants (n=20) became more errorful (accuracy and variability) in the normal vision condition as a function of time-on-task, whereas performance was maintained in the stroboscopic vision condition. We interpret these data as showing that participants failed to maintain coincidence-anticipation performance in the normal vision condition due to monotony and attentional underload. In contrast, the stroboscopic vision condition placed a greater demand on visual-spatial memory for motion extrapolation, and thus participants did not experience the typical vigilance decrement in performance. While short-term adaptation effects from practicing in stroboscopic vision are promising, future work needs to consider for how long participants can maintain effortful processing, and whether there are negative carry-over effects from cognitive fatigue when transferring to normal visio
Large-scale and long-term coupled thermo-hydro-mechanic experiments with bentonite: the FEBEX mock-up test.
El artÃculo presenta, de manera resumida y no exhaustiva, el experimento de gran escala y larga duración realizado en el marco de los proyectos FEBEX y FEBEXII. Se diseñó un ensayo termo hidro-mecánico en maqueta para mejorar el conocimiento de los procesos termo-hidro-mecánicos (THM) acoplados que se producen en el sistema de barreras de ingenierÃa prevista en el concepto de referencia para un Almacenamiento Geológico Profundo en granito. El ensayo se realiza bajo condiciones controladas.Más de 500 sensores miden las variables termo-hidro-mecánicas más importantes en la barrera de bentonita, mientras los sistemas de adquisición y control gestionan y supervisan el ensayo y registran los datos generados. Los sensores se han elegido para soportar las duras condiciones ambientales impuestas (tensiones mecánicas, temperatura, salinidad y humedad elevadas).Tras más de ocho años y medio, el ensayo se ha convertido en una importante fuente de datos, tanto por el número de procesos involucrados como por la duración del experimento, incluyendo algunas aspectos cualitativos que parecen indicar implicaciones importantes de los aspectos térmicos sobre los procesos de transporte
- …