235 research outputs found
Therapeutic potential of fetal liver cells transplantation in hemophilia A mice
: Hemophilia A (HA) cell therapy approaches in pediatric individuals require suitable factor (F)VIII-producing cells for stable engraftment. Liver sinusoidal endothelial cells (LSEC) and hematopoietic stem cells (HSC) have been demonstrated to be suitable for the treatment of adult HA-mice. However, after transplantation in busulfan (BU)-conditioned newborn mice, adult LSEC/HSC cannot efficiently engraft, while murine fetal liver (FL) hemato/vascular cells from embryonic day 11-13 of gestation (E11-E13), strongly engraft the hematopoietic and endothelial compartments while also secreting FVIII. Our aim was to investigate the engraftment of FL cells in newborn HA mice for obtaining a suitable "proof of concept" for the development of a new HA treatment in neonates. Hence, we transplanted FLE11 or E13 cells and adult bone marrow (BM) cells into newborn HA mice with or without BU preconditioning. The engraftment levels and FVIII activity was assessed starting from 6 weeks after transplantation. FLE11-E13+BU-transplanted newborns reached up to 95% engraftment with stable FVIII activity levels observed for 16 months. FLE13 cells showed engraftment ability even in absence of BU preconditioning, while FLE11 cells did not. BM+BU transplanted newborn HA mice showed high levels of engraftment; nevertheless, in contrast to FL cells, BM cells cannot engraft HA newborns in non-conditioning regimen. Finally, none of the transplanted mice developed anti-FVIII antibodies. Overall, this study sheds some light on the therapeutic potential of healthy FL cells in the cure of HA neonatal/pediatric patients
Skunk River Review 2008-2009, vol 21
https://openspace.dmacc.edu/skunkriver/1014/thumbnail.jp
Dataset for the reporting of carcinoma of renal tubular origin:recommendations from the International Collaboration on Cancer Reporting (ICCR)
AIMS The International Collaboration on Cancer Reporting (ICCR) has provided detailed datasets based upon the published reporting protocols of the Royal College of Pathologists, The Royal College of Pathologists of Australasia and the College of American Pathologists.
METHODS AND RESULTS The dataset for carcinomas of renal tubular origin treated by nephrectomy was developed to provide a minimum structured reporting template suitable for international use and incorporated recommendations from the 2012 Vancouver Consensus Conference of the International Society of Urological Pathology and the fourth edition of the World Health Organization Bluebook on tumours of the urinary and male genital systems published in 2016. Reporting elements were divided into those, which are Required and Recommended components of the report. Required elements are; specimen laterality, operative procedure, attached structures, tumour focality, tumour dimension, tumour type, WHO/ISUP grade, sarcomatoid/rhabdoid morphology, tumour necrosis, extent of invasion, lymph node status, surgical margin status, AJCC TNM staging and co-existing pathology. Recommended reporting elements are; pre-operative treatment, details of tissue removed for experimental purposes prior to submission, site of tumour(s) block identification key, extent of sarcomatoid and/or rhabdoid component, extent of necrosis, presence of tumour in renal vein wall, lymphovascular invasion and lymph node status (size of largest focus and extranodal extension).
CONCLUSIONS It is anticipated that the implementation of this dataset in routine clinical practise will inform patient treatment as well as provide standardized information relating to outcome prediction. The harmonisation of data reporting should also facilitate international research collaborations. This article is protected by copyright. All rights reserved
LSST: from Science Drivers to Reference Design and Anticipated Data Products
(Abridged) We describe here the most ambitious survey currently planned in
the optical, the Large Synoptic Survey Telescope (LSST). A vast array of
science will be enabled by a single wide-deep-fast sky survey, and LSST will
have unique survey capability in the faint time domain. The LSST design is
driven by four main science themes: probing dark energy and dark matter, taking
an inventory of the Solar System, exploring the transient optical sky, and
mapping the Milky Way. LSST will be a wide-field ground-based system sited at
Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m
effective) primary mirror, a 9.6 deg field of view, and a 3.2 Gigapixel
camera. The standard observing sequence will consist of pairs of 15-second
exposures in a given field, with two such visits in each pointing in a given
night. With these repeats, the LSST system is capable of imaging about 10,000
square degrees of sky in a single filter in three nights. The typical 5
point-source depth in a single visit in will be (AB). The
project is in the construction phase and will begin regular survey operations
by 2022. The survey area will be contained within 30,000 deg with
, and will be imaged multiple times in six bands, ,
covering the wavelength range 320--1050 nm. About 90\% of the observing time
will be devoted to a deep-wide-fast survey mode which will uniformly observe a
18,000 deg region about 800 times (summed over all six bands) during the
anticipated 10 years of operations, and yield a coadded map to . The
remaining 10\% of the observing time will be allocated to projects such as a
Very Deep and Fast time domain survey. The goal is to make LSST data products,
including a relational database of about 32 trillion observations of 40 billion
objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures
available from https://www.lsst.org/overvie
Calibrating an updated SPH scheme within GCD+
We adapt a modern scheme of smoothed particle hydrodynamics (SPH) to our tree
N-body/SPH galactic chemodynamics code GCD+. The applied scheme includes imple-
mentations of the artificial viscosity switch and artificial thermal
conductivity pro- posed by Morris & Monaghan (1997), Rosswog & Price (2007) and
Price (2008), to model discontinuities and Kelvin-Helmholtz instabilities more
accurately. We first present hydrodynamics test simulations and contrast the
results to runs undertaken without artificial viscosity switch or thermal
conduction. In addition, we also explore the different levels of smoothing by
adopting larger or smaller smoothing lengths, i.e. a larger or smaller number
of neighbour particles, Nnb. We demonstrate that the new version of GCD+ is
capable of modelling Kelvin-Helmholtz instabilities to a simi- lar level as the
mesh code, Athena. From the Gresho vortex, point-like explosion and
self-similar collapse tests, we conclude that setting the smoothing length to
keep the number of neighbour particles as high as Nnb~58 is preferable to
adopting smaller smoothing lengths. We present our optimised parameter sets
from the hydrodynamics tests.Comment: 14 pages, 2 tables, 15 figures, MNRAS in pres
Chemical evolution of galaxies. I. A composition-dependent SPH model for chemical evolution and cooling
We describe an SPH model for chemical enrichment and radiative cooling in
cosmological simulations of structure formation. This model includes: i) the
delayed gas restitution from stars by means of a probabilistic approach
designed to reduce the statistical noise and, hence, to allow for the study of
the inner chemical structure of objects with moderately high numbers of
particles; ii) the full dependence of metal production on the detailed chemical
composition of stellar particles by using, for the first time in SPH codes, the
Qij matrix formalism that relates each nucleosynthetic product to its sources;
and iii) the full dependence of radiative cooling on the detailed chemical
composition of gas particles, achieved through a fast algorithm using a new
metallicity parameter zeta(T) that gives the weight of each element on the
total cooling function. The resolution effects and the results obtained from
this SPH chemical model have been tested by comparing its predictions in
different problems with known theoretical solutions. We also present some
preliminary results on the chemical properties of elliptical galaxies found in
self-consistent cosmological simulations. Such simulations show that the above
zeta-cooling method is important to prevent an overestimation of the
metallicity-dependent cooling rate, whereas the Qij formalism is important to
prevent a significant underestimation of the [alpha/Fe] ratio in simulated
galaxy-like objects.Comment: 19 pages, 22 figures, 2 tables; accepted for publication in MNRA
Euclid: The importance of galaxy clustering and weak lensing cross-correlations within the photometric Euclid survey
Context. The data from the Euclid mission will enable the measurement of the angular positions and weak lensing shapes of over a billion galaxies,
with their photometric redshifts obtained together with ground-based observations. This large dataset, with well-controlled systematic effects, will
allow for cosmological analyses using the angular clustering of galaxies (GCph) and cosmic shear (WL). For Euclid, these two cosmological probes
will not be independent because they will probe the same volume of the Universe. The cross-correlation (XC) between these probes can tighten
constraints and is therefore important to quantify their impact for Euclid.
Aims. In this study, we therefore extend the recently published Euclid forecasts by carefully quantifying the impact of XC not only on the
final parameter constraints for different cosmological models, but also on the nuisance parameters. In particular, we aim to decipher the amount
of additional information that XC can provide for parameters encoding systematic effects, such as galaxy bias, intrinsic alignments (IAs), and
knowledge of the redshift distributions.
Methods. We follow the Fisher matrix formalism and make use of previously validated codes. We also investigate a different galaxy bias model,
which was obtained from the Flagship simulation, and additional photometric-redshift uncertainties; we also elucidate the impact of including the
XC terms on constraining these latter.
Results. Starting with a baseline model, we show that the XC terms reduce the uncertainties on galaxy bias by âŒ17% and the uncertainties on IA
by a factor of about four. The XC terms also help in constraining the Îł parameter for minimal modified gravity models. Concerning galaxy bias,
we observe that the role of the XC terms on the final parameter constraints is qualitatively the same irrespective of the specific galaxy-bias model
used. For IA, we show that the XC terms can help in distinguishing between different models, and that if IA terms are neglected then this can lead
to significant biases on the cosmological parameters. Finally, we show that the XC terms can lead to a better determination of the mean of the
photometric galaxy distributions.
Conclusions. We find that the XC between GCph and WL within the Euclid survey is necessary to extract the full information content from the data
in future analyses. These terms help in better constraining the cosmological model, and also lead to a better understanding of the systematic effects
that contaminate these probes. Furthermore, we find that XC significantly helps in constraining the mean of the photometric-redshift distributions,
but, at the same time, it requires more precise knowledge of this mean with respect to single probes in order not to degrade the final âfigure of
meritâ
Euclid: Constraining ensemble photometric redshift distributions with stacked spectroscopy
Context. The ESA Euclid mission will produce photometric galaxy samples over 15 000 square degrees of the sky that will be rich for clustering and weak lensing statistics. The accuracy of the cosmological constraints derived from these measurements will depend on the knowledge of the underlying redshift distributions based on photometric redshift calibrations. Aims. A new approach is proposed to use the stacked spectra from Euclid slitless spectroscopy to augment broad-band photometric information to constrain the redshift distribution with spectral energy distribution fitting. The high spectral resolution available in the stacked spectra complements the photometry and helps to break the colour-redshift degeneracy and constrain the redshift distribution of galaxy samples. Methods. We modelled the stacked spectra as a linear mixture of spectral templates. The mixture may be inverted to infer the underlying redshift distribution using constrained regression algorithms. We demonstrate the method on simulated Vera C. Rubin Observatory and Euclid mock survey data sets based on the Euclid Flagship mock galaxy catalogue. We assess the accuracy of the reconstruction by considering the inference of the baryon acoustic scale from angular two-point correlation function measurements. Results. We selected mock photometric galaxy samples at redshift za>a1 using the self-organising map algorithm. Considering the idealised case without dust attenuation, we find that the redshift distributions of these samples can be recovered with 0.5% accuracy on the baryon acoustic scale. The estimates are not significantly degraded by the spectroscopic measurement noise due to the large sample size. However, the error degrades to 2% when the dust attenuation model is left free. We find that the colour degeneracies introduced by attenuation limit the accuracy considering the wavelength coverage of Euclid near-infrared spectroscopy
Euclid: Forecast constraints on consistency tests of the â§cDM model
Context. The standard cosmological model is based on the fundamental assumptions of a spatially homogeneous and isotropic universe on large scales. An observational detection of a violation of these assumptions at any redshift would immediately indicate the presence of new physics. Aims. We quantify the ability of the Euclid mission, together with contemporary surveys, to improve the current sensitivity of null tests of the canonical cosmological constant ⧠and the cold dark matter (⧠CDM) model in the redshift range 0 < 1.8. Methods. We considered both currently available data and simulated Euclid and external data products based on a â§CDM fiducial model, an evolving dark energy model assuming the Chevallier-Polarski-Linder parameterization or an inhomogeneous LemaĂźtre-Tolman-Bondi model with a cosmological constant â§, and carried out two separate but complementary analyses: A machine learning reconstruction of the null tests based on genetic algorithms, and a theory-Agnostic parametric approach based on Taylor expansion and binning of the data, in order to avoid assumptions about any particular model. Results. We find that in combination with external probes, Euclid can improve current constraints on null tests of the â§CDM by approximately a factor of three when using the machine learning approach and by a further factor of two in the case of the parametric approach. However, we also find that in certain cases, the parametric approach may be biased against or missing some features of models far from â§CDM. Conclusions. Our analysis highlights the importance of synergies between Euclid and other surveys. These synergies are crucial for providing tighter constraints over an extended redshift range for a plethora of different consistency tests of some of the main assumptions of the current cosmological paradigm
The PAU Survey & Euclid: Improving broad-band photometric redshifts with multi-task learning
Current and future imaging surveys require photometric redshifts (photo-z) to
be estimated for millions of galaxies. Improving the photo-z quality is a major
challenge to advance our understanding of cosmology. In this paper, we explore
how the synergies between narrow-band photometric data and large imaging
surveys can be exploited to improve broad-band photometric redshifts. We use a
multi-task learning (MTL) network to improve broad-band photo-z estimates by
simultaneously predicting the broad-band photo-z and the narrow-band photometry
from the broad-band photometry. The narrow-band photometry is only required in
the training field, which enables better photo-z predictions also for the
galaxies without narrow-band photometry in the wide field. This technique is
tested with data from the Physics of the Accelerating Universe Survey (PAUS) in
the COSMOS field. We find that the method predicts photo-z that are 14% more
precise down to magnitude i_AB<23, while reducing the outlier rate by 40% with
respect to the baseline network mapping broad-band colours to only photo-zs.
Furthermore, MTL significantly reduces the photo-z bias for high-redshift
galaxies, improving the redshift distributions for tomographic bins with z>1.
Applying this technique to deeper samples is crucial for future surveys like
\Euclid or LSST. For simulated data, training on a sample with i_AB <23, the
method reduces the photo-z scatter by 15% for all galaxies with 24<i_AB<25. We
also study the effects of extending the training sample with photometric
galaxies using PAUS high-precision photo-zs, which further reduces the photo-z
scatter.Comment: 20 pages, 16 figure
- âŠ