30 research outputs found
Dark matter halo properties from galaxy-galaxy lensing
We present results for a galaxy-galaxy lensing study based on imaging data
from the Canada-France-Hawaii Telescope Legacy Survey Wide. From a 12 million
object multi-colour catalogue for 124 deg^2 of photometric data in the
u*g'r'i'z' filters we compute photometric redshifts (with a scatter of
\sigma_{\Delta z/(1+z)} = 0.033 and an outlier rate of \eta=2.0 per cent for
i'<=22.5) and extract galaxy shapes down to i'=24.0. We select a sample of
lenses and sources with 0.05 < z_d <= 1 and 0.05 < z_s <= 2. We fit three
different galaxy halo profiles to the lensing signal, a singular isothermal
sphere (SIS), a truncated isothermal sphere (BBS) and a universal density
profile (NFW). We derive velocity dispersions by fitting an SIS out to 100
h^{-1} kpc to the excess surface mass density \Delta\Sigma and perform maximum
likelihood analyses out to a maximum scale of 2 h^{-1} Mpc to obtain halo
parameters and scaling relations. We find luminosity scaling relations of
\sigma_{red} ~ L^{0.24+-0.03} for the red lens sample, \sigma_{blue} ~
L^{0.23+-0.03} for blue lenses and \sigma ~ L^{0.29+-0.02} for the combined
lens sample with zeropoints of \sigma*_{red}=162+-2 km/s, \sigma*_{blue}=115+-3
km/s and \sigma*=135+-2 km/s at a chosen reference luminosity L*_{r'} = 1.6
\times 10^10 h^{-2} L_{r',sun}. The steeper slope for the combined sample is
due to the different zeropoints of the blue and red lenses and the fact that
blue lenses dominate at low luminosities and red lenses at high luminosities.
The mean effective redshifts for the lens samples are =0.28 for red
lenses, =0.35 for blue lenses and =0.34 for the combined lens
sample.Comment: 62 pages, 55 figures, accepted for publication in MNRAS, abridged
abstract, includes corrections from final proof. Our created catalogues
(photometry, photometric redshifts and shears) are publicly available at
http://www.usm.uni-muenchen.de/people/stella/GGL
Astro-WISE: Chaining to the Universe
The recent explosion of recorded digital data and its processed derivatives
threatens to overwhelm researchers when analysing their experimental data or
when looking up data items in archives and file systems. While current hardware
developments allow to acquire, process and store 100s of terabytes of data at
the cost of a modern sports car, the software systems to handle these data are
lagging behind. This general problem is recognized and addressed by various
scientific communities, e.g., DATAGRID/EGEE federates compute and storage power
over the high-energy physical community, while the astronomical community is
building an Internet geared Virtual Observatory, connecting archival data.
These large projects either focus on a specific distribution aspect or aim to
connect many sub-communities and have a relatively long trajectory for setting
standards and a common layer. Here, we report "first light" of a very different
solution to the problem initiated by a smaller astronomical IT community. It
provides the abstract "scientific information layer" which integrates
distributed scientific analysis with distributed processing and federated
archiving and publishing. By designing new abstractions and mixing in old ones,
a Science Information System with fully scalable cornerstones has been
achieved, transforming data systems into knowledge systems. This break-through
is facilitated by the full end-to-end linking of all dependent data items,
which allows full backward chaining from the observer/researcher to the
experiment. Key is the notion that information is intrinsic in nature and thus
is the data acquired by a scientific experiment. The new abstraction is that
software systems guide the user to that intrinsic information by forcing full
backward and forward chaining in the data modelling.Comment: To be published in ADASS XVI ASP Conference Series, 2006, R. Shaw, F.
Hill and D. Bell, ed