3,681 research outputs found
Composition with Target Constraints
It is known that the composition of schema mappings, each specified by
source-to-target tgds (st-tgds), can be specified by a second-order tgd (SO
tgd). We consider the question of what happens when target constraints are
allowed. Specifically, we consider the question of specifying the composition
of standard schema mappings (those specified by st-tgds, target egds, and a
weakly acyclic set of target tgds). We show that SO tgds, even with the
assistance of arbitrary source constraints and target constraints, cannot
specify in general the composition of two standard schema mappings. Therefore,
we introduce source-to-target second-order dependencies (st-SO dependencies),
which are similar to SO tgds, but allow equations in the conclusion. We show
that st-SO dependencies (along with target egds and target tgds) are sufficient
to express the composition of every finite sequence of standard schema
mappings, and further, every st-SO dependency specifies such a composition. In
addition to this expressive power, we show that st-SO dependencies enjoy other
desirable properties. In particular, they have a polynomial-time chase that
generates a universal solution. This universal solution can be used to find the
certain answers to unions of conjunctive queries in polynomial time. It is easy
to show that the composition of an arbitrary number of standard schema mappings
is equivalent to the composition of only two standard schema mappings. We show
that surprisingly, the analogous result holds also for schema mappings
specified by just st-tgds (no target constraints). This is proven by showing
that every SO tgd is equivalent to an unnested SO tgd (one where there is no
nesting of function symbols). Similarly, we prove unnesting results for st-SO
dependencies, with the same types of consequences.Comment: This paper is an extended version of: M. Arenas, R. Fagin, and A.
Nash. Composition with Target Constraints. In 13th International Conference
on Database Theory (ICDT), pages 129-142, 201
Cosmological parameter determination from Planck and SDSS data in LambdaCHDM cosmologies
We study the complementarity between the cosmological information obtainable
with the Planck surveyour and the large scale structure (LSS) redshift surveys
in LambdaCHDM cosmologies. We compute the initial full phase-space neutrino
distribution function for LambdaCHDM models by using numerical simulations. As
initial condition we adopt the HDM density fluctuation power spectrum
normalized on the basis of the analysis of the local cluster X-ray temperature
function and derive the initial neutrino phase-space distribution at each
spatial wave number k by using the Zel'dovich approximation. These initial
neutrino phase-space distributions are implemented in the CMBFAST code for the
integration of the coupled linearized Einstein, Boltzmann and fluid equations
in k-space. We find that the relative bias between the CMB temperature
fluctuations and the underlying matter density fluctuation power spectrum in
COBE/DMR normalization is given by the CDM component normalized accordingly to
the abundance of rich clusters at the present time. We use the Fisher
information matrix approximation to constrain a multi-dimensional
parametrization of the LambdaCHDM model, by jointly considering CMB and large
scale structure data according to the Planck and the SDSS experimental
specifications and by taking into account redshift distortions and nonlinear
effects on the matter power spectrum. We found that, although the CMB
anisotropy and polarization measurements tend to dominate the constraints on
most of the cosmological parameters, the additional small scale LSS data help
to break the parameter degeneracies. This work has been done in the framework
of the Planck LFI activities.Comment: 36 pages and 8 figures in AAS LATEX macros v5.0 (submitted to ApJ
Modelling of the effect of scale on the compressibility parameters of fine-grained soils
The effect of sample scale represents a challenge when obtaining engineering parameters in the laboratory compared to those obtained in the field. This study aimed at contributing to existing knowledge numerically using the finite element software PLAXIS 2D. The investigations were analysed in terms of height scale (HS) and diameter scale (DS) through a series of laboratory tests. Its effect on compressibility parameters such as coefficient of consolidation (cv) was noted experimentally and showed that the sample scale greatly influences soil parameters most particularly at DS. The soil behaviour was found to be dependent on both DS and HS with a correlation factor of 0.650 and 0.062, respectively. The experimental data were validated in PLAXIS and a new proposed model was developed in PLAXIS 2D under the DS. The new proposed model developed was found to show no significant difference with the laboratory data
Nanoscale precipitation coating: the deposition of inorganic films through step-by-step spray-assembly.
Thin films and surface coatings play an important role in basic and applied research. Here we report on a new, versatile, and simple method ("precipitation coating") for the preparation of inorganic films, based on the alternate spraying of complementary inorganic salt solutions against a receiving surface on which the inorganic deposit forms. The method applies whenever the solubility of the deposited material is smaller than that of the salts in the solutions of the reactants. The film thickness is controlled from nanometers to hundreds of micrometers simply by varying the number of spraying steps; 200 spray cycles, corresponding to less than 15 min deposition time, yield films with thicknesses exceeding one micrometer and reaching tens of micrometers in some cases. The new solution-based process is also compatible with conventional layer-by-layer assembly and permits the fabrication of multimaterial sandwich-like coatings.journal articleresearch support, non-u.s. gov't2010 Aug 24importe
A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline
The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the
handling of the scientific and housekeeping telemetry. It is a critical
component of the Planck ground segment which has to strictly commit to the
project schedule to be ready for the launch and flight operations. In order to
guarantee the quality necessary to achieve the objectives of the Planck
mission, the design and development of the Level 1 software has followed the
ESA Software Engineering Standards. A fundamental step in the software life
cycle is the Verification and Validation of the software. The purpose of this
work is to show an example of procedures, test development and analysis
successfully applied to a key software project of an ESA mission. We present
the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by
detailing the methods used and the results obtained. Different approaches have
been used to test the scientific and housekeeping data processing. Scientific
data processing has been tested by injecting signals with known properties
directly into the acquisition electronics, in order to generate a test dataset
of real telemetry data and reproduce as much as possible nominal conditions.
For the HK telemetry processing, validation software have been developed to
inject known parameter values into a set of real housekeeping packets and
perform a comparison with the corresponding timelines generated by the Level 1.
With the proposed validation and verification procedure, where the on-board and
ground processing are viewed as a single pipeline, we demonstrated that the
scientific and housekeeping processing of the Planck-LFI raw data is correct
and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI
papers published on JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/jins
WMAP 5-year constraints on lepton asymmetry and radiation energy density: Implications for Planck
In this paper we set bounds on the radiation content of the Universe and
neutrino properties by using the WMAP-5 year CMB measurements complemented with
most of the existing CMB and LSS data (WMAP5+All),imposing also self-consistent
BBN constraints on the primordial helium abundance. We consider lepton
asymmetric cosmological models parametrized by the neutrino degeneracy
parameter and the variation of the relativistic degrees of freedom, due to
possible other physical processes occurred between BBN and structure formation
epochs. We find that WMAP5+All data provides strong bounds on helium mass
fraction and neutrino degeneracy parameter that rivals the similar bounds
obtained from the conservative analysis of the present data on helium
abundance. We also find a strong correlation between the matter energy density
and the redshift of matter-radiation equality, z_re, showing that we observe
non-zero equivalent number of relativistic neutrinos mainly via the change of
the of z_re, rather than via neutrino anisotropic stress claimed by the WMAP
team. We forecast that the CMB temperature and polarization measurements
observed with high angular resolutions and sensitivities by the future Planck
satellite will reduce the errors on these parameters down to values fully
consistent with the BBN bounds
Spray-on organic/inorganic films: a general method for the formation of functional nano- to microscale coatings.
journal article2010 Dec 27importe
- …