5,584 research outputs found
Recommended from our members
Age, Depth, and Residual Depth Anomalies in the North Pacific: Implications for Thermal Models of the Lithosphere and Upper Mantle
We present an empirical basement depth versus age relation for the North Pacific Ocean, based on the statistical treatment of an ocean-wide gridded data set. The SYNBAPS bathymetry was averaged into half-degree intervals and corrected for the effects of sediment loading. The resulting basement depths were plotted against ages determined from a revised isochron chart based on a recent compilation of magnetic lineations and various published plate reconstructions. On crust older than 80 Ma, the depths are skewed to the shallow side of the depth versus age distribution by large numbers of seamounts. Therefore the mean and standard deviations are not useful representations of the data. A more appropriate representation is the mode (or greatest concentration of points) and contours around the mode. The contours around the mode show that most ocean floor increases in depth with the square root of age out to crust of 80 Ma. Beyond this the majority of the data oscillates about a line that remains essentially constant as the age in-creases. Approximately 56% of all the data points lie within a + 300m band about the mode. If the sediment thickness data in the older basins of the western North Pacific is correct then the flattening of the depths favor a model in which extra heat is supplied to the base of the lithosphere on older ocean floor. Residual depth anomalies were calculated by removing the depths predicted by such a model. These anomalies correlate with bathymetric features and occur predominantly on crust of 120 and 160 Ma. They account for the rises in the mode at these two ages. The overall subsidence of the ocean floor can be accounted for by the cooling of a thermo-mechanical boundary layer. Correlations between geoid height and depth are evidence that many of the residual depth anomalies result from convective plumes which reset the thermal structure of the lithosphere. It is possible that this process observed at different times after the initial resetting of the isotherms may account for many of the depth anomalies in the western North Pacific.Institute for Geophysic
Using Provenance to support Good Laboratory Practice in Grid Environments
Conducting experiments and documenting results is daily business of
scientists. Good and traceable documentation enables other scientists to
confirm procedures and results for increased credibility. Documentation and
scientific conduct are regulated and termed as "good laboratory practice."
Laboratory notebooks are used to record each step in conducting an experiment
and processing data. Originally, these notebooks were paper based. Due to
computerised research systems, acquired data became more elaborate, thus
increasing the need for electronic notebooks with data storage, computational
features and reliable electronic documentation. As a new approach to this, a
scientific data management system (DataFinder) is enhanced with features for
traceable documentation. Provenance recording is used to meet requirements of
traceability, and this information can later be queried for further analysis.
DataFinder has further important features for scientific documentation: It
employs a heterogeneous and distributed data storage concept. This enables
access to different types of data storage systems (e. g. Grid data
infrastructure, file servers). In this chapter we describe a number of building
blocks that are available or close to finished development. These components
are intended for assembling an electronic laboratory notebook for use in Grid
environments, while retaining maximal flexibility on usage scenarios as well as
maximal compatibility overlap towards each other. Through the usage of such a
system, provenance can successfully be used to trace the scientific workflow of
preparation, execution, evaluation, interpretation and archiving of research
data. The reliability of research results increases and the research process
remains transparent to remote research partners.Comment: Book Chapter for "Data Provenance and Data Management for eScience,"
of Studies in Computational Intelligence series, Springer. 25 pages, 8
figure
Book Review: Common Ground: A Turbulent Decade in the Lives of Three American Families. by J. Anthony Lukas.
Book review: Common Ground: A Turbulent Decade in the Lives of Three American Families. By J. Anthony Lukas. New York: Alfred A. Knopf. 1985. Pp. 659. Reviewed by: Miriam K Feldman
Detection of gravitational-wave bursts with chirplet-like template families
Gravitational Wave (GW) burst detection algorithms typically rely on the
hypothesis that the burst signal is "locally stationary", that is it changes
slowly with frequency. Under this assumption, the signal can be decomposed into
a small number of wavelets with constant frequency. This justifies the use of a
family of sine-Gaussian templates in the Omega pipeline, one of the algorithms
used in LIGO-Virgo burst searches. However there are plausible scenarios where
the burst frequency evolves rapidly, such as in the merger phase of a binary
black hole and/or neutron star coalescence. In those cases, the local
stationarity of sine-Gaussians induces performance losses, due to the mismatch
between the template and the actual signal. We propose an extension of the
Omega pipeline based on chirplet-like templates. Chirplets incorporate an
additional parameter, the chirp rate, to control the frequency variation. In
this paper, we show that the Omega pipeline can easily be extended to include a
chirplet template bank. We illustrate the method on a simulated data set, with
a family of phenomenological binary black-hole coalescence waveforms embedded
into Gaussian LIGO/Virgo-like noise. Chirplet-like templates result in an
enhancement of the measured signal-to-noise ratio.Comment: 8 pages, 6 figures. Submitted to Class. Quantum Grav. Special issue:
Proceedings of GWDAW-14, Rome (Italy), 2010; fixed several minor issue
The effects of work values and job characteristics on job satisfaction
"Several approaches relate job satisfaction to work values and job characteristics. Quinn & Mangione (1973) used work values to weight domain-specific satisfaction ratings to find out that importance weighting rather reduces the explanatory power of domain-specific satisfaction ratings with regard to some outcome variables, such as overall job satisfaction. Kalleberg (1977) analyzed the effect of both types of variables in their own right. Using US data he concluded that while job characteristics had strong positive relationships with overall job satisfaction, the effect of work values was negative. Borg (1991) found that work values and the evaluation of job characteristics were not independent from each other. Coping strategies can account for linear or v-shaped relationships for different aspects. The present paper replicates selected analyses of previous studies using the International Social Survey Program 1997 study on 'Work Orientations', which includes similar indicators. The study is based on representative samples of fulltime employed respondents in a broad variety of national contexts." (author's abstract
- …