633 research outputs found

    ON THE DEVELOPMENT OF A DATASET PUBLICATION GUIDELINE: DATA REPOSITORIES AND KEYWORD ANALYSIS IN ISPRS DOMAIN

    Get PDF
    The FAIR principle (find, access, interoperability, reuse) forms a sustainable resource for scientific exchange between researchers. Currently, the implementation of this principle is an important process for future research projects. To support this process in the ISPRS community, the usage of data repositories for dataset publication has the potential to bring closer the achievement of the FAIR principle. Therefore, we (1) analysed available data repositories, (2) identified common keywords in ISPRS publications and (3) developed a tool for searching appropriate repositories. Thus, infrastructures from the field of geosciences, that can already be used, become more accessible

    CURRENT STATUS OF THE BENCHMARK DATABASE BEMEDA

    Get PDF
    Open science is an important attribute for developing new approaches. Especially, the data component plays a significant role. The FAIR principle provides a good orientation towards open data. One part of FAIR is findability. Thus, domain specific dataset search platforms were developed: the Earth Observation Database and our Benchmark Metadata Database (BeMeDa). In addition to the search itself, the datasets found by this platforms can be compared with each other with regard to their interoperability. We compare these two platforms and present an update of our platform BeMeDa. This update includes additional location information about the datasets and a new frontend design with improved usability. We rely on user feedback for further improvements and enhancements

    On SAT representations of XOR constraints

    Full text link
    We study the representation of systems S of linear equations over the two-element field (aka xor- or parity-constraints) via conjunctive normal forms F (boolean clause-sets). First we consider the problem of finding an "arc-consistent" representation ("AC"), meaning that unit-clause propagation will fix all forced assignments for all possible instantiations of the xor-variables. Our main negative result is that there is no polysize AC-representation in general. On the positive side we show that finding such an AC-representation is fixed-parameter tractable (fpt) in the number of equations. Then we turn to a stronger criterion of representation, namely propagation completeness ("PC") --- while AC only covers the variables of S, now all the variables in F (the variables in S plus auxiliary variables) are considered for PC. We show that the standard translation actually yields a PC representation for one equation, but fails so for two equations (in fact arbitrarily badly). We show that with a more intelligent translation we can also easily compute a translation to PC for two equations. We conjecture that computing a representation in PC is fpt in the number of equations.Comment: 39 pages; 2nd v. improved handling of acyclic systems, free-standing proof of the transformation from AC-representations to monotone circuits, improved wording and literature review; 3rd v. updated literature, strengthened treatment of monotonisation, improved discussions; 4th v. update of literature, discussions and formulations, more details and examples; conference v. to appear LATA 201

    Extensive study of nuclear uncertainties and their impact on the r-process nucleosynthesis in neutron star mergers

    Full text link
    Theoretically predicted yields of elements created by the rapid neutron capture (r-) process carry potentially large uncertainties associated with incomplete knowledge of nuclear properties as well as approximative hydrodynamical modelling of the matter ejection processes. We present an in-depth study of the nuclear uncertainties by systematically varying theoretical nuclear input models that describe the experimentally unknown neutron-rich nuclei. This includes two frameworks for calculating the radiative neutron capture rates and six, four and four models for the nuclear masses, β\beta-decay rates and fission properties, respectively. Our r-process nuclear network calculations are based on detailed hydrodynamical simulations of dynamically ejected material from NS-NS or NS-BH binary mergers plus the secular ejecta from BH-torus systems. The impact of nuclear uncertainties on the r-process abundance distribution and early radioactive heating rate is found to be modest (within a factor 20\sim 20 for individual A>90A>90 nuclei and a factor 2 for the heating rate), however the impact on the late-time heating rate is more significant and depends strongly on the contribution from fission. We witness significantly larger sensitivity to the nuclear physics input if only a single trajectory is used compared to considering ensembles of \sim200-300 trajectories, and the quantitative effects of the nuclear uncertainties strongly depend on the adopted conditions for the individual trajectory. We use the predicted Th/U ratio to estimate the cosmochronometric age of six metal-poor stars to set a lower limit of the age of the Galaxy and find the impact of the nuclear uncertainties to be up to 2 Gyr.Comment: 26 pages, 22 figures, submitted to MNRA

    Correlation based networks of equity returns sampled at different time horizons

    Get PDF
    We investigate the planar maximally filtered graphs of the portfolio of the 300 most capitalized stocks traded at the New York Stock Exchange during the time period 2001-2003. Topological properties such as the average length of shortest paths, the betweenness and the degree are computed on different planar maximally filtered graphs generated by sampling the returns at different time horizons ranging from 5 min up to one trading day. This analysis confirms that the selected stocks compose a hierarchical system progressively structuring as the sampling time horizon increases. Finally, a cluster formation, associated to economic sectors, is quantitatively investigated.Comment: 9 pages, 8 figure

    LISA Metrology System - Final Report

    No full text
    Gravitational Waves will open an entirely new window to the Universe, different from all other astronomy in that the gravitational waves will tell us about large-scale mass motions even in regions and at distances totally obscured to electromagnetic radiation. The most interesting sources are at low frequencies (mHz to Hz) inaccessible on ground due to seismic and other unavoidable disturbances. For these sources observation from space is the only option, and has been studied in detail for more than 20 years as the LISA concept. Consequently, The Gravitational Universe has been chosen as science theme for the L3 mission in ESA's Cosmic Vision program. The primary measurement in LISA and derived concepts is the observation of tiny (picometer) pathlength fluctuations between remote spacecraft using heterodyne laser interferometry. The interference of two laser beams, with MHz frequency difference, produces a MHz beat note that is converted to a photocurrent by a photodiode on the optical bench. The gravitational wave signal is encoded in the phase of this beat note. The next, and crucial, step is therefore to measure that phase with µcycle resolution in the presence of noise and other signals. This measurement is the purpose of the LISA metrology system and the subject of this report

    Financial correlations at ultra-high frequency: theoretical models and empirical estimation

    Full text link
    A detailed analysis of correlation between stock returns at high frequency is compared with simple models of random walks. We focus in particular on the dependence of correlations on time scales - the so-called Epps effect. This provides a characterization of stochastic models of stock price returns which is appropriate at very high frequency.Comment: 22 pages, 8 figures, 1 table, version to appear in EPJ

    Generation and quality control of lipidomics data for the alzheimers disease neuroimaging initiative cohort.

    Get PDF
    Alzheimers disease (AD) is a major public health priority with a large socioeconomic burden and complex etiology. The Alzheimer Disease Metabolomics Consortium (ADMC) and the Alzheimer Disease Neuroimaging Initiative (ADNI) aim to gain new biological insights in the disease etiology. We report here an untargeted lipidomics of serum specimens of 806 subjects within the ADNI1 cohort (188 AD, 392 mild cognitive impairment and 226 cognitively normal subjects) along with 83 quality control samples. Lipids were detected and measured using an ultra-high-performance liquid chromatography quadruple/time-of-flight mass spectrometry (UHPLC-QTOF MS) instrument operated in both negative and positive electrospray ionization modes. The dataset includes a total 513 unique lipid species out of which 341 are known lipids. For over 95% of the detected lipids, a relative standard deviation of better than 20% was achieved in the quality control samples, indicating high technical reproducibility. Association modeling of this dataset and available clinical, metabolomics and drug-use data will provide novel insights into the AD etiology. These datasets are available at the ADNI repository at http://adni.loni.usc.edu/

    Dynamic clamp with StdpC software

    Get PDF
    Dynamic clamp is a powerful method that allows the introduction of artificial electrical components into target cells to simulate ionic conductances and synaptic inputs. This method is based on a fast cycle of measuring the membrane potential of a cell, calculating the current of a desired simulated component using an appropriate model and injecting this current into the cell. Here we present a dynamic clamp protocol using free, fully integrated, open-source software (StdpC, for spike timing-dependent plasticity clamp). Use of this protocol does not require specialist hardware, costly commercial software, experience in real-time operating systems or a strong programming background. The software enables the configuration and operation of a wide range of complex and fully automated dynamic clamp experiments through an intuitive and powerful interface with a minimal initial lead time of a few hours. After initial configuration, experimental results can be generated within minutes of establishing cell recording

    Universal and non-universal properties of cross-correlations in financial time series

    Full text link
    We use methods of random matrix theory to analyze the cross-correlation matrix C of price changes of the largest 1000 US stocks for the 2-year period 1994-95. We find that the statistics of most of the eigenvalues in the spectrum of C agree with the predictions of random matrix theory, but there are deviations for a few of the largest eigenvalues. We find that C has the universal properties of the Gaussian orthogonal ensemble of random matrices. Furthermore, we analyze the eigenvectors of C through their inverse participation ratio and find eigenvectors with large inverse participation ratios at both edges of the eigenvalue spectrum--a situation reminiscent of results in localization theory.Comment: 14 pages, 3 figures, Revte
    corecore