930 research outputs found

    FAIR Practices in Europe

    No full text
    Institutions driving fundamental research at the cutting edge such as for example from the Max Planck Society (MPS) took steps to optimize data management and stewardship to be able to address new scientific questions. In this paper we selected three institutes from the MPS from the areas of humanities, environmental sciences and natural sciences as examples to indicate the efforts to integrate large amounts of data from collaborators worldwide to create a data space that is ready to be exploited to get new insights based on data intensive science methods. For this integration the typical challenges of fragmentation, bad quality and also social differences had to be overcome. In all three cases, well-managed repositories that are driven by the scientific needs and harmonization principles that have been agreed upon in the community were the core pillars. It is not surprising that these principles are very much aligned with what have now become the FAIR principles. The FAIR principles confirm the correctness of earlier decisions and their clear formulation identified the gaps which the projects need to address

    Lagrangian bias in the local bias model

    Full text link
    It is often assumed that the halo-patch fluctuation field can be written as a Taylor series in the initial Lagrangian dark matter density fluctuation field. We show that if this Lagrangian bias is local, and the initial conditions are Gaussian, then the two-point cross-correlation between halos and mass should be linearly proportional to the mass-mass auto-correlation function. This statement is exact and valid on all scales; there are no higher order contributions, e.g., from terms proportional to products or convolutions of two-point functions, which one might have thought would appear upon truncating the Taylor series of the halo bias function. In addition, the auto-correlation function of locally biased tracers can be written as a Taylor series in the auto-correlation function of the mass; there are no terms involving, e.g., derivatives or convolutions. Moreover, although the leading order coefficient, the linear bias factor of the auto-correlation function is just the square of that for the cross-correlation, it is the same as that obtained from expanding the mean number of halos as a function of the local density only in the large-scale limit. In principle, these relations allow simple tests of whether or not halo bias is indeed local in Lagrangian space. We discuss why things are more complicated in practice. We also discuss our results in light of recent work on the renormalizability of halo bias, demonstrating that it is better to renormalize than not. We use the Lognormal model to illustrate many of our findings.Comment: 14 pages, published on JCA

    Context-aware and automatic configuration of mobile devices in cloud-enabled ubiquitous computing

    Get PDF
    This is the author's accepted manuscript. The final publication is available at Springer via http://dx.doi.org/10.1007/s00779-013-0698-3. Copyright @ Springer-Verlag London 2013.Context-sensitive (or aware) applications have, in recent years, moved from the realm of possibilities to that of ubiquity. One exciting research area that is still very much in the realm of possibilities is that of cloud computing, and in this paper, we present our work, which explores the overlap of these two research areas. Accordingly, this paper explores the notion of cross-source integration of cloud-based, context-aware information in ubiquitous computing through a developed prototypical solution. Moreover, the described solution incorporates remote and automatic configuration of Android smartphones and advances the research area of context-aware information by harvesting information from several sources to build a rich foundation on which algorithms for context-aware computation can be based. Evaluation results show the viability of integrating and tailoring contextual information to provide users with timely, relevant and adapted application behaviour and content

    Chronology of dune development in the White River Badlands, northern Great Plains, USA

    Get PDF
    Aeolian dune field chronologies provide important information on drought history on the Great Plains. The White River Badlands (WRB) dunes are located approximately 60 km north of the Nebraska Sand Hills (NSH), in the western section of the northern Great Plains. Clifftop dunes, sand sheets, and stabilized northwest-southeast trending parabolic dunes are found on upland mesas and buttes, locally called tables. The result of this study is a dune stabilization history determined from samples collected from stratigraphic exposures and dune crests. Thirty-seven OSL ages, from this and previous investigations, show three periods of dune activity: 1) ~21,000 years ago to 12,000 years ago (a), 2) ~9 to 6 ka, and 3) post-700 a. Stratigraphic exposures and low-relief dune forms preserve evidence of late Pleistocene and middle Holocene dune development, while high-relief dune crests preserve evidence of late Holocene dune development. Results of 12 OSL ages from the most recent dune activation event indicate that Medieval Climate Anomaly (MCA) droughts and Little Ice Age (LIA) droughts caused dune reactivation on the tables. Dune reactivation was accompanied by other drought-driven geomorphological responses in the WRB, including fluvial incision of the prairie and formation of sod tables. Regional significance of the MCA and LIA droughts is supported by similarities in the aeolian chronologies of the NSH at 700–600 a and some western Great Plains dune fields at 420–210 a. Aerial photographs of the WRB show little activity during the Dust Bowl droughts of the 1930s

    Coordinating government and community support for community language teaching in Australia: Overview with special attention to New South Wales

    Get PDF
    An overview of formal government language-in-education planning for community languages (CLs) that has been undertaken in Australia and New South Wales is provided, moving from the more informal programmes provided in the 1980s to school-oriented programmes and training at the turn of the century. These programmes depend on community support; for many of the teachers from the communities, methodological training is needed to complement their language and cultural skills. At the same time, Commonwealth (Federal) and State support for CL programmes has improved their quality and provides students with opportunities to study CLs at the senior secondary matriculation level. The paper concludes with specific recommendations for greater recognition of CL schools and for greater attention to CL teacher preparation

    An algorithm for the direct reconstruction of the dark matter correlation function from weak lensing and galaxy clustering

    Full text link
    The clustering of matter on cosmological scales is an essential probe for studying the physical origin and composition of our Universe. To date, most of the direct studies have focused on shear-shear weak lensing correlations, but it is also possible to extract the dark matter clustering by combining galaxy-clustering and galaxy-galaxy-lensing measurements. In this study we develop a method that can constrain the dark matter correlation function from galaxy clustering and galaxy-galaxy-lensing measurements, by focusing on the correlation coefficient between the galaxy and matter overdensity fields. To generate a mock galaxy catalogue for testing purposes, we use the Halo Occupation Distribution approach applied to a large ensemble of N-body simulations to model pre-existing SDSS Luminous Red Galaxy sample observations. Using this mock catalogue, we show that a direct comparison between the excess surface mass density measured by lensing and its corresponding galaxy clustering quantity is not optimal. We develop a new statistic that suppresses the small-scale contributions to these observations and show that this new statistic leads to a cross-correlation coefficient that is within a few percent of unity down to 5 Mpc/h. Furthermore, the residual incoherence between the galaxy and matter fields can be explained using a theoretical model for scale-dependent bias, giving us a final estimator that is unbiased to within 1%. We also perform a comprehensive study of other physical effects that can affect the analysis, such as redshift space distortions and differences in radial windows between galaxy clustering and weak lensing observations. We apply the method to a range of cosmological models and show the viability of our new statistic to distinguish between cosmological models.Comment: 23 pages, 14 figures, accepted by PRD; minor changes to V1, 1 new figure, more detailed discussion of the covariance of the new ADSD statisti

    Density reconstruction from biased tracers and its application to primordial non-Gaussianity

    Get PDF
    Large-scale Fourier modes of the cosmic density field are of great value for learning about cosmology because of their well-understood relationship to fluctuations in the early universe. However, cosmic variance generally limits the statistical precision that can be achieved when constraining model parameters using these modes as measured in galaxy surveys, and moreover, these modes are sometimes inaccessible due to observational systematics or foregrounds. For some applications, both limitations can be circumvented by reconstructing large-scale modes using the correlations they induce between smaller-scale modes of an observed tracer (such as galaxy positions). In this paper, we further develop a formalism for this reconstruction, using a quadratic estimator similar to the one used for lensing of the cosmic microwave background. We incorporate nonlinearities from gravity, nonlinear biasing, and local-type primordial non-Gaussianity, and verify that the estimator gives the expected results when applied to N-body simulations. We then carry out forecasts for several upcoming surveys, demonstrating that, when reconstructed modes are included alongside directly-observed tracer density modes, constraints on local primordial non-Gaussianity are generically tightened by tens of percents compared to standard single-tracer analyses. In certain cases, these improvements arise from cosmic variance cancellation, with reconstructed modes taking the place of modes of a separate tracer, thus enabling an effective "multitracer" approach with single-tracer observations.Comment: 30 pages plus 14 pages appendices, 19 figure

    Primordial non-Gaussianity in the Bispectrum of the Halo Density Field

    Full text link
    The bispectrum vanishes for linear Gaussian fields and is thus a sensitive probe of non-linearities and non-Gaussianities in the cosmic density field. Hence, a detection of the bispectrum in the halo density field would enable tight constraints on non-Gaussian processes in the early Universe and allow inference of the dynamics driving inflation. We present a tree level derivation of the halo bispectrum arising from non-linear clustering, non-linear biasing and primordial non-Gaussianity. A diagrammatic description is developed to provide an intuitive understanding of the contributing terms and their dependence on scale, shape and the non-Gaussianity parameter fNL. We compute the terms based on a multivariate bias expansion and the peak-background split method and show that non-Gaussian modifications to the bias parameters lead to amplifications of the tree level bispectrum that were ignored in previous studies. Our results are in a good agreement with published simulation measurements of the halo bispectrum. Finally, we estimate the expected signal to noise on fNL and show that the constraint obtainable from the bispectrum analysis significantly exceeds the one obtainable from the power spectrum analysis.Comment: 34 pages, 15 figures, (v3): matches JCAP published versio
    • …
    corecore