3,966 research outputs found

    Error Estimation of Bathymetric Grid Models Derived from Historic and Contemporary Data Sets

    Get PDF
    The past century has seen remarkable advances in technologies associated with positioning and the measurement of depth. Lead lines have given way to single beam echo sounders, which in turn are being replaced by multibeam sonars and other means of remotely and rapidly collecting dense bathymetric datasets. Sextants were replaced by radio navigation, then transit satellite, GPS and now differential GPS. With each new advance comes tremendous improvement in the accuracy and resolution of the data we collect. Given these changes and given the vastness of the ocean areas we must map, the charts we produce are mainly compilations of multiple data sets collected over many years and representing a range of technologies. Yet despite our knowledge that the accuracy of the various technologies differs, our compilations have traditionally treated each sounding with equal weight. We address these issues in the context of generating regularly spaced grids containing bathymetric values. Gridded products are required for a number of earth sciences studies and for generating the grid we are often forced to use a complex interpolation scheme due to the sparseness and irregularity of the input data points. Consequently, we are faced with the difficult task of assessing the confidence that we can assign to the final grid product, a task that is not usually addressed in most bathymetric compilations. Traditionally the hydrographic community has considered each sounding equally accurate and there has been no error evaluation of the bathymetric end product. This has important implications for use of the gridded bathymetry, especially when it is used for generating further scientific interpretations. In this paper we approach the problem of assessing the confidence of the final bathymetry gridded product via a direct-simulation Monte Carlo method. We start with a small subset of data from the International Bathymetric Chart of the Arctic Ocean (IBCAO) grid model [Jakobsson et al., 2000]. This grid is compiled from a mixture of data sources ranging from single beam soundings with available metadata, to spot soundings with no available metadata, to digitized contours; the test dataset shows examples of all of these types. From this database, we assign a priori error variances based on available meta-data, and when this is not available, based on a worst-case scenario in an essentially heuristic manner. We then generate a number of synthetic datasets by randomly perturbing the base data using normally distributed random variates, scaled according to the predicted error model. These datasets are next re-gridded using the same methodology as the original product, generating a set of plausible grid models of the regional bathymetry that we can use for standard deviation estimates. Finally, we repeat the entire random estimation process and analyze each run’s standard deviation grids in order to examine sampling bias and standard error in the predictions. The final products of the estimation are a collection of standard deviation grids, which we combine with the source data density in order to create a grid that contains information about the bathymetric model’s reliability

    Reinventing the national topographic database

    Get PDF
    The National Land Survey (NLS) has had a digital topographic database (TDB) since 1992. Many of its features are based on the Basic Map created by M. Kajamaa in 1947, mapping first completed in 1977. The basis for the renewal of the TDB begun by investigating the value of the TDB, a study made by the Aalto University in 2014 and a study on the new TDB system 2030 published by the Ministry of Agriculture in 2015. As a result of these studies the NLS set up a programme for creating a new National Topographic Database (NTDB) in beginning of 2015. First new version should be available in 2019. The new NTDB has following key features: 1) it is based on processes where data is naturally maintained, 2) it is quality managed, 3) it has persistent Ids, 4) it supports 3D, 4D, 5) it is based on standards. The technical architecture is based on interoperable modules. A website for following the development of the NTDB can be accessed for more information: http://kmtk.maanmittauslaitos.fi/

    The Redshift Distribution of the TOUGH Survey

    Full text link
    We present the redshift results from a Very Large Telescope program aimed at optimizing the legacy value of the Swift mission: to characterize a homogeneous, X-ray selected, sample of 69 GRB host galaxies. 19 new redshifts have been secured, resulting in a 83% (57/69) redshift completion, making the survey the most comprehensive in terms of redshift completeness of any sample to the full Swift depth, available to date. We present the cumulative redshift distribution and derive a conservative, yet small, associated uncertainty. We constrain the fraction of Swift GRBs at high redshift to a maximum of 10% (5%) for z > 6 (z > 7). The mean redshift of the host sample is assessed to be > 2.2. Using this more complete sample, we confirm previous findings that the GRB rate at high redshift (z > 3) appears to be in excess of predictions based on assumptions that it should follow conventional determinations of the star formation history of the universe, combined with an estimate of its likely metallicity dependence. This suggests that either star formation at high redshifts has been significantly underestimated, for example due to a dominant contribution from faint, undetected galaxies, or that GRB production is enhanced in the conditions of early star formation, beyond those usually ascribed to lower metallicity.Comment: 7th Huntsville Gamma-Ray Burst Symposium, GRB 2013: paper 34 in eConf Proceedings C130414

    Cast-as-Intended Mechanism with Return Codes Based on PETs

    Full text link
    We propose a method providing cast-as-intended verifiability for remote electronic voting. The method is based on plaintext equivalence tests (PETs), used to match the cast ballots against the pre-generated encrypted code tables. Our solution provides an attractive balance of security and functional properties. It is based on well-known cryptographic building blocks and relies on standard cryptographic assumptions, which allows for relatively simple security analysis. Our scheme is designed with a built-in fine-grained distributed trust mechanism based on threshold decryption. It, finally, imposes only very little additional computational burden on the voting platform, which is especially important when voters use devices of restricted computational power such as mobile phones. At the same time, the computational cost on the server side is very reasonable and scales well with the increasing ballot size

    On the Use of Historical Bathymetric Data to Determine Changes in Bathymetry: An Analysis of Errors and Application to Great Bay Estuary, NH

    Get PDF
    The depth measurements that are incorporated into bathymetric charts have associated errors with magnitudes depending on the survey circumstances and applied techniques. For this reason, combining and comparing depth measurements collected over many years with different techniques and standards is a difficult task which must be done with great caution. In this study we have developed an approach for comparing historical bathymetric surveys. Our methodology uses Monte Carlo modelling to account for the random error components inherited in the data due to positioning and depth measurement uncertainties

    Tuning the Curie temperature of FeCo compounds by tetragonal distortion

    Full text link
    Combining density-functional theory calculations with a classical Monte Carlo method, we show that for B2-type FeCo compounds tetragonal distortion gives rise to a strong reduction of the Curie temperature TCT_{\mathrm{C}}. The TCT_{\mathrm{C}} monotonically decreases from 1575 K (for c/a=1c/a=1) to 940 K (for c/a=\sqrtwo). We find that the nearest neighbor Fe-Co exchange interaction is sufficient to explain the c/ac/a behavior of the TCT_{\mathrm{C}}. Combination of high magnetocrystalline anisotropy energy with a moderate TCT_{\mathrm{C}} value suggests tetragonal FeCo grown on the Rh substrate with c/a=1.24c/a=1.24 to be a promising material for heat-assisted magnetic recording applications.Comment: 4 pages, 2 figure

    Hiding in Plain Sight: A Longitudinal Study of Combosquatting Abuse

    Full text link
    Domain squatting is a common adversarial practice where attackers register domain names that are purposefully similar to popular domains. In this work, we study a specific type of domain squatting called "combosquatting," in which attackers register domains that combine a popular trademark with one or more phrases (e.g., betterfacebook[.]com, youtube-live[.]com). We perform the first large-scale, empirical study of combosquatting by analyzing more than 468 billion DNS records---collected from passive and active DNS data sources over almost six years. We find that almost 60% of abusive combosquatting domains live for more than 1,000 days, and even worse, we observe increased activity associated with combosquatting year over year. Moreover, we show that combosquatting is used to perform a spectrum of different types of abuse including phishing, social engineering, affiliate abuse, trademark abuse, and even advanced persistent threats. Our results suggest that combosquatting is a real problem that requires increased scrutiny by the security community.Comment: ACM CCS 1

    Scattered Emission from A Relativistic Outflow and Its Application to Gamma-Ray Bursts

    Full text link
    We investigate a scenario of photons scattering by electrons within a relativistic outflow. The outflow is composed of discrete shells with different speeds. One shell emits radiation for a short duration. Some of this radiation is scattered by the shell(s) behind. We calculate in a simple two-shell model the observed scattered flux density as a function of the observed primary flux density, the normalized arrival time delay between the two emission components, the Lorentz factor ratio of the two shells and the scattering shell's optical depth. Thomson scattering in a cold shell and inverse Compton scattering in a hot shell are both considered. The results of our calculations are applied to the Gamma-Ray Bursts and the afterglows. We find that the scattered flux from a cold slower shell is small and likely to be detected only for those bursts with very weak afterglows. A hot scattering shell could give rise to a scattered emission as bright as the X-ray shallow decay component detected in many bursts, on a condition that the isotropically equivalent total energy carried by the hot electrons is large, ∼1052−56\sim 10^{52-56} erg. The scattered emission from a faster shell could appear as a late short γ\gamma-ray/MeV flash or become part of the prompt emission depending on the delay of the ejection of the shell.Comment: 13 pages, 3 figures, MNRAS in press; a short intuitive estimation is added before detailed calculations; references update
    • …
    corecore