4,583 research outputs found

    Analysis of Data Relevant to Establishing Outer Limits of a Continental Shelf under Law of the Sea Article 76

    Get PDF
    Coastal states may extend the limits of their juridically defined continental shelf beyond 200 nautical miles from their baselines under the provisions set forth in Article 76 of the United Nations Convention on the Law of the Sea (UNCLOS). In a preparatory desktop study, the University of New Hampshire’s Center for Coastal and Ocean Mapping/Joint Hydrographic Center analysed existing U.S. bathymetric and geophysical data holdings, identified data adequacy, and survey requirements to prepare a U.S. claim beyond the Exclusive Economical Zone (EEZ). In this paper we describe the methodology for our desktop study with particular emphasis on how we assembled and evaluated the existing data around the shelf areas of the United States, and estimated where additional surveys may be required

    New national topographic database on the horizon

    Get PDF

    An Econometric Analysis of the European Commission's Merger Decisions

    Get PDF
    Using a sample of 96 mergers notified to the EU Commission and logit regression techniques, we analyse the Commission's decision process. We find that the probability of a phase 2 investigation and of a prohibition of the merger increases with the parties' market shares. The probability increases also when the Commission finds high entry barriers or that post-merger collusion is easy. We do not find significant effects of political variables, such as the nationality of the merging firms or the identity of the commissioner.competition law; antitrust; merger; merger reulation

    Error Estimation of Bathymetric Grid Models Derived from Historic and Contemporary Data Sets

    Get PDF
    The past century has seen remarkable advances in technologies associated with positioning and the measurement of depth. Lead lines have given way to single beam echo sounders, which in turn are being replaced by multibeam sonars and other means of remotely and rapidly collecting dense bathymetric datasets. Sextants were replaced by radio navigation, then transit satellite, GPS and now differential GPS. With each new advance comes tremendous improvement in the accuracy and resolution of the data we collect. Given these changes and given the vastness of the ocean areas we must map, the charts we produce are mainly compilations of multiple data sets collected over many years and representing a range of technologies. Yet despite our knowledge that the accuracy of the various technologies differs, our compilations have traditionally treated each sounding with equal weight. We address these issues in the context of generating regularly spaced grids containing bathymetric values. Gridded products are required for a number of earth sciences studies and for generating the grid we are often forced to use a complex interpolation scheme due to the sparseness and irregularity of the input data points. Consequently, we are faced with the difficult task of assessing the confidence that we can assign to the final grid product, a task that is not usually addressed in most bathymetric compilations. Traditionally the hydrographic community has considered each sounding equally accurate and there has been no error evaluation of the bathymetric end product. This has important implications for use of the gridded bathymetry, especially when it is used for generating further scientific interpretations. In this paper we approach the problem of assessing the confidence of the final bathymetry gridded product via a direct-simulation Monte Carlo method. We start with a small subset of data from the International Bathymetric Chart of the Arctic Ocean (IBCAO) grid model [Jakobsson et al., 2000]. This grid is compiled from a mixture of data sources ranging from single beam soundings with available metadata, to spot soundings with no available metadata, to digitized contours; the test dataset shows examples of all of these types. From this database, we assign a priori error variances based on available meta-data, and when this is not available, based on a worst-case scenario in an essentially heuristic manner. We then generate a number of synthetic datasets by randomly perturbing the base data using normally distributed random variates, scaled according to the predicted error model. These datasets are next re-gridded using the same methodology as the original product, generating a set of plausible grid models of the regional bathymetry that we can use for standard deviation estimates. Finally, we repeat the entire random estimation process and analyze each run’s standard deviation grids in order to examine sampling bias and standard error in the predictions. The final products of the estimation are a collection of standard deviation grids, which we combine with the source data density in order to create a grid that contains information about the bathymetric model’s reliability

    Microlensing variability in time-delay quasars

    Get PDF
    We have searched for microlensing variability in the light curves of five gravitationally lensed quasars with well-determined time delays: SBS 1520+530, FBQ 0951+2635, RX J0911+0551, B1600+434 and HE 2149-2745. By comparing the light curve of the leading image with a suitably time offset light curve of a trailing image we find that two (SBS 1520+530 and FBQ 0951+2635) out of the five quasars have significant long-term (years) and short-term (100 days) brightness variations that may be attributed to microlensing.The short-term variations may be due to nanolenses, relativistic hot or cold spots in the quasar accretion disks, or coherent microlensing at large optical depth.Comment: 12 pages, 5 figures, uses natbib.sty and aa.cl

    First-principles calculations of exchange interactions, spin waves, and temperature dependence of magnetization in inverse-Heusler-based spin gapless semiconductors

    Get PDF
    Employing first principles electronic structure calculations in conjunction with the frozen-magnon method we calculate exchange interactions, spin-wave dispersion, and spin-wave stiffness constants in inverse-Heusler-based spin gapless semiconductor (SGS) compounds Mn2_2CoAl, Ti2_2MnAl, Cr2_2ZnSi, Ti2_2CoSi and Ti2_2VAs. We find that their magnetic behavior is similar to the half-metallic ferromagnetic full-Heusler alloys, i.e., the intersublattice exchange interactions play an essential role in the formation of the magnetic ground state and in determining the Curie temperature, TcT_\mathrm{c}. All compounds, except Ti2_2CoSi possess a ferrimagnetic ground state. Due to the finite energy gap in one spin channel, the exchange interactions decay sharply with the distance, and hence magnetism of these SGSs can be described considering only nearest and next-nearest neighbor exchange interactions. The calculated spin-wave dispersion curves are typical for ferrimagnets and ferromagnets. The spin-wave stiffness constants turn out to be larger than those of the elementary 3dd-ferromagnets. Calculated exchange parameters are used as input to determine the temperature dependence of the magnetization and TcT_\mathrm{c} of the SGSs. We find that the TcT_\mathrm{c} of all compounds is much above the room temperature. The calculated magnetization curve for Mn2_2CoAl as well as the Curie temperature are in very good agreement with available experimental data. The present study is expected to pave the way for a deeper understanding of the magnetic properties of the inverse-Heusler-based SGSs and enhance the interest in these materials for application in spintronic and magnetoelectronic devices.Comment: Accepted for publ;ication in Physical Review

    On the Use of Historical Bathymetric Data to Determine Changes in Bathymetry: An Analysis of Errors and Application to Great Bay Estuary, NH

    Get PDF
    The depth measurements that are incorporated into bathymetric charts have associated errors with magnitudes depending on the survey circumstances and applied techniques. For this reason, combining and comparing depth measurements collected over many years with different techniques and standards is a difficult task which must be done with great caution. In this study we have developed an approach for comparing historical bathymetric surveys. Our methodology uses Monte Carlo modelling to account for the random error components inherited in the data due to positioning and depth measurement uncertainties

    Gamma-ray burst host galaxies and the link to star-formation

    Get PDF
    We briefly review the current status of the study of long-duration gamma-ray burst (GRB) host galaxies. GRB host galaxies are mainly interesting to study for two reasons: 1) they may help us understand where and when massive stars were formed throughout cosmic history, and 2) the properties of host galaxies and the localisation within the hosts where GRBs are formed may give essential clues to the precise nature of the progenitors. The main current problem is to understand to what degree GRBs are biased tracers of star formation. If GRBs are only formed by low-metallicity stars, then their host galaxies will not give a representative view of where stars are formed in the Universe (at least not a low redshifts). On the other hand, if there is no dependency on metallicity then the nature of the host galaxies leads to the perhaps surprising conclusion that most stars are formed in dwarf galaxies. In order to resolve this issue and to fully exploit the potential of GRBs as probes of star-forming galaxies throughout the observable universe it is mandatory that a complete sample of bursts with redshifts and host galaxy detections is built.Comment: 9 pages, 3 figures. To appear in the proceedings of the Eleventh Marcel Grossmann Meeting on General Relativity, eds. H. Kleinert, R. T. Jantzen & R. Ruffini, World Scientific, Singapore, 200
    • …
    corecore