10,127 research outputs found
Development of a fusion adaptive algorithm for marine debris detection within the post-Sandy restoration framework
Recognition of marine debris represent a difficult task due to the extreme variability of the marine environment, the possible targets, and the variable skill levels of human operators. The range of potential targets is much wider than similar fields of research such as mine hunting, localization of unexploded ordnance or pipeline detection. In order to address this additional complexity, an adaptive algorithm is being developing that appropriately responds to changes in the environment, and context.
The preliminary step is to properly geometrically and radiometrically correct the collected data. Then, the core engine manages the fusion of a set of statistically- and physically-based algorithms, working at different levels (swath, beam, snippet, and pixel) and using both predictive modeling (that is, a high-frequency acoustic backscatter model) and phenomenological (e.g., digital image processing techniques) approaches. The expected outcome is the reduction of inter-algorithmic cross-correlation and, thus, the probability of false alarm. At this early stage, we provide a proof of concept showing outcomes from algorithms that dynamically adapt themselves to the depth and average backscatter level met in the surveyed environment, targeting marine debris (modeled as objects of about 1-m size).
The project relies on a modular software library, called Matador (Marine Target Detection and Object Recognition)
HUDDL for description and archive of hydrographic binary data
Many of the attempts to introduce a universal hydrographic binary data format have failed or have been only partially successful. In essence, this is because such formats either have to simplify the data to such an extent that they only support the lowest common subset of all the formats covered, or they attempt to be a superset of all formats and quickly become cumbersome. Neither choice works well in practice. This paper presents a different approach: a standardized description of (past, present, and future) data formats using the Hydrographic Universal Data Description Language (HUDDL), a descriptive language implemented using the Extensible Markup Language (XML). That is, XML is used to provide a structural and physical description of a data format, rather than the content of a particular file. Done correctly, this opens the possibility of automatically generating both multi-language data parsers and documentation for format specification based on their HUDDL descriptions, as well as providing easy version control of them. This solution also provides a powerful approach for archiving a structural description of data along with the data, so that binary data will be easy to access in the future. Intending to provide a relatively low-effort solution to index the wide range of existing formats, we suggest the creation of a catalogue of format descriptions, each of them capturing the logical and physical specifications for a given data format (with its subsequent upgrades). A C/C++ parser code generator is used as an example prototype of one of the possible advantages of the adoption of such a hydrographic data format catalogue
Huddl: the Hydrographic Universal Data Description Language
Since many of the attempts to introduce a universal hydrographic data format have failed or have been only partially successful, a different approach is proposed. Our solution is the Hydrographic Universal Data Description Language (HUDDL), a descriptive XML-based language that permits the creation of a standardized description of (past, present, and future) data formats, and allows for applications like HUDDLER, a compiler that automatically creates drivers for data access and manipulation. HUDDL also represents a powerful solution for archiving data along with their structural description, as well as for cataloguing existing format specifications and their version control. HUDDL is intended to be an open, community-led initiative to simplify the issues involved in hydrographic data access
Potentially Polluting Marine Sites GeoDB: An S-100 Geospatial Database as an Effective Contribution to the Protection of the Marine Environment
Potentially Polluting Marine Sites (PPMS) are objects on, or areas of, the seabed that may release pollution in the future. A rationale for, and design of, a geospatial database to inventory and manipu-late PPMS is presented. Built as an S-100 Product Specification, it is specified through human-readable UML diagrams and implemented through machine-readable GML files, and includes auxiliary information such as pollution-control resources and potentially vulnerable sites in order to support analyses of the core data. The design and some aspects of implementation are presented, along with metadata requirements and structure, and a perspective on potential uses of the database
Developing a GIS-Database and Risk Index for Potentially Polluting Marine Sites
The increasing availability of geospatial marine data provides an opportunity for hydrographic offices to contribute to the identification of âPotentially Polluting Marine Sitesâ (PPMS). These include shipwrecks, oil rigs, pipelines, and dumping areas. To adequately assess the environmental risk of these sites, relevant information must be collected and converted into a multi-scale geodatabase suitable for site inventory and geo-spatial analysis. In addition, a Risk Index â representing an assessment of the magnitude of risk associated with any site â can be derived to determine the potential impacts of these PPMS. However, the successful collection and integration of PPMS information requires some effort to ânormalizeâ and standardize the data based on recognized international standards. In particular, there is benefit in structuring the data in conformance with the Universal Hydrographic Data Model (IHO S-100) recently adopted by the International Hydrographic Organization. In this paper, an S-100 compliant product specification for a PPMS geo-spatial database and associated Marine Site Risk Index is proposed which can be used by national hydrographic offices and marine protection agencies
Non-Perturbatively Improved Quenched Hadron Spectroscopy
We make a quenched lattice simulation of hadron spectroscopy at beta=6.2 with
the Wilson action non-perturbatively improved. With respect to the unimproved
case, the estimate of the lattice spacing is less influenced by the choice of
input hadron masses. We study also the effects of using an improved quark mass
in the fits to the dependence of hadron masses upon quark masses.Comment: 12 pages, including 5 postscript figure
The Near Infrared and Multiwavelength Afterglow of GRB 000301c
We present near-infrared observations of the counterpart of GRB 000301c. The
K' filter (2.1 micron) light curve shows a well-sampled break in the decay
slope at t=3.5 days post-burst. The early time slope is very shallow (~ -0.1),
while the late time slope is steep (-2.2). Comparison with the optical (R band)
light curve shows marginally significant differences, especially in the early
time decay slope (which is steeper in the optical) and the break time (which
occurs later in the optical). This is contrary to the general expectation that
light curve breaks should either be achromatic (e.g., for breaks due to
collimation effects) or should occur later at longer wavelengths (for most
other breaks). The observed color variations might be intrinsic to the
afterglow, or might indicate systematic errors of > 0.08 magnitude in all
fluxes. Even if the break is achromatic, we argue that its sharpness poses
difficulties for explanations that depend on collimated ejecta. The R light
curve shows further signs of fairly rapid variability (a bump, steep drop, and
plateau) that are not apparent in the K' light curve. In addition, by combining
the IR-optical-UV data with millimeter and radio fluxes, we are able to
constrain the locations of the self-absorption break and cooling break and to
infer the location of the spectral peak at t=3 days: f_nu = 3.4 mJy at nu=1e12
Hz. Using the multiwavelength spectral energy distribution, we are able to
constrain the blast wave energy, which was E > 3e53 erg if the explosion was
isotropic. This implies a maximum gamma ray production efficiency of ~ 0.15 for
GRB 000301C.Comment: Accepted to The Astrophysical Journal. 24 pages, 4 figures, 3 tables;
uses AASTeX 5 macros. This version includes a new figure (R-K' color vs.
time), a better sampled R band light curve, and more extensive discussion of
the optical data and error analysi
GRB Afterglows from Anisotropic Jets
Some progenitor models of gamma-ray bursts (GRBs) (e.g., collapsars) may
produce anisotropic jets in which the energy per unit solid angle is a
power-law function of the angle (). We calculate light
curves and spectra for GRB afterglows when such jets expand either in the
interstellar medium or in the wind medium. In particular, we take into account
two kinds of wind: one () possibly from a typical red
supergiant star and another () possibly from a Wolf-Rayet
star. We find that in each type of medium, one break appears in the late-time
afterglow light curve for small but becomes weaker and smoother as
increases. When , the break seems to disappear but the afterglow decays
rapidly. Thus, one expects that the emission from expanding, highly anisotropic
jets provides a plausible explanation for some rapidly fading afteglows whose
light curves have no break. We also present good fits to the optical afterglow
light curve of GRB 991208. Finally, we argue that this burst might arise from a
highly anisotropic jet expanding in the wind () from a red
supergiant to interpret the observed radio-to-optical-band afterglow data
(spectrum and light curve).Comment: 12 pages + 10 figures, accepted by Ap
Unveiling the nature of INTEGRAL objects through optical spectroscopy. IX. 22 more identifications, and a glance into the far hard X-ray Universe
(Abridged) Since its launch in October 2002, the INTEGRAL satellite has
revolutionized our knowledge of the hard X-ray sky thanks to its unprecedented
imaging capabilities and source detection positional accuracy above 20 keV.
Nevertheless, many of the newly-detected sources in the INTEGRAL sky surveys
are of unknown nature. The combined use of available information at longer
wavelengths (mainly soft X-rays and radio) and of optical spectroscopy on the
putative counterparts of these new hard X-ray objects allows us to pinpoint
their exact nature. Continuing our long-standing program that has been running
since 2004, and using 6 different telescopes of various sizes, we report the
classification through optical spectroscopy of 22 more unidentified or poorly
studied high-energy sources detected with the IBIS instrument onboard INTEGRAL.
We found that 16 of them are active galactic nuclei (AGNs), while the remaining
6 objects are within our Galaxy. Among the identified extragalactic sources, 14
are Type 1 AGNs; of these, 6 lie at redshift larger than 0.5 and one has z =
3.12, which makes it the second farthest object detected in the INTEGRAL
surveys up to now. The remaining AGNs are of type 2, and one of them is a pair
of interacting Seyfert 2 galaxies. The Galactic objects are identified as two
cataclysmic variables, one high-mass X-ray binary, one symbiotic binary and two
chromospherically active stars. We thus still find that AGNs are the most
abundant population among hard X-ray objects identified through optical
spectroscopy. Moreover, we note that the higher sensitivity of the more recent
INTEGRAL surveys is now enabling the detection of high-redshift AGNs, thus
allowing the exploration of the most distant hard X-ray emitting sources and
possibly of the most extreme blazars.Comment: 18 pages, 9 figures, 8 tables, accepted for publication on Astronomy
& Astrophysics, main journa
- âŠ