18,399 research outputs found

    Beyond the Chart: The use of Satellite Remote Sensing for Assessing the Adequacy and Completeness Information

    Get PDF
    Chart adequacy and completeness information consists of the symbols, abbreviations and warnings used to inform mariners of the level of confidence that should be given to data on a nautical chart. This information is derived both from the nautical chart and sailing directions. However, analysis based solely on these datasets is limited without access to the sources (e.g., smooth sheets). Publically-available, multi-spectral satellite imagery and published algorithms can be used to derive estimates of the relative bathymetry in shallow, clear waters. In this study, we evaluate the potential of these methods for supplementing the procedure to assess the adequacy of hydrographic surveying and nautical charting coverage. Optically-derived bathymetry provides information in areas that have not been surveyed and monitor any seafloor changes that may have occurred since the last survey of the area. Preliminary results show that multi-spectral satellite remote sensing is also potentially beneficial as a reconnaissance tool prior to a hydrographic acoustic survey

    A procedure for developing an acceptance test for airborne bathymetric lidar data application to NOAA charts in shallow waters

    Get PDF
    National Oceanic and Atmospheric Administration (NOAA) hydrographic data is typically acquired using sonar systems, with a small percent acquired via airborne lidar bathymetry for near‐shore areas. This study investigated an integrated approach for meeting NOAA’s hydrographic survey requirements for near‐shore areas of NOAA charts, using the existing topographic‐bathymetric lidar data from USACE’s National Coastal Mapping Program (NCMP). Because these existing NCMP bathymetric lidar datasets were not collected to NOAA hydrographic surveying standards, it is unclear if, and under what circumstances, they might aid in meeting certain hydrographic surveying requirements. The NCMP’s bathymetric lidar data are evaluated through a comparison to NOAA’s Office of Coast Survey hydrographic data derived from acoustic surveys. As a result, it is possible to assess if NCMP’s bathymetry can be used to fill in the data gap shoreward of the navigable area limit line (0 to 4 meters) and if there is potential for applying NCMP’s bathymetry lidar data to near‐shore areas deeper than 10 meters. Based on the study results, recommendations will be provided to NOAA for the site conditions where this data will provide the most benefit. Additionally, this analysis may allow the development of future operating procedures and workflows using other topographic‐ bathymetric lidar datasets to help update near‐shore areas of the NOAA charts

    Spatial snow water equivalent estimation for mountainous areas using wireless-sensor networks and remote-sensing products

    Get PDF
    We developed an approach to estimate snow water equivalent (SWE) through interpolation of spatially representative point measurements using a k-nearest neighbors (k-NN) algorithm and historical spatial SWE data. It accurately reproduced measured SWE, using different data sources for training and evaluation. In the central-Sierra American River basin, we used a k-NN algorithm to interpolate data from continuous snow-depth measurements in 10 sensor clusters by fusing them with 14 years of daily 500-m resolution SWE-reconstruction maps. Accurate SWE estimation over the melt season shows the potential for providing daily, near real-time distributed snowmelt estimates. Further south, in the Merced-Tuolumne basins, we evaluated the potential of k-NN approach to improve real-time SWE estimates. Lacking dense ground-measurement networks, we simulated k-NN interpolation of sensor data using selected pixels of a bi-weekly Lidar-derived snow water equivalent product. k-NN extrapolations underestimate the Lidar-derived SWE, with a maximum bias of −10 cm at elevations below 3000 m and +15 cm above 3000 m. This bias was reduced by using a Gaussian-process regression model to spatially distribute residuals. Using as few as 10 scenes of Lidar-derived SWE from 2014 as training data in the k-NN to estimate the 2016 spatial SWE, both RMSEs and MAEs were reduced from around 20–25 cm to 10–15 cm comparing to using SWE reconstructions as training data. We found that the spatial accuracy of the historical data is more important for learning the spatial distribution of SWE than the number of historical scenes available. Blending continuous spatially representative ground-based sensors with a historical library of SWE reconstructions over the same basin can provide real-time spatial SWE maps that accurately represents Lidar-measured snow depth; and the estimates can be improved by using historical Lidar scans instead of SWE reconstructions

    Aerosol Data Sources and Their Roles within PARAGON

    Get PDF
    We briefly but systematically review major sources of aerosol data, emphasizing suites of measurements that seem most likely to contribute to assessments of global aerosol climate forcing. The strengths and limitations of existing satellite, surface, and aircraft remote sensing systems are described, along with those of direct sampling networks and ship-based stations. It is evident that an enormous number of aerosol-related observations have been made, on a wide range of spatial and temporal sampling scales, and that many of the key gaps in this collection of data could be filled by technologies that either exist or are expected to be available in the near future. Emphasis must be given to combining remote sensing and in situ active and passive observations and integrating them with aerosol chemical transport models, in order to create a more complete environmental picture, having sufficient detail to address current climate forcing questions. The Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) initiative would provide an organizational framework to meet this goal

    Photon counting compressive depth mapping

    Get PDF
    We demonstrate a compressed sensing, photon counting lidar system based on the single-pixel camera. Our technique recovers both depth and intensity maps from a single under-sampled set of incoherent, linear projections of a scene of interest at ultra-low light levels around 0.5 picowatts. Only two-dimensional reconstructions are required to image a three-dimensional scene. We demonstrate intensity imaging and depth mapping at 256 x 256 pixel transverse resolution with acquisition times as short as 3 seconds. We also show novelty filtering, reconstructing only the difference between two instances of a scene. Finally, we acquire 32 x 32 pixel real-time video for three-dimensional object tracking at 14 frames-per-second.Comment: 16 pages, 8 figure

    Very Shallow Water Bathymetry Retrieval from Hyperspectral Imagery at the Virginia Coast Reserve (VCR\u2707) Multi-Sensor Campaign

    Get PDF
    A number of institutions, including the Naval Research Laboratory (NRL), have developed look up tables for remote retrieval of bathymetry and in-water optical properties from hyperspectral imagery (HSI) [6]. For bathymetry retrieval, the lower limit is the very shallow water case (here defined as \u3c 2m), a depth zone which is not well resolved by many existing bathymetric LIDAR sensors, such as SHOALS [4]. The ability to rapidly model these shallow water depths from HSI directly has potential benefits for combined HSI/LIDAR systems such as the Compact Hydrographic Airborne Rapid Total Survey (CHARTS) [10]. In this study, we focused on the validation of a near infra-red feature, corresponding to a local minimum in absorption (and therefore a local peak in reflectance), which can be correlated directly to bathymetry with a high degree of confidence. Compared to other VNIR wavelengths, this particular near-IR feature corresponds to a peak in the correlation with depth in this very shallow water regime, and this is a spectral range where reflectance depends primarily on water depth (water absorption) and bottom type, with suspended constituents playing a secondary role

    Quantum-inspired computational imaging

    Get PDF
    Computational imaging combines measurement and computational methods with the aim of forming images even when the measurement conditions are weak, few in number, or highly indirect. The recent surge in quantum-inspired imaging sensors, together with a new wave of algorithms allowing on-chip, scalable and robust data processing, has induced an increase of activity with notable results in the domain of low-light flux imaging and sensing. We provide an overview of the major challenges encountered in low-illumination (e.g., ultrafast) imaging and how these problems have recently been addressed for imaging applications in extreme conditions. These methods provide examples of the future imaging solutions to be developed, for which the best results are expected to arise from an efficient codesign of the sensors and data analysis tools.Y.A. acknowledges support from the UK Royal Academy of Engineering under the Research Fellowship Scheme (RF201617/16/31). S.McL. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grant EP/J015180/1). V.G. acknowledges support from the U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office award W911NF-10-1-0404, the U.S. DARPA REVEAL program through contract HR0011-16-C-0030, and U.S. National Science Foundation through grants 1161413 and 1422034. A.H. acknowledges support from U.S. Army Research Office award W911NF-15-1-0479, U.S. Department of the Air Force grant FA8650-15-D-1845, and U.S. Department of Energy National Nuclear Security Administration grant DE-NA0002534. D.F. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grants EP/M006514/1 and EP/M01326X/1). (RF201617/16/31 - UK Royal Academy of Engineering; EP/J015180/1 - UK Engineering and Physical Sciences Research Council; EP/M006514/1 - UK Engineering and Physical Sciences Research Council; EP/M01326X/1 - UK Engineering and Physical Sciences Research Council; W911NF-10-1-0404 - U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office; HR0011-16-C-0030 - U.S. DARPA REVEAL program; 1161413 - U.S. National Science Foundation; 1422034 - U.S. National Science Foundation; W911NF-15-1-0479 - U.S. Army Research Office; FA8650-15-D-1845 - U.S. Department of the Air Force; DE-NA0002534 - U.S. Department of Energy National Nuclear Security Administration)Accepted manuscrip

    Estimating forest structure in a tropical forest using field measurements, a synthetic model and discrete return lidar data

    Get PDF
    Tropical forests are huge reservoirs of terrestrial carbon and are experiencing rapid degradation and deforestation. Understanding forest structure proves vital in accurately estimating both forest biomass and also the natural disturbances and remote sensing is an essential method for quantification of forest properties and structure in the tropics. Our objective is to examine canopy vegetation profiles formulated from discrete return LIght Detection And Ranging (lidar) data and examine their usefulness in estimating forest structural parameters measured during a field campaign. We developed a modeling procedure that utilized hypothetical stand characteristics to examine lidar profiles. In essence, this is a simple method to further enhance shape characteristics from the lidar profile. In this paper we report the results comparing field data collected at La Selva, Costa Rica (10° 26′ N, 83° 59′ W) and forest structure and parameters calculated from vegetation height profiles and forest structural modeling. We developed multiple regression models for each measured forest biometric property using forward stepwise variable selection that used Bayesian information criteria (BIC) as selection criteria. Among measures of forest structure, ranging from tree lateral density, diameter at breast height, and crown geometry, we found strong relationships with lidar canopy vegetation profile parameters. Metrics developed from lidar that were indicators of height of canopy were not significant in estimating plot biomass (p-value = 0.31, r2 = 0.17), but parameters from our synthetic forest model were found to be significant for estimating many of the forest structural properties, such as mean trunk diameter (p-value = 0.004, r2 = 0.51) and tree density (p-value = 0.002, r2 = 0.43). We were also able to develop a significant model relating lidar profiles to basal area (p-value = 0.003, r2 = 0.43). Use of the full lidar profile provided additional avenues for the prediction of field based forest measure parameters. Our synthetic canopy model provides a novel method for examining lidar metrics by developing a look-up table of profiles that determine profile shape, depth, and height. We suggest that the use of metrics indicating canopy height derived from lidar are limited in understanding biomass in a forest with little variation across the landscape and that there are many parameters that may be gleaned by lidar data that inform on forest biometric properties
    corecore