41 research outputs found

    NeMO-Net & Fluid Lensing: The Neural Multi-Modal Observation & Training Network for Global Coral Reef Assessment Using Fluid Lensing Augmentation of NASA EOS Data

    Get PDF
    We present preliminary results from NASA NeMO-Net, the first neural multi-modal observation and training network for global coral reef assessment. NeMO-Net is an open-source deep convolutional neural network (CNN) and interactive active learning training software in development which will assess the present and past dynamics of coral reef ecosystems. NeMO-Net exploits active learning and data fusion of mm-scale remotely sensed 3D images of coral reefs captured using fluid lensing with the NASA FluidCam instrument, presently the highest-resolution remote sensing benthic imaging technology capable of removing ocean wave distortion, as well as hyperspectral airborne remote sensing data from the ongoing NASA CORAL mission and lower-resolution satellite data to determine coral reef ecosystem makeup globally at unprecedented spatial and temporal scales. Aquatic ecosystems, particularly coral reefs, remain quantitatively misrepresented by low-resolution remote sensing as a result of refractive distortion from ocean waves, optical attenuation, and remoteness. Machine learning classification of coral reefs using FluidCam mm-scale 3D data show that present satellite and airborne remote sensing techniques poorly characterize coral reef percent living cover, morphology type, and species breakdown at the mm, cm, and meter scales. Indeed, current global assessments of coral reef cover and morphology classification based on km-scale satellite data alone can suffer from segmentation errors greater than 40%, capable of change detection only on yearly temporal scales and decameter spatial scales, significantly hindering our understanding of patterns and processes in marine biodiversity at a time when these ecosystems are experiencing unprecedented anthropogenic pressures, ocean acidification, and sea surface temperature rise. NeMO-Net leverages our augmented machine learning algorithm that demonstrates data fusion of regional FluidCam (mm, cm-scale) airborne remote sensing with global low-resolution (m, km-scale) airborne and spaceborne imagery to reduce classification errors up to 80% over regional scales. Such technologies can substantially enhance our ability to assess coral reef ecosystems dynamics

    System and Method for Active Multispectral Imaging and Optical Communications

    Get PDF
    Provided is a system and method for active multispectral imaging having a transmitter that uses narrowband optical radiation to dynamically illuminate an object with modulated structured light in multiple spectral bands, and a receiver that includes an independent panchromatic imager. The transmitter and receiver can be operated in a bistatic decoupled configuration to enable passive multispectral synthesis, illumination-invariant sensing, optical communications, and the ability for the transmitter to emit a sequence of spectral bands in an order that is unknown to the receiver, and the receiver is able to passively decode the spectral identity from a band identifier embedded in the modulated structured light. The receiver passively decodes embedded high-bandwidth simplex communications while reconstructing calibrated multispectral images at video frame rates

    NASA Next-Generation Sensing Technologies for Earth & Planetary Science

    Get PDF
    Dr. Ved Chirayath's seminar will highlight new sensing technologies and airborne platforms he is developing for Earth & Space Science as director of the NASA Silicon Valley Laboratory for Advanced Sensing. His talk will feature recent work on Fluid Lensing, the first remote sensing technology capable of imaging through ocean waves in 3D at sub-cm resolutions, MiDAR, a next-generation active hyperspectral remote sensing and optical communications instrument, airborne gas sensing of multipollutant combustion sources, and a plasma actuated unmanned aerial vehicle (UAV) that utilized high-voltage dielectric discharge devices to achieve the first plasma controlled flight in history.Fluid Lensing and the NASA FluidCam instrument have been used extensively to provide distortion-free 3D multispectral imagery from UAVs of shallow marine systems around the world. MiDAR is being deployed on aircraft and underwater remotely operated vehicles (ROVs) as a new method to remotely sense living and nonliving structures in extreme environments as an analog for future Ocean Worlds robotic exploration missions. Finally, Chirayath will present preliminary results from NeMO-Net, a supercomputer-based neural network that uses a citizen science video game for global multimodal coral reef benthic habitat mapping, fusing remote sensing data from Fluid Lensing, MiDAR, NASA's Earth Observing System, and commercial satellites, to better understand the present and past dynamics of shallow marine systems. Together, these maturing technologies present promising new ways in which to explore terrestrial, marine, and aerial systems on Earth, and, ultimately, aid in the search for extraterrestrial life within our solar system and beyond

    NASA Fluid Lensing & MiDAR - Next-Generation Remote Sensing Technologies for Aquatic Remote Sensing

    Get PDF
    Piti's Tepungan Bay and Tumon Bay, two of five marine preserves in Guam, have not been mapped to a level of detail sufficient to support proposed management strategies. This project addresses this gap by providing high resolution maps to promote sustainable, responsible use of the area while protecting natural resources. Dr. Chirayath, a research scientist at the NASA Ames Laboratory, developed a theoretical model and algorithm called 'Fluid Lensing'. Fluid lensing removes optical distortions caused by moving water, improving the clarity of the images taken of the corals below the surface. We will also be using MiDAR, a next-generation remote sensing instrument that provides real-time multispectral video using an array of LED emitters coupled with NASA's FluidCam Imaging System, which may assist Guam's coral reef response team in understanding the severity and magnitude of coral bleaching events. This project will produce a 3D orthorectified model of the shallow water coral reef ecosystems in Tumon Bay and Piti marine preserves. These 3D models may be printed, creating a tactile diorama and increasing understanding of coral reefs among various audiences, including key decision makers. More importantly, the final data products can enable accurate and quantitative health assessment capabilities for coral reef ecosystems

    Fluid Lensing and Applications to Remote Sensing of Aquatic Environments

    Get PDF
    The use of fluid lensing technology on UAVs is presented as a novel means for 3D imaging of aquatic ecosystems from above the water's surface at the centimeter scale. Preliminary results are presented from airborne fluid lensing campaigns conducted over the coral reefs of Ofu Island, American Samoa (2013) and the stromatolite reefs of Shark Bay, Western Australia (2014), covering a combined area of 15km2. These reef ecosystems were revealed with centimetre-scale 2D resolution, and an accompanying 3D bathymetry model was derived using fluid lensing, Structure from Motion and UAV position data. Data products were validated from in-situ survey methods including underwater calibration targets, depth measurements and millimetre-scale high-dynamic range gigapixel photogrammetry. Fluid lensing is an experimental technology that uses water transmitting wavelengths to passively image underwater objects at high-resolution by exploiting time-varying optical lensing events caused by surface waves. Fluid lensing data are captured from low-altitude, cost-effective electric UAVs to achieve multispectral imagery and bathymetry models at the centimetre scale over regional areas. As a passive system, fluid lensing is presently limited by signal-to-noise ratio and water column inherent optical properties to approximately 10 m depth over visible wavelengths in clear waters. The datasets derived from fluid lensing present the first centimetre-scale images of a reef acquired from above the ocean surface, without wave distortion. The 3D multispectral data distinguish coral, fish and invertebrates in American Samoa, and reveal previously undocumented, morphologically distinct, stromatolite structures in Shark Bay. These findings suggest fluid lensing and multirotor electric drones represent a promising advance in the remote sensing of aquatic environments at the centimetre scale, or 'reef scale' relevant to the conservation of reef ecosystems. Pending further development and validation of fluid lensing methods, these technologies present a solution for large-scale 3D surveys of shallow aquatic habitats with centimetre-scale spatial resolution and hourly temporal sampling

    NASA NeMO-Net

    Get PDF
    We present preliminary results from NASA NeMO-Net, the first neural multi-modal observation and training network for global coral reef assessment. NeMO-Net is an open-source deep convolutional neural network (CNN) and interactive active learning training software in development which will assess the present and past dynamics of coral reef ecosystems. NeMO-Net exploits active learning and data fusion of mm-scale remotely sensed 3D images of coral reefs captured using fluid lensing with the NASA FluidCam instrument, presently the highest-resolution remote sensing benthic imaging technology capable of removing ocean wave distortion, as well as hyperspectral airborne remote sensing data from the ongoing NASA CORAL mission and lower-resolution satellite data to determine coral reef ecosystem makeup globally at unprecedented spatial and temporal scales. Aquatic ecosystems, particularly coral reefs, remain quantitatively misrepresented by low- resolution remote sensing as a result of refractive distortion from ocean waves, optical attenuation, and remoteness. Machine learning classification of coral reefs using FluidCam mm-scale 3D data show that present satellite and airborne remote sensing techniques poorly characterize coral reef percent living cover, morphology type, and species breakdown at the mm, cm, and meter scales. Indeed, current global assessments of coral reef cover and morphology classification based on km-scale satellite data alone can suffer from segmentation errors greater than 40%, capable of change detection only on yearly temporal scales and decameter spatial scales, significantly hindering our understanding of patterns and processes in marine biodiversity at a time when these ecosystems are experiencing unprecedented anthropogenic pressures, ocean acidification, and sea surface temperature rise. NeMO-Net leverages our augmented machine learning algorithm that demonstrates data fusion of regional FluidCam (mm, cm-scale) airborne remote sensing with global low-resolution (m, km-scale) airborne and spaceborne imagery to reduce classification errors up to 80% over regional scales. Such technologies can substantially enhance our ability to assess coral reef ecosystems dynamics

    NASA Fluid Lensing & MiDAR: Next-Generation Remote Sensing Technologies for Aquatic Remote Sensing

    Get PDF
    We present two recent instrument technology developments at NASA, Fluid Lensing and MiDAR, and their application to remote sensing of Earth's aquatic systems. Fluid Lensing is the first remote sensing technology capable of imaging through ocean waves in 3D at sub-cm resolutions. MiDAR is a next-generation active hyperspectral remote sensing and optical communications instrument capable of active fluid lensing. Fluid Lensing has been used to provide 3D multispectral imagery of shallow marine systems from unmanned aerial vehicles (UAVs, or drones), including coral reefs in American Samoa and stromatolite reefs in Hamelin Pool, Western Australia. MiDAR is being deployed on aircraft and underwater remotely operated vehicles (ROVs) to enable a new method for remote sensing of living and nonliving structures in extreme environments. MiDAR images targets with high-intensity narrowband structured optical radiation to measure an object"TM"s non-linear spectral reflectance, image through fluid interfaces such as ocean waves with active fluid lensing, and simultaneously transmit high-bandwidth data. As an active instrument, MiDAR is capable of remotely sensing reflectance at the centimeter (cm) spatial scale with a signal-to-noise ratio (SNR) multiple orders of magnitude higher than passive airborne and spaceborne remote sensing systems with significantly reduced integration time. This allows for rapid video-frame-rate hyperspectral sensing into the far ultraviolet and VNIR wavelengths. Previously, MiDAR was developed into a TRL 2 laboratory instrument capable of imaging in thirty-two narrowband channels across the VNIR spectrum (400-950nm). Recently, MiDAR UV was raised to TRL4 and expanded to include five ultraviolet bands from 280-400nm, permitting UV remote sensing capabilities in UV A, B, and C bands and enabling mineral identification and stimulated fluorescence measurements of organic proteins and compounds, such as green fluorescent proteins in terrestrial and aquatic organics

    Use of Multi-Spectral High Repetition Rate LED Systems for High Bandwidth Underwater Optical Communications, and Communications to Surface and Aerial Systems

    Get PDF
    A variety of both existing and developing sensors would benefit from near real time communication of high bandwidth data. To cite just one example, sensors that could more accurately report real-time positions of marine mammals would be useful in reducing whale-ship collisions. Similar considerations are relevant for maritime port and harbor security, including detection and alerts for divers or autonomous underwater vehicles (AUVs) that could pose a risk to ships. Especially in ports and harbors, field experiments have confirmed that acoustic communication in these cluttered and noisy shallow water environments, compounded with vertical reflecting surfaces formed by piers and pilings, can limit the reliability and utility of underwater acoustic communications. Moreover, many sensors have greater bandwidth requirements than acoustic communications are able to provide. We here discuss the development of high repetition rate multispectral LED optical systems initially developed for imaging, but also capable of simultaneous data transmission at rates of approximately100 kilobits per second. Results are discussed for the multispectral images from coral reefs in Guam, and data transmission experiments from underwater to surface vessels. Subsequent field efforts will extend data transmission from AUVs to unmanned aircraft systems (UAS)

    NeMO-Net: The Neural Multi-Modal Observation and Training Network for Global Coral Reef Assessment

    Get PDF
    In the past decade, coral reefs worldwide have experienced unprecedented stresses due to climate change, ocean acidification, and anthropomorphic pressures, instigating massive bleaching and die-off of these fragile and diverse ecosystems. Furthermore, remote sensing of these shallow marine habitats is hindered by ocean wave distortion, refraction and optical attenuation, leading invariably to data products that are often of low resolution and signal-to-noise (SNR) ratio. However, recent advances in UAV and Fluid Lensing technology have allowed us to capture multispectral 3D imagery of these systems at sub-cm scales from above the water surface, giving us an unprecedented view of their growth and decay. Exploiting the fine-scaled features of these datasets, machine learning methods such as MAP, PCA, and SVM can not only accurately classify the living cover and morphology of these reef systems (below 8 percent error), but are also able to map the spectral space between airborne and satellite imagery, augmenting and improving the classification accuracy of previously low-resolution datasets. We are currently implementing NeMO-Net, the first open-source deep convolutional neural network (CNN) and interactive active learning and training software to accurately assess the present and past dynamics of coral reef ecosystems through determination of percent living cover and morphology. NeMO-Net will be built upon the QGIS platform to ingest UAV, airborne and satellite datasets from various sources and sensor capabilities, and through data-fusion determine the coral reef ecosystem makeup globally at unprecedented spatial and temporal scales. To achieve this, we will exploit virtual data augmentation, the use of semi-supervised learning, and active learning through a tablet platform allowing for users to manually train uncertain or difficult to classify datasets. The project will make use of Pythons extensive libraries for machine learning, as well as extending integration to GPU and High-End Computing Capability (HECC) on the Pleiades supercomputing cluster, located at NASA Ames. The project is being supported by NASAs Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST-16) Program

    NeMO-Net The Neural Multi-Modal Observation Training Network for Global Coral Reef Assessment

    Get PDF
    In the past decade, coral reefs worldwide have experienced unprecedented stresses due to climate change, ocean acidification, and anthropomorphic pressures, instigating massive bleaching and die-off of these fragile and diverse ecosystems. Furthermore, remote sensing of these shallow marine habitats is hindered by ocean wave distortion, refraction and optical attenuation, leading invariably to data products that are often of low resolution and signal-to-noise (SNR) ratio. However, recent advances in UAV and Fluid Lensing technology have allowed us to capture multispectral 3D imagery of these systems at sub-cm scales from above the water surface, giving us an unprecedented view of their growth and decay. Exploiting the fine-scaled features of these datasets, machine learning methods such as MAP, PCA, and SVM can not only accurately classify the living cover and morphology of these reef systems (below 8 error), but are also able to map the spectral space between airborne and satellite imagery, augmenting and improving the classification accuracy of previously low-resolution datasets.We are currently implementing NeMO-Net, the first open-source deep convolutional neural network (CNN) and interactive active learning and training software to accurately assess the present and past dynamics of coral reef ecosystems through determination of percent living cover and morphology. NeMO-Net will be built upon the QGIS platform to ingest UAV, airborne and satellite datasets from various sources and sensor capabilities, and through data-fusion determine the coral reef ecosystem makeup globally at unprecedented spatial and temporal scales. To achieve this, we will exploit virtual data augmentation, the use of semi-supervised learning, and active learning through a tablet platform allowing for users to manually train uncertain or difficult to classify datasets. The project will make use of Pythons extensive libraries for machine learning, as well as extending integration to GPU and High-End Computing Capability (HECC) on the Pleiades supercomputing cluster, located at NASA Ames. The project is being supported by NASAs Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST-16) Program
    corecore