4,981 research outputs found

    The future of Earth observation in hydrology

    Get PDF
    In just the past 5 years, the field of Earth observation has progressed beyond the offerings of conventional space-agency-based platforms to include a plethora of sensing opportunities afforded by CubeSats, unmanned aerial vehicles (UAVs), and smartphone technologies that are being embraced by both for-profit companies and individual researchers. Over the previous decades, space agency efforts have brought forth well-known and immensely useful satellites such as the Landsat series and the Gravity Research and Climate Experiment (GRACE) system, with costs typically of the order of 1 billion dollars per satellite and with concept-to-launch timelines of the order of 2 decades (for new missions). More recently, the proliferation of smart-phones has helped to miniaturize sensors and energy requirements, facilitating advances in the use of CubeSats that can be launched by the dozens, while providing ultra-high (3-5 m) resolution sensing of the Earth on a daily basis. Start-up companies that did not exist a decade ago now operate more satellites in orbit than any space agency, and at costs that are a mere fraction of traditional satellite missions. With these advances come new space-borne measurements, such as real-time high-definition video for tracking air pollution, storm-cell development, flood propagation, precipitation monitoring, or even for constructing digital surfaces using structure-from-motion techniques. Closer to the surface, measurements from small unmanned drones and tethered balloons have mapped snow depths, floods, and estimated evaporation at sub-metre resolutions, pushing back on spatio-temporal constraints and delivering new process insights. At ground level, precipitation has been measured using signal attenuation between antennae mounted on cell phone towers, while the proliferation of mobile devices has enabled citizen scientists to catalogue photos of environmental conditions, estimate daily average temperatures from battery state, and sense other hydrologically important variables such as channel depths using commercially available wireless devices. Global internet access is being pursued via high-altitude balloons, solar planes, and hundreds of planned satellite launches, providing a means to exploit the "internet of things" as an entirely new measurement domain. Such global access will enable real-time collection of data from billions of smartphones or from remote research platforms. This future will produce petabytes of data that can only be accessed via cloud storage and will require new analytical approaches to interpret. The extent to which today's hydrologic models can usefully ingest such massive data volumes is unclear. Nor is it clear whether this deluge of data will be usefully exploited, either because the measurements are superfluous, inconsistent, not accurate enough, or simply because we lack the capacity to process and analyse them. What is apparent is that the tools and techniques afforded by this array of novel and game-changing sensing platforms present our community with a unique opportunity to develop new insights that advance fundamental aspects of the hydrological sciences. To accomplish this will require more than just an application of the technology: in some cases, it will demand a radical rethink on how we utilize and exploit these new observing systems

    Introducing an effect of climate change into globals models of rain fade on telecommunications links

    Get PDF
    Rain attenuation limits the performance of microwave telecommunication links functioning above approximately 5 GHz. Recent studies have revealed that over the last twenty years the occurrence of rain, at intensities that cause outage on terrestrial links, has experienced a strongly increasing trend in the UK. Globally, the height of rain events has also been observed to increase, which may compound increasing trends in rain fade experienced by Earth-Space communication systems. These climatic changes are almost certainly having significant effect on the performance of existing radio systems, and need to be taken into consideration when planning future systems. The International Telecommunication Union – Radio Section (ITU-R), maintains a set of internationally accepted models for the engineering and regulation of radio systems globally. Although under constant revision, these models assume that atmospheric fading is stationary. This assumption is inherent in the way models are tested.In this project, a method is developed to estimate global trends in one of the most fundamental parameters to the ITU-R models: the one-minute rain rate exceeded for 0.01% of an average year. This method introduces climate change into the ITU-R model of this parameter: Rec. ITU-R P.837. The new model is tested using a method that does not make a stationary climate assumption. Salonen-Poiares Baptista distribution, which is the fundamental method for developing ITU-R Rec. P.837 has been tested using UK Environment Agency data, but no correlations was found between measured annual accumulations and distribution parameters. Nonetheless a link was found between mean annual total precipitations (MT) and rain exceeded at larger time percentages such as; 0.1% and 1%

    On requirements for a satellite mission to measure tropical rainfall

    Get PDF
    Tropical rainfall data are crucial in determining the role of tropical latent heating in driving the circulation of the global atmosphere. Also, the data are particularly important for testing the realism of climate models, and their ability to simulate and predict climate accurately on the seasonal time scale. Other scientific issues such as the effects of El Nino on climate could be addressed with a reliable, extended time series of tropical rainfall observations. A passive microwave sensor is planned to provide information on the integrated column precipitation content, its areal distribution, and its intensity. An active microwave sensor (radar) will define the layer depth of the precipitation and provide information about the intensity of rain reaching the surface, the key to determining the latent heat input to the atmosphere. A visible/infrared sensor will provide very high resolution information on cloud coverage, type, and top temperatures and also serve as the link between these data and the long and virtually continuous coverage by the geosynchronous meteorological satellites. The unique combination of sensor wavelengths, coverages, and resolving capabilities together with the low-altitude, non-Sun synchronous orbit provide a sampling capability that should yield monthly precipitation amounts to a reasonable accuracy over a 500- by 500-km grid

    NASA scientific and technical publications: A catalog of special publications, reference publications, conference publications, and technical papers, 1989

    Get PDF
    This catalog lists 190 citations of all NASA Special Publications, NASA Reference Publications, NASA Conference Publications, and NASA Technical Papers that were entered into the NASA scientific and technical information database during accession year 1989. The entries are grouped by subject category. Indexes of subject terms, personal authors, and NASA report numbers are provided

    Opportunistic rain rate estimation from measurements of satellite downlink attenuation: A survey

    Get PDF
    Recent years have witnessed a growing interest in techniques and systems for rainfall surveillance on regional scale, with increasingly stringent requirements in terms of the following: (i) accuracy of rainfall rate measurements, (ii) adequate density of sensors over the territory, (iii) space‐time continuity and completeness of data and (iv) capability to elaborate rainfall maps in near real time. The devices deployed to monitor the precipitation fields are traditionally networks of rain gauges distributed throughout the territory, along with weather radars and satellite remote sensors operating in the optical or infrared band, none of which, however, are suitable for full compliance to all of the requirements cited above. More recently, a different approach to rain rate estimation techniques has been proposed and investigated, based on the measurement of the attenuation induced by rain on signals of pre‐existing radio networks either in terrestrial links, e.g., the backhaul connections in cellular networks, or in satellite‐to‐earth links and, among the latter, notably those between geostationary broadcast satellites and domestic subscriber terminals in the Ku and Ka bands. Knowledge of the above rain‐induced attenuation permits the retrieval of the corresponding rain intensity provided that a number of meteorological and geometric parameters are known and ultimately permits estimating the rain rate locally at the receiver site. In this survey paper, we specifically focus on such a type of “opportunistic” systems for rain field monitoring, which appear very promising in view of the wide diffusion over the territory of low‐cost domestic terminals for the reception of satellite signals, prospectively allowing for a considerable geographical capillarity in the distribution of sensors, at least in more densely populated areas. The purpose of the paper is to present a broad albeit synthetic overview of the numerous issues inherent in the above rain monitoring approach, along with a number of solutions and algorithms proposed in the literature in recent years, and ultimately to provide an exhaustive account of the current state of the art. Initially, the main relevant aspects of the satellite link are reviewed, including those related to satellite dynamics, frequency bands, signal formats, propagation channel and radio link geometry, all of which have a role in rainfall rate estimation algorithms. We discuss the impact of all these factors on rain estimation accuracy while also highlighting the substantial differences inherent in this approach in comparison with traditional rain monitoring techniques. We also review the basic formulas relating rain rate intensity to a variation of the received signal level or of the signal‐to-noise ratio. Furthermore, we present a comprehensive literature survey of the main research issues for the aforementioned scenario and provide a brief outline of the algorithms proposed for their solution, highlighting their points of strength and weakness. The paper includes an extensive list of bibliographic references from which the material presented herein was taken

    Scintillation/dynamics of the signal

    Get PDF

    Guidelines for spaceborne microwave remote sensors

    Get PDF
    A handbook was developed to provide information and support to the spaceborne remote sensing and frequency management communities: to guide sensor developers in the choice of frequencies; to advise regulators on sensor technology needs and sharing potential; to present sharing analysis models and, through example, methods for determining sensor sharing feasibility; to introduce developers to the regulatory process; to create awareness of proper assignment procedures; to present sensor allocations; and to provide guidelines on the use and limitations of allocated bands. Controlling physical factors and user requirements and the regulatory environment are discussed. Sensor frequency allocation achievable performance and usefulness are reviewed. Procedures for national and international registration, the use of non-allocated bands and steps for obtaining new frequency allocations, and procedures for reporting interference are also discussed

    Temporal variability corrections for Advanced Microwave Scanning Radiometer E (AMSR-E) surface soil moisture: case study in Little River Region, Georgia, U. S.

    Get PDF
    Statistical correction methods, the Cumulative Distribution Function (CDF) matching technique and Regional Statistics Method (RSM) are applied to adjust the limited temporal variability of Advanced Microwave Scanning Radiometer E (AMSR-E) data using the Common Land Model (CLM). The temporal variability adjustment between CLM and AMSR-E data was conducted for annual and seasonal periods for 2003 in the Little River region, GA. The results showed that the statistical correction techniques improved AMSR-E\u27s limited temporal variability as compared to ground-based measurements. The regression slope and intercept improved from 0.210 and 0.112 up to 0.971 and -0.005 for the non-growing season. The R-2 values also modestly improved. The Moderate Resolution Imaging Spectroradiometer (MODIS) Leaf Area Index (LAI) products were able to identify periods having an attenuated microwave brightness signal that are not likely to benefit from these statistical correction techniques

    Forest disturbance and recovery: A general review in the context of spaceborne remote sensing of impacts on aboveground biomass and canopy structure

    Get PDF
    Abrupt forest disturbances generating gaps \u3e0.001 km2 impact roughly 0.4–0.7 million km2a−1. Fire, windstorms, logging, and shifting cultivation are dominant disturbances; minor contributors are land conversion, flooding, landslides, and avalanches. All can have substantial impacts on canopy biomass and structure. Quantifying disturbance location, extent, severity, and the fate of disturbed biomass will improve carbon budget estimates and lead to better initialization, parameterization, and/or testing of forest carbon cycle models. Spaceborne remote sensing maps large-scale forest disturbance occurrence, location, and extent, particularly with moderate- and fine-scale resolution passive optical/near-infrared (NIR) instruments. High-resolution remote sensing (e.g., ∼1 m passive optical/NIR, or small footprint lidar) can map crown geometry and gaps, but has rarely been systematically applied to study small-scale disturbance and natural mortality gap dynamics over large regions. Reducing uncertainty in disturbance and recovery impacts on global forest carbon balance requires quantification of (1) predisturbance forest biomass; (2) disturbance impact on standing biomass and its fate; and (3) rate of biomass accumulation during recovery. Active remote sensing data (e.g., lidar, radar) are more directly indicative of canopy biomass and many structural properties than passive instrument data; a new generation of instruments designed to generate global coverage/sampling of canopy biomass and structure can improve our ability to quantify the carbon balance of Earth\u27s forests. Generating a high-quality quantitative assessment of disturbance impacts on canopy biomass and structure with spaceborne remote sensing requires comprehensive, well designed, and well coordinated field programs collecting high-quality ground-based data and linkages to dynamical models that can use this information
    corecore