13 research outputs found

    Offshore oil spill detection using synthetic aperture radar

    Get PDF
    Among the different types of marine pollution, oil spill has been considered as a major threat to the sea ecosystems. The source of the oil pollution can be located on the mainland or directly at sea. The sources of oil pollution at sea are discharges coming from ships, offshore platforms or natural seepage from sea bed. Oil pollution from sea-based sources can be accidental or deliberate. Different sensors to detect and monitor oil spills could be onboard vessels, aircraft, or satellites. Vessels equipped with specialised radars, can detect oil at sea but they can cover a very limited area. One of the established ways to monitor sea-based oil pollution is the use of satellites equipped with Synthetic Aperture Radar (SAR).The aim of the work presented in this thesis is to identify optimum set of feature extracted parameters and implement methods at various stages for oil spill detection from Synthetic Aperture Radar (SAR) imagery. More than 200 images of ERS-2, ENVSAT and RADARSAT 2 SAR sensor have been used to assess proposed feature vector for oil spill detection methodology, which involves three stages: segmentation for dark spot detection, feature extraction and classification of feature vector. Unfortunately oil spill is not only the phenomenon that can create a dark spot in SAR imagery. There are several others meteorological and oceanographic and wind induced phenomena which may lead to a dark spot in SAR imagery. Therefore, these dark objects also appear similar to the dark spot due to oil spill and are called as look-alikes. These look-alikes thus cause difficulty in detecting oil spill spots as their primary characteristic similar to oil spill spots. To get over this difficulty, feature extraction becomes important; a stage which may involve selection of appropriate feature extraction parameters. The main objective of this dissertation is to identify the optimum feature vector in order to segregate oil spill and ‘look-alike’ spots. A total of 44 Feature extracted parameters have been studied. For segmentation, four methods; based on edge detection, adaptive theresholding, artificial neural network (ANN) segmentation and the other on contrast split segmentation have been implemented. Spot features are extracted from both the dark spots themselves and their surroundings. Classification stage was performed using two different classification techniques, first one is based on ANN and the other based on a two-stage processing that combines classification tree analysis and fuzzy logic. A modified feature vector, including both new and improved features, is suggested for better description of different types of dark spots. An ANN classifier using full spectrum of feature parameters has also been developed and evaluated. The implemented methodology appears promising in detecting dark spots and discriminating oil spills from look-alikes and processing time is well below any operational service requirements

    Integration of techniques related to ship monitoring : research on the establishment of Chinese Maritime Domain Awareness System

    Get PDF

    Book of short Abstracts of the 11th International Symposium on Digital Earth

    Get PDF
    The Booklet is a collection of accepted short abstracts of the ISDE11 Symposium

    Remote Sensing and Geosciences for Archaeology

    Get PDF
    This book collects more than 20 papers, written by renowned experts and scientists from across the globe, that showcase the state-of-the-art and forefront research in archaeological remote sensing and the use of geoscientific techniques to investigate archaeological records and cultural heritage. Very high resolution satellite images from optical and radar space-borne sensors, airborne multi-spectral images, ground penetrating radar, terrestrial laser scanning, 3D modelling, Geographyc Information Systems (GIS) are among the techniques used in the archaeological studies published in this book. The reader can learn how to use these instruments and sensors, also in combination, to investigate cultural landscapes, discover new sites, reconstruct paleo-landscapes, augment the knowledge of monuments, and assess the condition of heritage at risk. Case studies scattered across Europe, Asia and America are presented: from the World UNESCO World Heritage Site of Lines and Geoglyphs of Nasca and Palpa to heritage under threat in the Middle East and North Africa, from coastal heritage in the intertidal flats of the German North Sea to Early and Neolithic settlements in Thessaly. Beginners will learn robust research methodologies and take inspiration; mature scholars will for sure derive inputs for new research and applications

    Technology Resources for Earthquake Monitoring and Response (TREMOR)

    Get PDF
    Earthquakes represent a major hazard for populations around the world, causing frequent loss of life, human suffering, and enormous damage to homes, other buildings, and infrastructure. The Technology Resources for Earthquake Monitoring and Response (TREMOR) proposal is designed to address this problem. This proposal recommends two prototype systems integrating space-based and ground technology. The suggested pilot implementation is over a 10-year period in three focus countries – China, Japan, and Peru – that are among the areas in the world most afflicted by earthquakes. The first proposed system is an Earthquake Early Warning Prototype System that addresses the potential of earthquake precursors, the science of which is incomplete and considered controversial within the scientific community. We recommend the development and launch of two small satellites to study ionospheric and electromagnetic precursors. In combination with ground-based precursor research, the data gathered will improve existing knowledge of earthquake-related phenomena. The second proposed system is an Earthquake Simulation and Response Prototype. An earthquake simulator will combine any available precursor data with detailed knowledge of the affected areas using a Geographic Information System (GIS) to identify those areas that are most likely to experience the greatest level of damage. Mobile satellite communication hubs will provide telephone and data links between response teams, while satellite navigation systems will locate and track emergency vehicles. We recommend a virtual response satellite constellation composed of existing and future high resolution satellites. We also recommend education and training for response teams on the use of these technologies. The two prototypes will be developed and implemented by a proposed non-profit nongovernmental organization (NGO) called the TREMOR Foundation, which will obtain funding from government disaster management agencies and NGOs. A for-profit subsidiary will market any spin-off technologies and provide an additional source of funding. Assuming positive results from the prototype systems, Team TREMOR recommends their eventual and permanent implementation in all countries affected by earthquakes.Postprint (published version

    Urban Informatics

    Get PDF
    This open access book is the first to systematically introduce the principles of urban informatics and its application to every aspect of the city that involves its functioning, control, management, and future planning. It introduces new models and tools being developed to understand and implement these technologies that enable cities to function more efficiently – to become ‘smart’ and ‘sustainable’. The smart city has quickly emerged as computers have become ever smaller to the point where they can be embedded into the very fabric of the city, as well as being central to new ways in which the population can communicate and act. When cities are wired in this way, they have the potential to become sentient and responsive, generating massive streams of ‘big’ data in real time as well as providing immense opportunities for extracting new forms of urban data through crowdsourcing. This book offers a comprehensive review of the methods that form the core of urban informatics from various kinds of urban remote sensing to new approaches to machine learning and statistical modelling. It provides a detailed technical introduction to the wide array of tools information scientists need to develop the key urban analytics that are fundamental to learning about the smart city, and it outlines ways in which these tools can be used to inform design and policy so that cities can become more efficient with a greater concern for environment and equity

    Urban Informatics

    Get PDF
    This open access book is the first to systematically introduce the principles of urban informatics and its application to every aspect of the city that involves its functioning, control, management, and future planning. It introduces new models and tools being developed to understand and implement these technologies that enable cities to function more efficiently – to become ‘smart’ and ‘sustainable’. The smart city has quickly emerged as computers have become ever smaller to the point where they can be embedded into the very fabric of the city, as well as being central to new ways in which the population can communicate and act. When cities are wired in this way, they have the potential to become sentient and responsive, generating massive streams of ‘big’ data in real time as well as providing immense opportunities for extracting new forms of urban data through crowdsourcing. This book offers a comprehensive review of the methods that form the core of urban informatics from various kinds of urban remote sensing to new approaches to machine learning and statistical modelling. It provides a detailed technical introduction to the wide array of tools information scientists need to develop the key urban analytics that are fundamental to learning about the smart city, and it outlines ways in which these tools can be used to inform design and policy so that cities can become more efficient with a greater concern for environment and equity

    Urban Informatics

    Get PDF
    This open access book is the first to systematically introduce the principles of urban informatics and its application to every aspect of the city that involves its functioning, control, management, and future planning. It introduces new models and tools being developed to understand and implement these technologies that enable cities to function more efficiently – to become ‘smart’ and ‘sustainable’. The smart city has quickly emerged as computers have become ever smaller to the point where they can be embedded into the very fabric of the city, as well as being central to new ways in which the population can communicate and act. When cities are wired in this way, they have the potential to become sentient and responsive, generating massive streams of ‘big’ data in real time as well as providing immense opportunities for extracting new forms of urban data through crowdsourcing. This book offers a comprehensive review of the methods that form the core of urban informatics from various kinds of urban remote sensing to new approaches to machine learning and statistical modelling. It provides a detailed technical introduction to the wide array of tools information scientists need to develop the key urban analytics that are fundamental to learning about the smart city, and it outlines ways in which these tools can be used to inform design and policy so that cities can become more efficient with a greater concern for environment and equity

    Analysis of vegetation-activity trends in a global land degradation framework

    Get PDF
    Land degradation is a global issue on a par with climate change and loss of biodiversity, but its extent and severity are only roughly known and there is little detail on the immediate processes – let alone the drivers. Earth-observation methods enable monitoring of land resources in a consistent, physical way and on global scale by making use of vegetation activity and/or cover as proxies. A well-known spectral proxy is the normalized difference vegetation index (NDVI), which is available in high temporal resolution time series since the early 1980s. In this work, harmonic analyses and non-parametric trend tests were applied to the GIMMS NDVI dataset (1981–2008) in order to quantify positive changes (or greening) and negative changes (browning). Phenological shifts and variations in length of growing season were accounted for using analysis by vegetation development stage rather than by calendar day. This approach does not rely on temporal aggregation for elimination of seasonal variation. The latter might introduce artificial trends as demonstrated in the chapter on the modifiable temporal unit problem. Still, a major assumption underlying the analysis is that trends were invariant, i.e. linear or monotonic, over time. However, these monotonic trends in vegetation activity may consist of an alternating sequence of greening and/or browning periods. This effect and the contribution of short-term trends to longer-term change was analysed using a procedure for detection of trend breaks. Both abrupt and gradual changes were found in large parts of the world, especially in (semi-arid) shrubland and grassland. Many abrupt changes were found around large-scale natural influences like the Mt Pinatubo eruption in 1991 and the strong 1997/98 El Niño event. This marks the importance of accounting for trend changes in the analysis of long-term NDVI time series. These new change-detection techniques advance our understanding of vegetation variability at a multi-decadal scale, but do not provide links to driving processes. It is very complex to disentangle all natural and human drivers and their interactions. As a first step, the spatial relation between changes in climate parameters and changes in vegetation activity was addressed in this work. It appeared that a substantial proportion (54%) of the spatial variation in NDVI changes could be associated to climatic changes in temperature, precipitation and incident radiation, especially in forest biomes. In other regions, the lack of such associations might be interpreted as human-induced land degradation. With these steps we demonstrated the value of global satellite records for monitoring land resources, although many steps are still to be taken.</p
    corecore