11,304 research outputs found

    Digital data from shuttle photography: The effects of platform variables

    Get PDF
    Two major criticisms of using Shuttle hand held photography as an Earth science sensor are that it is nondigital, nonquantitative and that it has inconsistent platform characteristics, e.g., variable look angles, especially as compared to remote sensing satellites such as LANDSAT and SPOT. However, these criticisms are assumptions and have not been systematically investigated. The spectral effects of off-nadir views of hand held photography from the Shuttle and their role in interpretation of lava flow morphology on the island of Hawaii are studied. Digitization of photography at JSC and use of LIPS image analysis software in obtaining data is discussed. Preliminary interpretative results of one flow are given. Most of the time was spent in developing procedures and overcoming equipment problems. Preliminary data are satisfactory for detailed analysis

    An analysis of the determinants of flood damages

    Get PDF
    In this paper we analyze mortality caused by 2,194 large flood events between 1985 and 2008 in 108 countries. Unlike previous studies that looked at natural-disaster mortality, we find that year-to-year changes in income and institutional determinants of vulnerability do not affect flood mortality directly. Income and institutions influence mortality only indirectly, through their impact on the intensity and frequency of floods. Population exposure affects the number of deaths both directly and indirectly. Higher population exposure results in more deaths once the flood has occurred, but it is associated with smaller floods. In developing countries it also reduces the count of floods.Natural Disasters, Floods, Mortality, Adaptation, Climate Change, Environmental Economics and Policy, International Development, Land Economics/Use, Risk and Uncertainty, Q54,

    Quality assessment technique for ubiquitous software and middleware

    Get PDF
    The new paradigm of computing or information systems is ubiquitous computing systems. The technology-oriented issues of ubiquitous computing systems have made researchers pay much attention to the feasibility study of the technologies rather than building quality assurance indices or guidelines. In this context, measuring quality is the key to developing high-quality ubiquitous computing products. For this reason, various quality models have been defined, adopted and enhanced over the years, for example, the need for one recognised standard quality model (ISO/IEC 9126) is the result of a consensus for a software quality model on three levels: characteristics, sub-characteristics, and metrics. However, it is very much unlikely that this scheme will be directly applicable to ubiquitous computing environments which are considerably different to conventional software, trailing a big concern which is being given to reformulate existing methods, and especially to elaborate new assessment techniques for ubiquitous computing environments. This paper selects appropriate quality characteristics for the ubiquitous computing environment, which can be used as the quality target for both ubiquitous computing product evaluation processes ad development processes. Further, each of the quality characteristics has been expanded with evaluation questions and metrics, in some cases with measures. In addition, this quality model has been applied to the industrial setting of the ubiquitous computing environment. These have revealed that while the approach was sound, there are some parts to be more developed in the future

    Remote Sensing and Forest Conservation: Challenges of Illegal Logging in Kursumlija Municipality (Serbia)

    Get PDF
    Evidence convincingly shows that illegal and corrupt activities are the major underlying cause of deforestation—illegal logging contributes up to 30% of the global market, in excess of US $20 billion a year. Since so much deforestation is a result of illegal logging, we cannot rely on official production statistics to capture deforestation. Given the importance and complexity of forest preservation, an attempt was made to evaluate the possible use of a normalized difference vegetation index (NDVI) in local forest management and prevention of illegal logging and corruption. We used the example of southern Serbian municipality Kursumlija that in the 2006–2011 periods experienced a 10% loss in forest area, as the obvious result of abrupt illegal logging. This process was very easy to locate and quantify (because illegal logging produced large canopy gaps that extend from the border of Kosovo to approximately 3–4 km into the Kursumlija\u27s territory). In short, NDVI is very promising for countries like Serbia (that rarely perform forest inventories): It is relatively cheap and quick, and it can provide forest managers with essential information; it is easy to implement; the objectivity of these methods can significantly help in avoiding corruption and illegal logging

    Graph Laplacian for Image Anomaly Detection

    Get PDF
    Reed-Xiaoli detector (RXD) is recognized as the benchmark algorithm for image anomaly detection; however, it presents known limitations, namely the dependence over the image following a multivariate Gaussian model, the estimation and inversion of a high-dimensional covariance matrix, and the inability to effectively include spatial awareness in its evaluation. In this work, a novel graph-based solution to the image anomaly detection problem is proposed; leveraging the graph Fourier transform, we are able to overcome some of RXD's limitations while reducing computational cost at the same time. Tests over both hyperspectral and medical images, using both synthetic and real anomalies, prove the proposed technique is able to obtain significant gains over performance by other algorithms in the state of the art.Comment: Published in Machine Vision and Applications (Springer
    • …
    corecore