294,694 research outputs found

    Report of subpanel on feature extraction

    Get PDF
    The state of knowledge in feature extraction for Earth resource observation systems is reviewed and research tasks are proposed. Issues in the subpixel feature estimation problem are defined as: (1) the identification of image models which adequately describe the data and the sensor it is using; (2) the construction of local feature models based on those image models; and (3) the problem of trying to understand these effects of preprocessing on the entire process. The development of ground control point (GCP) libraries for automated selection presents two concerns. One is the organization of these GCP libraries for rectification problems, i.e., the problems of automatically selecting by computer the specific GCP's for particular registration tasks. Second is the importance of integrating ground control patterns in a data base management system, allowing interface to a large number of sensor image types with an automatic selection system. The development of data validation criteria for the comparison of different extraction techniques is also discussed

    An Integrated Approach for Characterizing Aerosol Climate Impacts and Environmental Interactions

    Get PDF
    Aerosols exert myriad influences on the earth's environment and climate, and on human health. The complexity of aerosol-related processes requires that information gathered to improve our understanding of climate change must originate from multiple sources, and that effective strategies for data integration need to be established. While a vast array of observed and modeled data are becoming available, the aerosol research community currently lacks the necessary tools and infrastructure to reap maximum scientific benefit from these data. Spatial and temporal sampling differences among a diverse set of sensors, nonuniform data qualities, aerosol mesoscale variabilities, and difficulties in separating cloud effects are some of the challenges that need to be addressed. Maximizing the long-term benefit from these data also requires maintaining consistently well-understood accuracies as measurement approaches evolve and improve. Achieving a comprehensive understanding of how aerosol physical, chemical, and radiative processes impact the earth system can be achieved only through a multidisciplinary, inter-agency, and international initiative capable of dealing with these issues. A systematic approach, capitalizing on modern measurement and modeling techniques, geospatial statistics methodologies, and high-performance information technologies, can provide the necessary machinery to support this objective. We outline a framework for integrating and interpreting observations and models, and establishing an accurate, consistent, and cohesive long-term record, following a strategy whereby information and tools of progressively greater sophistication are incorporated as problems of increasing complexity are tackled. This concept is named the Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON). To encompass the breadth of the effort required, we present a set of recommendations dealing with data interoperability; measurement and model integration; multisensor synergy; data summarization and mining; model evaluation; calibration and validation; augmentation of surface and in situ measurements; advances in passive and active remote sensing; and design of satellite missions. Without an initiative of this nature, the scientific and policy communities will continue to struggle with understanding the quantitative impact of complex aerosol processes on regional and global climate change and air quality

    From buildings to cities: techniques for the multi-scale analysis of urban form and function

    Get PDF
    The built environment is a significant factor in many urban processes, yet direct measures of built form are seldom used in geographical studies. Representation and analysis of urban form and function could provide new insights and improve the evidence base for research. So far progress has been slow due to limited data availability, computational demands, and a lack of methods to integrate built environment data with aggregate geographical analysis. Spatial data and computational improvements are overcoming some of these problems, but there remains a need for techniques to process and aggregate urban form data. Here we develop a Built Environment Model of urban function and dwelling type classifications for Greater London, based on detailed topographic and address-based data (sourced from Ordnance Survey MasterMap). The multi-scale approach allows the Built Environment Model to be viewed at fine-scales for local planning contexts, and at city-wide scales for aggregate geographical analysis, allowing an improved understanding of urban processes. This flexibility is illustrated in the two examples, that of urban function and residential type analysis, where both local-scale urban clustering and city-wide trends in density and agglomeration are shown. While we demonstrate the multi-scale Built Environment Model to be a viable approach, a number of accuracy issues are identified, including the limitations of 2D data, inaccuracies in commercial function data and problems with temporal attribution. These limitations currently restrict the more advanced applications of the Built Environment Model

    UK utility data integration: overcoming schematic heterogeneity

    Get PDF
    In this paper we discuss syntactic, semantic and schematic issues which inhibit the integration of utility data in the UK. We then focus on the techniques employed within the VISTA project to overcome schematic heterogeneity. A Global Schema based architecture is employed. Although automated approaches to Global Schema definition were attempted the heterogeneities of the sector were too great. A manual approach to Global Schema definition was employed. The techniques used to define and subsequently map source utility data models to this schema are discussed in detail. In order to ensure a coherent integrated model, sub and cross domain validation issues are then highlighted. Finally the proposed framework and data flow for schematic integration is introduced

    Transitioning Applications to Semantic Web Services: An Automated Formal Approach

    No full text
    Semantic Web Services have been recognized as a promising technology that exhibits huge commercial potential, and attract significant attention from both industry and the research community. Despite expectations being high, the industrial take-up of Semantic Web Service technologies has been slower than expected. One of the main reasons is that many systems have been developed without considering the potential of the web in integrating services and sharing resources. Without a systematic methodology and proper tool support, the migration from legacy systems to Semantic Web Service-based systems can be a very tedious and expensive process, which carries a definite risk of failure. There is an urgent need to provide strategies which allow the migration of legacy systems to Semantic Web Services platforms, and also tools to support such a strategy. In this paper we propose a methodology for transitioning these applications to Semantic Web Services by taking the advantage of rigorous mathematical methods. Our methodology allows users to migrate their applications to Semantic Web Services platform automatically or semi-automatically

    Design for diagnostics and prognostics:a physical- functional approach

    Get PDF

    Integrating IVHM and Asset Design

    Get PDF
    Integrated Vehicle Health Management (IVHM) describes a set of capabilities that enable effective and efficient maintenance and operation of the target vehicle. It accounts for the collection of data, conducting analysis, and supporting the decision-making process for sustainment and operation. The design of IVHM systems endeavours to account for all causes of failure in a disciplined, systems engineering, manner. With industry striving to reduce through-life cost, IVHM is a powerful tool to give forewarning of impending failure and hence control over the outcome. Benefits have been realised from this approach across a number of different sectors but, hindering our ability to realise further benefit from this maturing technology, is the fact that IVHM is still treated as added on to the design of the asset, rather than being a sub-system in its own right, fully integrated with the asset design. The elevation and integration of IVHM in this way will enable architectures to be chosen that accommodate health ready sub-systems from the supply chain and design trade-offs to be made, to name but two major benefits. Barriers to IVHM being integrated with the asset design are examined in this paper. The paper presents progress in overcoming them, and suggests potential solutions for those that remain. It addresses the IVHM system design from a systems engineering perspective and the integration with the asset design will be described within an industrial design process
    • ā€¦
    corecore