31,477 research outputs found

    Information Extraction, Data Integration, and Uncertain Data Management: The State of The Art

    Get PDF
    Information Extraction, data Integration, and uncertain data management are different areas of research that got vast focus in the last two decades. Many researches tackled those areas of research individually. However, information extraction systems should have integrated with data integration methods to make use of the extracted information. Handling uncertainty in extraction and integration process is an important issue to enhance the quality of the data in such integrated systems. This article presents the state of the art of the mentioned areas of research and shows the common grounds and how to integrate information extraction and data integration under uncertainty management cover

    Detecting Functional Requirements Inconsistencies within Multi-teams Projects Framed into a Model-based Web Methodology

    Get PDF
    One of the most essential processes within the software project life cycle is the REP (Requirements Engineering Process) because it allows specifying the software product requirements. This specification should be as consistent as possible because it allows estimating in a suitable manner the effort required to obtain the final product. REP is complex in itself, but this complexity is greatly increased in big, distributed and heterogeneous projects with multiple analyst teams and high integration between functional modules. This paper presents an approach for the systematic conciliation of functional requirements in big projects dealing with a web model-based approach and how this approach may be implemented in the context of the NDT (Navigational Development Techniques): a web methodology. This paper also describes the empirical evaluation in the CALIPSOneo project by analyzing the improvements obtained with our approach.Ministerio de EconomĂ­a y Competitividad TIN2013-46928-C3-3-RMinisterio de EconomĂ­a y Competitividad TIN2015-71938-RED

    CIRSS vertical data integration, San Bernardino study

    Get PDF
    The creation and use of a vertically integrated data base, including LANDSAT data, for local planning purposes in a portion of San Bernardino County, California are described. The project illustrates that a vertically integrated approach can benefit local users, can be used to identify and rectify discrepancies in various data sources, and that the LANDSAT component can be effectively used to identify change, perform initial capability/suitability modeling, update existing data, and refine existing data in a geographic information system. Local analyses were developed which produced data of value to planners in the San Bernardino County Planning Department and the San Bernardino National Forest staff

    Characteristic extraction tool for gravitational waveforms

    Get PDF
    We develop and calibrate a characteristic waveform extraction tool whose major improvements and corrections of prior versions allow satisfaction of the accuracy standards required for advanced LIGO data analysis. The extraction tool uses a characteristic evolution code to propagate numerical data on an inner worldtube supplied by a 3+1 Cauchy evolution to obtain the gravitational waveform at null infinity. With the new extraction tool, high accuracy and convergence of the numerical error can be demonstrated for an inspiral and merger of mass M binary black holes even for an extraction worldtube radius as small as R=20M. The tool provides a means for unambiguous comparison between waveforms generated by evolution codes based upon different formulations of the Einstein equations and based upon different numerical approximations

    Management of data quality when integrating data with known provenance

    Get PDF
    Abstract unavailable please refer to PD

    SurfelMeshing: Online Surfel-Based Mesh Reconstruction

    Full text link
    We address the problem of mesh reconstruction from live RGB-D video, assuming a calibrated camera and poses provided externally (e.g., by a SLAM system). In contrast to most existing approaches, we do not fuse depth measurements in a volume but in a dense surfel cloud. We asynchronously (re)triangulate the smoothed surfels to reconstruct a surface mesh. This novel approach enables to maintain a dense surface representation of the scene during SLAM which can quickly adapt to loop closures. This is possible by deforming the surfel cloud and asynchronously remeshing the surface where necessary. The surfel-based representation also naturally supports strongly varying scan resolution. In particular, it reconstructs colors at the input camera's resolution. Moreover, in contrast to many volumetric approaches, ours can reconstruct thin objects since objects do not need to enclose a volume. We demonstrate our approach in a number of experiments, showing that it produces reconstructions that are competitive with the state-of-the-art, and we discuss its advantages and limitations. The algorithm (excluding loop closure functionality) is available as open source at https://github.com/puzzlepaint/surfelmeshing .Comment: Version accepted to IEEE Transactions on Pattern Analysis and Machine Intelligenc
    • 

    corecore