1,376 research outputs found
METAMORPHISM OF ALS POINT DATA FOR MULTITUDE APPLICATION
Technologically assisted decision-making in urban planning and governance is significant to envisage Sustainable Development Goal (SDG) 11, of developing sustainable cities and communities. In the current millennium, planners and decision-makers require knowledge-rich virtual models for managing the man-made and natural resources in the city. To effectively utilize the technological advancement for sustainable urban development, there is a need for expeditious entail towards accurate urban resource mapping, development of flexible monitoring and information extraction framework, and enticing visualization. Airborne LiDAR Scanning (ALS) is capable of producing very precise 3D geometric data over expansive urban areas in a timely and economically efficient manner. However, it is challenging to derive viable outcomes from the unstructured and voluminous point cloud. This work proposes an intermediary metamorphosed point cloud storage framework to enhance the utility of point clouds for multiple pragmatic applications. The proposed methodological approach transforms the unstructured, massive point cloud into an ontologically stored urban object collection to utilize the large-scale urban point cloud in decision-making. Further, the paper demonstrates the direct applicability of the metamorphosed point cloud storage framework for two specific applications related to sustainable urban development. Experiments carried out using the proposed framework on DALES benchmark dataset show promising results
Distributed RDF query processing and reasoning for big data / linked data
Title from PDF of title page, viewed on August 27, 2014Thesis advisor: Yugyung LeeVitaIncludes bibliographical references (pages 61-65)Thesis (M. S.)--School of Computing and Engineering. University of Missouri--Kansas City, 2014The Linked Data Movement is aimed at converting unstructured and semi-structured
data on the documents to semantically connected documents called the "web of data." This is
based on Resource Description Framework (RDF) that represents the semantic data and a
collection of such statements shapes an RDF graph. SPARQL is a query language designed
specifically to query RDF data. Linked Data faces the same challenge that Big Data does. We
now lead the way to a new wave of a new paradigm, Big Data and Linked Data that identify
massive amounts of data in a connected form. Indeed, utilizing Linked Data and Big Data
continue to be in high demand. Therefore, we need a scalable and accessible query system
for the reusability and availability of existing web data. However, existing SPAQL query
systems are not sufficiently scalable for Big Data and Linked Data.
In this thesis, we address an issue of how to improve the scalability and performance
of query processing with Big Data / Linked Data. Our aim is to evaluate and assess presently
available SPARQL query engines and develop an effective model to query RDF data that
should be scalable with reasoning capabilities. We designed an efficient and distributed
SPARQL engine using MapReduce (parallel and distributed processing for large data sets on
a cluster) and the Apache Cassandra database (scalable and highly available peer to peer distributed database system). We evaluated an existing in-memory based ARQ engine
provided by Jena framework and found that it cannot handle large datasets, as it only works
based on the in-memory feature of the system. It was shown that the proposed model had
powerful reasoning capabilities and dealt efficiently with big datasetsAbstract -- Illistrations -- Tables -- Introduction -- Background and related work -- Graph-store based SPARQL model -- Graph-store based SPARQL model implementation -- Results and evaluation -- Conclusion and future work -- Reference
EG-ICE 2021 Workshop on Intelligent Computing in Engineering
The 28th EG-ICE International Workshop 2021 brings together international experts working at the interface between advanced computing and modern engineering challenges. Many engineering tasks require open-world resolutions to support multi-actor collaboration, coping with approximate models, providing effective engineer-computer interaction, search in multi-dimensional solution spaces, accommodating uncertainty, including specialist domain knowledge, performing sensor-data interpretation and dealing with incomplete knowledge. While results from computer science provide much initial support for resolution, adaptation is unavoidable and most importantly, feedback from addressing engineering challenges drives fundamental computer-science research. Competence and knowledge transfer goes both ways
Experimental workflow for the creation of a non-conventional open source HBIM platform integrating metric data and stratigraphic analysis: the case study of the refectory of Santa Maria di Staffarda Abbey
L'abstract è presente nell'allegato / the abstract is in the attachmen
- …