59 research outputs found

    Semantic Trajectories:Computing and Understanding Mobility Data

    Get PDF
    Thanks to the rapid development of mobile sensing technologies (like GPS, GSM, RFID, accelerometer, gyroscope, sound and other sensors in smartphones), the large-scale capture of evolving positioning data (called mobility data or trajectories) generated by moving objects with embedded sensors has become easily feasible, both technically and economically. We have already entered a world full of trajectories. The state-of-the-art on trajectory, either from the moving object database area or in the statistical analysis viewpoint, has built a bunch of sophisticated techniques for trajectory data ad-hoc storage, indexing, querying and mining etc. However, most of these existing methods mainly focus on a spatio-temporal viewpoint of mobility data, which means they analyze only the geometric movement of trajectories (e.g., the raw â€čx, y, tâ€ș sequential data) without enough consideration on the high-level semantics that can better understand the underlying meaningful movement behaviors. Addressing this challenging issue for better understanding movement behaviors from the raw mobility data, this doctoral work aims at providing a high-level modeling and computing methodology for semantically abstracting the rapidly increasing mobility data. Therefore, we bring top-down semantic modeling and bottom-up data computing together and establish a new concept called "semantic trajectories" for mobility data representation and understanding. As the main novelty contribution, this thesis provides a rich, holistic, heterogeneous and application-independent methodology for computing semantic trajectories to better understand mobility data at different levels. In details, this methodology is composed of five main parts with dedicated contributions. Semantic Trajectory Modeling. By investigating trajectory modeling requirements to better understand mobility data, this thesis first designs a hybrid spatio-semantic trajectory model that represents mobility with rich data abstraction at different levels, i.e., from the low-level spatio-temporal trajectory to the intermediate-level structured trajectory, and finally to the high-level semantic trajectory. In addition, a semantic based ontological framework has also been designed and applied for querying and reasoning on trajectories. Offline Trajectory Computing. To utilize the hybrid model, the thesis complementarily designs a holistic trajectory computing platform with dedicated algorithms for reconstructing trajectories at different levels. The platform can preprocess collected mobility data (i.e., raw movement tracks like GPS feeds) in terms of data cleaning/compression etc., identify individual trajectories, and segment them into structurally meaningful trajectory episodes. Therefore, this trajectory computing platform can construct spatio-temporal trajectories and structured trajectories from the raw mobility data. Such computing platform is initially designed as an offline solution which is supposed to analyze past trajectories via a batch procedure. Trajectory Semantic Annotation. To achieve the final semantic level for better understanding mobility data, this thesis additionally designs a semantic annotation platform that can enrich trajectories with third party sources that are composed of geographic background information and application domain knowledge, to further infer more meaningful semantic trajectories. Such annotation platform is application-independent that can annotate various trajectories (e.g., mobility data of people, vehicle and animals) with heterogeneous data sources of semantic knowledge (e.g., third party sources in any kind of geometric shapes like point, line and region) that can help trajectory enrichment. Online Trajectory Computing. In addition to the offline trajectory computing for analyzing past trajectories, this thesis also contributes to dealing with ongoing trajectories in terms of real-time trajectory computing from movement data streams. The online trajectory computing platform is capable of providing real-life trajectory data cleaning, compression, and segmentation over streaming movement data. In addition, the online platform explores the functionality of online tagging to achieve fully semantic-aware trajectories and further evaluate trajectory computing in a real-time setting. Mining Trajectories from Multi-Sensors. Previously, the focus is on computing semantic trajectories using single-sensory data (i.e., GPS feeds), where most datasets are from moving objects with wearable GPS-embedded sensors (e.g., mobility data of animal, vehicle and people tracking). In addition, we explore the problem of mining people trajectories using multi-sensory feeds from smartphones (GPS, gyroscope, accelerometer etc). The research results reveal that the combination of two sensors (GPS+accelerometer) can significantly infer a complete life-cycle semantic trajectories of people's daily behaviors, both outdoor movement via GPS and indoor activities via accelerometer

    Urban Informatics

    Get PDF
    This open access book is the first to systematically introduce the principles of urban informatics and its application to every aspect of the city that involves its functioning, control, management, and future planning. It introduces new models and tools being developed to understand and implement these technologies that enable cities to function more efficiently – to become ‘smart’ and ‘sustainable’. The smart city has quickly emerged as computers have become ever smaller to the point where they can be embedded into the very fabric of the city, as well as being central to new ways in which the population can communicate and act. When cities are wired in this way, they have the potential to become sentient and responsive, generating massive streams of ‘big’ data in real time as well as providing immense opportunities for extracting new forms of urban data through crowdsourcing. This book offers a comprehensive review of the methods that form the core of urban informatics from various kinds of urban remote sensing to new approaches to machine learning and statistical modelling. It provides a detailed technical introduction to the wide array of tools information scientists need to develop the key urban analytics that are fundamental to learning about the smart city, and it outlines ways in which these tools can be used to inform design and policy so that cities can become more efficient with a greater concern for environment and equity

    Urban Informatics

    Get PDF
    This open access book is the first to systematically introduce the principles of urban informatics and its application to every aspect of the city that involves its functioning, control, management, and future planning. It introduces new models and tools being developed to understand and implement these technologies that enable cities to function more efficiently – to become ‘smart’ and ‘sustainable’. The smart city has quickly emerged as computers have become ever smaller to the point where they can be embedded into the very fabric of the city, as well as being central to new ways in which the population can communicate and act. When cities are wired in this way, they have the potential to become sentient and responsive, generating massive streams of ‘big’ data in real time as well as providing immense opportunities for extracting new forms of urban data through crowdsourcing. This book offers a comprehensive review of the methods that form the core of urban informatics from various kinds of urban remote sensing to new approaches to machine learning and statistical modelling. It provides a detailed technical introduction to the wide array of tools information scientists need to develop the key urban analytics that are fundamental to learning about the smart city, and it outlines ways in which these tools can be used to inform design and policy so that cities can become more efficient with a greater concern for environment and equity

    Urban Informatics

    Get PDF
    This open access book is the first to systematically introduce the principles of urban informatics and its application to every aspect of the city that involves its functioning, control, management, and future planning. It introduces new models and tools being developed to understand and implement these technologies that enable cities to function more efficiently – to become ‘smart’ and ‘sustainable’. The smart city has quickly emerged as computers have become ever smaller to the point where they can be embedded into the very fabric of the city, as well as being central to new ways in which the population can communicate and act. When cities are wired in this way, they have the potential to become sentient and responsive, generating massive streams of ‘big’ data in real time as well as providing immense opportunities for extracting new forms of urban data through crowdsourcing. This book offers a comprehensive review of the methods that form the core of urban informatics from various kinds of urban remote sensing to new approaches to machine learning and statistical modelling. It provides a detailed technical introduction to the wide array of tools information scientists need to develop the key urban analytics that are fundamental to learning about the smart city, and it outlines ways in which these tools can be used to inform design and policy so that cities can become more efficient with a greater concern for environment and equity

    A review of technical factors to consider when designing neural networks for semantic segmentation of Earth Observation imagery

    Full text link
    Semantic segmentation (classification) of Earth Observation imagery is a crucial task in remote sensing. This paper presents a comprehensive review of technical factors to consider when designing neural networks for this purpose. The review focuses on Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), Generative Adversarial Networks (GANs), and transformer models, discussing prominent design patterns for these ANN families and their implications for semantic segmentation. Common pre-processing techniques for ensuring optimal data preparation are also covered. These include methods for image normalization and chipping, as well as strategies for addressing data imbalance in training samples, and techniques for overcoming limited data, including augmentation techniques, transfer learning, and domain adaptation. By encompassing both the technical aspects of neural network design and the data-related considerations, this review provides researchers and practitioners with a comprehensive and up-to-date understanding of the factors involved in designing effective neural networks for semantic segmentation of Earth Observation imagery.Comment: 145 pages with 32 figure

    An Overview on the Generation and Detection of Synthetic and Manipulated Satellite Images

    Get PDF
    Due to the reduction of technological costs and the increase of satellites launches, satellite images are becoming more popular and easier to obtain. Besides serving benevolent purposes, satellite data can also be used for malicious reasons such as misinformation. As a matter of fact, satellite images can be easily manipulated relying on general image editing tools. Moreover, with the surge of Deep Neural Networks (DNNs) that can generate realistic synthetic imagery belonging to various domains, additional threats related to the diffusion of synthetically generated satellite images are emerging. In this paper, we review the State of the Art (SOTA) on the generation and manipulation of satellite images. In particular, we focus on both the generation of synthetic satellite imagery from scratch, and the semantic manipulation of satellite images by means of image-transfer technologies, including the transformation of images obtained from one type of sensor to another one. We also describe forensic detection techniques that have been researched so far to classify and detect synthetic image forgeries. While we focus mostly on forensic techniques explicitly tailored to the detection of AI-generated synthetic contents, we also review some methods designed for general splicing detection, which can in principle also be used to spot AI manipulate imagesComment: 25 pages, 17 figures, 5 tables, APSIPA 202

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    Resilience in Soils and Land Use

    Get PDF
    Currently, studies on land use in territorial planning are of interest, the purpose of which was previously to analyze the aptitude of each type of land for a specific use, based on its ability to assume impacts and the potential that the land may have had. The analysis of erosive risks constitutes a parameter to take into account in said management.The scientific community, given the enormous social interest in monitoring and controlling the environment, is developing methodologies that allow such control that is more efficient. One of the environmental factors to consider is the soil, which constitutes the support for life and is one of the basic natural elements, which is evident in the European Soil Charter, of the Council of Europe, which says, in its first point: “The soil is one of the most precious goods of Humanity. It allows the life of plants, animals and man on the surface of the Earth”. This European charter also highlights the scarcity and fragility of the edaphic resource, indicating that it must be protected through a greater effort in scientific research and interdisciplinary collaboration to ensure the rational use and conservation of soil
    • 

    corecore