4,541 research outputs found

    New directions in the analysis of movement patterns in space and time

    Get PDF

    A survey on Human Mobility and its applications

    Full text link
    Human Mobility has attracted attentions from different fields of studies such as epidemic modeling, traffic engineering, traffic prediction and urban planning. In this survey we review major characteristics of human mobility studies including from trajectory-based studies to studies using graph and network theory. In trajectory-based studies statistical measures such as jump length distribution and radius of gyration are analyzed in order to investigate how people move in their daily life, and if it is possible to model this individual movements and make prediction based on them. Using graph in mobility studies, helps to investigate the dynamic behavior of the system, such as diffusion and flow in the network and makes it easier to estimate how much one part of the network influences another by using metrics like centrality measures. We aim to study population flow in transportation networks using mobility data to derive models and patterns, and to develop new applications in predicting phenomena such as congestion. Human Mobility studies with the new generation of mobility data provided by cellular phone networks, arise new challenges such as data storing, data representation, data analysis and computation complexity. A comparative review of different data types used in current tools and applications of Human Mobility studies leads us to new approaches for dealing with mentioned challenges

    Probing stellar winds and accretion physics in high-mass X-ray binaries and ultra-luminous X-ray sources with LOFT

    Get PDF
    This is a White Paper in support of the mission concept of the Large Observatory for X-ray Timing (LOFT), proposed as a medium-sized ESA mission. We discuss the potential of LOFT for the study of high-mass X-ray binaries and ultra-luminous X-ray sources. For a summary, we refer to the paper.Comment: White Paper in Support of the Mission Concept of the Large Observatory for X-ray Timing. (v2 few typos corrected

    Surveying human habit modeling and mining techniques in smart spaces

    Get PDF
    A smart space is an environment, mainly equipped with Internet-of-Things (IoT) technologies, able to provide services to humans, helping them to perform daily tasks by monitoring the space and autonomously executing actions, giving suggestions and sending alarms. Approaches suggested in the literature may differ in terms of required facilities, possible applications, amount of human intervention required, ability to support multiple users at the same time adapting to changing needs. In this paper, we propose a Systematic Literature Review (SLR) that classifies most influential approaches in the area of smart spaces according to a set of dimensions identified by answering a set of research questions. These dimensions allow to choose a specific method or approach according to available sensors, amount of labeled data, need for visual analysis, requirements in terms of enactment and decision-making on the environment. Additionally, the paper identifies a set of challenges to be addressed by future research in the field

    Life in the "Matrix": Human Mobility Patterns in the Cyber Space

    Full text link
    With the wide adoption of the multi-community setting in many popular social media platforms, the increasing user engagements across multiple online communities warrant research attention. In this paper, we introduce a novel analogy between the movements in the cyber space and the physical space. This analogy implies a new way of studying human online activities by modelling the activities across online communities in a similar fashion as the movements among locations. First, we quantitatively validate the analogy by comparing several important properties of human online activities and physical movements. Our experiments reveal striking similarities between the cyber space and the physical space. Next, inspired by the established methodology on human mobility in the physical space, we propose a framework to study human "mobility" across online platforms. We discover three interesting patterns of user engagements in online communities. Furthermore, our experiments indicate that people with different mobility patterns also exhibit divergent preferences to online communities. This work not only attempts to achieve a better understanding of human online activities, but also intends to open a promising research direction with rich implications and applications.Comment: To appear at The International AAAI Conference on Web and Social Media (ICWSM) 201

    Knowledge discovery from trajectories

    Get PDF
    Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial TechnologiesAs a newly proliferating study area, knowledge discovery from trajectories has attracted more and more researchers from different background. However, there is, until now, no theoretical framework for researchers gaining a systematic view of the researches going on. The complexity of spatial and temporal information along with their combination is producing numerous spatio-temporal patterns. In addition, it is very probable that a pattern may have different definition and mining methodology for researchers from different background, such as Geographic Information Science, Data Mining, Database, and Computational Geometry. How to systematically define these patterns, so that the whole community can make better use of previous research? This paper is trying to tackle with this challenge by three steps. First, the input trajectory data is classified; second, taxonomy of spatio-temporal patterns is developed from data mining point of view; lastly, the spatio-temporal patterns appeared on the previous publications are discussed and put into the theoretical framework. In this way, researchers can easily find needed methodology to mining specific pattern in this framework; also the algorithms needing to be developed can be identified for further research. Under the guidance of this framework, an application to a real data set from Starkey Project is performed. Two questions are answers by applying data mining algorithms. First is where the elks would like to stay in the whole range, and the second is whether there are corridors among these regions of interest

    D3.2 Cost Concept Model and Gateway Specification

    Get PDF
    This document introduces a Framework supporting the implementation of a cost concept model against which current and future cost models for curating digital assets can be benchmarked. The value built into this cost concept model leverages the comprehensive engagement by the 4C project with various user communities and builds upon our understanding of the requirements, drivers, obstacles and objectives that various stakeholder groups have relating to digital curation. Ultimately, this concept model should provide a critical input to the development and refinement of cost models as well as helping to ensure that the curation and preservation solutions and services that will inevitably arise from the commercial sector as ‘supply’ respond to a much better understood ‘demand’ for cost-effective and relevant tools. To meet acknowledged gaps in current provision, a nested model of curation which addresses both costs and benefits is provided. The goal of this task was not to create a single, functionally implementable cost modelling application; but rather to design a model based on common concepts and to develop a generic gateway specification that can be used by future model developers, service and solution providers, and by researchers in follow-up research and development projects.<p></p> The Framework includes:<p></p> • A Cost Concept Model—which defines the core concepts that should be included in curation costs models;<p></p> • An Implementation Guide—for the cost concept model that provides guidance and proposes questions that should be considered when developing new cost models and refining existing cost models;<p></p> • A Gateway Specification Template—which provides standard metadata for each of the core cost concepts and is intended for use by future model developers, model users, and service and solution providers to promote interoperability;<p></p> • A Nested Model for Digital Curation—that visualises the core concepts, demonstrates how they interact and places them into context visually by linking them to A Cost and Benefit Model for Curation.<p></p> This Framework provides guidance for data collection and associated calculations in an operational context but will also provide a critical foundation for more strategic thinking around curation such as the Economic Sustainability Reference Model (ESRM).<p></p> Where appropriate, definitions of terms are provided, recommendations are made, and examples from existing models are used to illustrate the principles of the framework

    Tensor Based Monitoring of Large-Scale Network Traffic

    Get PDF
    Network monitoring systems are important for network operators to easily analyze behavioral trends in flow data. As networks become larger and more complex, the data becomes more complex with increased size and more variables. This increase in dimensionality lends itself to tensor-based analysis of network data as tensors are arbitrarily sized multi-dimensional objects. Tensor-based network monitoring methods have been explored in recent years through work at Carnegie Mellon University through their algorithm DenseAlert. DenseAlert identifies events anomalous events in tensors through quick detection of dense sub-tensors in positive-valued tensors. However, from experimentation, DenseAlert fails on larger datasets. Drawing from RED Alert, we developed an algorithm called RED Alert that uses recursive filtering and expansion to handle anomaly detection in large tensors of positive and negative valued data. This is done through the use of network parameters that are structured in a hierarchical fashion. That is, network traffic is first modeled at low granular data (e.g. host country), and events detected as anomalous in lower spaces are tracked down to higher granular data (e.g. host IP). The tensors are built on-the-fly in streaming data, filtering data to only consider the parameters deemed anomalous in previous granularity levels. RED Alert is showcased on two network monitoring examples, packet loss detection and botnet detection, comparing results to DenseAlert. In both cases, RED Alert was able to detect suspicious events and identify the root cause of the behavior from a sole IP. RED Alert was developed as part of a greater project, InSight2, that provides several different network monitoring dashboards to aid network operators. This required additional development of a tensor library that worked in the context of InSight2 as well as the development of a dashboard that could run the algorithm and display the results in meaningful ways
    • …
    corecore