39,980 research outputs found

    Geographic Gossip: Efficient Averaging for Sensor Networks

    Full text link
    Gossip algorithms for distributed computation are attractive due to their simplicity, distributed nature, and robustness in noisy and uncertain environments. However, using standard gossip algorithms can lead to a significant waste in energy by repeatedly recirculating redundant information. For realistic sensor network model topologies like grids and random geometric graphs, the inefficiency of gossip schemes is related to the slow mixing times of random walks on the communication graph. We propose and analyze an alternative gossiping scheme that exploits geographic information. By utilizing geographic routing combined with a simple resampling method, we demonstrate substantial gains over previously proposed gossip protocols. For regular graphs such as the ring or grid, our algorithm improves standard gossip by factors of nn and n\sqrt{n} respectively. For the more challenging case of random geometric graphs, our algorithm computes the true average to accuracy ϵ\epsilon using O(n1.5lognlogϵ1)O(\frac{n^{1.5}}{\sqrt{\log n}} \log \epsilon^{-1}) radio transmissions, which yields a nlogn\sqrt{\frac{n}{\log n}} factor improvement over standard gossip algorithms. We illustrate these theoretical results with experimental comparisons between our algorithm and standard methods as applied to various classes of random fields.Comment: To appear, IEEE Transactions on Signal Processin

    Hoodsquare: Modeling and Recommending Neighborhoods in Location-based Social Networks

    Full text link
    Information garnered from activity on location-based social networks can be harnessed to characterize urban spaces and organize them into neighborhoods. In this work, we adopt a data-driven approach to the identification and modeling of urban neighborhoods using location-based social networks. We represent geographic points in the city using spatio-temporal information about Foursquare user check-ins and semantic information about places, with the goal of developing features to input into a novel neighborhood detection algorithm. The algorithm first employs a similarity metric that assesses the homogeneity of a geographic area, and then with a simple mechanism of geographic navigation, it detects the boundaries of a city's neighborhoods. The models and algorithms devised are subsequently integrated into a publicly available, map-based tool named Hoodsquare that allows users to explore activities and neighborhoods in cities around the world. Finally, we evaluate Hoodsquare in the context of a recommendation application where user profiles are matched to urban neighborhoods. By comparing with a number of baselines, we demonstrate how Hoodsquare can be used to accurately predict the home neighborhood of Twitter users. We also show that we are able to suggest neighborhoods geographically constrained in size, a desirable property in mobile recommendation scenarios for which geographical precision is key.Comment: ASE/IEEE SocialCom 201

    Hot Routes: Developing a New Technique for the Spatial Analysis of Crime

    Get PDF
    The use of hotspot mapping techniques such as KDE to represent the geographical spread of linear events can be problematic. Network-constrained data (for example transport-related crime) require a different approach to visualize concentration. We propose a methodology called Hot Routes, which measures the risk distribution of crime along a linear network by calculating the rate of crimes per section of road. This method has been designed for everyday crime analysts, and requires only a Geographical Information System (GIS), and suitable data to calculate. A demonstration is provided using crime data collected from London bus routes

    Evaluating the Differences of Gridding Techniques for Digital Elevation Models Generation and Their Influence on the Modeling of Stony Debris Flows Routing: A Case Study From Rovina di Cancia Basin (North-Eastern Italian Alps)

    Get PDF
    Debris \ufb02ows are among the most hazardous phenomena in mountain areas. To cope with debris \ufb02ow hazard, it is common to delineate the risk-prone areas through routing models. The most important input to debris \ufb02ow routing models are the topographic data, usually in the form of Digital Elevation Models (DEMs). The quality of DEMs depends on the accuracy, density, and spatial distribution of the sampled points; on the characteristics of the surface; and on the applied gridding methodology. Therefore, the choice of the interpolation method affects the realistic representation of the channel and fan morphology, and thus potentially the debris \ufb02ow routing modeling outcomes. In this paper, we initially investigate the performance of common interpolation methods (i.e., linear triangulation, natural neighbor, nearest neighbor, Inverse Distance to a Power, ANUDEM, Radial Basis Functions, and ordinary kriging) in building DEMs with the complex topography of a debris \ufb02ow channel located in the Venetian Dolomites (North-eastern Italian Alps), by using small footprint full- waveform Light Detection And Ranging (LiDAR) data. The investigation is carried out through a combination of statistical analysis of vertical accuracy, algorithm robustness, and spatial clustering of vertical errors, and multi-criteria shape reliability assessment. After that, we examine the in\ufb02uence of the tested interpolation algorithms on the performance of a Geographic Information System (GIS)-based cell model for simulating stony debris \ufb02ows routing. In detail, we investigate both the correlation between the DEMs heights uncertainty resulting from the gridding procedure and that on the corresponding simulated erosion/deposition depths, both the effect of interpolation algorithms on simulated areas, erosion and deposition volumes, solid-liquid discharges, and channel morphology after the event. The comparison among the tested interpolation methods highlights that the ANUDEM and ordinary kriging algorithms are not suitable for building DEMs with complex topography. Conversely, the linear triangulation, the natural neighbor algorithm, and the thin-plate spline plus tension and completely regularized spline functions ensure the best trade-off among accuracy and shape reliability. Anyway, the evaluation of the effects of gridding techniques on debris \ufb02ow routing modeling reveals that the choice of the interpolation algorithm does not signi\ufb01cantly affect the model outcomes

    A Survey of Location Prediction on Twitter

    Full text link
    Locations, e.g., countries, states, cities, and point-of-interests, are central to news, emergency events, and people's daily lives. Automatic identification of locations associated with or mentioned in documents has been explored for decades. As one of the most popular online social network platforms, Twitter has attracted a large number of users who send millions of tweets on daily basis. Due to the world-wide coverage of its users and real-time freshness of tweets, location prediction on Twitter has gained significant attention in recent years. Research efforts are spent on dealing with new challenges and opportunities brought by the noisy, short, and context-rich nature of tweets. In this survey, we aim at offering an overall picture of location prediction on Twitter. Specifically, we concentrate on the prediction of user home locations, tweet locations, and mentioned locations. We first define the three tasks and review the evaluation metrics. By summarizing Twitter network, tweet content, and tweet context as potential inputs, we then structurally highlight how the problems depend on these inputs. Each dependency is illustrated by a comprehensive review of the corresponding strategies adopted in state-of-the-art approaches. In addition, we also briefly review two related problems, i.e., semantic location prediction and point-of-interest recommendation. Finally, we list future research directions.Comment: Accepted to TKDE. 30 pages, 1 figur

    Evaluating a Self-Organizing Map for Clustering and Visualizing Optimum Currency Area Criteria

    Get PDF
    Optimum currency area (OCA) theory attempts to define the geographical region in which it would maximize economic efficiency to have a single currency. In this paper, the focus is on prospective and current members of the Economic and Monetary Union. For this task, a self-organizing neural network, the Self-organizing map (SOM), is combined with hierarchical clustering for a two-level approach to clustering and visualizing OCA criteria. The output of the SOM is a topologically preserved two-dimensional grid. The final models are evaluated based on both clustering tendencies and accuracy measures. Thereafter, the two-dimensional grid of the chosen model is used for visual assessment of the OCA criteria, while its clustering results are projected onto a geographic map.Self-organizing maps, Optimum Currency Area, projection, clustering, geospatial visualization

    Using tracked mobile sensors to make maps of environmental effects

    Get PDF
    We present a study the results of a study of environmental carbon monoxide pollution that has uses a set of tracked, mobile pollution sensors. The motivating concept is that we will be able to map pollution and other properties of the real world a fine scale if we can deploy a large set of sensors with members of the general public who would carry them as they go about their normal everyday activities. To prove the viability of this concept we have to demonstrate that data gathered in an ad-hoc manner is reliable enough in order to allow us to build interesting geo-temporal maps. We present a trial using a small number of global positioning system-tracked CO sensors. From analysis of raw GPS logs we find some well-known spatial and temporal properties of CO. Further, by processing the GPS logs we can find fine-grained variations in pollution readings such as when crossing roads. We then discuss the space of possibilities that may be enabled by tracking sensors around the urban environment – both in getting at personal experience of properties of the environment and in making summative maps to predict future conditions. Although we present a study of CO, the techniques will be applicable to other environmental properties such as radio signal strength, noise, weather and so on

    Wave modelling - the state of the art

    Get PDF
    This paper is the product of the wave modelling community and it tries to make a picture of the present situation in this branch of science, exploring the previous and the most recent results and looking ahead towards the solution of the problems we presently face. Both theory and applications are considered. The many faces of the subject imply separate discussions. This is reflected into the single sections, seven of them, each dealing with a specific topic, the whole providing a broad and solid overview of the present state of the art. After an introduction framing the problem and the approach we followed, we deal in sequence with the following subjects: (Section) 2, generation by wind; 3, nonlinear interactions in deep water; 4, white-capping dissipation; 5, nonlinear interactions in shallow water; 6, dissipation at the sea bottom; 7, wave propagation; 8, numerics. The two final sections, 9 and 10, summarize the present situation from a general point of view and try to look at the future developments
    corecore