155,589 research outputs found
A stigmergy-based analysis of city hotspots to discover trends and anomalies in urban transportation usage
A key aspect of a sustainable urban transportation system is the
effectiveness of transportation policies. To be effective, a policy has to
consider a broad range of elements, such as pollution emission, traffic flow,
and human mobility. Due to the complexity and variability of these elements in
the urban area, to produce effective policies remains a very challenging task.
With the introduction of the smart city paradigm, a widely available amount of
data can be generated in the urban spaces. Such data can be a fundamental
source of knowledge to improve policies because they can reflect the
sustainability issues underlying the city. In this context, we propose an
approach to exploit urban positioning data based on stigmergy, a bio-inspired
mechanism providing scalar and temporal aggregation of samples. By employing
stigmergy, samples in proximity with each other are aggregated into a
functional structure called trail. The trail summarizes relevant dynamics in
data and allows matching them, providing a measure of their similarity.
Moreover, this mechanism can be specialized to unfold specific dynamics.
Specifically, we identify high-density urban areas (i.e hotspots), analyze
their activity over time, and unfold anomalies. Moreover, by matching activity
patterns, a continuous measure of the dissimilarity with respect to the typical
activity pattern is provided. This measure can be used by policy makers to
evaluate the effect of policies and change them dynamically. As a case study,
we analyze taxi trip data gathered in Manhattan from 2013 to 2015.Comment: Preprin
Complex Network Tools to Understand the Behavior of Criminality in Urban Areas
Complex networks are nowadays employed in several applications. Modeling
urban street networks is one of them, and in particular to analyze criminal
aspects of a city. Several research groups have focused on such application,
but until now, there is a lack of a well-defined methodology for employing
complex networks in a whole crime analysis process, i.e. from data preparation
to a deep analysis of criminal communities. Furthermore, the "toolset"
available for those works is not complete enough, also lacking techniques to
maintain up-to-date, complete crime datasets and proper assessment measures. In
this sense, we propose a threefold methodology for employing complex networks
in the detection of highly criminal areas within a city. Our methodology
comprises three tasks: (i) Mapping of Urban Crimes; (ii) Criminal Community
Identification; and (iii) Crime Analysis. Moreover, it provides a proper set of
assessment measures for analyzing intrinsic criminality of communities,
especially when considering different crime types. We show our methodology by
applying it to a real crime dataset from the city of San Francisco - CA, USA.
The results confirm its effectiveness to identify and analyze high criminality
areas within a city. Hence, our contributions provide a basis for further
developments on complex networks applied to crime analysis.Comment: 7 pages, 2 figures, 14th International Conference on Information
Technology : New Generation
You can't see what you can't see: Experimental evidence for how much relevant information may be missed due to Google's Web search personalisation
The influence of Web search personalisation on professional knowledge work is
an understudied area. Here we investigate how public sector officials
self-assess their dependency on the Google Web search engine, whether they are
aware of the potential impact of algorithmic biases on their ability to
retrieve all relevant information, and how much relevant information may
actually be missed due to Web search personalisation. We find that the majority
of participants in our experimental study are neither aware that there is a
potential problem nor do they have a strategy to mitigate the risk of missing
relevant information when performing online searches. Most significantly, we
provide empirical evidence that up to 20% of relevant information may be missed
due to Web search personalisation. This work has significant implications for
Web research by public sector professionals, who should be provided with
training about the potential algorithmic biases that may affect their judgments
and decision making, as well as clear guidelines how to minimise the risk of
missing relevant information.Comment: paper submitted to the 11th Intl. Conf. on Social Informatics;
revision corrects error in interpretation of parameter Psi/p in RBO resulting
from discrepancy between the documentation of the implementation in R
(https://rdrr.io/bioc/gespeR/man/rbo.html) and the original definition
(https://dl.acm.org/citation.cfm?id=1852106) as per 20/05/201
- …