8,466 research outputs found

    Program your city: Designing an urban integrated open data API

    Get PDF
    Cities accumulate and distribute vast sets of digital information. Many decision-making and planning processes in councils, local governments and organisations are based on both real-time and historical data. Until recently, only a small, carefully selected subset of this information has been released to the public – usually for specific purposes (e.g. train timetables, release of planning application through websites to name just a few). This situation is however changing rapidly. Regulatory frameworks, such as the Freedom of Information Legislation in the US, the UK, the European Union and many other countries guarantee public access to data held by the state. One of the results of this legislation and changing attitudes towards open data has been the widespread release of public information as part of recent Government 2.0 initiatives. This includes the creation of public data catalogues such as data.gov.au (U.S.), data.gov.uk (U.K.), data.gov.au (Australia) at federal government levels, and datasf.org (San Francisco) and data.london.gov.uk (London) at municipal levels. The release of this data has opened up the possibility of a wide range of future applications and services which are now the subject of intensified research efforts. Previous research endeavours have explored the creation of specialised tools to aid decision-making by urban citizens, councils and other stakeholders (Calabrese, Kloeckl & Ratti, 2008; Paulos, Honicky & Hooker, 2009). While these initiatives represent an important step towards open data, they too often result in mere collections of data repositories. Proprietary database formats and the lack of an open application programming interface (API) limit the full potential achievable by allowing these data sets to be cross-queried. Our research, presented in this paper, looks beyond the pure release of data. It is concerned with three essential questions: First, how can data from different sources be integrated into a consistent framework and made accessible? Second, how can ordinary citizens be supported in easily composing data from different sources in order to address their specific problems? Third, what are interfaces that make it easy for citizens to interact with data in an urban environment? How can data be accessed and collected

    On the Feasibility of Social Network-based Pollution Sensing in ITSs

    Full text link
    Intense vehicular traffic is recognized as a global societal problem, with a multifaceted influence on the quality of life of a person. Intelligent Transportation Systems (ITS) can play an important role in combating such problem, decreasing pollution levels and, consequently, their negative effects. One of the goals of ITSs, in fact, is that of controlling traffic flows, measuring traffic states, providing vehicles with routes that globally pursue low pollution conditions. How such systems measure and enforce given traffic states has been at the center of multiple research efforts in the past few years. Although many different solutions have been proposed, very limited effort has been devoted to exploring the potential of social network analysis in such context. Social networks, in general, provide direct feedback from people and, as such, potentially very valuable information. A post that tells, for example, how a person feels about pollution at a given time in a given location, could be put to good use by an environment aware ITS aiming at minimizing contaminant emissions in residential areas. This work verifies the feasibility of using pollution related social network feeds into ITS operations. In particular, it concentrates on understanding how reliable such information is, producing an analysis that confronts over 1,500,000 posts and pollution data obtained from on-the- field sensors over a one-year span.Comment: 10 pages, 15 figures, Transaction Forma

    Internet of things for disaster management: state-of-the-art and prospects

    Get PDF
    Disastrous events are cordially involved with the momentum of nature. As such mishaps have been showing off own mastery, situations have gone beyond the control of human resistive mechanisms far ago. Fortunately, several technologies are in service to gain affirmative knowledge and analysis of a disaster's occurrence. Recently, Internet of Things (IoT) paradigm has opened a promising door toward catering of multitude problems related to agriculture, industry, security, and medicine due to its attractive features, such as heterogeneity, interoperability, light-weight, and flexibility. This paper surveys existing approaches to encounter the relevant issues with disasters, such as early warning, notification, data analytics, knowledge aggregation, remote monitoring, real-time analytics, and victim localization. Simultaneous interventions with IoT are also given utmost importance while presenting these facts. A comprehensive discussion on the state-of-the-art scenarios to handle disastrous events is presented. Furthermore, IoT-supported protocols and market-ready deployable products are summarized to address these issues. Finally, this survey highlights open challenges and research trends in IoT-enabled disaster management systems. © 2013 IEEE
    • …
    corecore