23 research outputs found

    Processing Collections of Geo-Referenced Images for Natural Disasters

    Get PDF
    After disaster strikes, emergency response teams need to work fast. In this context, crowdsourcing has emerged as a powerful mechanism where volunteers can help to process different tasks such as processing complex images using labeling and classification techniques. In this work we propose to address the problem of how to efficiently process large volumes of georeferenced images using crowdsourcing in the context of high risk such as natural disasters. Research on citizen science and crowdsourcing indicates that volunteers should be able to contribute in a useful way with a limited time to a project, supported by the results of usability studies. We present the design of a platform for real-time processing of georeferenced images. In particular, we focus on the interaction between the crowdsourcing and the volunteers connected to a P2P network.Facultad de Informátic

    Processing Collections of Geo-Referenced Images for Natural Disasters

    Get PDF
    After disaster strikes, emergency response teams need to work fast. In this context, crowdsourcing has emerged as a powerful mechanism where volunteers can help to process different tasks such as processing complex images using labeling and classification techniques. In this work we propose to address the problem of how to efficiently process large volumes of georeferenced images using crowdsourcing in the context of high risk such as natural disasters. Research on citizen science and crowdsourcing indicates that volunteers should be able to contribute in a useful way with a limited time to a project, supported by the results of usability studies. We present the design of a platform for real-time processing of georeferenced images. In particular, we focus on the interaction between the crowdsourcing and the volunteers connected to a P2P network.Facultad de Informátic

    Large spatial datasets: Present Challenges, future opportunities

    Get PDF
    The key advantages of a well-designed multidimensional database is its ability to allow as many users as possible across an organisation to simultaneously gain access and view of the same data. Large spatial datasets evolve from scientific activities (from recent days) that tends to generate large databases which always come in a scale nearing terabyte of data size and in most cases are multidimensional. In this paper, we look at the issues pertaining to large spatial datasets; its feature (for example views), architecture, access methods and most importantly design technologies. We also looked at some ways of possibly improving the performance of some of the existing algorithms for managing large spatial datasets. The study reveals that the major challenges militating against effective management of large spatial datasets is storage utilization and computational complexity (both of which are characterised by the size of spatial big data which now tends to exceeds the capacity of commonly used spatial computing systems owing to their volume, variety and velocity). These problems fortunately can be combated by employing functional programming method or parallelization techniques

    MARITIME DATA INTEGRATION AND ANALYSIS: RECENT PROGRESS AND RESEARCH CHALLENGES

    Get PDF
    The correlated exploitation of heterogeneous data sources offering very large historical as well as streaming data is important to increasing the accuracy of computations when analysing and predicting future states of moving entities. This is particularly critical in the maritime domain, where online tracking, early recognition of events, and real-time forecast of anticipated trajectories of vessels are crucial to safety and operations at sea. The objective of this paper is to review current research challenges and trends tied to the integration, management, analysis, and visualization of objects moving at sea as well as a few suggestions for a successful development of maritime forecasting and decision-support systems. Document type: Articl

    MARITIME DATA INTEGRATION AND ANALYSIS: RECENT PROGRESS AND RESEARCH CHALLENGES

    Get PDF
    The correlated exploitation of heterogeneous data sources offering very large historical as well as streaming data is important to increasing the accuracy of computations when analysing and predicting future states of moving entities. This is particularly critical in the maritime domain, where online tracking, early recognition of events, and real-time forecast of anticipated trajectories of vessels are crucial to safety and operations at sea. The objective of this paper is to review current research challenges and trends tied to the integration, management, analysis, and visualization of objects moving at sea as well as a few suggestions for a successful development of maritime forecasting and decision-support systems. Document type: Articl

    High-Performance Spatial Query Processing on Big Taxi Trip Data Using GPGPUs

    Full text link
    Abstract — City-wide GPS recorded taxi trip data contains rich information for traffic and travel analysis to facilitate transportation planning and urban studies. However, traditional data management techniques are largely incapable of processing big taxi trip data at the scale of hundreds of millions. In this study, we aim at utilizing the General Purpose computing on Graphics Processing Units (GPGPUs) technologies to speed up processing complex spatial queries on big taxi data on inexpensive commodity GPUs. By using the land use types of tax lot polygons as a proxy for trip purposes at the pickup and drop-off locations, we formulate a taxi trip data analysis problem as a large-scale nearest neighbor spatial query problem based on point-to-polygon distance. Experiments on nearly 170 million taxi trips in the New York City (NYC) in 2009 and 735,488 tax lot polygons with 4,698,986 vertices have demonstrated the efficiency of the proposed techniques: the GPU implementations is about 10-20X faster than the host system and complete the spatial query in about a minute. We further discuss several interesting patterns discovered from the query results which warrant further study. The proposed approach can be an interesting alternative to traditional MapReduce/Hadoop based approaches to processing big data with respect to performance and cost

    Metodologia para definição de unidades territoriais de planejamento do transporte interurbano de pessoas no Brasil

    Get PDF
    No Brasil, é comum o desenvolvimento de planos e estudos voltados à mobilidade urbana, mas a mobilidade interurbana, em escala nacional, não é estudada como um sistema integrado, grande parte em razão da segregação institucional de gestão e planejamento de sistemas de transporte, o que dificulta a visão do território , a compreensão do contexto geográfico e limita o desenvolvimento técnico do planejamento sob uma visão sistêmica. Portanto, a fim de materializar um zoneamento que permita a definição de um contexto geográfico de referência, esse artigo propõe uma metodologia para definição de Unidades Territoriais de Planejamento – UTP, aqui definidas como as regiões onde se concentram as populações e agrupam os fluxos intraurbanos. Para tanto, foram utilizados como base de dados informações sobre o transporte terrestre rodoviário e ferroviário, aeroviário, aquaviário, além de Planos Nacionais de Turismo (MTur, 2013), de Logística (EPL, 2018), Plano Aeroviário Nacional (MTPA, 2018) e estudos do IBGE (2008, 2016, 2017a, 2017b). Conclui- se que, apesar de ser um recorte inicial, com as UTPs definidas para o planejamento do transporte interurbano de pessoas no Brasil, é possível iniciar uma análise sistêmica, e consequentemente, evoluir o planejamento das infraestruturas e serviços envolvidos. Palavras-chave: Mobilidade, interurbana; planejamento de transportes, Unidades Territoriais de Planejamento, integração nacional
    corecore