254 research outputs found

    From Accessibility and Exposure to Engagement: A Multi-scalar Approach to Measuring Environmental Determinants of Children’s Health Using Geographic Information Systems

    Get PDF
    A growing body of research suggests that increasing the accessibility to health-related environmental features and increasing exposure to and engagement in outdoor environments leads to positive benefits for the overall health and well-being of children. Additionally, research over the last twenty-five years has documented a decline in the time children spend outdoors. Outdoor activity in children is associated with increased levels of physical fitness, and cognitive well-being. Despite acknowledging this connection, problems occur for researchers when attempting to identify the child’s location and to measure whether a child has made use of an accessible health-related facility, or where, when and for how long a child spends time outdoors. The purpose of this thesis is to measure children’s accessibility to, exposure to, and engagement with health-promoting features of their environment. The research on the environment-health link aims to meet two objectives: 1) to quantify the magnitude of positional discrepancies and accessibility misclassiïŹcation that result from using several commonly-used address proxies; and 2) to examine how individual-level, household-level, and neighbourhood-level factors are associated with the quantity of time children spend outdoors. This will be achieved by employing the use of GPS tracking to objectively quantify the time spent outdoors using a novel machine learning algorithm, and by applying a hexagonal grid to extract built environment measures. This study aims to identify the impact of positional discrepancies when measuring accessibility by examining misclassiïŹcation of address proxies to several health-related facilities throughout the City of London and Middlesex County, Ontario, Canada. Positional errors are quantiïŹed by multiple neighbourhood types. Findings indicate that the shorter the threshold distance used to measure accessibility between subject population and health-related facility, the higher the proportion of misclassiïŹed addresses. Using address proxies based on large aggregated units, such as centroids of census tracts or dissemination areas, can result in vast positional discrepancies, and therefore should be avoided in spatial epidemiologic research. To reduce the misclassification, and positional errors, the use of individual portable passive GPS receivers were employed to objectively track the spatial patterns, and quantify the time spent outdoors of children (aged 7 to 13 years) in London, Ontario across multiple neighbourhood types. On the whole, children spent most of their outdoor time during school hours (recess time) and the non-school time outdoors in areas immediately surrounding their home. From these findings, policymakers, educators, and parents can support children’s health by making greater efforts to promote outdoor activities for improved health and quality of life in children. This thesis aims to advance our understanding of the environment and health-link and suggests practical steps for more well-informed decision making by combining novel classification and mapping techniques

    Multiscale Modeling of Inter-Vehicle Communication

    Get PDF
    Within this thesis, different modeling approaches at different scales in the domains of urban radio propagation, decentralized channel coordination, and information dissemination in inter-vehicle communication networks are investigated. The contributions reveal the suitability of existing models for network-oriented research, propose a novel information-centric modeling approach, and identify characteristics of inter-vehicle communication systems which determine key dependability aspects

    Personalized Air Quality Sensing: A Case Study Analysis in Singapore

    Get PDF

    Cognitive Foundations for Visual Analytics

    Full text link

    Journal of environmental geography : Vol. XIII. No 3-4.

    Get PDF

    Road Traffic Congestion Analysis Via Connected Vehicles

    Get PDF
    La congestion routiĂšre est un Ă©tat particulier de mobilitĂ© oĂč les temps de dĂ©placement augmentent et de plus en plus de temps est passĂ© dans le vĂ©hicule. En plus d’ĂȘtre une expĂ©rience trĂšs stressante pour les conducteurs, la congestion a Ă©galement un impact nĂ©gatif sur l’environnement et l’économie. Dans ce contexte, des pressions sont exercĂ©es sur les autoritĂ©s afin qu’elles prennent des mesures dĂ©cisives pour amĂ©liorer le flot du trafic sur le rĂ©seau routier. En amĂ©liorant le flot, la congestion est rĂ©duite et la durĂ©e totale de dĂ©placement des vĂ©hicules est rĂ©duite. D’une part, la congestion routiĂšre peut ĂȘtre rĂ©currente, faisant rĂ©fĂ©rence Ă  la congestion qui se produit rĂ©guliĂšrement. La congestion non rĂ©currente (NRC), quant Ă  elle, dans un rĂ©seau urbain, est principalement causĂ©e par des incidents, des zones de construction, des Ă©vĂ©nements spĂ©ciaux ou des conditions mĂ©tĂ©orologiques dĂ©favorables. Les opĂ©rateurs d’infrastructure surveillent le trafic sur le rĂ©seau mais sont contraints Ă  utiliser le moins de ressources possibles. Cette contrainte implique que l’état du trafic ne peut pas ĂȘtre mesurĂ© partout car il n’est pas rĂ©aliste de dĂ©ployer des Ă©quipements sophistiquĂ©s pour assurer la collecte prĂ©cise des donnĂ©es de trafic et la dĂ©tection en temps rĂ©el des Ă©vĂ©nements partout sur le rĂ©seau routier. Alors certains emplacements oĂč le flot de trafic doit ĂȘtre amĂ©liorĂ© ne sont pas surveillĂ©s car ces emplacements varient beaucoup. D’un autre cĂŽtĂ©, de nombreuses Ă©tudes sur la congestion routiĂšre ont Ă©tĂ© consacrĂ©es aux autoroutes plutĂŽt qu’aux rĂ©gions urbaines, qui sont pourtant beaucoup plus susceptibles d’ĂȘtre surveillĂ©es par les autoritĂ©s de la circulation. De plus, les systĂšmes actuels de collecte de donnĂ©es de trafic n’incluent pas la possibilitĂ© d’enregistrer des informations dĂ©taillĂ©es sur les Ă©vĂ©nements qui surviennent sur la route, tels que les collisions, les conditions mĂ©tĂ©orologiques dĂ©favorables, etc. Aussi, les Ă©tudes proposĂ©es dans la littĂ©rature ne font que dĂ©tecter la congestion ; mais ce n’est pas suffisant, nous devrions ĂȘtre en mesure de mieux caractĂ©riser l’évĂ©nement qui en est la cause. Les agences doivent comprendre quelle est la cause qui affecte la variabilitĂ© de flot sur leurs installations et dans quelle mesure elles peuvent prendre les actions appropriĂ©es pour attĂ©nuer la congestion.----------ABSTRACT: Road traffic congestion is a particular state of mobility where travel times increase and more and more time is spent in vehicles. Apart from being a quite-stressful experience for drivers, congestion also has a negative impact on the environment and the economy. In this context, there is pressure on the authorities to take decisive actions to improve the network traffic flow. By improving network flow, congestion is reduced and the total travel time of vehicles is decreased. In fact, congestion can be classified as recurrent and non-recurrent (NRC). Recurrent congestion refers to congestion that happens on a regular basis. Non-recurrent congestion in an urban network is mainly caused by incidents, workzones, special events and adverse weather. Infrastructure operators monitor traffic on the network while using the least possible resources. Thus, traffic state cannot be directly measured everywhere on the traffic road network. But the location where traffic flow needs to be improved varies highly and certainly, deploying highly sophisticated equipment to ensure the accurate estimation of traffic flows and timely detection of events everywhere on the road network is not feasible. Also, many studies have been devoted to highways rather than highly congested urban regions which are intricate, complex networks and far more likely to be monitored by the traffic authorities. Moreover, current traffic data collection systems do not incorporate the ability of registring detailed information on the altering events happening on the road, such as vehicle crashes, adverse weather, etc. Operators require external data sources to retireve this information in real time. Current methods only detect congestion but it’s not enough, we should be able to better characterize the event causing it. Agencies need to understand what is the cause affecting variability on their facilities and to what degree so that they can take the appropriate action to mitigate congestion

    Proceedings of the GIS Research UK 18th Annual Conference GISRUK 2010

    Get PDF
    This volume holds the papers from the 18th annual GIS Research UK (GISRUK). This year the conference, hosted at University College London (UCL), from Wednesday 14 to Friday 16 April 2010. The conference covered the areas of core geographic information science research as well as applications domains such as crime and health and technological developments in LBS and the geoweb. UCL’s research mission as a global university is based around a series of Grand Challenges that affect us all, and these were accommodated in GISRUK 2010. The overarching theme this year was “Global Challenges”, with specific focus on the following themes: * Crime and Place * Environmental Change * Intelligent Transport * Public Health and Epidemiology * Simulation and Modelling * London as a global city * The geoweb and neo-geography * Open GIS and Volunteered Geographic Information * Human-Computer Interaction and GIS Traditionally, GISRUK has provided a platform for early career researchers as well as those with a significant track record of achievement in the area. As such, the conference provides a welcome blend of innovative thinking and mature reflection. GISRUK is the premier academic GIS conference in the UK and we are keen to maintain its outstanding record of achievement in developing GIS in the UK and beyond

    Designing the next generation intelligent transportation sensor system using big data driven machine learning techniques

    Get PDF
    Accurate traffic data collection is essential for supporting advanced traffic management system operations. This study investigated a large-scale data-driven sequential traffic sensor health monitoring (TSHM) module that can be used to monitor sensor health conditions over large traffic networks. Our proposed module consists of three sequential steps for detecting different types of abnormal sensor issues. The first step detects sensors with abnormally high missing data rates, while the second step uses clustering anomaly detection to detect sensors reporting abnormal records. The final step introduces a novel Bayesian changepoint modeling technique to detect sensors reporting abnormal traffic data fluctuations by assuming a constant vehicle length distribution based on average effective vehicle length (AEVL). Our proposed method is then compared with two benchmark algorithms to show its efficacy. Results obtained by applying our method to the statewide traffic sensor data of Iowa show it can successfully detect different classes of sensor issues. This demonstrates that sequential TSHM modules can help transportation agencies determine traffic sensors’ exact problems, thereby enabling them to take the required corrective steps. The second research objective will focus on the traffic data imputation after we discard the anomaly/missing data collected from failure traffic sensors. Sufficient high-quality traffic data are a crucial component of various Intelligent Transportation System (ITS) applications and research related to congestion prediction, speed prediction, incident detection, and other traffic operation tasks. Nonetheless, missing traffic data are a common issue in sensor data which is inevitable due to several reasons, such as malfunctioning, poor maintenance or calibration, and intermittent communications. Such missing data issues often make data analysis and decision-making complicated and challenging. In this study, we have developed a generative adversarial network (GAN) based traffic sensor data imputation framework (TSDIGAN) to efficiently reconstruct the missing data by generating realistic synthetic data. In recent years, GANs have shown impressive success in image data generation. However, generating traffic data by taking advantage of GAN based modeling is a challenging task, since traffic data have strong time dependency. To address this problem, we propose a novel time-dependent encoding method called the Gramian Angular Summation Field (GASF) that converts the problem of traffic time-series data generation into that of image generation. We have evaluated and tested our proposed model using the benchmark dataset provided by Caltrans Performance Management Systems (PeMS). This study shows that the proposed model can significantly improve the traffic data imputation accuracy in terms of Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) compared to state-of-the-art models on the benchmark dataset. Further, the model achieves reasonably high accuracy in imputation tasks even under a very high missing data rate (\u3e50%), which shows the robustness and efficiency of the proposed model. Besides the loop and radar sensors, traffic cameras have shown great ability to provide insightful traffic information using the image and video processing techniques. Therefore, the third and final part of this work aimed to introduce an end to end real-time cloud-enabled traffic video analysis (IVA) framework to support the development of the future smart city. As Artificial intelligence (AI) growing rapidly, Computer vision (CV) techniques are expected to significantly improve the development of intelligent transportation systems (ITS), which are anticipated to be a key component of future Smart City (SC) frameworks. Powered by computer vision techniques, the converting of existing traffic cameras into connected ``smart sensors called intelligent video analysis (IVA) systems has shown the great capability of producing insightful data to support ITS applications. However, developing such IVA systems for large-scale, real-time application deserves further study, as the current research efforts are focused more on model effectiveness instead of model efficiency. Therefore, we have introduced a real-time, large-scale, cloud-enabled traffic video analysis framework using NVIDIA DeepStream, which is a streaming analysis toolkit for AI-based video and image analysis. In this study, we have evaluated the technical and economic feasibility of our proposed framework to help traffic agency to build IVA systems more efficiently. Our study shows that the daily operating cost for our proposed framework on Google Cloud Platform (GCP) is less than $0.14 per camera, and that, compared with manual inspections, our framework achieves an average vehicle-counting accuracy of 83.7% on sunny days
    • 

    corecore