14,664 research outputs found
Optimization of traffic simulation using GPS navigation records
Proyecto de Graduación (Maestría en Computación con énfasis en Ciencias de la Computación) Instituto Tecnológico de Costa Rica, Escuela de Ingeniería en Computación, 2021A traffic simulation is a tool that permits constructing a virtual environment based
on a real one, with the objective to perform analysis about the actual conditions
and more important, apply changes to the virtual scene or to the driving rules
to generate new scenarios and test solutions. However, the problem we found
with simulations it that with incorrect parameters may not represent the traffic
conditions we are looking for.
In this work, we propose a method to calibrate the traffic simulations using data
available for transportation in Costa Rica. This data comes from Global Position
System (GPS) navigation records. The calibration algorithm search to represent
those actual traffic conditions in a virtual environment, and after that, propose
and design solutions to ease the complicated traffic situations.
This thesis reflects the work of months to design and implemented an algorithm to
calibrate simulations of five sectors of the country where we found difficult traffic
conditions. The algorithm calculates a Measure of Performance to compare data
from the simulation and the GPS records, and it searches iteratively for the best
parameters. In the end, it validates the best solution found with a statistical test.
As results, we achieved to calibrate the simulations for the five studied sectors,
reaching a configuration of input parameters that reflects the traffic conditions
extracted from the GPS records, as a portrait of the real-life conditions of the
locations.
The impact and applications of this work are plenty. For the computing part, we
can dig more profound in using more techniques of calibration, and also exploit
the data available for more general works. Moreover, it can become in a significant
resource for analysis and decision making in urban mobility studies
Big data analytics:Computational intelligence techniques and application areas
Big Data has significant impact in developing functional smart cities and supporting modern societies. In this paper, we investigate the importance of Big Data in modern life and economy, and discuss challenges arising from Big Data utilization. Different computational intelligence techniques have been considered as tools for Big Data analytics. We also explore the powerful combination of Big Data and Computational Intelligence (CI) and identify a number of areas, where novel applications in real world smart city problems can be developed by utilizing these powerful tools and techniques. We present a case study for intelligent transportation in the context of a smart city, and a novel data modelling methodology based on a biologically inspired universal generative modelling approach called Hierarchical Spatial-Temporal State Machine (HSTSM). We further discuss various implications of policy, protection, valuation and commercialization related to Big Data, its applications and deployment
A Simple Flood Forecasting Scheme Using Wireless Sensor Networks
This paper presents a forecasting model designed using WSNs (Wireless Sensor
Networks) to predict flood in rivers using simple and fast calculations to
provide real-time results and save the lives of people who may be affected by
the flood. Our prediction model uses multiple variable robust linear regression
which is easy to understand and simple and cost effective in implementation, is
speed efficient, but has low resource utilization and yet provides real time
predictions with reliable accuracy, thus having features which are desirable in
any real world algorithm. Our prediction model is independent of the number of
parameters, i.e. any number of parameters may be added or removed based on the
on-site requirements. When the water level rises, we represent it using a
polynomial whose nature is used to determine if the water level may exceed the
flood line in the near future. We compare our work with a contemporary
algorithm to demonstrate our improvements over it. Then we present our
simulation results for the predicted water level compared to the actual water
level.Comment: 16 pages, 4 figures, published in International Journal Of Ad-Hoc,
Sensor And Ubiquitous Computing, February 2012; V. seal et al, 'A Simple
Flood Forecasting Scheme Using Wireless Sensor Networks', IJASUC, Feb.201
An automatic calibration procedure of driving behaviour parameters in the presence of high bus volume
Most of the microscopic traffic simulation programs used today incorporate car-following and lane-change models to simulate driving behaviour across a given area. The main goal of this study has been to develop an automatic calibration process for the parameters of driving behaviour models using metaheuristic algorithms. Genetic Algorithm (GA), Particle Swarm Optimization (PSO), and a combination of GA and PSO (i.e. hybrid GAPSO and hybrid PSOGA) were used during the optimization stage. In order to verify our proposed methodology, a suitable study area with high bus volume on-ramp from the 0-1 Highway in Istanbul has been modelled in VISSIM. Traffic data have been gathered through detectors. The calibration procedure has been coded using MATLAB and implemented via the VISSIM-MATLAB COM interface. Using the proposed methodology, the results of the calibrated model showed that hybrid GAPSO and hybrid PSOGA techniques outperformed the GA-only and PSO-only techniques during the calibration process. Thus, both are recommended for use in the calibration of microsimulation traffic models, rather than GA-only and PSO-only techniques.This study is a part of Ph.D. thesis of the corresponding author from the Istanbul Technical University, Turkey. We would like to thank Prof. Dr. Toma? Maher and Assist. Dr. Rok Marseti? from the Traffic Technical Institute of Civil and Geodetic Engineering Faculty, University of Ljubljana, Slovenia, for their kind support, valuable comments, and helpful suggestions. The authors are also thankful to the PTV-AG Karlsruhe Company for their support in providing a thesis-based unlimited version of VISSIM softwarePublisher's Versio
Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks
Future wireless networks have a substantial potential in terms of supporting
a broad range of complex compelling applications both in military and civilian
fields, where the users are able to enjoy high-rate, low-latency, low-cost and
reliable information services. Achieving this ambitious goal requires new radio
techniques for adaptive learning and intelligent decision making because of the
complex heterogeneous nature of the network structures and wireless services.
Machine learning (ML) algorithms have great success in supporting big data
analytics, efficient parameter estimation and interactive decision making.
Hence, in this article, we review the thirty-year history of ML by elaborating
on supervised learning, unsupervised learning, reinforcement learning and deep
learning. Furthermore, we investigate their employment in the compelling
applications of wireless networks, including heterogeneous networks (HetNets),
cognitive radios (CR), Internet of things (IoT), machine to machine networks
(M2M), and so on. This article aims for assisting the readers in clarifying the
motivation and methodology of the various ML algorithms, so as to invoke them
for hitherto unexplored services as well as scenarios of future wireless
networks.Comment: 46 pages, 22 fig
- …