88 research outputs found
Cooperative download in urban vehicular networks
We target urban scenarios where vehicular users can download large files from road-side Access Points (APs), and define a framework to exploit opportunistic encounters between mobile nodes to increase their transfer rate. We first devise a
technique for APs deployment, based on vehicular traffic flows analysis, which fosters cooperative download. Then, we propose and evaluate different algorithms for carriers selection and chunk scheduling in carry&forward data transfers. Results obtained under realistic road topology and vehicular mobility conditions show that coupling our APs deployment scheme with probabilistic carriers selection and redundant chunk scheduling yields a worstcase
2x gain in the average download rate with respect to direct download, as well as a lOx reduction in the rate of undelivered chunks with respect to a blind carry&forward.Peer ReviewedPostprint (published version
Analytical models for the multiplexing of worst case traffic sources and their application to ATM traffic control.
Postprint (published version
Exact decoding probability under random linear network coding
In this letter, we compute the exact probability that a receiver obtains N linearly independent packets among K ≥ N received packets, when the sender/s use/s random linear network coding over a Galois Field of size q. Such condition maps to the receiver's capability to decode the original information, and its mathematical characterization helps to design the coding so to guarantee the correctness of the transmission. Our formulation represents an improvement over the current upper bound for the decoding probability, and provides theoretical grounding to simulative results in the literature.Peer ReviewedPostprint (published version
Information mobility: a new paradigm for wireless content dissemination
Content distribution networks are nowadays becoming a mature technology. Nevertheless, content delivery research in ad hoc networks has dealt with overlay applications, information querying and data broadcasting techniques. The aim of this paper is to explore and discuss content delivery in
wireless ad hoc networks and to state challenges and reference models for this kind of networks. The concept of Information
Mobility is introduced to point out that the contents are being stored in mobile nodes and that the contents move and replicate
before being accessed. The network considered is a wireless ad hoc network with sparse connectivity and limited infrastructure support.Peer ReviewedPostprint (published version
Review of linear algebra and applications to data science
Lectures notes of the "Review of linear algebra and applications to data science" of the course SANS (Statistical Analysis of Networks and Systems) at Master in Innovation and Research in Informatics (MIRI) at FIB, UPC.2023/202
Review of probability theory
Lectures notes of the review of probability theory of the course SANS (Statistical Analysis of Networks and Systems) at Master in Innovation and Research in Informatics (MIRI) at FIB, UPC.2023/202
A comparative study of calibration methods for low-cost ozone sensors in IoT platforms
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes,creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.This paper shows the result of the calibration process of an Internet of Things platform for the measurement of tropospheric ozone (O 3 ). This platform, formed by 60 nodes, deployed in Italy, Spain, and Austria, consisted of 140 metal–oxide O 3 sensors, 25 electro-chemical O 3 sensors, 25 electro-chemical NO 2 sensors, and 60 temperature and relative humidity sensors. As ozone is a seasonal pollutant, which appears in summer in Europe, the biggest challenge is to calibrate the sensors in a short period of time. In this paper, we compare four calibration methods in the presence of a large dataset for model training and we also study the impact of a limited training dataset on the long-range predictions. We show that the difficulty in calibrating these sensor technologies in a real deployment is mainly due to the bias produced by the different environmental conditions found in the prediction with respect to those found in the data training phase.Peer ReviewedPostprint (author's final draft
Graph signal reconstruction techniques for IoT air pollution monitoring platforms
Air pollution monitoring platforms play a very important role in preventing and mitigating the effects of pollution. Recent advances in the field of graph signal processing have made it possible to describe and analyze air pollution monitoring networks using graphs. One of the main applications is the reconstruction of the measured signal in a graph using a subset of sensors. Reconstructing the signal using information from neighboring sensors is a key technique for maintaining network data quality, with examples including filling in missing data with correlated neighboring nodes, creating virtual sensors, or correcting a drifting sensor with neighboring sensors that are more accurate. This paper proposes a signal reconstruction framework for air pollution monitoring data where a graph signal reconstruction model is superimposed on a graph learned from the data. Different graph signal reconstruction methods are compared on actual air pollution data sets measuring O3, NO2, and PM10. The ability of the methods to reconstruct the signal of a pollutant is shown, as well as the computational cost of this reconstruction. The results indicate the superiority of methods based on kernel-based graph signal reconstruction, as well as the difficulties of the methods to scale in an air pollution monitoring network with a large number of low-cost sensors. However, we show that the scalability of the framework can be improved with simple methods, such as partitioning the network using a clustering algorithm.This work is supported by the National Spanish funding PID2019-107910RB-I00, by regional project 2017SGR-990, and with the support of Secretaria d’Universitats i Recerca de la Generalitat de Catalunya i del Fons Social Europeu.Peer ReviewedPostprint (author's final draft
Volterra graph-based outlier detection for air pollution sensor networks
Today's air pollution sensor networks pose new challenges given their heterogeneity of low-cost sensors and high-cost instrumentation. Recently, with the advent of graph signal processing, sensor network measurements have been successfully represented by graphs depicting the relationships between sensors. However, one of the main problems of these sensor networks is their reliability, especially due to the inclusion of low-cost sensors, so the detection and identification of outliers is extremely important for maintaining the quality of the network data. In order to better identify the outliers of the sensors composing a network, we propose the Volterra graph-based outlier detection (VGOD) mechanism, which uses a graph learned from data and a Volterra-like graph signal reconstruction model to detect and localize abnormal measurements in air pollution sensor networks. The proposed unsupervised decision process is compared with other outlier detection methods, state-of-the-art graph-based methods and non-graph-based methods, showing improvements in both detection and localization of anomalous measurements, so that anomalous measurements can be corrected and malfunctioning sensors can be replaced.This work is supported by the National Spanish funding PID2019-107910RB-I00, by regional project 2017SGR-990, and with the support of Secretaria d’Universitats i Recerca de la Generalitat de Catalunya i del Fons Social Europeu.Peer ReviewedPostprint (author's final draft
Raw data collected from NO2, O3 and NO air pollution electrochemical low-cost sensors
Recently, the monitoring of air pollution by means of lowcost sensors has become a growing research field due to the study of techniques based on machine learning to improve the sensors’ data quality. For this purpose, sensors undergo a calibration process, where these are placed in-situ nearby a regulatory reference station. The data set explained in this paper contains data from two self-built low-cost air pollution nodes deployed for four months, from January 16, 2021 to May 15, 2021, at an official air quality reference station in Barcelona, Spain. The goal of the deployment was to have five electrochemical sensors at a high sampling rate of 0.5 Hz; two NO2 sensors, two O3 sensors, and one NO sensor. It should be noted that the reference stations publish air pollution data every hour, thus at a rate of 2.7 × 10-4 Hz. In addition, the nodes have also captured temperature and relative humidity data, which are typically used as correctors in the calibration of low-cost sensors. The availability of the sensors’ time series at this high resolution is important in order to be able to carry out analysis from the signal processing perspective, allowing the study of sensor sampling strategies, sensor signal filtering, and the calibration of low-cost sensors among others.This work was supported by National Spanish project PID2019-107910RB-I00, and regional project 2017SGR-990, and with the support of Secretaria d’Universitats i Recerca de la Generalitat de Catalunya i del Fons Social Europeu.Peer ReviewedPostprint (published version
- …