2 research outputs found

    On the importance and feasibility of forecasting data in sensors

    Full text link
    The first generation of wireless sensor nodes have constrained energy resources and computational power, which discourages applications to process any task other than measuring and transmitting towards a central server. However, nowadays, sensor networks tend to be incorporated into the Internet of Things and the hardware evolution may change the old strategy of avoiding data computation in the sensor nodes. In this paper, we show the importance of reducing the number of transmissions in sensor networks and present the use of forecasting methods as a way of doing it. Experiments using real sensor data show that state-of-the-art forecasting methods can be successfully implemented in the sensor nodes to keep the quality of their measurements and reduce up to 30% of their transmissions, lowering the channel utilization. We conclude that there is an old paradigm that is no longer the most beneficial, which is the strategy of always transmitting a measurement when it differs by more than a threshold from the last one transmitted. Adopting more complex forecasting methods in the sensor nodes is the alternative to significantly reduce the number of transmissions without compromising the quality of their measurements, and therefore support the exponential growth of the Internet of Things.Comment: 30 pages and 12 figures. This paper has been submitted to the Transactions on Mobile Computing journa

    The Impact of Dual Prediction Schemes on the Reduction of the Number of Transmissions in Sensor Networks

    Full text link
    Future Internet of Things (IoT) applications will require that billions of wireless devices transmit data to the cloud frequently. However, the wireless medium access is pointed as a problem for the next generations of wireless networks; hence, the number of data transmissions in Wireless Sensor Networks (WSNs) can quickly become a bottleneck, disrupting the exponential growth in the number of interconnected devices, sensors, and amount of produced data. Therefore, keeping a low number of data transmissions is critical to incorporate new sensor nodes and measure a great variety of parameters in future generations of WSNs. Thanks to the high accuracy and low complexity of state-of-the-art forecasting algorithms, Dual Prediction Schemes (DPSs) are potential candidates to optimize the data transmissions in WSNs at the finest level because they facilitate for sensor nodes to avoid unnecessary transmissions without affecting the quality of their measurements. In this work, we present a sensor network model that uses statistical theorems to describe the expected impact of DPSs and data aggregation in WSNs. We aim to provide a foundation for future works by characterizing the theoretical gains of processing data in sensors and conditioning its transmission to the predictions' accuracy. Our simulation results show that the number of transmissions can be reduced by almost 98% in the sensor nodes with the highest workload. We also detail the impact of predicting and aggregating transmissions according to the parameters that can be observed in common scenarios, such as sensor nodes' transmission ranges, the correlation between measurements of different sensors, and the period between two consecutive measurements in a sensor.Comment: 30 pages, 8 figure
    corecore