645 research outputs found

    A Bayesian Packet Sharing Approach for Noisy IoT Scenarios

    Get PDF
    International audience—Cloud computing and Internet of Things (IoT) represent two different technologies that are massively being adopted in our daily life, playing a fundamental role in the future Internet. One important challenge that need to be handled is the enormous amount of data generated by sensing devices, that make the control of sending useless data very important. In order to face with this challenge, there is a increasing interest about predictive approaches to avoid to send high spatio-temporal correlated data. Belief Propagation (BP) algorithm is a method of performing approximate inference on arbitrary graphical models that is becoming increasingly popular in the context of IoT. By exploiting BP, we can derive effective methods to drastically reduce the number of transmitted messages, while keeping high the data throughput in the global information system. In this paper, we propose a BP approach in a hierarchical architecture with simple nodes, gateways and data centers. We evaluate the error bounding and propose a corrective mechanism to keep a certain quality of the global information in the architecture considered

    Massive MIMO for Internet of Things (IoT) Connectivity

    Full text link
    Massive MIMO is considered to be one of the key technologies in the emerging 5G systems, but also a concept applicable to other wireless systems. Exploiting the large number of degrees of freedom (DoFs) of massive MIMO essential for achieving high spectral efficiency, high data rates and extreme spatial multiplexing of densely distributed users. On the one hand, the benefits of applying massive MIMO for broadband communication are well known and there has been a large body of research on designing communication schemes to support high rates. On the other hand, using massive MIMO for Internet-of-Things (IoT) is still a developing topic, as IoT connectivity has requirements and constraints that are significantly different from the broadband connections. In this paper we investigate the applicability of massive MIMO to IoT connectivity. Specifically, we treat the two generic types of IoT connections envisioned in 5G: massive machine-type communication (mMTC) and ultra-reliable low-latency communication (URLLC). This paper fills this important gap by identifying the opportunities and challenges in exploiting massive MIMO for IoT connectivity. We provide insights into the trade-offs that emerge when massive MIMO is applied to mMTC or URLLC and present a number of suitable communication schemes. The discussion continues to the questions of network slicing of the wireless resources and the use of massive MIMO to simultaneously support IoT connections with very heterogeneous requirements. The main conclusion is that massive MIMO can bring benefits to the scenarios with IoT connectivity, but it requires tight integration of the physical-layer techniques with the protocol design.Comment: Submitted for publicatio

    Resource-aware IoT Control: Saving Communication through Predictive Triggering

    Full text link
    The Internet of Things (IoT) interconnects multiple physical devices in large-scale networks. When the 'things' coordinate decisions and act collectively on shared information, feedback is introduced between them. Multiple feedback loops are thus closed over a shared, general-purpose network. Traditional feedback control is unsuitable for design of IoT control because it relies on high-rate periodic communication and is ignorant of the shared network resource. Therefore, recent event-based estimation methods are applied herein for resource-aware IoT control allowing agents to decide online whether communication with other agents is needed, or not. While this can reduce network traffic significantly, a severe limitation of typical event-based approaches is the need for instantaneous triggering decisions that leave no time to reallocate freed resources (e.g., communication slots), which hence remain unused. To address this problem, novel predictive and self triggering protocols are proposed herein. From a unified Bayesian decision framework, two schemes are developed: self triggers that predict, at the current triggering instant, the next one; and predictive triggers that check at every time step, whether communication will be needed at a given prediction horizon. The suitability of these triggers for feedback control is demonstrated in hardware experiments on a cart-pole, and scalability is discussed with a multi-vehicle simulation.Comment: 16 pages, 15 figures, accepted article to appear in IEEE Internet of Things Journal. arXiv admin note: text overlap with arXiv:1609.0753

    Review on Radio Resource Allocation Optimization in LTE/LTE-Advanced using Game Theory

    Get PDF
    Recently, there has been a growing trend toward ap-plying game theory (GT) to various engineering fields in order to solve optimization problems with different competing entities/con-tributors/players. Researches in the fourth generation (4G) wireless network field also exploited this advanced theory to overcome long term evolution (LTE) challenges such as resource allocation, which is one of the most important research topics. In fact, an efficient de-sign of resource allocation schemes is the key to higher performance. However, the standard does not specify the optimization approach to execute the radio resource management and therefore it was left open for studies. This paper presents a survey of the existing game theory based solution for 4G-LTE radio resource allocation problem and its optimization

    Efficient Bayesian Communication Approach For Smart Agriculture Applications

    Get PDF
    International audienceTo meet the food demand of the future, farmers are turning to the Internet of Things (IoT) for advanced analytics. In this case, data generated by sensor nodes and collected by farmers on the field provide a wealth of information about soil, seeds, crops, plant diseases, etc. Therefore, the use of high tech farming techniques and IoT technology offer insights on how to optimize and increase yield. However, one major challenge that should be addressed is the huge amount of data generated by the sensing devices, which make the control of sending useless data very important.To face this challenge, we present a Bayesian Inference Approach (BIA), which allows avoiding the transmission of high spatio-temporal correlateddata. In this paper, BIA is based on the PEACH project, which aims to predict frost events in peach orchards by means of dense monitoringusing low-power wireless mesh networking technology. Belief Propagation algorithm has been chosen for performing an approximate inferenceon our model in order to reconstruct the missing sensing data. According to different scenarios, BIA is evaluated based on the data collected from real sensors deployed on the peach orchard. The results show that our proposed approach reduces drastically the number of transmitted data and the energy consumption, while maintaining an acceptable level of data prediction accuracy

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig
    corecore