160 research outputs found
Aerial base stations with opportunistic links for next generation emergency communications
Rapidly deployable and reliable mission-critical communication networks are fundamental requirements to guarantee the successful operations of public safety officers during disaster recovery and crisis management preparedness. The ABSOLUTE project focused on designing, prototyping, and demonstrating a high-capacity IP mobile data network with low latency and large coverage suitable for many forms of multimedia delivery including public safety scenarios. The ABSOLUTE project combines aerial, terrestrial, and satellites communication networks for providing a robust standalone system able to deliver resilience communication systems. This article focuses on describing the main outcomes of the ABSOLUTE project in terms of network and system architecture, regulations, and implementation of aerial base stations, portable land mobile units, satellite backhauling, S-MIM satellite messaging, and multimode user equipments
Decentralized Spectrum Learning for IoT Wireless Networks Collision Mitigation
This paper describes the principles and implementation results of
reinforcement learning algorithms on IoT devices for radio collision mitigation
in ISM unlicensed bands. Learning is here used to improve both the IoT network
capability to support a larger number of objects as well as the autonomy of IoT
devices. We first illustrate the efficiency of the proposed approach in a
proof-of-concept based on USRP software radio platforms operating on real radio
signals. It shows how collisions with other RF signals present in the ISM band
are diminished for a given IoT device. Then we describe the first
implementation of learning algorithms on LoRa devices operating in a real
LoRaWAN network, that we named IoTligent. The proposed solution adds neither
processing overhead so that it can be ran in the IoT devices, nor network
overhead so that no change is required to LoRaWAN. Real life experiments have
been done in a realistic LoRa network and they show that IoTligent device
battery life can be extended by a factor 2 in the scenarios we faced during our
experiment
Upper-Confidence Bound for Channel Selection in LPWA Networks with Retransmissions
In this paper, we propose and evaluate different learning strategies based on
Multi-Arm Bandit (MAB) algorithms. They allow Internet of Things (IoT) devices
to improve their access to the network and their autonomy, while taking into
account the impact of encountered radio collisions. For that end, several
heuristics employing Upper-Confident Bound (UCB) algorithms are examined, to
explore the contextual information provided by the number of retransmissions.
Our results show that approaches based on UCB obtain a significant improvement
in terms of successful transmission probabilities. Furthermore, it also reveals
that a pure UCB channel access is as efficient as more sophisticated learning
strategies.Comment: The source code (MATLAB or Octave) used for the simula-tions and the
figures is open-sourced under the MIT License,
atBitbucket.org/scee\_ietr/ucb\_smart\_retran
Performance evaluation of synergic operation of algorithms enabling opportunistic networks - D4.3
Deliverable D4.3 del projecte OneFITPreprin
Designing and Implementing Future Aerial Communication Networks
Providing "connectivity from the sky" is the new innovative trend in wireless
communications. High and low altitude platforms, drones, aircrafts and airships
are being considered as the candidates for deploying wireless communications
complementing the terrestrial communication infrastructure. In this article, we
report the detailed account of the design and implementation challenges of an
aerial network consisting of LTE Advanced (LTE-A) base stations. In particular,
we review achievements and innovations harnessed by an aerial network composed
of Helikite platforms. Helikites can be raised in the sky to bring Internet
access during special events and in the aftermath of an emergency. The trial
phase of the system mounting LTE-A technology onboard Helikites to serve users
on the ground showed not only to be very encouraging but that such a system
could offer even a longer lasting solution provided that inefficiency in
powering the radio frequency equipment in the Helikite can be overcome.Comment: IEEE Communications Magazine 201
A review of the use of artificial intelligence methods in infrastructure systems
The artificial intelligence (AI) revolution offers significant opportunities to capitalise on the growth of digitalisation and has the potential to enable the ‘system of systems’ approach required in increasingly complex infrastructure systems. This paper reviews the extent to which research in economic infrastructure sectors has engaged with fields of AI, to investigate the specific AI methods chosen and the purposes to which they have been applied both within and across sectors. Machine learning is found to dominate the research in this field, with methods such as artificial neural networks, support vector machines, and random forests among the most popular. The automated reasoning technique of fuzzy logic has also seen widespread use, due to its ability to incorporate uncertainties in input variables. Across the infrastructure sectors of energy, water and wastewater, transport, and telecommunications, the main purposes to which AI has been applied are network provision, forecasting, routing, maintenance and security, and network quality management. The data-driven nature of AI offers significant flexibility, and work has been conducted across a range of network sizes and at different temporal and geographic scales. However, there remains a lack of integration of planning and policy concerns, such as stakeholder engagement and quantitative feasibility assessment, and the majority of research focuses on a specific type of infrastructure, with an absence of work beyond individual economic sectors. To enable solutions to be implemented into real-world infrastructure systems, research will need to move away from a siloed perspective and adopt a more interdisciplinary perspective that considers the increasing interconnectedness of these systems
Recommended from our members
Neural network design for intelligent mobile network optimisation
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonThe mobile networks users’ demands for data services are increasing exponentially, this is due to two main factors: the first is the evolution of smart phones and their application, and the second is the emerging new technologies for internet of things, smart cities…etc, which keeps pumping more data into the network; ‘though most of the data routed in the current mobile network is non-live data’. This increasing of demands arise the necessity for the mobile network operators to keep improving their network to satisfy it, this improvement takes place via adding hardware or increasing the resources or a combination of both. The radio resources are strictly limited due to spectrum licensing and availability, therefore efficient spectrum utilization is a major goal to be achieved for both network operators and developers. Simultaneous and multiple channel access,and adding more cells to the network are ways used to increase the data exchanged between the network nodes. The current 4G mobile system is based on the Orthogonal Frequency Division Multiple Access (OFDMA) for accessing the medium and the intercell interference degrades the link quality at the cell edge, with the introduction of heterogeneity concept to the LTE in Release 10 of the 3GPP the handover process became even more complex. To mitigate the intercell interference at the cell edge, coordinated multipoint and carrier aggregation techniques are utilized for dual connectivity. This work is focused on designing and proposing enhancing features to improve network performance and sustainability, these features comprises of distributing small cells for data only transmission, handover schemes performance evaluation at cell edge with dual connectivity, and Artificial Intelligence technology for balancing and prediction. In the proposed model design the data and controls of the Small eNodeB (SeNodeB) are processed at the network edge using a Mobile Edge Computing (MEC) server and the SeNodeBs are used to boost services provided to the users, also the concept of caching data has been investigated, the caching units where implemented in different network levels. The proposed system and resource management are simulated using the OPNET modeller and evaluated through multiple scenarios with and without full load, the UE is reconfigured to accommodate dual connectivity and have two separate connections for uplink and downlink, while maintaining connection to the Macro cell via uplink, the downlink is dedicated for small cells when content is requested from the cache. The results clearly show that the proposed system can decrease the latency while the total throughput delivered by the network has highly improved when SeNodeBs are deployed in the system, rising throughput will incur the rise of overall capacity which leads to better services being provided to the users or more users to join and benefit from the network. Handover improvement is also considered in this work, with the help of two Artificial Intelligence (AI) entities better handover performance are achieved. Balanced load over the SeNodeBs results in less frequent handover, the proposed load balancer is based on artificial neural network clustering model with self-organizing map as a hidden layer, it’s trained to forecast the network condition and learn to reduce the number of handovers especially for the UEs at the cell edge by performing only necessary ones, and avoid handovers to the Macro cell for the downlink direction. The examined handovers concern the downlinks when routing non live video stored at the small cell’s cache, and a reduction in the frequent handovers was achieved when running the balancer. Keep revolving in the handover orbit, another way to preserve and utilize network resources is by predicting the handovers before they occur, and allocate the required data in the target SeNodeB, the predictor entity in the proposed system architecture combines the features of Radial Basis Function Neural Network and neural network time series tool to create and update prediction list from the system’s collected data and learn to predict the next SeNodeB to associate with. The prediction entity is simulated using MATLAB, and the results shows that the system was able to deliver up to 92% correct predictions for handovers which led to overall throughput improvement of 75%
- …