1,243 research outputs found

    A survey of self organisation in future cellular networks

    Get PDF
    This article surveys the literature over the period of the last decade on the emerging field of self organisation as applied to wireless cellular communication networks. Self organisation has been extensively studied and applied in adhoc networks, wireless sensor networks and autonomic computer networks; however in the context of wireless cellular networks, this is the first attempt to put in perspective the various efforts in form of a tutorial/survey. We provide a comprehensive survey of the existing literature, projects and standards in self organising cellular networks. Additionally, we also aim to present a clear understanding of this active research area, identifying a clear taxonomy and guidelines for design of self organising mechanisms. We compare strength and weakness of existing solutions and highlight the key research areas for further development. This paper serves as a guide and a starting point for anyone willing to delve into research on self organisation in wireless cellular communication networks

    Wireless transmission protocols using relays for broadcast and information exchange channels

    No full text
    Relays have been used to overcome existing network performance bottlenecks in meeting the growing demand for large bandwidth and high quality of service (QoS) in wireless networks. This thesis proposes several wireless transmission protocols using relays in practical multi-user broadcast and information exchange channels. The main theme is to demonstrate that efficient use of relays provides an additional dimension to improve reliability, throughput, power efficiency and secrecy. First, a spectrally efficient cooperative transmission protocol is proposed for the multiple-input and singleoutput (MISO) broadcast channel to improve the reliability of wireless transmission. The proposed protocol mitigates co-channel interference and provides another dimension to improve the diversity gain. Analytical and simulation results show that outage probability and the diversity and multiplexing tradeoff of the proposed cooperative protocol outperforms the non-cooperative scheme. Second, a two-way relaying protocol is proposed for the multi-pair, two-way relaying channel to improve the throughput and reliability. The proposed protocol enables both the users and the relay to participate in interference cancellation. Several beamforming schemes are proposed for the multi-antenna relay. Analytical and simulation results reveal that the proposed protocol delivers significant improvements in ergodic capacity, outage probability and the diversity and multiplexing tradeoff if compared to existing schemes. Third, a joint beamforming and power management scheme is proposed for multiple-input and multiple-output (MIMO) two-way relaying channel to improve the sum-rate. Network power allocation and power control optimisation problems are formulated and solved using convex optimisation techniques. Simulation results verify that the proposed scheme delivers better sum-rate or consumes lower power when compared to existing schemes. Fourth, two-way secrecy schemes which combine one-time pad and wiretap coding are proposed for the scalar broadcast channel to improve secrecy rate. The proposed schemes utilise the channel reciprocity and employ relays to forward secret messages. Analytical and simulation results reveal that the proposed schemes are able to achieve positive secrecy rates even when the number of users is large. All of these new wireless transmission protocols help to realise better throughput, reliability, power efficiency and secrecy for wireless broadcast and information exchange channels through the efficient use of relays

    Analysis and Design of Non-Orthogonal Multiple Access (NOMA) Techniques for Next Generation Wireless Communication Systems

    Get PDF
    The current surge in wireless connectivity, anticipated to amplify significantly in future wireless technologies, brings a new wave of users. Given the impracticality of an endlessly expanding bandwidth, there’s a pressing need for communication techniques that efficiently serve this burgeoning user base with limited resources. Multiple Access (MA) techniques, notably Orthogonal Multiple Access (OMA), have long addressed bandwidth constraints. However, with escalating user numbers, OMA’s orthogonality becomes limiting for emerging wireless technologies. Non-Orthogonal Multiple Access (NOMA), employing superposition coding, serves more users within the same bandwidth as OMA by allocating different power levels to users whose signals can then be detected using the gap between them, thus offering superior spectral efficiency and massive connectivity. This thesis examines the integration of NOMA techniques with cooperative relaying, EXtrinsic Information Transfer (EXIT) chart analysis, and deep learning for enhancing 6G and beyond communication systems. The adopted methodology aims to optimize the systems’ performance, spanning from bit-error rate (BER) versus signal to noise ratio (SNR) to overall system efficiency and data rates. The primary focus of this thesis is the investigation of the integration of NOMA with cooperative relaying, EXIT chart analysis, and deep learning techniques. In the cooperative relaying context, NOMA notably improved diversity gains, thereby proving the superiority of combining NOMA with cooperative relaying over just NOMA. With EXIT chart analysis, NOMA achieved low BER at mid-range SNR as well as achieved optimal user fairness in the power allocation stage. Additionally, employing a trained neural network enhanced signal detection for NOMA in the deep learning scenario, thereby producing a simpler signal detection for NOMA which addresses NOMAs’ complex receiver problem

    Adaptive Resource Allocation Algorithms For Data And Energy Integrated Networks Supporting Internet of Things

    Get PDF
    According to the forecast, there are around 2.1 billion IoT devices connected to the network by 2022. The rapidly increased IoT devices bring enormous pressure to the energy management work as most of them are battery-powered gadgets. What’s more, in some specific scenarios, the IoT nodes are fitted in some extreme environment. For example, a large-scale IoT pressure sensor system is deployed underneath the floor to detect people moving across the floor. A density-viscosity sensor is deployed inside the fermenting vat to discriminate variations in density and viscosity for monitoring the wine fermentation. A strain distribution wireless sensor for detecting the crack formation of the bridge is deployed underneath the bridge and attached near the welded part of the steel. It is difficult for people to have an access to the extreme environment. Hence, the energy management work, namely, replacing batteries for the rapidly increased IoT sensors in the extreme environment brings more challenges. In order to reduce the frequency of changing batteries, the thesis proposes a self-management Data and Energy Integrated Network (DEIN) system, which designs a stable and controllable ambient RF resource to charge the battery-less IoT wireless devices. It embraces an adaptive energy management mechanism for automatically maintaining the energy level of the battery-less IoT wireless devices, which always keeps the devices within a workable voltage range that is from 2.9 to 4.0 volts. Based on the DEIN system, RF energy transmission is achieved by transmitting the designed packets with enhanced transmission power. However, it partly occupies the bandwidth which was only used for wireless information transmission. Hence, a scheduling cycle mechanism is proposed in the thesis for organizing the RF energy and wireless information transmission in separate time slots. In addition, a bandwidth allocation algorithm is proposed to minimize the bandwidth for RF energy transmission in order to maximize the throughput of wireless information. To harvest the RF energy, the RF-to-DC energy conversion is essential at the receiver side. According to the existing technologies, the hardware design of the RF-to-DC energy converter is normally realized by the voltage rectifier which is structured by multiple Schottky diodes and capacitors. Research proves that a maximum of 84% RF-to-DC conversion efficiency is obtained by comparing a variety of different wireless band for transmitting RF energy. Furthermore, there is energy loss in the air during transmitting the RF energy to the receiver. Moreover, the circuital loss happens when the harvested energy is utilized by electronic components. Hence, how to improve the efficiency of RF energy utilization is considered in the thesis. According to the scenario proposed in the thesis, the harvested energy is mainly consumed for uplink transmission. a resource allocation algorithm is proposed to minimize the system’s energy consumption per bit of uplink data. It works out the optimal transmission power for RF energy as well as the bandwidth allocated for RF energy and wireless information transmission. Referring to the existing RF energy transmission and harvesting application on the market, the Powercast uses the supercapacitor to preserve the harvested RF energy. Due to the lack of self-control energy management mechanism for the embedded sensor, the harvested energy is consumed quickly, and the system has to keep transmitting RF energy. Existing jobs have proposed energy-saving methods for IoT wireless devices such as how to put them in sleep mode and how to reduce transmission power. However,they are not adaptive, and that would be an issue for a practical application. In the thesis, an energy-saving algorithm is designed to adaptively manage the transmission power of the device for uplink data transmission. The algorithm balances the trade-off between the transmission power and the packet loss rate. It finds the optimal transmission power to minimize the average energy cost for uplink data transmission, which saves the harvested energy to reduce the frequency of RF energy transmission to free more bandwidth for wireless information

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    Multiuser MIMO-OFDM for Next-Generation Wireless Systems

    No full text
    This overview portrays the 40-year evolution of orthogonal frequency division multiplexing (OFDM) research. The amelioration of powerful multicarrier OFDM arrangements with multiple-input multiple-output (MIMO) systems has numerous benefits, which are detailed in this treatise. We continue by highlighting the limitations of conventional detection and channel estimation techniques designed for multiuser MIMO OFDM systems in the so-called rank-deficient scenarios, where the number of users supported or the number of transmit antennas employed exceeds the number of receiver antennas. This is often encountered in practice, unless we limit the number of users granted access in the base station’s or radio port’s coverage area. Following a historical perspective on the associated design problems and their state-of-the-art solutions, the second half of this treatise details a range of classic multiuser detectors (MUDs) designed for MIMO-OFDM systems and characterizes their achievable performance. A further section aims for identifying novel cutting-edge genetic algorithm (GA)-aided detector solutions, which have found numerous applications in wireless communications in recent years. In an effort to stimulate the cross pollination of ideas across the machine learning, optimization, signal processing, and wireless communications research communities, we will review the broadly applicable principles of various GA-assisted optimization techniques, which were recently proposed also for employment inmultiuser MIMO OFDM. In order to stimulate new research, we demonstrate that the family of GA-aided MUDs is capable of achieving a near-optimum performance at the cost of a significantly lower computational complexity than that imposed by their optimum maximum-likelihood (ML) MUD aided counterparts. The paper is concluded by outlining a range of future research options that may find their way into next-generation wireless systems

    Cooperative control of relay based cellular networks

    Get PDF
    PhDThe increasing popularity of wireless communications and the higher data requirements of new types of service lead to higher demands on wireless networks. Relay based cellular networks have been seen as an effective way to meet users’ increased data rate requirements while still retaining the benefits of a cellular structure. However, maximizing the probability of providing service and spectrum efficiency are still major challenges for network operators and engineers because of the heterogeneous traffic demands, hard-to-predict user movements and complex traffic models. In a mobile network, load balancing is recognised as an efficient way to increase the utilization of limited frequency spectrum at reasonable costs. Cooperative control based on geographic load balancing is employed to provide flexibility for relay based cellular networks and to respond to changes in the environment. According to the potential capability of existing antenna systems, adaptive radio frequency domain control in the physical layer is explored to provide coverage at the right place at the right time. This thesis proposes several effective and efficient approaches to improve spectrum efficiency using network wide optimization to coordinate the coverage offered by different network components according to the antenna models and relay station capability. The approaches include tilting of antenna sectors, changing the power of omni-directional antennas, and changing the assignment of relay stations to different base stations. Experiments show that the proposed approaches offer significant improvements and robustness in heterogeneous traffic scenarios and when the propagation environment changes. The issue of predicting the consequence of cooperative decisions regarding antenna configurations when applied in a realistic environment is described, and a coverage prediction model is proposed. The consequences of applying changes to the antenna configuration on handovers are analysed in detail. The performance evaluations are based on a system level simulator in the context of Mobile WiMAX technology, but the concepts apply more generally
    • …
    corecore