8,147 research outputs found

    Green OFDMA Resource Allocation in Cache-Enabled CRAN

    Full text link
    Cloud radio access network (CRAN), in which remote radio heads (RRHs) are deployed to serve users in a target area, and connected to a central processor (CP) via limited-capacity links termed the fronthaul, is a promising candidate for the next-generation wireless communication systems. Due to the content-centric nature of future wireless communications, it is desirable to cache popular contents beforehand at the RRHs, to reduce the burden on the fronthaul and achieve energy saving through cooperative transmission. This motivates our study in this paper on the energy efficient transmission in an orthogonal frequency division multiple access (OFDMA)-based CRAN with multiple RRHs and users, where the RRHs can prefetch popular contents. We consider a joint optimization of the user-SC assignment, RRH selection and transmit power allocation over all the SCs to minimize the total transmit power of the RRHs, subject to the RRHs' individual fronthaul capacity constraints and the users' minimum rate constraints, while taking into account the caching status at the RRHs. Although the problem is non-convex, we propose a Lagrange duality based solution, which can be efficiently computed with good accuracy. We compare the minimum transmit power required by the proposed algorithm with different caching strategies against the case without caching by simulations, which show the significant energy saving with caching.Comment: Presented in IEEE Online Conference on Green Communications (Online GreenComm), Nov. 2016 (Invited Paper

    The Quest for a Killer App for Opportunistic and Delay Tolerant Networks (Invited Paper)

    Get PDF
    Delay Tolerant Networking (DTN) has attracted a lot of attention from the research community in recent years. Much work have been done regarding network architectures and algorithms for routing and forwarding in such networks. At the same time as many show enthusiasm for this exciting new research area there are also many sceptics, who question the usefulness of research in this area. In the past, we have seen other research areas become over-hyped and later die out as there was no killer app for them that made them useful in real scenarios. Real deployments of DTN systems have so far mostly been limited to a few niche scenarios, where they have been done as proof-of-concept field tests in research projects. In this paper, we embark upon a quest to find out what characterizes a potential killer applications for DTNs. Are there applications and situations where DTNs provide services that could not be achieved otherwise, or have potential to do it in a better way than other techniques? Further, we highlight some of the main challenges that needs to be solved to realize these applications and make DTNs a part of the mainstream network landscape

    Joint Channel Estimation and Pilot Allocation in Underlay Cognitive MISO Networks

    Get PDF
    Cognitive radios have been proposed as agile technologies to boost the spectrum utilization. This paper tackles the problem of channel estimation and its impact on downlink transmissions in an underlay cognitive radio scenario. We consider primary and cognitive base stations, each equipped with multiple antennas and serving multiple users. Primary networks often suffer from the cognitive interference, which can be mitigated by deploying beamforming at the cognitive systems to spatially direct the transmissions away from the primary receivers. The accuracy of the estimated channel state information (CSI) plays an important role in designing accurate beamformers that can regulate the amount of interference. However, channel estimate is affected by interference. Therefore, we propose different channel estimation and pilot allocation techniques to deal with the channel estimation at the cognitive systems, and to reduce the impact of contamination at the primary and cognitive systems. In an effort to tackle the contamination problem in primary and cognitive systems, we exploit the information embedded in the covariance matrices to successfully separate the channel estimate from other users' channels in correlated cognitive single input multiple input (SIMO) channels. A minimum mean square error (MMSE) framework is proposed by utilizing the second order statistics to separate the overlapping spatial paths that create the interference. We validate our algorithms by simulation and compare them to the state of the art techniques.Comment: 6 pages, 2 figures, invited paper to IWCMC 201

    Nearly Optimal Resource Allocation for Downlink OFDMA in 2-D Cellular Networks

    Full text link
    In this paper, we propose a resource allocation algorithm for the downlink of sectorized two-dimensional (2-D) OFDMA cellular networks assuming statistical Channel State Information (CSI) and fractional frequency reuse. The proposed algorithm can be implemented in a distributed fashion without the need to any central controlling units. Its performance is analyzed assuming fast fading Rayleigh channels and Gaussian distributed multicell interference. We show that the transmit power of this simple algorithm tends, as the number of users grows to infinity, to the same limit as the minimal power required to satisfy all users' rate requirements i.e., the proposed resource allocation algorithm is asymptotically optimal. As a byproduct of this asymptotic analysis, we characterize a relevant value of the reuse factor that only depends on an average state of the network.Comment: submitted to IEEE Transactions on Wireless Communication

    Random Linear Network Coding for 5G Mobile Video Delivery

    Get PDF
    An exponential increase in mobile video delivery will continue with the demand for higher resolution, multi-view and large-scale multicast video services. Novel fifth generation (5G) 3GPP New Radio (NR) standard will bring a number of new opportunities for optimizing video delivery across both 5G core and radio access networks. One of the promising approaches for video quality adaptation, throughput enhancement and erasure protection is the use of packet-level random linear network coding (RLNC). In this review paper, we discuss the integration of RLNC into the 5G NR standard, building upon the ideas and opportunities identified in 4G LTE. We explicitly identify and discuss in detail novel 5G NR features that provide support for RLNC-based video delivery in 5G, thus pointing out to the promising avenues for future research.Comment: Invited paper for Special Issue "Network and Rateless Coding for Video Streaming" - MDPI Informatio

    AI-Driven Resource Allocation in Optical Wireless Communication Systems

    Full text link
    Visible light communication (VLC) is a promising solution to satisfy the extreme demands of emerging applications. VLC offers bandwidth that is orders of magnitude higher than what is offered by the radio spectrum, hence making best use of the resources is not a trivial matter. There is a growing interest to make next generation communication networks intelligent using AI based tools to automate the resource management and adapt to variations in the network automatically as opposed to conventional handcrafted schemes based on mathematical models assuming prior knowledge of the network. In this article, a reinforcement learning (RL) scheme is developed to intelligently allocate resources of an optical wireless communication (OWC) system in a HetNet environment. The main goal is to maximise the total reward of the system which is the sum rate of all users. The results of the RL scheme are compared with that of an optimization scheme that is based on Mixed Integer Linear Programming (MILP) model.Comment: 6 pages, 2 Figures, Conferenc

    Network Slicing Based 5G and Future Mobile Networks: Mobility, Resource Management, and Challenges

    Get PDF
    5G networks are expected to be able to satisfy users' different QoS requirements. Network slicing is a promising technology for 5G networks to provide services tailored for users' specific QoS demands. Driven by the increased massive wireless data traffic from different application scenarios, efficient resource allocation schemes should be exploited to improve the flexibility of network resource allocation and capacity of 5G networks based on network slicing. Due to the diversity of 5G application scenarios, new mobility management schemes are greatly needed to guarantee seamless handover in network-slicing-based 5G systems. In this article, we introduce a logical architecture for network-slicing-based 5G systems, and present a scheme for managing mobility between different access networks, as well as a joint power and subchannel allocation scheme in spectrum-sharing two-tier systems based on network slicing, where both the co-tier interference and cross-tier interference are taken into account. Simulation results demonstrate that the proposed resource allocation scheme can flexibly allocate network resources between different slices in 5G systems. Finally, several open issues and challenges in network-slicing-based 5G networks are discussed, including network reconstruction, network slicing management, and cooperation with other 5G technologies
    corecore