27 research outputs found
Proactive content caching in future generation communication networks: Energy and security considerations
The proliferation of hand-held devices and Internet of Things (IoT) applications has heightened demand for popular content download. A high volume of content streaming/downloading services during peak hours can cause network congestion. Proactive content caching has emerged as a prospective solution to tackle this congestion problem. In proactive content caching, data storage units are used to store popular content in helper nodes at the network edge. This contributes to a reduction of peak traffic load and network congestion.
However, data storage units require additional energy, which offers a challenge to researchers that intend to reduce energy consumption up to 90% in next generation networks. This thesis presents proactive content caching techniques to reduce grid energy consumption by utilizing renewable energy sources to power-up data storage units in helper nodes. The integration of renewable energy sources with proactive caching is a significant challenge due to the intermittent nature of renewable energy sources and investment costs. In this thesis, this challenge is tackled by introducing strategies to determine the optimal time of the day for content caching and optimal scheduling of caching nodes. The proposed strategies consider not only the availability of renewable energy but also temporal changes in network trac to reduce associated energy costs.
While proactive caching can facilitate the reduction of peak trac load and the integration of renewable energy, cached content objects at helper nodes are often more vulnerable to malicious attacks due to less stringent security at edge nodes. Potential content leakage can lead to catastrophic consequences, particularly for cache-equipped Industrial Internet of Things (IIoT) applications. In this thesis, the concept of \trusted caching nodes (TCNs) is introduced. TCNs cache popular content objects and provide security services to connected links. The proposed study optimally allocates TCNs and selects the most suitable content forwarding paths. Furthermore, a caching strategy is designed for mobile edge computing systems to support IoT task offloading. The strategy optimally assigns security resources to offloaded tasks while satisfying their individual requirements. However, security measures often contribute to overheads in terms of both energy consumption and delay. Consequently, in this thesis, caching techniques have been designed to investigate the trade-off between energy consumption and probable security breaches.
Overall, this thesis contributes to the current literature by simultaneously investigating energy and security aspects of caching systems whilst introducing solutions to relevant research problems
Recommended from our members
Hybrid, Proactive In-Network Caching for Mobile On-Demand Video Streaming
Mobile video streaming has become an essential application in mobile wireless networks,making up most of the mobile data of today’s Internet traffic. Studies have shown that mobile video data is projected to make up about 78 percent of the global mobile data traffic, and that global mobile data traffic is expected to increase sevenfold by 2021.Massive small cell base station (SBS) deployments have emerged as a potential solution promising to fulfill these unprecedented mobile data demands, by offering great coverage enhancements and maintaining high quality of video streaming. However, due to relatively small cell sizes and high user mobility, mobile video streaming in dense SBS networks faces fundamental challenges such as intermittent connectivity and frequent handoffs, causing degradation in video streaming quality. In this thesis, we tackle this issue by introducing a hybrid proactive in-network caching framework that stores some popular videos at the edge of the network, namely at the SBSs, while also pre-caching video contents in advance to better service mobile users. The proposed framework essentially reduces the need for bringing every requested video from the core (original)network, which results in alleviating network congestion by reducing back-haul traffic and in improving mobile video streaming experience by avoiding service discontinuity during handoffs. We develop a simulation framework using MATLAB to study the performance of the proposed hybrid proactive caching technique, and show using simulations that the proposed technique can effectively improve video quality of experience and reduce back-haul traffic.Keywords: hybrid proactive caching, Video Quality of Experience, Small-cell Base Station (SBS)., Mobile video streamin
Cache-Aided Non-Orthogonal Multiple Access
In this paper, we propose a novel joint caching and non-orthogonal multiple
access (NOMA) scheme to facilitate advanced downlink transmission for next
generation cellular networks. In addition to reaping the conventional
advantages of caching and NOMA transmission, the proposed cache-aided NOMA
scheme also exploits cached data for interference cancellation which is not
possible with separate caching and NOMA transmission designs. Furthermore, as
caching can help to reduce the residual interference power, several decoding
orders are feasible at the receivers, and these decoding orders can be flexibly
selected for performance optimization. We characterize the achievable rate
region of cache-aided NOMA and investigate its benefits for minimizing the time
required to complete video file delivery. Our simulation results reveal that,
compared to several baseline schemes, the proposed cache-aided NOMA scheme
significantly expands the achievable rate region for downlink transmission,
which translates into substantially reduced file delivery times.Comment: Accepted for presentation at IEEE ICC 201
A review on green caching strategies for next generation communication networks
© 2020 IEEE. In recent years, the ever-increasing demand for networking resources and energy, fueled by the unprecedented upsurge in Internet traffic, has been a cause for concern for many service providers. Content caching, which serves user requests locally, is deemed to be an enabling technology in addressing the challenges offered by the phenomenal growth in Internet traffic. Conventionally, content caching is considered as a viable solution to alleviate the backhaul pressure. However, recently, many studies have reported energy cost reductions contributed by content caching in cache-equipped networks. The hypothesis is that caching shortens content delivery distance and eventually achieves significant reduction in transmission energy consumption. This has motivated us to conduct this study and in this article, a comprehensive survey of the state-of-the-art green caching techniques is provided. This review paper extensively discusses contributions of the existing studies on green caching. In addition, the study explores different cache-equipped network types, solution methods, and application scenarios. We categorically present that the optimal selection of the caching nodes, smart resource management, popular content selection, and renewable energy integration can substantially improve energy efficiency of the cache-equipped systems. In addition, based on the comprehensive analysis, we also highlight some potential research ideas relevant to green content caching
Cache-Aided Non-Orthogonal Multiple Access: The Two-User Case
In this paper, we propose a cache-aided non-orthogonal multiple access (NOMA)
scheme for spectrally efficient downlink transmission. The proposed scheme not
only reaps the benefits associated with NOMA and caching, but also exploits the
data cached at the users for interference cancellation. As a consequence,
caching can help to reduce the residual interference power, making multiple
decoding orders at the users feasible. The resulting flexibility in decoding
can be exploited for improved NOMA detection. We characterize the achievable
rate region of cache-aided NOMA and derive the Pareto optimal rate tuples
forming the boundary of the rate region. Moreover, we optimize cache-aided NOMA
for minimization of the time required for completing file delivery. The optimal
decoding order and the optimal transmit power and rate allocation are derived
as functions of the cache status, the file sizes, and the channel conditions.
Simulation results confirm that, compared to several baseline schemes, the
proposed cache-aided NOMA scheme significantly expands the achievable rate
region and increases the sum rate for downlink transmission, which translates
into substantially reduced file delivery times.Comment: Accepted for publication in IEEE J. Sel. Topics Signal Process. arXiv
admin note: text overlap with arXiv:1712.0955