2 research outputs found

    Performance optimizations for LTE User-Plane L2 software

    Get PDF
    Abstract. Nowadays modern mobile communication networks are expected to be able to compete with wired connections in both latency and speed. This places a lot of pressure on the mobile communication protocols, which are very complex, and much of their implementation depends on the software. The performance of the software directly affects the capacity of the network, which in turn affects the throughput and latency of the network’s users and the number of users the network can support. This thesis concentrates on identifying software components of LTE User-Plane radio interface protocols for improvements, and exploring the solutions for better performance. This study leans on system component tests and the performance profiler tool perf, which enables tracking the effects of software optimizations from function-level to the whole system-level accuracy. In addition to perf, performance counters provided by the processor are manually observed and they provide the verification on why specific optimizations affect the performance. Slow memory accesses or cache misses are identified as the most constraining factor in the software’s performance. Also many good practices are found during the optimization work, such as arranging code common path first. Surprisingly, separating hardly executed code from hotspots also has a positive impact on performance, in addition to shrinking the active binary. The optimization work results in the whole software’s load decreasing from 60% to 50% and in some individual functions load decreases of over 70% are achieved

    From Traditional Adaptive Data Caching to Adaptive Context Caching: A Survey

    Full text link
    Context data is in demand more than ever with the rapid increase in the development of many context-aware Internet of Things applications. Research in context and context-awareness is being conducted to broaden its applicability in light of many practical and technical challenges. One of the challenges is improving performance when responding to large number of context queries. Context Management Platforms that infer and deliver context to applications measure this problem using Quality of Service (QoS) parameters. Although caching is a proven way to improve QoS, transiency of context and features such as variability, heterogeneity of context queries pose an additional real-time cost management problem. This paper presents a critical survey of state-of-the-art in adaptive data caching with the objective of developing a body of knowledge in cost- and performance-efficient adaptive caching strategies. We comprehensively survey a large number of research publications and evaluate, compare, and contrast different techniques, policies, approaches, and schemes in adaptive caching. Our critical analysis is motivated by the focus on adaptively caching context as a core research problem. A formal definition for adaptive context caching is then proposed, followed by identified features and requirements of a well-designed, objective optimal adaptive context caching strategy.Comment: This paper is currently under review with ACM Computing Surveys Journal at this time of publishing in arxiv.or
    corecore