151 research outputs found

    Direct communication radio Iinterface for new radio multicasting and cooperative positioning

    Get PDF
    Cotutela: Universidad de defensa UNIVERSITA’ MEDITERRANEA DI REGGIO CALABRIARecently, the popularity of Millimeter Wave (mmWave) wireless networks has increased due to their capability to cope with the escalation of mobile data demands caused by the unprecedented proliferation of smart devices in the fifth-generation (5G). Extremely high frequency or mmWave band is a fundamental pillar in the provision of the expected gigabit data rates. Hence, according to both academic and industrial communities, mmWave technology, e.g., 5G New Radio (NR) and WiGig (60 GHz), is considered as one of the main components of 5G and beyond networks. Particularly, the 3rd Generation Partnership Project (3GPP) provides for the use of licensed mmWave sub-bands for the 5G mmWave cellular networks, whereas IEEE actively explores the unlicensed band at 60 GHz for the next-generation wireless local area networks. In this regard, mmWave has been envisaged as a new technology layout for real-time heavy-traffic and wearable applications. This very work is devoted to solving the problem of mmWave band communication system while enhancing its advantages through utilizing the direct communication radio interface for NR multicasting, cooperative positioning, and mission-critical applications. The main contributions presented in this work include: (i) a set of mathematical frameworks and simulation tools to characterize multicast traffic delivery in mmWave directional systems; (ii) sidelink relaying concept exploitation to deal with the channel condition deterioration of dynamic multicast systems and to ensure mission-critical and ultra-reliable low-latency communications; (iii) cooperative positioning techniques analysis for enhancing cellular positioning accuracy for 5G+ emerging applications that require not only improved communication characteristics but also precise localization. Our study indicates the need for additional mechanisms/research that can be utilized: (i) to further improve multicasting performance in 5G/6G systems; (ii) to investigate sideline aspects, including, but not limited to, standardization perspective and the next relay selection strategies; and (iii) to design cooperative positioning systems based on Device-to-Device (D2D) technology

    Facilitating Internet of Things on the Edge

    Get PDF
    The evolution of electronics and wireless technologies has entered a new era, the Internet of Things (IoT). Presently, IoT technologies influence the global market, bringing benefits in many areas, including healthcare, manufacturing, transportation, and entertainment. Modern IoT devices serve as a thin client with data processing performed in a remote computing node, such as a cloud server or a mobile edge compute unit. These computing units own significant resources that allow prompt data processing. The user experience for such an approach relies drastically on the availability and quality of the internet connection. In this case, if the internet connection is unavailable, the resulting operations of IoT applications can be completely disrupted. It is worth noting that emerging IoT applications are even more throughput demanding and latency-sensitive which makes communication networks a practical bottleneck for the service provisioning. This thesis aims to eliminate the limitations of wireless access, via the improvement of connectivity and throughput between the devices on the edge, as well as their network identification, which is fundamentally important for IoT service management. The introduction begins with a discussion on the emerging IoT applications and their demands. Subsequent chapters introduce scenarios of interest, describe the proposed solutions and provide selected performance evaluation results. Specifically, we start with research on the use of degraded memory chips for network identification of IoT devices as an alternative to conventional methods, such as IMEI; these methods are not vulnerable to tampering and cloning. Further, we introduce our contributions for improving connectivity and throughput among IoT devices on the edge in a case where the mobile network infrastructure is limited or totally unavailable. Finally, we conclude the introduction with a summary of the results achieved

    The 3rd International Conference on the Challenges, Opportunities, Innovations and Applications in Electronic Textiles

    Get PDF
    This reprint is a collection of papers from the E-Textiles 2021 Conference and represents the state-of-the-art from both academia and industry in the development of smart fabrics that incorporate electronic and sensing functionality. The reprint presents a wide range of applications of the technology including wearable textile devices for healthcare applications such as respiratory monitoring and functional electrical stimulation. Manufacturing approaches include printed smart materials, knitted e-textiles and flexible electronic circuit assembly within fabrics and garments. E-textile sustainability, a key future requirement for the technology, is also considered. Supplying power is a constant challenge for all wireless wearable technologies and the collection includes papers on triboelectric energy harvesting and textile-based water-activated batteries. Finally, the application of textiles antennas in both sensing and 5G wireless communications is demonstrated, where different antenna designs and their response to stimuli are presented

    Addressing training data sparsity and interpretability challenges in AI based cellular networks

    Get PDF
    To meet the diverse and stringent communication requirements for emerging networks use cases, zero-touch arti cial intelligence (AI) based deep automation in cellular networks is envisioned. However, the full potential of AI in cellular networks remains hindered by two key challenges: (i) training data is not as freely available in cellular networks as in other fields where AI has made a profound impact and (ii) current AI models tend to have black box behavior making operators reluctant to entrust the operation of multibillion mission critical networks to a black box AI engine, which allow little insights and discovery of relationships between the configuration and optimization parameters and key performance indicators. This dissertation systematically addresses and proposes solutions to these two key problems faced by emerging networks. A framework towards addressing the training data sparsity challenge in cellular networks is developed, that can assist network operators and researchers in choosing the optimal data enrichment technique for different network scenarios, based on the available information. The framework encompasses classical interpolation techniques, like inverse distance weighted and kriging to more advanced ML-based methods, like transfer learning and generative adversarial networks, several new techniques, such as matrix completion theory and leveraging different types of network geometries, and simulators and testbeds, among others. The proposed framework will lead to more accurate ML models, that rely on sufficient amount of representative training data. Moreover, solutions are proposed to address the data sparsity challenge specifically in Minimization of drive test (MDT) based automation approaches. MDT allows coverage to be estimated at the base station by exploiting measurement reports gathered by the user equipment without the need for drive tests. Thus, MDT is a key enabling feature for data and artificial intelligence driven autonomous operation and optimization in current and emerging cellular networks. However, to date, the utility of MDT feature remains thwarted by issues such as sparsity of user reports and user positioning inaccuracy. For the first time, this dissertation reveals the existence of an optimal bin width for coverage estimation in the presence of inaccurate user positioning, scarcity of user reports and quantization error. The presented framework can enable network operators to configure the bin size for given positioning accuracy and user density that results in the most accurate MDT based coverage estimation. The lack of interpretability in AI-enabled networks is addressed by proposing a first of its kind novel neural network architecture leveraging analytical modeling, domain knowledge, big data and machine learning to turn black box machine learning models into more interpretable models. The proposed approach combines analytical modeling and domain knowledge to custom design machine learning models with the aim of moving towards interpretable machine learning models, that not only require a lesser training time, but can also deal with issues such as sparsity of training data and determination of model hyperparameters. The approach is tested using both simulated data and real data and results show that the proposed approach outperforms existing mathematical models, while also remaining interpretable when compared with black-box ML models. Thus, the proposed approach can be used to derive better mathematical models of complex systems. The findings from this dissertation can help solve the challenges in emerging AI-based cellular networks and thus aid in their design, operation and optimization

    Sensors, Signal, and Artificial Intelligent Processing

    Get PDF
    • …
    corecore