284 research outputs found

    Channel Characterization for Chip-scale Wireless Communications within Computing Packages

    Get PDF
    Wireless Network-on-Chip (WNoC) appears as a promising alternative to conventional interconnect fabrics for chip-scale communications. WNoC takes advantage of an overlaid network composed by a set of millimeter-wave antennas to reduce latency and increase throughput in the communication between cores. Similarly, wireless inter-chip communication has been also proposed to improve the information transfer between processors, memory, and accelerators in multi-chip settings. However, the wireless channel remains largely unknown in both scenarios, especially in the presence of realistic chip packages. This work addresses the issue by accurately modeling flip-chip packages and investigating the propagation both its interior and its surroundings. Through parametric studies, package configurations that minimize path loss are obtained and the trade-offs observed when applying such optimizations are discussed. Single-chip and multi-chip architectures are compared in terms of the path loss exponent, confirming that the amount of bulk silicon found in the pathway between transmitter and receiver is the main determinant of losses.Comment: To be presented 12th IEEE/ACM International Symposium on Networks-on-Chip (NOCS 2018); Torino, Italy; October 201

    Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks

    Get PDF
    Soaring capacity and coverage demands dictate that future cellular networks need to soon migrate towards ultra-dense networks. However, network densification comes with a host of challenges that include compromised energy efficiency, complex interference management, cumbersome mobility management, burdensome signaling overheads and higher backhaul costs. Interestingly, most of the problems, that beleaguer network densification, stem from legacy networks' one common feature i.e., tight coupling between the control and data planes regardless of their degree of heterogeneity and cell density. Consequently, in wake of 5G, control and data planes separation architecture (SARC) has recently been conceived as a promising paradigm that has potential to address most of aforementioned challenges. In this article, we review various proposals that have been presented in literature so far to enable SARC. More specifically, we analyze how and to what degree various SARC proposals address the four main challenges in network densification namely: energy efficiency, system level capacity maximization, interference management and mobility management. We then focus on two salient features of future cellular networks that have not yet been adapted in legacy networks at wide scale and thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and device-to-device (D2D) communications. After providing necessary background on CoMP and D2D, we analyze how SARC can particularly act as a major enabler for CoMP and D2D in context of 5G. This article thus serves as both a tutorial as well as an up to date survey on SARC, CoMP and D2D. Most importantly, the article provides an extensive outlook of challenges and opportunities that lie at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201

    Taming and Leveraging Directionality and Blockage in Millimeter Wave Communications

    Get PDF
    To cope with the challenge for high-rate data transmission, Millimeter Wave(mmWave) is one potential solution. The short wavelength unlatched the era of directional mobile communication. The semi-optical communication requires revolutionary thinking. To assist the research and evaluate various algorithms, we build a motion-sensitive mmWave testbed with two degrees of freedom for environmental sensing and general wireless communication.The first part of this thesis contains two approaches to maintain the connection in mmWave mobile communication. The first one seeks to solve the beam tracking problem using motion sensor within the mobile device. A tracking algorithm is given and integrated into the tracking protocol. Detailed experiments and numerical simulations compared several compensation schemes with optical benchmark and demonstrated the efficiency of overhead reduction. The second strategy attempts to mitigate intermittent connections during roaming is multi-connectivity. Taking advantage of properties of rateless erasure code, a fountain code type multi-connectivity mechanism is proposed to increase the link reliability with simplified backhaul mechanism. The simulation demonstrates the efficiency and robustness of our system design with a multi-link channel record.The second topic in this thesis explores various techniques in blockage mitigation. A fast hear-beat like channel with heavy blockage loss is identified in the mmWave Unmanned Aerial Vehicle (UAV) communication experiment due to the propeller blockage. These blockage patterns are detected through Holm\u27s procedure as a problem of multi-time series edge detection. To reduce the blockage effect, an adaptive modulation and coding scheme is designed. The simulation results show that it could greatly improve the throughput given appropriately predicted patterns. The last but not the least, the blockage of directional communication also appears as a blessing because the geometrical information and blockage event of ancillary signal paths can be utilized to predict the blockage timing for the current transmission path. A geometrical model and prediction algorithm are derived to resolve the blockage time and initiate active handovers. An experiment provides solid proof of multi-paths properties and the numeral simulation demonstrates the efficiency of the proposed algorithm

    Engineer the channel and adapt to it: enabling wireless intra-chip communication

    Get PDF
    © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes,creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.The authors gratefully acknowledge support from the Spanish MINECO under grant PCIN-2015-012, from the EU’s H2020 FET-OPEN program under grants No. 736876 and No. 863337, and by the Catalan Institution for Research and Advanced Studies (ICREA).Peer ReviewedPostprint (author's final draft

    Engineer the Channel and Adapt to it: Enabling Wireless Intra-Chip Communication

    Get PDF
    Ubiquitous multicore processors nowadays rely on an integrated packet-switched network for cores to exchange and share data. The performance of these intra-chip networks is a key determinant of the processor speed and, at high core counts, becomes an important bottleneck due to scalability issues. To address this, several works propose the use of mm-wave wireless interconnects for intra-chip communication and demonstrate that, thanks to their low-latency broadcast and system-level flexibility, this new paradigm could break the scalability barriers of current multicore architectures. However, these same works assume 10+ Gb/s speeds and efficiencies close to 1 pJ/bit without a proper understanding on the wireless intra-chip channel. This paper first demonstrates that such assumptions do not hold in the context of commercial chips by evaluating losses and dispersion in them. Then, we leverage the system's monolithic nature to engineer the channel, this is, to optimize its frequency response by carefully choosing the chip package dimensions. Finally, we exploit the static nature of the channel to adapt to it, pushing efficiency-speed limits with simple tweaks at the physical layer. Our methods reduce the path loss and delay spread of a simulated commercial chip by 47 dB and 7.3x, respectively, enabling intra-chip wireless communications over 10 Gb/s and only 3.1 dB away from the dispersion-free case.Comment: 12 pages, 10 figures. IEEE Transactions on Communications Journal, 202

    Channel characterization for chip-scale wireless communications within computing packages

    Get PDF
    Wireless Network-on-Chip (WNoC) appears as a promising alternative to conventional interconnect fabrics for chip-scale communications. WNoC takes advantage of an overlaid network composed by a set of millimeter-wave antennas to reduce latency and increase throughput in the communication between cores. Similarly, wireless inter-chip communication has been also proposed to improve the information transfer between processors, memory, and accelerators in multi-chip settings. However, the wireless channel remains largely unknown in both scenarios, especially in the presence of realistic chip packages. This work addresses the issue by accurately modeling flip-chip packages and investigating the propagation both its interior and its surroundings. Through parametric studies, package configurations that minimize path loss are obtained and the trade-offs observed when applying such optimizations are discussed. Single-chip and multi-chip architectures are compared in terms of the path loss exponent, confirming that the amount of bulk silicon found in the pathway between transmitter and receiver is the main determinant of losses.Peer ReviewedPostprint (author's final draft

    Exploration of intercell wireless millimeter-wave communication in the landscape of intelligent metasurfaces

    Get PDF
    Software-defined metasurfaces are electromagnetically ultra-thin, artificial components thatcan provide engineered and externally controllable functionalities. The control over these functionalities isenabled by the metasurface tunability, which is implemented by embedded electronic circuits that modifylocally the surface resistance and reactance. Integrating controllers within the metasurface able them tointercommunicate and adaptively reconfigure, thus imparting a desired electromagnetic operation, opens thepath towards the creation of an artificially intelligent (AI) fabric where each unit cell can have its own sensing,programmable computing, and actuation facilities. In this work we take a crucial step towards bringing theAI metasurface technology to emerging applications, in particular exploring the wireless mm-wave intercellcommunication capabilities in a software-defined HyperSurface designed for operation in the microwaveregime. We examine three different wireless communication channels within the landscape of the reflectivemetasurface: Firstly, in the layer where the control electronics of the HyperSurface lie, secondly inside adedicated layer enclosed between two metallic plates, and, thirdly, inside the metasurface itself. For each casewe examine the physical implementation of the mm-wave transceiver nodes, we quantify communicationchannel metrics, and we identify complexity vs. performance trade-offs.Peer ReviewedPostprint (published version

    A Map-Free Indoor Localization Method Using Ultrawideband Large-Scale Array Systems

    Get PDF

    Proactive Received Power Prediction Using Machine Learning and Depth Images for mmWave Networks

    Full text link
    This study demonstrates the feasibility of the proactive received power prediction by leveraging spatiotemporal visual sensing information toward the reliable millimeter-wave (mmWave) networks. Since the received power on a mmWave link can attenuate aperiodically due to a human blockage, the long-term series of the future received power cannot be predicted by analyzing the received signals before the blockage occurs. We propose a novel mechanism that predicts a time series of the received power from the next moment to even several hundred milliseconds ahead. The key idea is to leverage the camera imagery and machine learning (ML). The time-sequential images can involve the spatial geometry and the mobility of obstacles representing the mmWave signal propagation. ML is used to build the prediction model from the dataset of sequential images labeled with the received power in several hundred milliseconds ahead of when each image is obtained. The simulation and experimental evaluations using IEEE 802.11ad devices and a depth camera show that the proposed mechanism employing convolutional LSTM predicted a time series of the received power in up to 500 ms ahead at an inference time of less than 3 ms with a root-mean-square error of 3.5 dB
    • …
    corecore