31,047 research outputs found

    TS-MUWSN: Time synchronization for mobile underwater sensor networks

    Get PDF
    Time synchronization is an important, yet challenging, problem in underwater sensor networks (UWSNs). This challenge can be attributed to: 1) messaging timestamping; 2) node mobility; and 3) Doppler scale effect. To mitigate these problems, we present an acoustic-based time-synchronization algorithm for UWSN, where we compare several message time-stamping algorithms in addition to different Doppler scale estimators. A synchronization system is based on a bidirectional message exchange between a reference node and a slave one, which has to be synchronized. Therefore, we take as reference the DA-Sync-like protocol (Liu et al., 2014), which takes into account node's movement by using first-order kinematic equations, which refine Doppler scale factor estimation accuracy, and result in better synchronization performance. In our study, we propose to modify both time-stamping and Doppler scale estimation procedures. Besides simulation, we also perform real tests in controlled underwater communication in a water test tank and a shallow-water test in the Mediterranean Sea.Peer ReviewedPostprint (author's final draft

    Channel Dynamics and SNR Tracking in Millimeter Wave Cellular Systems

    Full text link
    The millimeter wave (mmWave) frequencies are likely to play a significant role in fifth-generation (5G) cellular systems. A key challenge in developing systems in these bands is the potential for rapid channel dynamics: since mmWave signals are blocked by many materials, small changes in the position or orientation of the handset relative to objects in the environment can cause large swings in the channel quality. This paper addresses the issue of tracking the signal to noise ratio (SNR), which is an essential procedure for rate prediction, handover and radio link failure detection. A simple method for estimating the SNR from periodic synchronization signals is considered. The method is then evaluated using real experiments in common blockage scenarios combined with outdoor statistical models

    Efficient DSP and Circuit Architectures for Massive MIMO: State-of-the-Art and Future Directions

    Full text link
    Massive MIMO is a compelling wireless access concept that relies on the use of an excess number of base-station antennas, relative to the number of active terminals. This technology is a main component of 5G New Radio (NR) and addresses all important requirements of future wireless standards: a great capacity increase, the support of many simultaneous users, and improvement in energy efficiency. Massive MIMO requires the simultaneous processing of signals from many antenna chains, and computational operations on large matrices. The complexity of the digital processing has been viewed as a fundamental obstacle to the feasibility of Massive MIMO in the past. Recent advances on system-algorithm-hardware co-design have led to extremely energy-efficient implementations. These exploit opportunities in deeply-scaled silicon technologies and perform partly distributed processing to cope with the bottlenecks encountered in the interconnection of many signals. For example, prototype ASIC implementations have demonstrated zero-forcing precoding in real time at a 55 mW power consumption (20 MHz bandwidth, 128 antennas, multiplexing of 8 terminals). Coarse and even error-prone digital processing in the antenna paths permits a reduction of consumption with a factor of 2 to 5. This article summarizes the fundamental technical contributions to efficient digital signal processing for Massive MIMO. The opportunities and constraints on operating on low-complexity RF and analog hardware chains are clarified. It illustrates how terminals can benefit from improved energy efficiency. The status of technology and real-life prototypes discussed. Open challenges and directions for future research are suggested.Comment: submitted to IEEE transactions on signal processin

    milliProxy: a TCP Proxy Architecture for 5G mmWave Cellular Systems

    Full text link
    TCP is the most widely used transport protocol in the internet. However, it offers suboptimal performance when operating over high bandwidth mmWave links. The main issues introduced by communications at such high frequencies are (i) the sensitivity to blockage and (ii) the high bandwidth fluctuations due to Line of Sight (LOS) to Non Line of Sight (NLOS) transitions and vice versa. In particular, TCP has an abstract view of the end-to-end connection, which does not properly capture the dynamics of the wireless mmWave link. The consequence is a suboptimal utilization of the available resources. In this paper we propose a TCP proxy architecture that improves the performance of TCP flows without any modification at the remote sender side. The proxy is installed in the Radio Access Network, and exploits information available at the gNB in order to maximize throughput and minimize latency.Comment: 7 pages, 6 figures, 2 tables, presented at the 2017 51st Asilomar Conference on Signals, Systems and Computers, Pacific Grove, CA, 201
    • …
    corecore