7 research outputs found

    Techniques to Improve the Efficiency of Data Transmission in Cable Networks

    Get PDF
    The cable television (CATV) networks, since their introduction in the late 1940s, have now become a crucial part of the broadcasting industry. To keep up with growing demands from the subscribers, cable networks nowadays not only provide television programs but also deliver two-way interactive services such as telephone, high-speed Internet and social TV features. A new standard for CATV networks is released every five to six years to satisfy the growing demands from the mass market. From this perspective, this thesis is concerned with three main aspects for the continuing development of cable networks: (i) efficient implementations of backward-compatibility functions from the old standard, (ii) addressing and providing solutions for technically-challenging issues in the current standard and, (iii) looking for prospective features that can be implemented in the future standard. Since 1997, five different versions of the digital CATV standard had been released in North America. A new standard often contains major improvements over the previous one. The latest version of the standard, namely DOCSIS 3.1 (released in late 2013), is packed with state-of-the-art technologies and allows approximately ten times the amount of traffic as compared to the previous standard, DOCSIS 3.0 (released in 2008). Backward-compatibility is a must-have function for cable networks. In particular, to facilitate the system migration from older standards to a newer one, the backward compatible functions in the old standards must remain in the newer-standard products. More importantly, to keep the implementation cost low, the inherited backward compatible functions must be redesigned by taking advantage of the latest technology and algorithms. To improve the backward-compatibility functions, the first contribution of the thesis focuses on redesigning the pulse shaping filter by exploiting infinite impulse response (IIR) filter structures as an alternative to the conventional finite impulse response (FIR) structures. Comprehensive comparisons show that more economical filters with better performance can be obtained by the proposed design algorithm, which considers a hybrid parameterization of the filter's transfer function in combination with a constraint on the pole radius to be less than 1. The second contribution of the thesis is a new fractional timing estimation algorithm based on peak detection by log-domain interpolation. When compared with the commonly-used timing detection method, which is based on parabolic interpolation, the proposed algorithm yields more accurate estimation with a comparable implementation cost. The third contribution of the thesis is a technique to estimate the multipath channel for DOCSIS 3.1 cable networks. DOCSIS 3.1 is markedly different from prior generations of CATV networks in that OFDM/OFDMA is employed to create a spectrally-efficient signal. In order to effectively demodulate such a signal, it is necessary to employ a demodulation circuit which involves estimation and tracking of the multipath channel. The estimation and tracking must be highly accurate because extremely dense constellations such as 4096-QAM and possibly 16384-QAM can be used in DOCSIS 3.1. The conventional OFDM channel estimators available in the literature either do not perform satisfactorily or are not suitable for the DOCSIS 3.1 channel. The novel channel estimation technique proposed in this thesis iteratively searches for parameters of the channel paths. The proposed technique not only substantially enhances the channel estimation accuracy, but also can, at no cost, accurately identify the delay of each echo in the system. The echo delay information is valuable for proactive maintenance of the network. The fourth contribution of this thesis is a novel scheme that allows OFDM transmission without the use of a cyclic prefix (CP). The structure of OFDM in the current DOCSIS 3.1 does not achieve the maximum throughput if the channel has multipath components. The multipath channel causes inter-symbol-interference (ISI), which is commonly mitigated by employing CP. The CP acts as a guard interval that, while successfully protecting the signal from ISI, reduces the transmission throughput. The problem becomes more severe for downstream direction, where the throughput of the entire system is determined by the user with the worst channel. To solve the problem, this thesis proposes major alterations to the current DOCSIS 3.1 OFDM/OFDMA structure. The alterations involve using a pair of Nyquist filters at the transceivers and an efficient time-domain equalizer (TEQ) at the receiver to reduce ISI down to a negligible level without the need of CP. Simulation results demonstrate that, by incorporating the proposed alterations to the DOCSIS 3.1 down-link channel, the system can achieve the maximum throughput over a wide range of multipath channel conditions

    Real-Time Quantum Noise Suppression In Very Low-Dose Fluoroscopy

    Get PDF
    Fluoroscopy provides real-time X-ray screening of patient's organs and of various radiopaque objects, which make it an invaluable tool for many interventional procedures. For this reason, the number of fluoroscopy screenings has experienced a consistent growth in the last decades. However, this trend has raised many concerns about the increase in X-ray exposure, as even low-dose procedures turned out to be not as safe as they were considered, thus demanding a rigorous monitoring of the X-ray dose delivered to the patients and to the exposed medical staff. In this context, the use of very low-dose protocols would be extremely beneficial. Nonetheless, this would result in very noisy images, which need to be suitably denoised in real-time to support interventional procedures. Simple smoothing filters tend to produce blurring effects that undermines the visibility of object boundaries, which is essential for the human eye to understand the imaged scene. Therefore, some denoising strategies embed noise statistics-based criteria to improve their denoising performances. This dissertation focuses on the Noise Variance Conditioned Average (NVCA) algorithm, which takes advantage of the a priori knowledge of quantum noise statistics to perform noise reduction while preserving the edges and has already outperformed many state-of-the-art methods in the denoising of images corrupted by quantum noise, while also being suitable for real-time hardware implementation. Different issues are addressed that currently limit the actual use of very low-dose protocols in clinical practice, e.g. the evaluation of actual performances of denoising algorithms in very low-dose conditions, the optimization of tuning parameters to obtain the best denoising performances, the design of an index to properly measure the quality of X-ray images, and the assessment of an a priori noise characterization approach to account for time-varying noise statistics due to changes of X-ray tube settings. An improved NVCA algorithm is also presented, along with its real-time hardware implementation on a Field Programmable Gate Array (FPGA). The novel algorithm provides more efficient noise reduction performances also for low-contrast moving objects, thus relaxing the trade-off between noise reduction and edge preservation, while providing a further reduction of hardware complexity, which allows for low usage of logic resources also on small FPGA platforms. The results presented in this dissertation provide the means for future studies aimed at embedding the NVCA algorithm in commercial fluoroscopic devices to accomplish real-time denoising of very low-dose X-ray images, which would foster their actual use in clinical practice

    System- and Data-Driven Methods and Algorithms

    Get PDF
    An increasing complexity of models used to predict real-world systems leads to the need for algorithms to replace complex models with far simpler ones, while preserving the accuracy of the predictions. This two-volume handbook covers methods as well as applications. This first volume focuses on real-time control theory, data assimilation, real-time visualization, high-dimensional state spaces and interaction of different reduction techniques
    corecore