2 research outputs found

    An electrical-level superposed-edge approach to statistical serial link simulation

    No full text
    rute-force simulation approaches to estimating serial-link bit-error rates (BERs) become computationally intractable for the case when BERs are low and the interconnect electrical response is slow enough to generate intersymbol interference that spans dozens of bit periods. Electrical-level statistical simulation approaches based on superposing pulse responses were developed to address this problem, but such pulse-based methods have difficulty analyzing jitter and rise/fall asymmetry. In this paper we present a superposing-edge approach for statistical simulation, as edge-based methods handle rise/fall asymmetry and jitter in straightforward way. We also resolve a key problem in using edge-based approaches, that edges are always correlated, by deriving an efficient inductive approach for propagating the edge correlations. Examples are presented demonstrating the edge-based method's accuracy and effectiveness in analyzing combinations of uniform, Gaussian, and periodic distributed random jitter

    High-speed Channel Analysis and Design using Polynomial Chaos Theory and Machine Learning

    Get PDF
    With the exponential increase in the data rate of high-speed serial channels, their efficient and accurate analysis and design has become of crucial importance. Signal integrity analysis of these channels is often done with the eye diagram analysis, which demonstrates jitter and noise of the channel. Conventional methods for this type of analysis are either exorbitantly time and memory consuming, or only applicable to linear time invariant (LTI) systems. On the other hand, recently advancements in numerical methods and machine learning has shown a great potential for analysis and design of high-speed electronics. Therefore, in this dissertation we introduce two novel approaches for efficient eye analysis, based on machine learning and numerical techniques. These methods are focused on the data dependent jitter and noise, and the intersymbol interference. In the first approach, a complete surrogate model of the channel is trained using a short transient simulation. This model is based on the Polynomial Chaos theory. It can directly and quickly provide distribution of the jitter and other statistics of the eye diagram. In addition, it provides an estimation of the full eye diagram. The second analysis method is for faster analysis when we are interested in finding the worst-case eye width, eye height, and inner eye opening, which would be achieved by the conventional eye analysis if its transient simulation is continued for an arbitrary amount of time. The proposed approach quickly finds the data patterns resulting in the worst signal integrity; hence, in the closest eye. This method is based on the Bayesian optimization. Although majority of the contributions of this dissertation are on the analysis part, for the sake of completeness the final portion of this work is dedicated to design of high-speed channels with machine learning since the interference and complex interactions in modern channels has made their design challenging and time consuming too. The proposed design approach focuses on inverse design of CTLE, where the desired eye height and eye width are given, and the algorithm finds the corresponding peaking and DC gain of CTLE. This approach is based on the invertible neural networks. Main advantage of this network is the possibility to provide multiple solutions for cases where the answer to the inverse problem is not unique. Numerical examples are provided to evaluate efficiency and accuracy of the proposed approaches. The results show up to 11.5X speedup for direct estimation of the jitter distribution using the PC surrogate model approach. In addition, up to 23X speedup using the worst-case eye analysis approach is achieved, and the inverse design of CTLE shows promising results.Ph.D
    corecore