3 research outputs found

    Clock Domain Crossing Fault Model and Coverage Metric for Validation of SoC Design

    Full text link
    Multiple asynchronous clock domains have been increasingly employed in System-on-Chip (SoC) designs for different I/O interfaces. Functional validation is one of the most expensive tasks in the SoC design process. Simulation on register transfer level (RTL) is still the most widely used method. It is important to quantitatively measure the validation confidence and progress for clock domain crossing (CDC) designs. In this paper, we propose an efficient method for definition of CDC coverage, which can be used in RTL simulation for a multi-clock domain SoC design. First, we develop a CDC fault model to present the actual effect of metastability Second, we use a temporal dataflow graph (TDFG) to propagate the CDC faults to observable variables. Finally, CDC coverage is defined based on the CDC faults and their observability Our experiments on a commercial IP demonstrate that this method is useful to find CDC errors early in the design cycles.Automation & Control SystemsEngineering, Electrical & ElectronicEngineering, MechanicalEICPCI-S(ISTP)

    Design and Verification of Clock Domain Crossing Interfaces

    Get PDF
    The clock distribution network is an essential component in every synchronous digital system. The design of this network is becoming an increasingly sophisticated and difficult task due to the increasing logic capacity of chips and due to the fact that this network has to reach out to each and every memory element in the chip. Multiclock domain circuits with Clock Domain Crossing (CDC) interfaces are emerging as an alternative to circuits with a global clock. The design of CDC interfaces is a challenging task due to the difficulty of dealing with two possibly unrelated clock domains and the possibility of propagating metastability into the communicating blocks making CDC interfaces difficult to design and verify. In this work, we present a hybrid FIFO-asynchronous method for constructing robust CDC interfaces. This method avoids the shortcomings of previous interfaces and provides reliable transfer of data and control signals between different clock domains. A complete design is proposed, fully implemented using 90nm TSMC CMOS technology, and simulated using SPICE. Extensive simulations confirmed the robustness of the interface at different temperatures, different workloads, and varying frequency ratios. The reported implementation provides a maximum throughput of 606 Mitems/s. Moreover, we also address the challenging task of the verification of CDC interfaces. Most RTL simulation tools available today are incapable of simulating these interfaces. In this thesis, we present a framework for the formal verification of CDC interfaces. The framework explicitly models metastability by taking advantage of the unique features of probabilistic model checking. The framework is applied to common CDC interfaces by verifying them using the PRISM model checker

    Static verification tool improvement in ASIC design flow: Tool Evaluation using a real design

    Get PDF
    Verification is now the most time-consuming step in the design flow for digital circuits. Design organizations are constantly researching improvements to accelerate verification tasks so that a functional and efficient silicon can be released to a demanding market to improve the company’s competitive position. Today, EDA (Electronic Design Automation) tools are a part of the development of each designed circuit and contribute to the verification work. Automating and simplifying the verification flow will help focus on resolving the underlying system issues. The company is interested in improving its verification flow to take advantage of the features available in the EDA market. Recognizing more synchronization structures, an improved hierarchical verification flow and incremental verification flow could potentially improve verification process throughput in the company. This study examines how changing the tool improves the static verification flow for the company. This research examines the background of EDA tools and the most relevant theory to understand the tool roles and the static rule checks they make. In addition, the deployment of static verification tools will be discussed, and clock-domain crossing and lint checking tools will be introduced from an EDA toolkit. A modular inspection flow compatible with the server infrastructure will be built for this purpose. The company’s proprietary and problematic synchronization structures will be implemented as an interface-level script, so that the inspection software understands the used structures to be suitable for specific use-cases. Design constraints are developed to improve the accuracy of the results. In addition, this study measures the performance of the programs. The use of computing resources is measured in the company’s design environment and compared to the current verification flow. In addition, the quantity and quality of the reported messages are compared to the old flow, and the user experience and correctness of the results are briefly assessed. In the study, configured software checks are performed on the company’s own subsystem under development, and the suitability of the software for the organization’s purposes is determined by examining the results. Flow performance, duration, and utilization of computational resources are measured with the generated script and software reports. In addition, the functionality and user-friendliness of the software's graphical user interface are briefly reviewed. The research finds that the clock-domain crossing verification flow is accelerated by over three times and the lint verification flow by a quarter. The inspections detect almost three times more potential issues in the code. The new tool flow requires under a third of the disk space and system memory consumed by the old verification flow. In addition, it is observed that the new software is overall more pleasant to use: The software is perceived to be somewhat more challenging to learn, but in return it provides more information to solve the underlying issues in the code. Finally, it is concluded that the company should consider adding the tool to complement its verification tool belt due to the performance, verification thoroughness and more moderate use of computation server resources
    corecore