15 research outputs found

    An Accurate Hybrid Delay Model for Multi-Input Gates

    Full text link
    Accurately modeling the delay of multi-input gates is challenging due to variations caused by switching different inputs in close temporal proximity. This paper introduces a hybrid model for a CMOS NOR gate, which is based on replacing transistors with time-variant resistors. We analytically solve the resulting non-constant coefficient differential equations and derive expressions for the gate delays, which also paved the way to an empirical parametrization procedure. By comparison with Spice simulation data, we show that our model indeed faithfully represents all relevant multi-input switching effects. Using an implementation in the Involution Tool, we also demonstrate that it surpasses the few alternative models known so far in terms of accuracy
    corecore