493 research outputs found

    On Regularity and Integrated DFM Metrics

    Get PDF
    Transistor geometries are well into the nanometer regime, keeping with Moore's Law. With this scaling in geometry, problems not significant in the larger geometries have come to the fore. These problems, collectively termed variability, stem from second-order effects due to the small geometries themselves and engineering limitations in creating the small geometries. The engineering obstacles have a few solutions which are yet to be widely adopted due to cost limitations in deploying them. Addressing and mitigating variability due to second-order effects comes largely under the purview of device engineers and to a smaller extent, design practices. Passive layout measures that ease these manufacturing limitations by regularizing the different layout pitches have been explored in the past. However, the question of the best design practice to combat systematic variations is still open. In this work we explore considerations for the regular layout of the exclusive-OR gate, the half-adder and full-adder cells implemented with varying degrees of regularity. Tradeoffs like complete interconnect unidirectionality, and the inevitable introduction of vias are qualitatively analyzed and some factors affecting the analysis are presented. Finally, results from the Calibre Critical Feature Analysis (CFA) of the cells are used to evaluate the qualitative analysis

    Comparison of finite difference and boundary integral solutions to three-dimensional spontaneous rupture

    Get PDF
    The spontaneously propagating shear crack on a frictional interface has proven to be a useful idealization of a natural earthquake. The corresponding boundary value problems are nonlinear and usually require computationally intensive numerical methods for their solution. Assessing the convergence and accuracy of the numerical methods is challenging, as we lack appropriate analytical solutions for comparison. As a complement to other methods of assessment, we compare solutions obtained by two independent numerical methods, a finite difference method and a boundary integral (BI) method. The finite difference implementation, called DFM, uses a traction-at-split-node formulation of the fault discontinuity. The BI implementation employs spectral representation of the stress transfer functional. The three-dimensional (3-D) test problem involves spontaneous rupture spreading on a planar interface governed by linear slip-weakening friction that essentially defines a cohesive law. To get a priori understanding of the spatial resolution that would be required in this and similar problems, we review and combine some simple estimates of the cohesive zone sizes which correspond quite well to the sizes observed in simulations. We have assessed agreement between the methods in terms of the RMS differences in rupture time, final slip, and peak slip rate and related these to median and minimum measures of the cohesive zone resolution observed in the numerical solutions. The BI and DFM methods give virtually indistinguishable solutions to the 3-D spontaneous rupture test problem when their grid spacing Δx is small enough so that the solutions adequately resolve the cohesive zone, with at least three points for BI and at least five node points for DFM. Furthermore, grid-dependent differences in the results, for each of the two methods taken separately, decay as a power law in Δx, with the same convergence rate for each method, the calculations apparently converging to a common, grid interval invariant solution. This result provides strong evidence for the accuracy of both methods. In addition, the specific solution presented here, by virtue of being demonstrably grid-independent and consistent between two very different numerical methods, may prove useful for testing new numerical methods for spontaneous rupture problems

    IMPROVING OPERATIONAL REPORTING WITH ARTIFICIAL INTELLIGENCE

    Get PDF
    Today, military analysts receive far more information than they can process in the time available for mission planning or decision-making. Operational demands have outpaced the analytical capacity of the Department of Defense. To address this problem, this work applies natural language processing to cluster reports based on the topics they contain, provides automatic text summarizations, and then demonstrates a prototype of a system that uses graph theory to visualize the results. The major findings reveal that the cosine similarity algorithm applied to vector-based models of documents produced statistically significant predictions of document similarity; the Term Frequency-Inverse Document Frequency algorithm improved similarity algorithm performance and produced topic models as document summaries; and a high degree of analytic efficiency was achieved using visualizations based on centrality measures and graph theory. From these results, one can see that clustering reports based on semantic similarity offers substantial advantages over current analytical procedures, which rely on manual reading of individual reports. On this basis, this thesis provides a prototype of a system to improve the utility of operational reporting as well as an analytical framework that can assist in the development of future capabilities for military planning and decision-making.Major, United States ArmyApproved for public release. distribution is unlimite

    Layout regularity metric as a fast indicator of process variations

    Get PDF
    Integrated circuits design faces increasing challenge as we scale down due to the increase of the effect of sensitivity to process variations. Systematic variations induced by different steps in the lithography process affect both parametric and functional yields of the designs. These variations are known, themselves, to be affected by layout topologies. Design for Manufacturability (DFM) aims at defining techniques that mitigate variations and improve yield. Layout regularity is one of the trending techniques suggested by DFM to mitigate process variations effect. There are several solutions to create regular designs, like restricted design rules and regular fabrics. These regular solutions raised the need for a regularity metric. Metrics in literature are insufficient for different reasons; either because they are qualitative or computationally intensive. Furthermore, there is no study relating either lithography or electrical variations to layout regularity. In this work, layout regularity is studied in details and a new geometrical-based layout regularity metric is derived. This metric is verified against lithographic simulations and shows good correlation. Calculation of the metric takes only few minutes on 1mm x 1mm design, which is considered fast compared to the time taken by simulations. This makes it a good candidate for pre-processing the layout data and selecting certain areas of interest for lithographic simulations for faster throughput. The layout regularity metric is also compared against a model that measures electrical variations due to systematic lithographic variations. The validity of using the regularity metric to flag circuits that have high variability using the developed electrical variations model is shown. The regularity metric results compared to the electrical variability model results show matching percentage that can reach 80%, which means that this metric can be used as a fast indicator of designs more susceptible to lithography and hence electrical variations
    • …
    corecore