717 research outputs found

    vPROBE: Variation aware post-silicon power/performance binning using embedded 3T1D cells

    Get PDF
    In this paper, we present an on-die post-silicon binning methodology that takes into account the effect of static and dynamic variations and categorizes every processor based on power/performance.The proposed scheme is composed of a discretization hardware that exploits the delay/leakage dependence on variability sources characteristic for categorizationPreprin

    Via-configurable transistors array: a regular design technique to improve ICs yield

    Get PDF
    Process variations are a major bottleneck for digital CMOS integrated circuits manufacturability and yield. That is why regular techniques with different degrees of regularity are emerging as possible solutions. Our proposal is a new regular layout design technique called Via-Configurable Transistors Array (VCTA) that pushes to the limit circuit layout regularity for devices and interconnects in order to maximize regularity benefits. VCTA is predicted to perform worse than the Standard Cell approach designs for a certain technology node but it will allow the use of a future technology on an earlier time. Our objective is to optimize VCTA for it to be comparable to the Standard Cell design in an older technology. Simulations for the first unoptimized version of our VCTA of delay and energy consumption for a Full Adder circuit in the 90 nm technology node are presented and also the extrapolation for Carry-Ripple Adders from 4 bits to 64 bits.Peer ReviewedPostprint (published version

    CAD Techniques for Robust FPGA Design Under Variability

    Get PDF
    The imperfections in the semiconductor fabrication process and uncertainty in operating environment of VLSI circuits have emerged as critical challenges for the semiconductor industry. These are generally termed as process and environment variations, which lead to uncertainty in performance and unreliable operation of the circuits. These problems have been further aggravated in scaled nanometer technologies due to increased process variations and reduced operating voltage. Several techniques have been proposed recently for designing digital VLSI circuits under variability. However, most of them have targeted ASICs and custom designs. The flexibility of reconfiguration and unknown end application in FPGAs make design under variability different for FPGAs compared to ASICs and custom designs, and the techniques proposed for ASICs and custom designs cannot be directly applied to FPGAs. An important design consideration is to minimize the modifications in architecture and circuit to reduce the cost of changing the existing FPGA architecture and circuit. The focus of this work can be divided into three principal categories, which are, improving timing yield under process variations, improving power yield under process variations and improving the voltage profile in the FPGA power grid. The work on timing yield improvement proposes routing architecture enhancements along with CAD techniques to improve the timing yield of FPGA designs. The work on power yield improvement for FPGAs selects a low power dual-Vdd FPGA design as the baseline FPGA architecture for developing power yield enhancement techniques. It proposes CAD techniques to improve the power yield of FPGAs. A mathematical programming technique is proposed to determine the parameters of the buffers in the interconnect such as the sizes of the transistors and threshold voltage of the transistors, all within constraints, such that the leakage variability is minimized under delay constraints. Two CAD techniques are investigated and proposed to improve the supply voltage profile of the power grids in FPGAs. The first technique is a place and route technique and the second technique is a logic clustering technique to reduce IR-drops and spatial variation of supply voltage in the power grid

    Standard cell library design for sub-threshold operation

    Get PDF

    Reliability in the face of variability in nanometer embedded memories

    Get PDF
    In this thesis, we have investigated the impact of parametric variations on the behaviour of one performance-critical processor structure - embedded memories. As variations manifest as a spread in power and performance, as a first step, we propose a novel modeling methodology that helps evaluate the impact of circuit-level optimizations on architecture-level design choices. Choices made at the design-stage ensure conflicting requirements from higher-levels are decoupled. We then complement such design-time optimizations with a runtime mechanism that takes advantage of adaptive body-biasing to lower power whilst improving performance in the presence of variability. Our proposal uses a novel fully-digital variation tracking hardware using embedded DRAM (eDRAM) cells to monitor run-time changes in cache latency and leakage. A special fine-grain body-bias generator uses the measurements to generate an optimal body-bias that is needed to meet the required yield targets. A novel variation-tolerant and soft-error hardened eDRAM cell is also proposed as an alternate candidate for replacing existing SRAM-based designs in latency critical memory structures. In the ultra low-power domain where reliable operation is limited by the minimum voltage of operation (Vddmin), we analyse the impact of failures on cache functional margin and functional yield. Towards this end, we have developed a fully automated tool (INFORMER) capable of estimating memory-wide metrics such as power, performance and yield accurately and rapidly. Using the developed tool, we then evaluate the #effectiveness of a new class of hybrid techniques in improving cache yield through failure prevention and correction. Having a holistic perspective of memory-wide metrics helps us arrive at design-choices optimized simultaneously for multiple metrics needed for maintaining lifetime requirements

    When can social media lead financial markets?

    Get PDF
    Social media analytics is showing promise for the prediction of financial markets. The research presented here employs linear regression analysis and information theory analysis techniques to measure the extent to which social media data is a predictor of the future returns of stock-exchange traded financial assets. Two hypotheses are proposed which investigate if the measurement of social media data in real-time can be used to pre-empt – or lead – changes in the prices of financial markets. Using Twitter as the social media data source, this study firstly investigates if geographically-filtered Tweets can lead the returns of UK and US stock indices. Next, the study considers if string-filtered Tweets can lead the returns of currency pairs and the securities of individual publically-traded companies. The study evaluates Tweet message sentiments – mathematical quantifications of text strings’ moods – and Tweet message volumes. A sentiment classification system specifically designed and validated in literature to accurately rank social media’s colloquial vernacular is employed. This research builds on previous studies which either use sentiment analysis techniques not geared for such text, or which instead only consider social media message volumes. Stringent tests for statistical-significance are employed. Tweets on twenty-eight financial instruments were collected over three months – a period chosen to minimise the effect of the economic cycle in the time-series whilst encapsulating a range of market conditions, and during which no major product changes were made to Twitter. The study shows that Tweet message sentiments contain lead-time information about the future returns of twelve of these securities, in excess of what is achievable via the analysis of Twitter message volumes. The study’s results are found to be robust against modification in analysis parameters, and that additional insight about market returns can be gained from social media data sentiment analytics under particular parameter variations
    • …
    corecore