5,863 research outputs found
Methodology for Standby Leakage Power Reduction in Nanometer-Scale CMOS Circuits
In nanometer-scale CMOS technology, leakage power has become a major component of the total power dissipation due to the downscaling of threshold voltage and gate oxide thickness. The leakage power consumption has received even more attention by increasing demand for mobile devices. Since mobile devices spend a majority of their time in a standby mode, the leakage power savings in standby state is critical to extend battery lifetime. For this reason, low power has become a major factor in designing CMOS circuits.
In this dissertation, we propose a novel transistor reordering methodology for leakage reduction. Unlike previous technique, the proposed method provides exact reordering rules for minimum leakage formation by considering all leakage components. Thus, this method formulates an optimized structure for leakage reduction even in complex CMOS logic gate, and can be used in combination with other leakage reduction techniques to achieve further improvement.
We also propose a new standby leakage reduction methodology, leakage-aware body biasing, to overcome the shortcomings of a conventional Reverse Body Biasing (RBB) technique. The RBB technique has been used to reduce subthreshold leakage current. Therefore, this technique works well under subthreshold dominant region even though it has intrinsic structural drawbacks. However, such drawbacks cannot be overlooked anymore since gate leakage has become comparable to subthreshold leakage in nanometer-scale region. In addition, BTBT leakage also increases with technology scaling due to the higher doping concentration applied in each process technology. In these circumstances, the objective of leakage minimization is not a single leakage source but the overall leakage sources. The proposed leakage-aware body biasing technique, unlike conventional RBB technique, considers all major leakage sources to minimize the negative effects of existing body biasing approach. This can be achieved by intelligently applying body bias to appropriate CMOS network based on its status (on-/off-state) with the aid of a pin/transistor reordering technique
Leakage Power Reduction Techniques in Deep Submicron Technologies for VLSI Applications
AbstractThe leakage power dissipation has become one of the most challenging issues in low power VLSI circuit designs especially with on-chip devices as it doubles for every two years[4]-[5]. The scaling down of threshold voltage has contributed enormously towards increase in subthreshold leakage current thereby making the static (leakage) power dissipation very high. According to International Technology Roadmap for Semiconductors (ITRS), the total power dissipation may be significantly contributed by leakage power dissipation [1]. The battery operated devices with long duration in standby mode may be drained out very quickly due to the leakage power. In CMOS submicron technologies, leakage power dissipation plays a significant role. However, various low power design techniques for efficient minimization of leakage power are proposed in the literature review. A comprehensive study and analysis of various leakage power minimization techniques have been presented in this paper. The present research study and its corresponding analysis are mainly focusing on circuit performance parameters. It is implied from the current literature that only an appropriate choice of leakage power minimization technique for a specific application can be effectively carried by a VLSI circuit designer based on sequential analytical approach
A new circuit technique for reduced leakage current in Deep Submicron CMOS technologies
Modern CMOS processes in the Deep Submicron regime are restricted to supply voltages below 2 volts and further to account for the transistors' field strength limitations and to reduce the power per logic gate. To maintain the high switching performance, the threshold voltage must be scaled according with the supply voltage. However, this leads to an increased subthreshold current of the transistors in standby mode (<i>V</i><sub><i>GS</i></sub>=0). Another source of leakage is gate current, which becomes significant for gate oxides of 3nm and below. </p><p style="line-height: 20px;"> We propose a <b>S</b>elf-<b>B</b>iasing <b>V</b>irtual <b>R</b>ails (SBVR) - CMOS technique which acts like an adaptive local supply voltage in case of standby mode. Most important sources of leakage currents are reduced by this technique. Moreover, SBVR-CMOS is capable of conserving stored information in sleep mode, which is vital for memory circuits. </p><p style="line-height: 20px;"> Memories are exposed to radiation causing soft errors. This well-known problem becomes even worse in standby mode of typical SRAMs, that have low driving performance to withstand alpha particle hits. In this paper, a 16-transistor SRAM cell is proposed, which combines the advantage of extremely low leakage currents with a very high soft error stability
Study of Space Station propulsion system resupply and repair Final report
Resupply and repair capabilities for orbital space station bipropellant propulsion syste
Standby Leakage Power Reduction Technique for Nanoscale CMOS VLSI Systems
In this paper, a novel low-power design technique is proposed to minimize the standby leakage power in nanoscale CMOS very large scale integration (VLSI) systems by generating the adaptive optimal reverse body-bias voltage. The adaptive optimal body-bias voltage is generated from the proposed leakage monitoring circuit, which compares the subthreshold current (ISUB) and the band-to-band tunneling (BTBT) current (IBTBT). The proposed circuit was simulated in HSPICE using 32-nm bulk CMOS technology and evaluated using ISCAS85 benchmark circuits at different operating temperatures (ranging from 25°C to 100°C). Analysis of the results shows a maximum of 551 and 1491 times leakage power reduction at 25°C and 100°C, respectively, on a circuit with 546 gates. The proposed approach demonstrates that the optimal body bias reduces a considerable amount of standby leakage power dissipation in nanoscale CMOS integrated circuits. In this approach, the temperature and supply voltage variations are compensated by the proposed feedback loop
Static Feed Water Electrolysis Subsystem Testing and Component Development
A program was carried out to develop and test advanced electrochemical cells/modules and critical electromechanical components for a static feed (alkaline electrolyte) water electrolysis oxygen generation subsystem. The accomplishments were refurbishment of a previously developed subsystem and successful demonstration for a total of 2980 hours of normal operation; achievement of sustained one-person level oxygen generation performance with state-of-the-art cell voltages averaging 1.61 V at 191 ASF for an operating temperature of 128F (equivalent to 1.51V when normalized to 180F); endurance testing and demonstration of reliable performance of the three-fluid pressure controller for 8650 hours; design and development of a fluid control assembly for this subsystem and demonstration of its performance; development and demonstration at the single cell and module levels of a unitized core composite cell that provides expanded differential pressure tolerance capability; fabrication and evaluation of a feed water electrolyte elimination five-cell module; and successful demonstration of an electrolysis module pressurization technique that can be used in place of nitrogen gas during the standby mode of operation to maintain system pressure and differential pressures
Preprototype nitrogen supply subsystem development
The design and development of a test stand for the Nitrogen Generation Module (NGM) and a series of tests which verified its operation and performance capability are described. Over 900 hours of parametric testing were achieved. The results from this testing were then used to design an advanced NGM and a self contained, preprototype Nitrogen Supply Subsystem. The NGM consists of three major components: nitrogen generation module, pressure controller and hydrazine storage tank and ancillary components. The most important improvement is the elimination of all sealing surfaces, achieved with a total welded or brazed construction. Additionally, performance was improved by increasing hydrogen separating capability by 20% with no increase in overall packaging size
FORCED STACK SLEEP TRANSISTOR (FORTRAN): A NEW LEAKAGE CURRENT REDUCTION APPROACH IN CMOS BASED CIRCUIT DESIGNING
Reduction in leakage current has become a significant concern in nanotechnology-based low-power, low-voltage, and high-performance VLSI applications. This research article discusses a new low-power circuit design the approach of FORTRAN (FORced stack sleep TRANsistor), which decreases the leakage power efficiency in the CMOS-based circuit outline in VLSI domain. FORTRAN approach reduces leakage current in both active as well as standby modes of operation. Furthermore, it is not time intensive when the circuit goes from active mode to standby mode and vice-versa. To validate the proposed design approach, experiments are conducted in the Tanner EDA tool of mentor graphics bundle on projected circuit designs for the full adder, a chain of 4-inverters, and 4-bit multiplier designs utilizing 180nm, 130nm, and 90nm TSMC technology node. The outcomes obtained show the result of a 95-98% vital reduction in leakage power as well as a 15-20% reduction in dynamic power with a minor increase in delay. The result outcomes are compared for accuracy with the notable design approaches that are accessible for both active and standby modes of operation
- …