311,264 research outputs found

    A Multi-objective Perspective for Operator Scheduling using Fine-grained DVS Architecture

    Full text link
    The stringent power budget of fine grained power managed digital integrated circuits have driven chip designers to optimize power at the cost of area and delay, which were the traditional cost criteria for circuit optimization. The emerging scenario motivates us to revisit the classical operator scheduling problem under the availability of DVFS enabled functional units that can trade-off cycles with power. We study the design space defined due to this trade-off and present a branch-and-bound(B/B) algorithm to explore this state space and report the pareto-optimal front with respect to area and power. The scheduling also aims at maximum resource sharing and is able to attain sufficient area and power gains for complex benchmarks when timing constraints are relaxed by sufficient amount. Experimental results show that the algorithm that operates without any user constraint(area/power) is able to solve the problem for most available benchmarks, and the use of power budget or area budget constraints leads to significant performance gain.Comment: 18 pages, 6 figures, International journal of VLSI design & Communication Systems (VLSICS

    Economic and environmental impacts of the energy source for the utility production system in the HDA process

    Get PDF
    The well-known benchmark process for hydrodealkylation of toluene (HDA) to produce benzene is revisited in a multi-objective approach for identifying environmentally friendly and cost-effective operation solutions. The paper begins with the presentation of the numerical tools used in this work, i.e., a multi-objective genetic algorithm and a Multiple Choice Decision Making procedure. Then, two studies related to the energy source involved in the utility production system (UPS), either fuel oil or natural gas, of the HDA process are carried out. In each case, a multi-objective optimization problem based on the minimization of the total annual cost of the process and of five environmental burdens, that are Global Warming Potential, Acidification Potential, Photochemical Ozone Creation Potential, Human Toxicity Potential and Eutrophication Potential, is solved and the best solution is identified by use of Multiple Choice Decision Making procedures. An assessment of the respective contribution of the HDA process and the UPS towards environmental impacts on the one hand, and of the environmental impacts generated by the main equipment items of the HDA process on the other hand is then performed to compare both solutions. This ‘‘gate-to-gate’’ environmental study is then enlarged by implementing a ‘‘cradle-togate’’ Life Cycle Assessment (LCA), for accounting of emission inventory and extraction. The use of a natural gas turbine, less economically efficient, turns out to be a more attractive alternative to meet the societal expectations concerning environment preservation and sustainable development

    Implementation of a Combined OFDM-Demodulation and WCDMA-Equalization Module

    Get PDF
    For a dual-mode baseband receiver for the OFDMWireless LAN andWCDMA standards, integration of the demodulation and equalization tasks on a dedicated hardware module has been investigated. For OFDM demodulation, an FFT algorithm based on cascaded twiddle factor decomposition has been selected. This type of algorithm combines high spatial and temporal regularity in the FFT data-flow graphs with a minimal number of computations. A frequency-domain algorithm based on a circulant channel approximation has been selected for WCDMA equalization. It has good performance, low hardware complexity and a low number of computations. Its main advantage is the reuse of the FFT kernel, which contributes to the integration of both tasks. The demodulation and equalization module has been described at the register transfer level with the in-house developed Arx language. The core of the module is a pipelined radix-23 butterfly combined with a complex multiplier and complex divider. The module has an area of 0.447 mm2 in 0.18 ¿m technology and a power consumption of 10.6 mW. The proposed module compares favorably with solutions reported in literature

    Adaptive Real Time Imaging Synthesis Telescopes

    Full text link
    The digital revolution is transforming astronomy from a data-starved to a data-submerged science. Instruments such as the Atacama Large Millimeter Array (ALMA), the Large Synoptic Survey Telescope (LSST), and the Square Kilometer Array (SKA) will measure their accumulated data in petabytes. The capacity to produce enormous volumes of data must be matched with the computing power to process that data and produce meaningful results. In addition to handling huge data rates, we need adaptive calibration and beamforming to handle atmospheric fluctuations and radio frequency interference, and to provide a user environment which makes the full power of large telescope arrays accessible to both expert and non-expert users. Delayed calibration and analysis limit the science which can be done. To make the best use of both telescope and human resources we must reduce the burden of data reduction. Our instrumentation comprises of a flexible correlator, beam former and imager with digital signal processing closely coupled with a computing cluster. This instrumentation will be highly accessible to scientists, engineers, and students for research and development of real-time processing algorithms, and will tap into the pool of talented and innovative students and visiting scientists from engineering, computing, and astronomy backgrounds. Adaptive real-time imaging will transform radio astronomy by providing real-time feedback to observers. Calibration of the data is made in close to real time using a model of the sky brightness distribution. The derived calibration parameters are fed back into the imagers and beam formers. The regions imaged are used to update and improve the a-priori model, which becomes the final calibrated image by the time the observations are complete

    Economic and environmental strategies for process design

    Get PDF
    This paper first addresses the definition of various objectives involved in eco-efficient processes, taking simultaneously into account ecological and economic considerations. The environmental aspect at the preliminary design phase of chemical processes is quantified by using a set of metrics or indicators following the guidelines of sustainability concepts proposed by . The resulting multiobjective problem is solved by a genetic algorithm following an improved variant of the so-called NSGA II algorithm. A key point for evaluating environmental burdens is the use of the package ARIANE™, a decision support tool dedicated to the management of plants utilities (steam, electricity, hot water, etc.) and pollutants (CO2, SO2, NO, etc.), implemented here both to compute the primary energy requirements of the process and to quantify its pollutant emissions. The well-known benchmark process for hydrodealkylation (HDA) of toluene to produce benzene, revisited here in a multiobjective optimization way, is used to illustrate the approach for finding eco-friendly and cost-effective designs. Preliminary biobjective studies are carried out for eliminating redundant environmental objectives. The trade-off between economic and environmental objectives is illustrated through Pareto curves. In order to aid decision making among the various alternatives that can be generated after this step, a synthetic evaluation method, based on the so-called Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) (), has been first used. Another simple procedure named FUCA has also been implemented and shown its efficiency vs. TOPSIS. Two scenarios are studied; in the former, the goal is to find the best trade-off between economic and ecological aspects while the latter case aims at defining the best compromise between economic and more strict environmental impact

    Output Filter Aware Optimization of the Noise Shaping Properties of {\Delta}{\Sigma} Modulators via Semi-Definite Programming

    Full text link
    The Noise Transfer Function (NTF) of {\Delta}{\Sigma} modulators is typically designed after the features of the input signal. We suggest that in many applications, and notably those involving D/D and D/A conversion or actuation, the NTF should instead be shaped after the properties of the output/reconstruction filter. To this aim, we propose a framework for optimal design based on the Kalman-Yakubovich-Popov (KYP) lemma and semi-definite programming. Some examples illustrate how in practical cases the proposed strategy can outperform more standard approaches.Comment: 14 pages, 18 figures, journal. Code accompanying the paper is available at http://pydsm.googlecode.co

    From FPGA to ASIC: A RISC-V processor experience

    Get PDF
    This work document a correct design flow using these tools in the Lagarto RISC- V Processor and the RTL design considerations that must be taken into account, to move from a design for FPGA to design for ASIC

    High performance blended membranes using a novel preparation technique

    Get PDF
    The possibility of applying novel microwave (MW) technique in the dissolution of polyethersulfone (PES) and lithium halides in aprotic solvent is studied. The lithium halides additives used are lithium fluoride (LiF), lithium bromide (LiBr) and lithium chloride (LiCl) and a comparison is made with conventional method. PES was dissolved in dimethylformamide (DMF) in the single solvent whilst for the double solvent (DS); PES was dissolved in a mixture of two different solvents DMF and acetone. The concentrations of lithium halide in both solvents were varied from 1 to 5 wt%. In order to illuminate the mechanism through which lithium halide influences the kinetic membrane performance in both techniques, rheological, FTIR, contact angle and water uptake analysis were performed. The performances of the membranes were evaluated in terms of pure water permeation (PWP), permeation rate (PR) and separation rates of various polyethylene glycols. Result revealed that the hollow fiber MW membrane with the 3 wt% LiBr additive exhibits both high permeation rates of 222.16 Lm-2hr-1 and separation rates of 99% and molecular weight cutoff (MWCO) of 2.6 kDa. In general, the MW membranes exhibited higher permeation and separation rates compared to conventional electrothermal heating (CEH) membranes. The FTIR, contact angle and water uptake measurement revealed that the LiCl and LiBr have enhanced the hydrophilic properties of the PES membranes thus producing membrane with high permeation and separation rates
    corecore