56,666 research outputs found

    Variability-Aware VLSI Design Automation For Nanoscale Technologies

    Get PDF
    As technology scaling enters the nanometer regime, design of large scale ICs gets more challenging due to shrinking feature sizes and increasing design complexity. Aggressive scaling causes significant degradation in reliability, increased susceptibility to fabrication and environmental randomness and increased dynamic and leakage power dissipation. In this work, we investigate these scaling issues in large scale integrated systems. This dissertation proposes to develop variability-aware design methodologies by proposing design analysis, design-time optimization, post-silicon tunability and runtime-adaptivity based optimization techniques for handling variability. We discuss our research in the area of variability-aware analysis, specifically focusing on the problem of statistical timing analysis. The first technique presents the concept of error budgeting that achieves significant runtime speedups during statistical timing analysis. The second work presents a general framework for non-linear non-Gaussian statistical timing analysis considering correlations. Further, we present our work on design-time optimization schemes that are applicable during physical synthesis. Firstly, we present a buffer insertion technique that considers wire-length uncertainty and proposes algorithms to perform probabilistic buffer insertion. Secondly, we present a stochastic optimization framework based on Monte-Carlo technique considering fabrication variability. This optimization framework can be applied to problems that can be modeled as linear programs without without imposing any assumptions on the nature of the variability. Subsequently, we present our work on post-silicon tunability based design optimization. This work presents a design management framework that can be used to balance the effort spent on pre-silicon (through gate sizing) and post-silicon optimization (through tunable clock-tree buffers) while maximizing the yield gains. Lastly, we present our work on variability-aware runtime optimization techniques. We look at the problem of runtime supply voltage scaling for dynamic power optimization, and propose a framework to consider the impact of variability on the reliability of such designs. We propose a probabilistic design synthesis technique where reliability of the design is a primary optimization metric

    Robust Energy Management for Green and Survivable IP Networks

    Get PDF
    Despite the growing necessity to make Internet greener, it is worth pointing out that energy-aware strategies to minimize network energy consumption must not undermine the normal network operation. In particular, two very important issues that may limit the application of green networking techniques concern, respectively, network survivability, i.e. the network capability to react to device failures, and robustness to traffic variations. We propose novel modelling techniques to minimize the daily energy consumption of IP networks, while explicitly guaranteeing, in addition to typical QoS requirements, both network survivability and robustness to traffic variations. The impact of such limitations on final network consumption is exhaustively investigated. Daily traffic variations are modelled by dividing a single day into multiple time intervals (multi-period problem), and network consumption is reduced by putting to sleep idle line cards and chassis. To preserve network resiliency we consider two different protection schemes, i.e. dedicated and shared protection, according to which a backup path is assigned to each demand and a certain amount of spare capacity has to be available on each link. Robustness to traffic variations is provided by means of a specific modelling framework that allows to tune the conservatism degree of the solutions and to take into account load variations of different magnitude. Furthermore, we impose some inter-period constraints necessary to guarantee network stability and preserve the device lifetime. Both exact and heuristic methods are proposed. Experimentations carried out with realistic networks operated with flow-based routing protocols (i.e. MPLS) show that significant savings, up to 30%, can be achieved also when both survivability and robustness are fully guaranteed

    Tolerance analysis approach based on the classification of uncertainty (aleatory / epistemic)

    Get PDF
    Uncertainty is ubiquitous in tolerance analysis problem. This paper deals with tolerance analysis formulation, more particularly, with the uncertainty which is necessary to take into account into the foundation of this formulation. It presents: a brief view of the uncertainty classification: Aleatory uncertainty comes from the inherent uncertain nature and phenomena, and epistemic uncertainty comes from the lack of knowledge, a formulation of the tolerance analysis problem based on this classification, its development: Aleatory uncertainty is modeled by probability distributions while epistemic uncertainty is modeled by intervals; Monte Carlo simulation is employed for probabilistic analysis while nonlinear optimization is used for interval analysis.“AHTOLA” project (ANR-11- MONU-013

    After-sales services optimisation through dynamic opportunistic maintenance: a wind energy case study

    Get PDF
    After-sales maintenance services can be a very profitable source of incomes for original equipment manufacturers (OEM) due to the increasing interest of assets’ users on performance-based contracts. However, when it concerns the product value-adding process, OEM have traditionally been more focused on improving their production processes, rather than on complementing their products by offering after-sales services; consequently leading to difficulties in offering them efficiently. Furthermore, both due to the high uncertainty of the assets’ behaviour and the inherent challenges of managing the maintenance process (e.g. maintenance strategy to be followed or resources to be deployed), it is complex to make business out of the provision of after-sales services. With the aim of helping the business and maintenance decision makers at this point, this paper proposes a framework for optimising the incomes of after-sales maintenance services through: 1) implementing advanced multi-objective opportunistic maintenance strategies that sistematically consider the assets’ operational context in order to perform preventive maintenance during most favourable conditions, 2) considering the specific OEMs’ and users’ needs, and 3) assessing both internal and external uncertainties that might condition the after-sales services’ success. The developed case study for the wind energy sector demonstrates the suitability of the presented framework for optimising the after-sales services.EU Framework Programme Horizon 2020, MSCA-RISE-2014: Marie SkƂodowska-Curie Research and Innovation Staff Exchange (RISE) (grant agreement number 645733- Sustain-Owner-H2020-MSCA-RISE-2014) and the EmaitekPlus 2016-2017 Program of the Basque Government

    Reliability-based economic model predictive control for generalized flow-based networks including actuators' health-aware capabilities

    Get PDF
    This paper proposes a reliability-based economic model predictive control (MPC) strategy for the management of generalized flow-based networks, integrating some ideas on network service reliability, dynamic safety stock planning, and degradation of equipment health. The proposed strategy is based on a single-layer economic optimisation problem with dynamic constraints, which includes two enhancements with respect to existing approaches. The first enhancement considers chance-constraint programming to compute an optimal inventory replenishment policy based on a desired risk acceptability level, leading to dynamically allocate safety stocks in flow-based networks to satisfy non-stationary flow demands. The second enhancement computes a smart distribution of the control effort and maximises actuators’ availability by estimating their degradation and reliability. The proposed approach is illustrated with an application of water transport networks using the Barcelona network as the considered case study.Peer ReviewedPostprint (author's final draft

    Quality of Information in Mobile Crowdsensing: Survey and Research Challenges

    Full text link
    Smartphones have become the most pervasive devices in people's lives, and are clearly transforming the way we live and perceive technology. Today's smartphones benefit from almost ubiquitous Internet connectivity and come equipped with a plethora of inexpensive yet powerful embedded sensors, such as accelerometer, gyroscope, microphone, and camera. This unique combination has enabled revolutionary applications based on the mobile crowdsensing paradigm, such as real-time road traffic monitoring, air and noise pollution, crime control, and wildlife monitoring, just to name a few. Differently from prior sensing paradigms, humans are now the primary actors of the sensing process, since they become fundamental in retrieving reliable and up-to-date information about the event being monitored. As humans may behave unreliably or maliciously, assessing and guaranteeing Quality of Information (QoI) becomes more important than ever. In this paper, we provide a new framework for defining and enforcing the QoI in mobile crowdsensing, and analyze in depth the current state-of-the-art on the topic. We also outline novel research challenges, along with possible directions of future work.Comment: To appear in ACM Transactions on Sensor Networks (TOSN

    Chance-Constrained Equilibrium in Electricity Markets With Asymmetric Forecasts

    Full text link
    We develop a stochastic equilibrium model for an electricity market with asymmetric renewable energy forecasts. In our setting, market participants optimize their profits using public information about a conditional expectation of energy production but use private information about the forecast error distribution. This information is given in the form of samples and incorporated into profit-maximizing optimizations of market participants through chance constraints. We model information asymmetry by varying the sample size of participants' private information. We show that with more information available, the equilibrium gradually converges to the ideal solution provided by the perfect information scenario. Under information scarcity, however, we show that the market converges to the ideal equilibrium if participants are to infer the forecast error distribution from the statistical properties of the data at hand or share their private forecasts
    • 

    corecore