23,067 research outputs found

    AI/ML Algorithms and Applications in VLSI Design and Technology

    Full text link
    An evident challenge ahead for the integrated circuit (IC) industry in the nanometer regime is the investigation and development of methods that can reduce the design complexity ensuing from growing process variations and curtail the turnaround time of chip manufacturing. Conventional methodologies employed for such tasks are largely manual; thus, time-consuming and resource-intensive. In contrast, the unique learning strategies of artificial intelligence (AI) provide numerous exciting automated approaches for handling complex and data-intensive tasks in very-large-scale integration (VLSI) design and testing. Employing AI and machine learning (ML) algorithms in VLSI design and manufacturing reduces the time and effort for understanding and processing the data within and across different abstraction levels via automated learning algorithms. It, in turn, improves the IC yield and reduces the manufacturing turnaround time. This paper thoroughly reviews the AI/ML automated approaches introduced in the past towards VLSI design and manufacturing. Moreover, we discuss the scope of AI/ML applications in the future at various abstraction levels to revolutionize the field of VLSI design, aiming for high-speed, highly intelligent, and efficient implementations

    Report from GI-Dagstuhl Seminar 16394: Software Performance Engineering in the DevOps World

    Get PDF
    This report documents the program and the outcomes of GI-Dagstuhl Seminar 16394 "Software Performance Engineering in the DevOps World". The seminar addressed the problem of performance-aware DevOps. Both, DevOps and performance engineering have been growing trends over the past one to two years, in no small part due to the rise in importance of identifying performance anomalies in the operations (Ops) of cloud and big data systems and feeding these back to the development (Dev). However, so far, the research community has treated software engineering, performance engineering, and cloud computing mostly as individual research areas. We aimed to identify cross-community collaboration, and to set the path for long-lasting collaborations towards performance-aware DevOps. The main goal of the seminar was to bring together young researchers (PhD students in a later stage of their PhD, as well as PostDocs or Junior Professors) in the areas of (i) software engineering, (ii) performance engineering, and (iii) cloud computing and big data to present their current research projects, to exchange experience and expertise, to discuss research challenges, and to develop ideas for future collaborations

    Incorporation of uncertainties in real-time catchment flood forecasting

    Get PDF
    Floods have become the most prevalent and costly natural hazards in the U.S. When preparing real-time flood forecasts for a catchment flood warning and preparedness system, consideration must be given to four sources of uncertainty -- natural, data, model parameters, and model structure. A general procedure has been developed for applying reliability analysis to evaluate the effects of the various sources of uncertainty on hydrologic models used for forecasting and prediction of catchment floods. Three reliability analysis methods -- Monte Carlo simulation, mean value and advanced first-order second moment analyses (MVFOSM and AFOSM, respectively) - - were applied to the rainfall -runoff modeling reliability problem. Comparison of these methods indicates that the AFOSM method is probably best suited to the rainfall-runoff modeling reliability problem with the MVFOSM showing some promise. The feasibility and utility of the reliability analysis procedure are shown for a case study employing as an example the HEC-1 and RORB rainfall-runoff watershed models to forecast flood events on the Vermilion River watershed at Pontiac, Illinois. The utility of the reliability analysis approach is demonstrated for four important hydrologic problems: 1) determination of forecast (or prediction) reliability, 2) determination of the flood level exceedance probability due to a current storm and development of "rules of thumb" for flood warning decision making considering this probabilistic information, 3) determination of the key sources of uncertainty influencing model forecast reliability, 4) selection of hydrologic models based on comparison of model forecast reliability. Central to this demonstration is the reliability analysis methods' ability to estimate the exceedance probability for any hydrologic target level of interest and, hence, to produce forecast cumulative density functions and probability distribution functions. For typical hydrologic modeling cases, reduction of the underlying modeling uncertainties is the key to obtaining useful, reliable forecasts. Furthermore, determination of the rainfall excess is the primary source of uncertainty, especially in the estimation of the temporal and areal rainfall distributions.U.S. Department of the InteriorU.S. Geological SurveyOpe
    corecore