181 research outputs found

    Yield and Reliability Analysis for Nanoelectronics

    Get PDF
    As technology has continued to advance and more break-through emerge, semiconductor devices with dimensions in nanometers have entered into all spheres of our lives. Accordingly, high reliability and high yield are very much a central concern to guarantee the advancement and utilization of nanoelectronic products. However, there appear to be some major challenges related to nanoelectronics in regard to the field of reliability: identification of the failure mechanisms, enhancement of the low yields of nano products, and management of the scarcity and secrecy of available data [34]. Therefore, this dissertation investigates four issues related to the yield and reliability of nanoelectronics. Yield and reliability of nanoelectronics are affected by defects generated in the manufacturing processes. An automatic method using model-based clustering has been developed to detect the defect clusters and identify their patterns where the distribution of the clustered defects is modeled by a new mixture distribution of multivariate normal distributions and principal curves. The new mixture model is capable of modeling defect clusters with amorphous, curvilinear, and linear patterns. We evaluate the proposed method using both simulated and experimental data and promising results have been obtained. Yield is one of the most important performance indexes for measuring the success of nano fabrication and manufacturing. Accurate yield estimation and prediction is essential for evaluating productivity and estimating production cost. This research studies advanced yield modeling approaches which consider the spatial variations of defects or defect counts. Results from real wafer map data show that the new yield models provide significant improvement in yield estimation compared to the traditional Poisson model and negative binomial model. The ultra-thin SiO2 is a major factor limiting the scaling of semiconductor devices. High-k gate dielectric materials such as HfO2 will replace SiO2 in future generations of MOS devices. This study investigates the two-step breakdown mechanisms and breakdown sequences of double-layered high-k gate stacks by monitoring the relaxation of the dielectric films. The hazard rate is a widely used metric for measuring the reliability of electronic products. This dissertation studies the hazard rate function of gate dielectrics breakdown. A physically feasible failure time distribution is used to model the time-to-breakdown data and a Bayesian approach is adopted in the statistical analysis

    New stochastic processes to model interest rates : LIBOR additive processes

    Get PDF
    In this paper, a new kind of additive process is proposed. Our main goal is to define, characterize and prove the existence of the LIBOR additive process as a new stochastic process. This process will be de.ned as a piecewise stationary process with independent increments, continuous in probability but with discontinuous trajectories, and having "cĂ dlĂ g" sample paths. The proposed process is specifically designed to derive interest-rates modelling because it allows us to introduce a jump-term structure as an increasing sequence of LĂ©vy measures. In this paper we characterize this process as a Markovian process with an infinitely divisible, selfsimilar, stable and self-decomposable distribution. Also, we prove that the LĂ©vy-Khintchine characteristic function and LĂ©vy-ItĂ´ decomposition apply to this process. Additionally we develop a basic framework for density transformations. Finally, we show some examples of LIBOR additive processes

    Modeling and Generating Dependent Risk Processes for IRM and DFA

    Get PDF
    Modern Integrated Risk Management (IRM) and Dynamic Financial Analysis (DFA) rely in great part on an appropriate modeling of the stochastic behavior of the various risky assets and processes that influence the performance of the company under consideration. A major challenge here is a more substantial and realistic description and modeling of the various complex dependence structures between such risks showing up on all scales. In this presentation, we propose some approaches towards modeling and generating (simulating) dependent risk processes in the framework of collective risk theory, in particular w.r.t. dependent claim number processes of Poisson type (homogeneous and non-homogeneous), and compound Poisson processe

    Modeling and Generating Dependent Risk Processes for IRM and DFA

    Get PDF
    Modern Integrated Risk Management (IRM) and Dynamic Financial Analysis (DFA) rely in great part on an appropriate modeling of the stochastic behavior of the various risky assets and processes that influence the performance of the company under consideration. A major challenge here is a more substantial and realistic description and modeling of the various complex dependence structures between such risks showing up on all scales. In this presentation, we propose some approaches towards modeling and generating (simulating) dependent risk processes in the framework of collective risk theory, in particular w.r.t. dependent claim number processes of Poisson type (homogeneous and non-homogeneous), and compound Poisson processe

    Computational model of mesenchymal migration in 3D under chemotaxis

    Get PDF
    Cell chemotaxis is an important characteristic of cellular migration, which takes part in crucial aspects of life and development. In this work, we propose a novel in silico model of mesenchymal 3D migration with competing protrusions under a chemotactic gradient. Based on recent experimental observations, we identify three main stages that can regulate mesenchymal chemotaxis: chemosensing, dendritic protrusion dynamics and cell–matrix interactions. Therefore, each of these features is considered as a different module of the main regulatory computational algorithm. The numerical model was particularized for the case of fibroblast chemotaxis under a PDGF-bb gradient. Fibroblasts migration was simulated embedded in two different 3D matrices – collagen and fibrin – and under several PDGF-bb concentrations. Validation of the model results was provided through qualitative and quantitative comparison with in vitro studies. Our numerical predictions of cell trajectories and speeds were within the measured in vitro ranges in both collagen and fibrin matrices. Although in fibrin, the migration speed of fibroblasts is very low, because fibrin is a stiffer and more entangling matrix. Testing PDGF-bb concentrations, we noticed that an increment of this factor produces a speed increment. At 1 ng mL-1 a speed peak is reached after which the migration speed diminishes again. Moreover, we observed that fibrin exerts a dampening behavior on migration, significantly affecting the migration efficiency

    Towards the Efficient Probabilistic Characterization of Tropical Cyclone-Generated Storm Surge Hazards Under Stationary and Nonstationary Conditions

    Get PDF
    The scarcity of observations at any single location confounds the probabilistic characterization of tropical cyclone-generated storm surge hazards using annual maxima and peaks-over-threshold methods. The EST and the JPM are indirect approaches aimed at estimating the probability distribution of the response variable of interest (i.e. storm surge) using the probability distributions of predictor variables (e.g. storm size, storm intensity etc.). In the first part of this work, the relative performance of the empirical simulation technique (EST; Borgman et al., 1992) and the joint probability method (JPM; Myers, 1970) is evaluated via stochastic simulation methods. It is shown that the JPM has greater predictive capability for the estimation of the frequency of tropical cyclone winds, an efficient proxy for storm surge. The traditional attractions of the EST have been its economy and ease of implementation; more efficient numerical approximation schemes such as Bayesian quadrature now exist, which allows for more cost effective implementation of the JPM. In addition, typical enhancements of the original EST approach, such as the introduction of synthetic storms to complement the historical sample, are largely ineffective. These observations indicate that the EST should no longer be considered a practical approach for the robust and reliable estimation of the exceedance probabilities of storm surge levels, as required for actuarial purposes, engineering design and flood risk management in tropical cyclone-prone regions. The JPM is, however, not applicable to extratropical storm-prone regions and nonstationary phenomena. Additionally, the JPM requires the evaluation of a multidimensional integral composed of the product of marginal and conditional probability distributions of storm descriptors. This integral is typically approximated as a weighted summation of discrete function evaluations in each dimension and extended to D-dimensions by tensor product rules. To adequately capture the dynamics of the underlying physical process—storm surge driven by tropical cyclone wind fields—one must maintain a large number of explanatory variables in the integral. The complexity and cost of the joint probability problem, however, increases exponentially with dimension, precluding the inclusion of more than a few (≤4) stochastic variables. In the second part of the work, we extend stochastic simulation approaches to the classical joint probability problem. The successful implementation of stochastic simulation to the storm surge frequency problem requires the introduction of a new paradigm: the use of a regression function constructed by the careful selection of an optimal training set from the storm sample space such that the growth of support nodes required for efficient interpolation remains nonexponential while preserving the performance of a product grid equivalent. Apart from retaining the predictive capability of the JPM, the stochastic simulation approach also allows for nonstationary phenomena such as the effects of climate change on tropical cyclone activity to be efficiently modeled. A great utility of the stochastic approach is that the random sampling scheme is readily modified so that it conducts empirical simulation if required in place of parametric simulation. The enhanced empirical simulation technique attains predictive capabilities that are comparable with the JPM and the parametric simulation approach, while also retaining the suitability of empirical methods for application to situations that confound parametric methods, such as, application to extratropical cyclones and complexly distributed data. The parametric and empirical simulation techniques, together, will enable seamless flood hazard estimation for the entire coastline of the United States, with simple elaborations where needed to allow for the joint occurrence of both tropical and extratropical storms as compound stochastic processes. The stochastic approaches proposed hold great promise for the efficient probabilistic modeling of other multi-parameter systems such as earthquakes and riverine floods

    Joint epidemic and spatio-temporal approach for modelling disease outbreaks

    Get PDF
    When forecasting epidemics, the main interests lie in understanding the determinants of transmission and predicting who is likely to become infected next. However, for vector-borne diseases, data availability and alteration can constitute an obstacle to doing so: climate change and globalized trade contribute to the expansion of vector habitats to different territories and hence the distribution of many diseases. As a consequence, in the face of a rapidly changing environmental and ecological climatic conditions, previously well-fitted models might become obsolete soon. The demand for precise forecast and prediction of the spread of a disease requires a model that is flexible with respect to the availability of vector data, unobserved random effects and only partially observed data for diseases incidence. Thus, we introduce a combination of a mechanistic SIR model with principled data-based methods from geostatistics. We allow flexibility by replacing a parameter of a continuous-time mechanistic model with a random effect, that is assumed to stem from a spatial Gaussian process. By employing Bayesian inference techniques, we identify points in space where transmission (as opposed to simply incidence) is unusually high or low compared to a national average. We explore how well the spatial random effect can be recovered within a mechanistic model and only partially observed outbreak data available. To this end, we extended the Python probabilistic programming library PyMC3 with our own sampler to effectively impute missing infection and removal time data
    • …
    corecore