20,226 research outputs found

    Semiconductor manufacturing simulation design and analysis with limited data

    Full text link
    This paper discusses simulation design and analysis for Silicon Carbide (SiC) manufacturing operations management at New York Power Electronics Manufacturing Consortium (PEMC) facility. Prior work has addressed the development of manufacturing system simulation as the decision support to solve the strategic equipment portfolio selection problem for the SiC fab design [1]. As we move into the phase of collecting data from the equipment purchased for the PEMC facility, we discuss how to redesign our manufacturing simulations and analyze their outputs to overcome the challenges that naturally arise in the presence of limited fab data. We conclude with insights on how an approach aimed to reflect learning from data can enable our discrete-event stochastic simulation to accurately estimate the performance measures for SiC manufacturing at the PEMC facility

    Modeling survival times using frailty models

    Get PDF
    Traditional survival models, including Kaplan Meier, Nelson Aalen and Cox regression assume a homogeneous population; however, these are inappropriate in the presence of heterogeneity. The introduction of frailty models four decades ago addressed this limitation. Fundamentally, frailty models apply the same principles of survival theory, however, they incorporate a multiplicative term in the distribution to address the impact of frailty and cater for any underlying unobserved heterogeneity. These frailty models will be used to relate survival durations for censored data to a number of pre-operative, operative and post-operative patient related variables to identify risks factors. The study is mainly focused on fitting shared and unshared frailty models to account for unobserved frailty within the data and simultaneously identify the risk factors that best predict the hazard of death.peer-reviewe

    Hierarchical models for semi-competing risks data with application to quality of end-of-life care for pancreatic cancer

    Full text link
    Readmission following discharge from an initial hospitalization is a key marker of quality of health care in the United States. For the most part, readmission has been used to study quality of care for patients with acute health conditions, such as pneumonia and heart failure, with analyses typically based on a logistic-Normal generalized linear mixed model. Applying this model to the study readmission among patients with increasingly prevalent advanced health conditions such as pancreatic cancer is problematic, however, because it ignores death as a competing risk. A more appropriate analysis is to imbed such studies within the semi-competing risks framework. To our knowledge, however, no comprehensive statistical methods have been developed for cluster-correlated semi-competing risks data. In this paper we propose a novel hierarchical modeling framework for the analysis of cluster-correlated semi-competing risks data. The framework permits parametric or non-parametric specifications for a range of model components, including baseline hazard functions and distributions for key random effects, giving analysts substantial flexibility as they consider their own analyses. Estimation and inference is performed within the Bayesian paradigm since it facilitates the straightforward characterization of (posterior) uncertainty for all model parameters including hospital-specific random effects. The proposed framework is used to study the risk of readmission among 5,298 Medicare beneficiaries diagnosed with pancreatic cancer at 112 hospitals in the six New England states between 2000-2009, specifically to investigate the role of patient-level risk factors and to characterize variation in risk across hospitals that is not explained by differences in patient case-mix

    Impact of New Madrid Seismic Zone Earthquakes on the Central USA, Vol. 1 and 2

    Get PDF
    The information presented in this report has been developed to support the Catastrophic Earthquake Planning Scenario workshops held by the Federal Emergency Management Agency. Four FEMA Regions (Regions IV, V, VI and VII) were involved in the New Madrid Seismic Zone (NMSZ) scenario workshops. The four FEMA Regions include eight states, namely Illinois, Indiana, Kentucky, Tennessee, Alabama, Mississippi, Arkansas and Missouri. The earthquake impact assessment presented hereafter employs an analysis methodology comprising three major components: hazard, inventory and fragility (or vulnerability). The hazard characterizes not only the shaking of the ground but also the consequential transient and permanent deformation of the ground due to strong ground shaking as well as fire and flooding. The inventory comprises all assets in a specific region, including the built environment and population data. Fragility or vulnerability functions relate the severity of shaking to the likelihood of reaching or exceeding damage states (light, moderate, extensive and near-collapse, for example). Social impact models are also included and employ physical infrastructure damage results to estimate the effects on exposed communities. Whereas the modeling software packages used (HAZUS MR3; FEMA, 2008; and MAEviz, Mid-America Earthquake Center, 2008) provide default values for all of the above, most of these default values were replaced by components of traceable provenance and higher reliability than the default data, as described below. The hazard employed in this investigation includes ground shaking for a single scenario event representing the rupture of all three New Madrid fault segments. The NMSZ consists of three fault segments: the northeast segment, the reelfoot thrust or central segment, and the southwest segment. Each segment is assumed to generate a deterministic magnitude 7.7 (Mw7.7) earthquake caused by a rupture over the entire length of the segment. US Geological Survey (USGS) approved the employed magnitude and hazard approach. The combined rupture of all three segments simultaneously is designed to approximate the sequential rupture of all three segments over time. The magnitude of Mw7.7 is retained for the combined rupture. Full liquefaction susceptibility maps for the entire region have been developed and are used in this study. Inventory is enhanced through the use of the Homeland Security Infrastructure Program (HSIP) 2007 and 2008 Gold Datasets (NGA Office of America, 2007). These datasets contain various types of critical infrastructure that are key inventory components for earthquake impact assessment. Transportation and utility facility inventories are improved while regional natural gas and oil pipelines are added to the inventory, alongside high potential loss facility inventories. The National Bridge Inventory (NBI, 2008) and other state and independent data sources are utilized to improve the inventory. New fragility functions derived by the MAE Center are employed in this study for both buildings and bridges providing more regionally-applicable estimations of damage for these infrastructure components. Default fragility values are used to determine damage likelihoods for all other infrastructure components. The study reports new analysis using MAE Center-developed transportation network flow models that estimate changes in traffic flow and travel time due to earthquake damage. Utility network modeling was also undertaken to provide damage estimates for facilities and pipelines. An approximate flood risk model was assembled to identify areas that are likely to be flooded as a result of dam or levee failure. Social vulnerability identifies portions of the eight-state study region that are especially vulnerable due to various factors such as age, income, disability, and language proficiency. Social impact models include estimates of displaced and shelter-seeking populations as well as commodities and medical requirements. Lastly, search and rescue requirements quantify the number of teams and personnel required to clear debris and search for trapped victims. The results indicate that Tennessee, Arkansas, and Missouri are most severely impacted. Illinois and Kentucky are also impacted, though not as severely as the previous three states. Nearly 715,000 buildings are damaged in the eight-state study region. About 42,000 search and rescue personnel working in 1,500 teams are required to respond to the earthquakes. Damage to critical infrastructure (essential facilities, transportation and utility lifelines) is substantial in the 140 impacted counties near the rupture zone, including 3,500 damaged bridges and nearly 425,000 breaks and leaks to both local and interstate pipelines. Approximately 2.6 million households are without power after the earthquake. Nearly 86,000 injuries and fatalities result from damage to infrastructure. Nearly 130 hospitals are damaged and most are located in the impacted counties near the rupture zone. There is extensive damage and substantial travel delays in both Memphis, Tennessee, and St. Louis, Missouri, thus hampering search and rescue as well as evacuation. Moreover roughly 15 major bridges are unusable. Three days after the earthquake, 7.2 million people are still displaced and 2 million people seek temporary shelter. Direct economic losses for the eight states total nearly $300 billion, while indirect losses may be at least twice this amount. The contents of this report provide the various assumptions used to arrive at the impact estimates, detailed background on the above quantitative consequences, and a breakdown of the figures per sector at the FEMA region and state levels. The information is presented in a manner suitable for personnel and agencies responsible for establishing response plans based on likely impacts of plausible earthquakes in the central USA.Armu W0132T-06-02unpublishednot peer reviewe

    Sequential Design with Mutual Information for Computer Experiments (MICE): Emulation of a Tsunami Model

    Get PDF
    Computer simulators can be computationally intensive to run over a large number of input values, as required for optimization and various uncertainty quantification tasks. The standard paradigm for the design and analysis of computer experiments is to employ Gaussian random fields to model computer simulators. Gaussian process models are trained on input-output data obtained from simulation runs at various input values. Following this approach, we propose a sequential design algorithm, MICE (Mutual Information for Computer Experiments), that adaptively selects the input values at which to run the computer simulator, in order to maximize the expected information gain (mutual information) over the input space. The superior computational efficiency of the MICE algorithm compared to other algorithms is demonstrated by test functions, and a tsunami simulator with overall gains of up to 20% in that case

    ENERGETIC PARTICLE DIFFUSION IN CRITICALLY BALANCED TURBULENCE

    Get PDF
    Observations and modeling suggest that the fluctuations in magnetized plasmas exhibit scale-dependent anisotropy, with more energy in the fluctuations perpendicular to the mean magnetic field than in the parallel fluctuations and the anisotropy increasing at smaller scales. The scale dependence of the anisotropy has not been studied in full-orbit simulations of particle transport in turbulent plasmas so far. In this paper, we construct a model of critically balanced turbulence, as suggested by Goldreich & Sridhar, and calculate energetic particle spatial diffusion coefficients using full-orbit simulations. The model uses an enveloped turbulence approach, where each two-dimensional wave mode with wavenumber k ⊥ is packed into envelopes of length L following the critical balance condition, Lk –2/3 ⊥, with the wave mode parameters changing between envelopes. Using full-orbit particle simulations, we find that both the parallel and perpendicular diffusion coefficients increase by a factor of two, compared to previous models with scale-independent anisotropy

    Cost-Effectiveness of Stronger Woodframe Buildings

    Get PDF
    We examine the cost-effectiveness of improvements in woodframe buildings. These include retrofits, redesign measures, and improved quality in 19 hypothetical woodframe dwellings. We estimated cost-effectiveness for each improvement and each zip code in California. The dwellings were designed under the CUREE-Caltech Woodframe Project. Costs and seismic vulnerability were determined on a component-by-component basis using the Assembly Based Vulnerability method, within a nonlinear time-history structural-analysis framework and using full-size test specimen data. Probabilistic site hazard was calculated by zip code, considering site soil classification, and integrated with vulnerability to determine expected annualized repair cost. The approach provides insight into uncertainty of loss at varying shaking levels. We calculated present value of benefit to determine cost-effectiveness in terms of benefit-cost ratio (BCR). We find that one retrofit exhibits BCRs as high as 8, and is in excess of 1 in half of California zip codes. Four retrofit or redesign measures are cost-effective in at least some locations. Higher quality is estimated to save thousands of dollars per house. Results are illustrated by maps for the Los Angeles and San Francisco regions and are available for every zip code in California
    corecore