121,430 research outputs found

    Semiconductor manufacturing simulation design and analysis with limited data

    Full text link
    This paper discusses simulation design and analysis for Silicon Carbide (SiC) manufacturing operations management at New York Power Electronics Manufacturing Consortium (PEMC) facility. Prior work has addressed the development of manufacturing system simulation as the decision support to solve the strategic equipment portfolio selection problem for the SiC fab design [1]. As we move into the phase of collecting data from the equipment purchased for the PEMC facility, we discuss how to redesign our manufacturing simulations and analyze their outputs to overcome the challenges that naturally arise in the presence of limited fab data. We conclude with insights on how an approach aimed to reflect learning from data can enable our discrete-event stochastic simulation to accurately estimate the performance measures for SiC manufacturing at the PEMC facility

    The role of learning on industrial simulation design and analysis

    Full text link
    The capability of modeling real-world system operations has turned simulation into an indispensable problemsolving methodology for business system design and analysis. Today, simulation supports decisions ranging from sourcing to operations to finance, starting at the strategic level and proceeding towards tactical and operational levels of decision-making. In such a dynamic setting, the practice of simulation goes beyond being a static problem-solving exercise and requires integration with learning. This article discusses the role of learning in simulation design and analysis motivated by the needs of industrial problems and describes how selected tools of statistical learning can be utilized for this purpose

    Producing power-law distributions and damping word frequencies with two-stage language models

    Get PDF
    Standard statistical models of language fail to capture one of the most striking properties of natural languages: the power-law distribution in the frequencies of word tokens. We present a framework for developing statisticalmodels that can generically produce power laws, breaking generativemodels into two stages. The first stage, the generator, can be any standard probabilistic model, while the second stage, the adaptor, transforms the word frequencies of this model to provide a closer match to natural language. We show that two commonly used Bayesian models, the Dirichlet-multinomial model and the Dirichlet process, can be viewed as special cases of our framework. We discuss two stochastic processes-the Chinese restaurant process and its two-parameter generalization based on the Pitman-Yor process-that can be used as adaptors in our framework to produce power-law distributions over word frequencies. We show that these adaptors justify common estimation procedures based on logarithmic or inverse-power transformations of empirical frequencies. In addition, taking the Pitman-Yor Chinese restaurant process as an adaptor justifies the appearance of type frequencies in formal analyses of natural language and improves the performance of a model for unsupervised learning of morphology.48 page(s

    Tools for Assessing Climate Impacts on Fish and Wildlife

    Get PDF
    Climate change is already affecting many fish and wildlife populations. Managing these populations requires an understanding of the nature, magnitude, and distribution of current and future climate impacts. Scientists and managers have at their disposal a wide array of models for projecting climate impacts that can be used to build such an understanding. Here, we provide a broad overview of the types of models available for forecasting the effects of climate change on key processes that affect fish and wildlife habitat (hydrology, fire, and vegetation), as well as on individual species distributions and populations. We present a framework for how climate-impacts modeling can be used to address management concerns, providing examples of model-based assessments of climate impacts on salmon populations in the Pacific Northwest, fire regimes in the boreal region of Canada, prairies and savannas in the Willamette Valley-Puget Sound Trough-Georgia Basin ecoregion, and marten Martes americana populations in the northeastern United States and southeastern Canada. We also highlight some key limitations of these models and discuss how such limitations should be managed. We conclude with a general discussion of how these models can be integrated into fish and wildlife management

    Experimental analysis and modeling of orthogonal cutting using material and friction models

    Get PDF
    In this study, a process model for orthogonal cutting processes is proposed. The model involves the primary and secondary deformation zones. The primary shear zone is modeled by a Johnson-Cook constitutive relationship and a shear plane having constant thickness. The secondary deformation zone is modeled semi-analytically, where the coefficient of friction is calibrated experimentally. The cutting forces predicted using the calibrated sliding friction coefficients are in good agreement with the measurements. The experimental investigation of sliding friction coefficients also show promising results for the proposed model, which is still under development

    Syntactic Topic Models

    Full text link
    The syntactic topic model (STM) is a Bayesian nonparametric model of language that discovers latent distributions of words (topics) that are both semantically and syntactically coherent. The STM models dependency parsed corpora where sentences are grouped into documents. It assumes that each word is drawn from a latent topic chosen by combining document-level features and the local syntactic context. Each document has a distribution over latent topics, as in topic models, which provides the semantic consistency. Each element in the dependency parse tree also has a distribution over the topics of its children, as in latent-state syntax models, which provides the syntactic consistency. These distributions are convolved so that the topic of each word is likely under both its document and syntactic context. We derive a fast posterior inference algorithm based on variational methods. We report qualitative and quantitative studies on both synthetic data and hand-parsed documents. We show that the STM is a more predictive model of language than current models based only on syntax or only on topics

    Coastal Tropical Convection in a Stochastic Modeling Framework

    Full text link
    Recent research has suggested that the overall dependence of convection near coasts on large-scale atmospheric conditions is weaker than over the open ocean or inland areas. This is due to the fact that in coastal regions convection is often supported by meso-scale land-sea interactions and the topography of coastal areas. As these effects are not resolved and not included in standard cumulus parametrization schemes, coastal convection is among the most poorly simulated phenomena in global models. To outline a possible parametrization framework for coastal convection we develop an idealized modeling approach and test its ability to capture the main characteristics of coastal convection. The new approach first develops a decision algorithm, or trigger function, for the existence of coastal convection. The function is then applied in a stochastic cloud model to increase the occurrence probability of deep convection when land-sea interactions are diagnosed to be important. The results suggest that the combination of the trigger function with a stochastic model is able to capture the occurrence of deep convection in atmospheric conditions often found for coastal convection. When coastal effects are deemed to be present the spatial and temporal organization of clouds that has been documented form observations is well captured by the model. The presented modeling approach has therefore potential to improve the representation of clouds and convection in global numerical weather forecasting and climate models.Comment: Manuscript submitted for publication in Journal of Advances in Modeling Earth System

    FEA modeling of orthogonal cutting of steel: a review

    Get PDF
    Orthogonal cutting is probably the most studied machining operation for metals. Its simulation with the Finite Element Analysis (FEA) method is of paramount academic interest. 2D models, and to a lesser extent 3D models, have been developed to predict cutting forces, chip formation, heat generation and temperature fields, residual stress distribution and tool wear. This paper first presents a brief review of scientific literature with focus on FEA modelling of the orthogonal cutting process for steels. Following, emphasis is put on the building blocks of the simulation model, such as the formulation of the mechanical problem, the material constitutive model, the friction models and damage laws

    ECONOMETRIC-PROCESS MODELS FOR INTEGRATED ASSESSMENT OF AGRICULTURAL PRODUCTION SYSTEMS

    Get PDF
    This paper develops the conceptual and empirical basis for a class of empirical economic production models that can be linked to site-specific bio-physical models for use in integrated assessment research. Site-specific data are used to estimate econometric production models, and these data and models are then incorporated into a simulation model that represents the decision making process of the farmer as a sequence of discrete or continuous land use and input use decisions. This discrete/continuous structure of the econometric process model is able to simulate decision making both within and outside the range of observed data in a way that is consistent with economic theory and with site-specific bio-physical constraints and processes. An econometric-process model of the dryland grain production system of the Northern Plains demonstrates the capabilities of this type of model.bio-physical models, integrated assessment, production models, dryland grain production, econometric-process models, Production Economics, C5, Q1, Q2,
    corecore