731 research outputs found

    INCORPORATING SAFETY-FIRST CONSTRAINTS IN LINEAR PROGRAMMING PRODUCTION MODELS

    Get PDF
    A recent survey indicated that many procedures view risk in a safety-first context. Traditional methods used to impose safety-first constraints in optimization models have often been difficult to implement. This is particularly true when endogenous decisions affect the distribution of the chance-constrained random variable. This paper presents a method whereby probabilistic constraints can be easily imposed upon finitely discrete random variables. The procedure uses a linear version of the lower partial moment stochastic inequality. The resulting solutions are somewhat conservative but are less so than the results using the previously published mean income-absolute deviation stochastic inequality.Production Economics, Research Methods/ Statistical Methods,

    Computational Methods in Modeling Fusion Plasmas

    Get PDF
    Fusion provides an attractive potential alternative to using fossil fuels for energy. Fusion requires vastly less fuel resources than does current non-renewable energy processes (virtually a 100% reduction in the required mass of fuel needed). The fuel sources needed (mainly deuterium and lithium) are also highly abundant on the Earth and fusion generates minimal waste products. One of the biggest obstacles to practical fusion energy is how to contain the reactants long enough for energy output to significantly exceed energy input. The equations governing plasma dynamics and confinement are highly nonlinear and do not admit simple analytic solutions in realistic situations. To obtain predictions of various plasma confinement scenarios, it is often necessary to turn to other means, such as computational modeling, to simulate the relevant plasma dynamics. Evaluating the effectiveness and reliability of the computational methods used for simulation then becomes extremely important, especially when subsequently using your code to predict new physics to the scientific community. In this work, we present an effort to analyze the effectiveness of one of the computational techniques used in the NIMROD code, which code Eric Held (USU) and others in the scientific community have helped to develop. This method involves resolving something called the Grad-Shafranov equation, which governs the potential plasma equilibria that can exist in tokamak plasmas. Here we evaluate the effectiveness of the method and discuss the potential implications resulting from this analysis

    PERFORMANCE OF RISK-INCOME MODELS OUTSIDE THE ORIGINAL DATA SET

    Get PDF
    Selected risk programming solutions (i.e., profit maximization, Target-MOTAD, and MOTAD) are tested in an economic environment outside the data set from which they were developed. Specifically, solutions are derived from either a longer 10-year (1965-74) or shorter 6-year estimation period (1969-74), and then, they are tested for consistent risk-income characteristics over a later 10-year period (1975-84). Risk solutions estimated from earlier periods perform well in the later test period in spite of different economic conditions between time periods. However, favorable performance may be related to the specific example used in this analysis. Further testing for other farm situations is needed before general conclusions can be reached.Risk and Uncertainty,

    Closed-Shell Hartree-Fock: an Efficient Implementation Based on the Contraction of Integrals in the Primitive Basis

    Get PDF
    In this work a highly efficient, closed-shell restricted Hartree-Fock self-consistent-field (SCF) program has been developed. It has been written as a part of the ``Quantum Objects Library'' (QOL), which is developed at the Institute for Theoretical Chemistry at the University of Cologne. In the implementation presented here, the explicit transformation of the two-electron integrals from the primitive to the radial contracted, angularly transformed basis is avoided. The density matrix is obtained by diagonalizing the Fock matrix in the radial-angular-transformed basis. It is transformed to the primitive basis, and contracted with the integrals to give the Fock matrix, also in the primitive basis. To assure that solutions of the Fock equations are obtained in the transformed basis, the Fock matrix is transformed back to the radial contracted, angularly transformed basis after the density contraction step. Both one- and two-electron integrals are calculated using the ansatz of Obara-Saika (OS). For the evaluation of the two-electron integrals a code-generating ansatz is used. At compile time optimized code is generated, to be called at runtime. Numerical instabilities inherent in the Obara-Saika scheme were analyzed and eliminated. The code-generating ansatz was also used for the implementation of the density contraction. Contraction codes for both primitive and radial-angular-transformed bases were developed. Both two-electron integral evaluation and density contraction implementations have been optimized by explicit use of streaming single instruction multiple data (SIMD) extensions (SSE). Integral prescreening has been implemented on basis of the Schwarz inequality, and the differential density scheme is used. Convergence acceleration has been realized by implementing the direct inversion of the iterative subspace (DIIS) method. The developed SCF program has been parallelized using the open multi-processing (OMP) application programming interface (API). In final performance comparisons to commercially available programs competitive results were obtained

    Simulation of the Recent Multidecadal Increase of Atlantic Hurricane Activity Using an 18-km-Grid Regional Model

    Get PDF
    In this study, a new modeling framework for simulating Atlantic hurricane activity is introduced. The model is an 18-km-grid nonhydrostatic regional model, run over observed specified SSTs and nudged toward observed time-varying large-scale atmospheric conditions (Atlantic domain wavenumbers 0-2) derived from the National Centers for Environmental Prediction (NCEP) reanalyses. Using this perfect large-scale model approach for 27 recent August-October seasons (1980-2006), it is found that the model successfully reproduces the observed multidecadal increase in numbers of Atlantic hurricanes and several other tropical cyclone (TC) indices over this period. The correlation of simulated versus observed hurricane activity by year varies from 0.87 for basin-wide hurricane counts to 0.41 for U.S. landfalling hurricanes. For tropical storm count, accumulated cyclone energy, and TC power dissipation indices the correlation is similar to 0.75, for major hurricanes the correlation is 0.69, and for U.S. landfalling tropical storms, the correlation is 0.57. The model occasionally simulates hurricanes intensities of up to category 4 (similar to 942 mb) in terms of central pressure, although the surface winds (\u3c 47 in s-1) do not exceed category-2 intensity. On interannual time scales, the model reproduces the observed ENSO-Atlantic hurricane covariation reasonably well. Some notable aspects of the highly contrasting 2005 and 2006 seasons are well reproduced, although the simulated activity during the 2006 core season was excessive. The authors conclude that the model appears to be a useful tool for exploring mechanisms of hurricane variability in the Atlantic (e.g., shear versus potential intensity contributions). The model may be capable of making useful simulations/projections of pre-1980 or twentieth-century Atlantic hurricane activity. However, the reliability of these projections will depend on obtaining reliable large-scale atmospheric and SST conditions from sources external to the model

    Lateral coupling in baroclinically unstable flows

    Get PDF
    Author Posting. © American Meteorological Society, 2008. This article is posted here by permission of American Meteorological Society for personal use, not for redistribution. The definitive version was published in Journal of Physical Oceanography 38 (2008): 1267-1277, doi:10.1175/2007JPO3906.1.A two-layer quasigeostrophic model in a channel is used to study the influence of lateral displacements of regions of different sign mean potential vorticity gradient (Πy) on the growth rate and structure of linearly unstable waves. The mean state is very idealized, with a region of positive Πy in the upper layer and a region of negative Πy in the lower layer; elsewhere Πy is zero. The growth rate and structure of the model’s unstable waves are quite sensitive to the amount of overlap between the two regions. For large amounts of overlap (more than several internal deformation radii), the channel modes described by Phillips’ model are recovered. The growth rate decreases abruptly as the amount of overlap decreases below the internal deformation radius. However, unstable modes are also found for cases in which the two nonzero Πy regions are separated far apart. In these cases, the wavenumber of the unstable waves decreases such that the aspect ratio of the wave remains O(1). The waves are characterized by a large-scale barotropic component that has maximum amplitude near one boundary but extends all the way across the channel to the opposite boundary. Near the boundaries, the wave is of mixed barotropic–baroclinic structure with cross-front scales on the order of the internal deformation radius. The perturbation heat flux is concentrated near the nonzero Πy regions, but the perturbation momentum flux extends all the way across the channel. The perturbation fluxes act to reduce the isopycnal slopes near the channel boundaries and to transmit zonal momentum from the region of Πy > 0 to the region on the opposite side of the channel where Πy < 0. These nonzero perturbation momentum fluxes are found even for a mean state that has no lateral shear in the velocity field.This work was supported by NSF Grants OPP-0421904, OCE-0423975 (MAS), and OCE- 85108600 (JP)

    Novel method for high-throughput colony PCR screening in nanoliter-reactors

    Get PDF
    We introduce a technology for the rapid identification and sequencing of conserved DNA elements employing a novel suspension array based on nanoliter (nl)-reactors made from alginate. The reactors have a volume of 35 nl and serve as reaction compartments during monoseptic growth of microbial library clones, colony lysis, thermocycling and screening for sequence motifs via semi-quantitative fluorescence analyses. nl-Reactors were kept in suspension during all high-throughput steps which allowed performing the protocol in a highly space-effective fashion and at negligible expenses of consumables and reagents. As a first application, 11 high-quality microsatellites for polymorphism studies in cassava were isolated and sequenced out of a library of 20 000 clones in 2 days. The technology is widely scalable and we envision that throughputs for nl-reactor based screenings can be increased up to 100 000 and more samples per day thereby efficiently complementing protocols based on established deep-sequencing technologie

    The problem of globalization

    Get PDF
    There is a general consensus that the contemporary world is best under-stood through the prism of globalization. This view is shared by most social scientists, politicians, journalists, businesses, and indeed broad sectors of the general population. Opinions differ as to whether global-ization is a positive or a negative development, but there is general agreement that whatever is going on is either a symptom or a consequence of globalization. There are many good reasons for this view. The number of passengers on scheduled international flights has risen by a factor of six between 1975 and 2000. The number of international tourist arrivals rose by three and a half times during the same period. Much less happily, the number of internationally displaced refugees rose by around 50 percent in the 20 years before 2000. During the quarter of a century after 1975, the duration of international voice telephone calls rose by around 25 times. We could go on multiplying such striking figures i

    Restriction landmark genomic scanning (RLGS) spot identification by second generation virtual RLGS in multiple genomes with multiple enzyme combinations.

    Get PDF
    BackgroundRestriction landmark genomic scanning (RLGS) is one of the most successfully applied methods for the identification of aberrant CpG island hypermethylation in cancer, as well as the identification of tissue specific methylation of CpG islands. However, a limitation to the utility of this method has been the ability to assign specific genomic sequences to RLGS spots, a process commonly referred to as "RLGS spot cloning."ResultsWe report the development of a virtual RLGS method (vRLGS) that allows for RLGS spot identification in any sequenced genome and with any enzyme combination. We report significant improvements in predicting DNA fragment migration patterns by incorporating sequence information into the migration models, and demonstrate a median Euclidian distance between actual and predicted spot migration of 0.18 centimeters for the most complex human RLGS pattern. We report the confirmed identification of 795 human and 530 mouse RLGS spots for the most commonly used enzyme combinations. We also developed a method to filter the virtual spots to reduce the number of extra spots seen on a virtual profile for both the mouse and human genomes. We demonstrate use of this filter to simplify spot cloning and to assist in the identification of spots exhibiting tissue-specific methylation.ConclusionThe new vRLGS system reported here is highly robust for the identification of novel RLGS spots. The migration models developed are not specific to the genome being studied or the enzyme combination being used, making this tool broadly applicable. The identification of hundreds of mouse and human RLGS spot loci confirms the strong bias of RLGS studies to focus on CpG islands and provides a valuable resource to rapidly study their methylation
    • …
    corecore