561 research outputs found

    Research on optimization-based design

    Get PDF
    Research on optimization-based design is discussed. Illustrative examples are given for cases involving continuous optimization with discrete variables and optimization with tolerances. Approximation of computationally expensive and noisy functions, electromechanical actuator/control system design using decomposition and application of knowledge-based systems and optimization for the design of a valve anti-cavitation device are among the topics covered

    Optimization of coupled systems: A critical overview of approaches

    Get PDF
    A unified overview is given of problem formulation approaches for the optimization of multidisciplinary coupled systems. The overview includes six fundamental approaches upon which a large number of variations may be made. Consistent approach names and a compact approach notation are given. The approaches are formulated to apply to general nonhierarchic systems. The approaches are compared both from a computational viewpoint and a managerial viewpoint. Opportunities for parallelism of both computation and manpower resources are discussed. Recommendations regarding the need for future research are advanced

    An algorithm for solving the system-level problem in multilevel optimization

    Get PDF
    A multilevel optimization approach which is applicable to nonhierarchic coupled systems is presented. The approach includes a general treatment of design (or behavior) constraints and coupling constraints at the discipline level through the use of norms. Three different types of norms are examined: the max norm, the Kreisselmeier-Steinhauser (KS) norm, and the 1(sub p) norm. The max norm is recommended. The approach is demonstrated on a class of hub frame structures which simulate multidisciplinary systems. The max norm is shown to produce system-level constraint functions which are non-smooth. A cutting-plane algorithm is presented which adequately deals with the resulting corners in the constraint functions. The algorithm is tested on hub frames with increasing number of members (which simulate disciplines), and the results are summarized

    Execution of Multidisciplinary Design Optimization Approaches on Common Test Problems

    Get PDF
    A class of synthetic problems for testing multidisciplinary design optimization (MDO) approaches is presented. These test problems are easy to reproduce because all functions are given as closed-form mathematical expressions. They are constructed in such a way that the optimal value of all variables and the objective is unity. The test problems involve three disciplines and allow the user to specify the number of design variables, state variables, coupling functions, design constraints, controlling design constraints, and the strength of coupling. Several MDO approaches were executed on two sample synthetic test problems. These approaches included single-level optimization approaches, collaborative optimization approaches, and concurrent subspace optimization approaches. Execution results are presented, and the robustness and efficiency of these approaches an evaluated for these sample problems

    Electron affinity of Li: A state-selective measurement

    Get PDF
    We have investigated the threshold of photodetachment of Li^- leading to the formation of the residual Li atom in the 2p2P2p ^2P state. The excited residual atom was selectively photoionized via an intermediate Rydberg state and the resulting Li^+ ion was detected. A collinear laser-ion beam geometry enabled both high resolution and sensitivity to be attained. We have demonstrated the potential of this state selective photodetachment spectroscopic method by improving the accuracy of Li electron affinity measurements an order of magnitude. From a fit to the Wigner law in the threshold region, we obtained a Li electron affinity of 0.618 049(20) eV.Comment: 5 pages,6 figures,22 reference

    One-dimensional Model of a Gamma Klystron

    Full text link
    A new scheme for amplification of coherent gamma rays is proposed. The key elements are crystalline undulators - single crystals with periodically bent crystallographic planes exposed to a high energy beam of charged particles undergoing channeling inside the crystals. The scheme consists of two such crystals separated by a vacuum gap. The beam passes the crystals successively. The particles perform undulator motion inside the crystals following the periodic shape of the crystallographic planes. Gamma rays passing the crystals parallel to the beam get amplified due to interaction with the particles inside the crystals. The term `gamma klystron' is proposed for the scheme because its operational principles are similar to those of the optical klystron. A more simple one-crystal scheme is considered as well for the sake of comparison. It is shown that the gamma ray amplification in the klystron scheme can be reached at considerably lower particle densities than in the one-crystal scheme, provided that the gap between the crystals is sufficiently large.Comment: RevTeX4, 22 pages, 4 figure

    A new approach for the limit to tree height using a liquid nanolayer model

    Full text link
    Liquids in contact with solids are submitted to intermolecular forces inferring density gradients at the walls. The van der Waals forces make liquid heterogeneous, the stress tensor is not any more spherical as in homogeneous bulks and it is possible to obtain stable thin liquid films wetting vertical walls up to altitudes that incompressible fluid models are not forecasting. Application to micro tubes of xylem enables to understand why the ascent of sap is possible for very high trees like sequoias or giant eucalyptus.Comment: In the conclusion is a complementary comment to the Continuum Mechanics and Thermodynamics paper. 21 pages, 4 figures. Continuum Mechanics and Thermodynamics 20, 5 (2008) to appea

    Global Warming: Forecasts by Scientists versus Scientific Forecasts

    Get PDF
    In 2007, the Intergovernmental Panel on Climate Change’s Working Group One, a panel of experts established by the World Meteorological Organization and the United Nations Environment Programme, issued its Fourth Assessment Report. The Report included predictions of dramatic increases in average world temperatures over the next 92 years and serious harm resulting from the predicted temperature increases. Using forecasting principles as our guide we asked: Are these forecasts a good basis for developing public policy? Our answer is “no”. To provide forecasts of climate change that are useful for policy-making, one would need to forecast (1) global temperature, (2) the effects of any temperature changes, and (3) the effects of feasible alternative policies. Proper forecasts of all three are necessary for rational policy making. The IPCC WG1 Report was regarded as providing the most credible long-term forecasts of global average temperatures by 31 of the 51 scientists and others involved in forecasting climate change who responded to our survey. We found no references in the 1056-page Report to the primary sources of information on forecasting methods despite the fact these are conveniently available in books, articles, and websites. We audited the forecasting processes described in Chapter 8 of the IPCC’s WG1 Report to assess the extent to which they complied with forecasting principles. We found enough information to make judgments on 89 out of a total of 140 forecasting principles. The forecasting procedures that were described violated 72 principles. Many of the violations were, by themselves, critical. The forecasts in the Report were not the outcome of scientific procedures. In effect, they were the opinions of scientists transformed by mathematics and obscured by complex writing. Research on forecasting has shown that experts’ predictions are not useful in situations involving uncertainly and complexity. We have been unable to identify any scientific forecasts of global warming. Claims that the Earth will get warmer have no more credence than saying that it will get colder

    Measurement of gauge blocks by interferometry

    Get PDF
    The key comparison EURAMET.L-K1.2011 on gauge blocks was carried out in the framework of a EURAMET project starting in 2012 and ending in 2015. It involved the participation of 24 National Metrology Institutes from Europe and Egypt, respectively. 38 gauge blocks of steel and ceramic with nominal central lengths between 0.5 mm and 500 mm were circulated. The comparison was conducted in two loops with two sets of artifacts. A statistical technique for linking the reference values was applied. As a consequence the reference value of one loop is influenced by the measurements of the other loop although they did not even see the artifacts of the others. This influence comes solely from three "linking laboratories" which measure both sets of artifacts. In total there were 44 results were not fully consistent with the reference values. This represents 10% of the full set of 420 results which is a considerable high number. At least 12 of them are clearly outliers where the participants have been informed by the pilot as soon as possible. The comparison results help to support the calibration and measurement capabilities (CMCs) of the laboratories involved in the CIPM MRA
    • …
    corecore