2,988 research outputs found

    Linking objective and subjective modeling in engineering design through arc-elastic dominance

    Get PDF
    Engineering design in mechanics is a complex activity taking into account both objective modeling processes derived from physical analysis and designers’ subjective reasoning. This paper introduces arc-elastic dominance as a suitable concept for ranking design solutions according to a combination of objective and subjective models. Objective models lead to the aggregation of information derived from physics, economics or eco-environmental analysis into a performance indicator. Subjective models result in a confidence indicator for the solutions’ feasibility. Arc-elastic dominant design solutions achieve an optimal compromise between gain in performance and degradation in confidence. Due to the definition of arc-elasticity, this compromise value is expressive and easy for designers to interpret despite the difference in the nature of the objective and subjective models. From the investigation of arc-elasticity mathematical properties, a filtering algorithm of Pareto-efficient solutions is proposed and illustrated through a design knowledge modeling framework. This framework notably takes into account Harrington’s desirability functions and Derringer’s aggregation method. It is carried out through the re-design of a geothermal air conditioning system

    The application of multi-objective robust design methods in ship design

    Get PDF
    When designing large complex vessels, the evaluation of a particular design can be both complicated and time consuming. Designers often resort to the use of concept design models enabling both a reduction in complexity and time for evaluation. Various optimisation methods are then typically used to explore the design space facilitating the selection of optimum or near optimum designs. It is now possible to incorporate considerations of seakeeping, stability and costs at the earliest stage in the ship design process. However, to ensure that reliable results are obtained, the models used are generally complex and computationally expensive. Methods have been developed which avoid the necessity to carry out an exhaustive search of the complete design space. One such method is described which is concerned with the application of the theory of Design Of Experiments (DOE) enabling the design space to be efficiently explored. The objective of the DOE stage is to produce response surfaces which can then be used by an optimisation module to search the design space. It is assumed that the concept exploration tool whilst being a simplification of the design problem, is still sufficiently complicated to enable reliable evaluations of a particular design concept. The response surface is used as a representation of the concept exploration tool, and by it's nature can be used to rapidly evaluate a design concept hence reducing concept exploration time. While the methodology has a wide applicability in ship design and production, it is illustrated by its application to the design of a catamaran with respect to seakeeping. The paper presents results exploring the design space for the catamaran. A concept is selected which is robust with respect to the Relative Bow Motion (RBM), the heave, pitch and roll at any particular waveheading. The design space is defined by six controllable design parameters; hull length, breadth to draught ratio, distance between demihull centres, coefficient of waterplane, longitudinal centre of floatation, longitudinal centre of buoyancy, and by one noise parameter, the waveheading. A Pareto-optimal set of solutions is obtained using RBM, heave, pitch and roll as criteria. The designer can then select from this set the design which most closely satisfies their requirements. Typical solutions are shown to yield average reductions of over 25% in the objective functions when compared to earlier results obtained using conventional optimisation methods

    Using numerical plant models and phenotypic correlation space to design achievable ideotypes

    Full text link
    Numerical plant models can predict the outcome of plant traits modifications resulting from genetic variations, on plant performance, by simulating physiological processes and their interaction with the environment. Optimization methods complement those models to design ideotypes, i.e. ideal values of a set of plant traits resulting in optimal adaptation for given combinations of environment and management, mainly through the maximization of a performance criteria (e.g. yield, light interception). As use of simulation models gains momentum in plant breeding, numerical experiments must be carefully engineered to provide accurate and attainable results, rooting them in biological reality. Here, we propose a multi-objective optimization formulation that includes a metric of performance, returned by the numerical model, and a metric of feasibility, accounting for correlations between traits based on field observations. We applied this approach to two contrasting models: a process-based crop model of sunflower and a functional-structural plant model of apple trees. In both cases, the method successfully characterized key plant traits and identified a continuum of optimal solutions, ranging from the most feasible to the most efficient. The present study thus provides successful proof of concept for this enhanced modeling approach, which identified paths for desirable trait modification, including direction and intensity.Comment: 25 pages, 5 figures, 2017, Plant, Cell and Environmen

    A robust design methodology suitable for application to one-off products

    Get PDF
    Robust design is an activity of fundamental importance when designing large, complex, one-off engineering products. Work is described which is concerned with the application of the theory of design of experiments and stochastic optimization methods to explore and optimize at the concept design stage. The discussion begins with a description of state-of-the-art stochastic techniques and their application to robust design. The content then focuses on a generic methodology which is capable of manipulating design algorithms that can be used to describe a design concept. An example is presented, demonstrating the use of the system for the robust design of a catamaran with respect to seakeeping

    The Use Of Cultural Algorithms To Learn The Impact Of Climate On Local Fishing Behavior In Cerro Azul, Peru

    Get PDF
    Recently it has been found that the earth’s oceans are warming at a pace that is 40% faster than predicted by a United Nations panel a few years ago. As a result, 2019 has become the warmest year on record for the earth’s oceans. That is because the oceans have acted as a buffer by absorbing 93% of the heat produced by the greenhouse gases [40]. The impact of the oceanic warming has already been felt in terms of the periodic warming of the Pacific Ocean as an effect of the ENSO process. The ENSO process is a cycle of warming and subsequent cooling of the Pacific Ocean that can last over a period of years. This cycle was first documented by Peruvian fishermen in the early 1600’s. So it has been part of the environmental challenges that have been presented to economic agents throughout the world since then. It has even been suggested that the cycle has increased in frequency over the years, perhaps in response to the overall issues related to global warming. Although the onset of the ENSO cycle might be viewed as disruption of the fishing economy in a given area, there is some possibility that over time agents have been able to develop strategic responses to these changes to as to reduce the economic risk associated with them. During that time the Cerro Azul, Peru was in the process of emerging from one of the largest ENSOs on record. This was perceived to be a great opportunity to see how the collective bodies of fishermen were able to alter their fishing strategies to deal with these more uncertain times. Our results suggest that indeed the collective economic response of the fishermen demonstrates an ability to respond to the unpredictabilities of climate change, but at a cost. It is clear that the fishermen have gained the collective knowledge over the years to produce a coordinated response that can be observed at a higher level. Of course, this knowledge can be used to coordinate activities only if it is communicated socially within the society. Although our data does not provide any explicit information about such communication there is some indirect evidence that the adjustments in strategy are brought about by the increased exchange of experiences among the fishermen

    Calibrating and Evaluating Dynamic Rule-Based Transit-Signal-Priority Control Systems in Urban Traffic Networks

    Get PDF
    Setting the traffic controller parameters to perform effectively in real-time is a challenging task, and it entails setting several parameters to best suit some predicted traffic conditions. This study presents the framework and method that entail the application of the Response Surface Methodology (RSM) to calibrate the parameters of any control system incorporating advanced traffic management strategies (e.g., the complex integrated traffic control system developed by Ahmed and Hawas). The integrated system is a rule-based heuristic controller that reacts to specific triggering conditions, such as identification of priority transit vehicle, downstream signal congestion, and incidents by penalizing the predefined objective function with a set of parameters corresponding to these conditions. The integrated system provides real time control of actuated signalized intersections with different phase arrangements (split, protected and dual). The premise of the RSM is its ability to handle either single or multiple objective functions; some of which may be contradicting to each other. For instance, maximizing transit trips in a typical transit priority system may affect the overall network travel time. The challenging task is to satisfy the requirements of transit and non-transit vehicles simultaneously. The RSM calibrates the parameters of the integrated system by selecting the values that can produce optimal measures of effectiveness. The control system was calibrated using extensive simulation-based analyses under high and very high traffic demand scenario for the split, protected, and dual control types. A simulation-based approach that entailed the use of the popular TSIS software with code scripts representing the logic of the integrated control system was used. The simulation environment was utilized to generate the data needed to carry on the RSM analysis and calibrate the models. The RSM was used to identify the optimal parameter settings for each control type and traffic demand level. It was also used to determine the most influential parameters on the objective function(s) and to develop models of the significant parameters as well as their interactions on the overall network performance measures. RSM uses the so-called composite desirability value as well as the simultaneous multi-objective desirabilities (e.g., the desirability of maximizing the transit vehicles throughput and minimizing the average vehicular travel time) estimates of the responses to identify the best parameters. This study also demonstrated how to develop “mathematical” models for rough estimation of the performance measures vis-à-vis the various parameter values, including how to validate the optimal settings. The calibrated models are proven to be significant. The optimal parameters of each control type and demand level were also checked for robustness, and whether a universal set of relative parameter values can be used for each control type. For the high traffic demand level, the optimal set of parameters is more robust than those of the very high traffic demand. Besides, the dual actuated controller optimal setting under the very high traffic demand scenario is more robust (than other control types settings) and shows the best performance

    A genetic algorithm for optimal assembly of pairwise forced-choice questionnaires

    Full text link
    The use of multidimensional forced-choice questionnaires has been proposed as a means of improving validity in the assessment of non-cognitive attributes in high-stakes scenarios. However, the reduced precision of trait estimates in this questionnaire format is an important drawback. Accordingly, this article presents an optimization procedure for assembling pairwise forced-choice questionnaires while maximizing posterior marginal reliabilities. This procedure is performed through the adaptation of a known genetic algorithm (GA) for combinatorial problems. In a simulation study, the efficiency of the proposed procedure was compared with a quasi-brute-force (BF) search. For this purpose, five-dimensional item pools were simulated to emulate the real problem of generating a forced-choice personality questionnaire under the five-factor model. Three factors were manipulated: (1) the length of the questionnaire, (2) the relative item pool size with respect to the questionnaire’s length, and (3) the true correlations between traits. The recovery of the person parameters for each assembled questionnaire was evaluated through the squared correlation between estimated and true parameters, the root mean square error between the estimated and true parameters, the average difference between the estimated and true inter-trait correlations, and the average standard error for each trait level. The proposed GA offered more accurate trait estimates than the BF search within a reasonable computation time in every simulation condition. Such improvements were especially important when measuring correlated traits and when the relative item pool sizes were higher. A user-friendly online implementation of the algorithm was made available to the user

    Evaluation of production control strategies for the co-ordination of work-authorisations and inventory management in lean supply chains

    Get PDF
    A decision support framework is proposed for assisting managers and executives to possibly utilise lean production control strategies to coordinate work authorisations and inventory management in supply chains. The framework allows decision makers to evaluate and compare the suitability of various strategies to their system especially when considering conflicting objectives, such as maximising customer service levels while minimising Work in Process (WIP) in a business environment distressed by variabilities and uncertainties in demand stemmed from customer power. Also, the framework provides decision guidance in selecting and testing optimal solutions of selected policies control parameters. The framework is demonstrated by application to a four-node serial supply-chain operating under three different pull-based supply chain strategies; namely CONWIP, Kanban, and Hybrid Kanban-CONWIP and exhibiting low, medium, and high variability in customer demand (i.e., coefficient of variation of 25%, 112.5%, and 200%). The framework consists of three phases; namely Modelling, Optimisation and Decision Support; and is applicable to both Simulation-Based and Metamodel-Based Optimisation. The Modelling phase includes conceptual modelling, discrete event simulation modelling and metamodels development. The Optimisation phase requires the application of multi-criteria optimisation methods to generate WIP-Service Level trade-off curves. The Curvature and Risk Analysis of the trade-off curves are utilised in the Decision Support phase to provide guidance to the decision maker in selecting and testing the best settings for the control parameters of the system. The inflection point of the curvature function indicates the point at which further increases in Service Level are only achievable by incurring an unacceptably higher cost in terms of average WIP. Risk analysis quantifies the risk associated with designing a supply chain system under specific environmental parameters. This research contributes an efficient framework that is applicable to solve real supply chain problems and better understanding of the potential impacts and expected effectiveness of different pull control mechanisms, and offers valuable insights on future research opportunities in this field to production and supply chain managers
    • 

    corecore