12,707 research outputs found
Analysis-of-marginal-Tail-Means (ATM): a robust method for discrete black-box optimization
We present a new method, called Analysis-of-marginal-Tail-Means (ATM), for
effective robust optimization of discrete black-box problems. ATM has important
applications to many real-world engineering problems (e.g., manufacturing
optimization, product design, molecular engineering), where the objective to
optimize is black-box and expensive, and the design space is inherently
discrete. One weakness of existing methods is that they are not robust: these
methods perform well under certain assumptions, but yield poor results when
such assumptions (which are difficult to verify in black-box problems) are
violated. ATM addresses this via the use of marginal tail means for
optimization, which combines both rank-based and model-based methods. The
trade-off between rank- and model-based optimization is tuned by first
identifying important main effects and interactions, then finding a good
compromise which best exploits additive structure. By adaptively tuning this
trade-off from data, ATM provides improved robust optimization over existing
methods, particularly in problems with (i) a large number of factors, (ii)
unordered factors, or (iii) experimental noise. We demonstrate the
effectiveness of ATM in simulations and in two real-world engineering problems:
the first on robust parameter design of a circular piston, and the second on
product family design of a thermistor network
Tailoring the Statistical Experimental Design Process for LVC Experiments
The use of Live, Virtual and Constructive (LVC) Simulation environments are increasingly being examined for potential analytical use particularly in test and evaluation. The LVC simulation environments provide a mechanism for conducting joint mission testing and system of systems testing when scale and resource limitations prevent the accumulation of the necessary density and diversity of assets required for these complex and comprehensive tests. The statistical experimental design process is re-examined for potential application to LVC experiments and several additional considerations are identified to augment the experimental design process for use with LVC. This augmented statistical experimental design process is demonstrated by a case study involving a series of tests on an experimental data link for strike aircraft using LVC simulation for the test environment. The goal of these tests is to assess the usefulness of information being presented to aircrew members via different datalink capabilities. The statistical experimental design process is used to structure the experiment leading to the discovery of faulty assumptions and planning mistakes that could potentially wreck the results of the experiment. Lastly, an aggressive sequential experimentation strategy is presented for LVC experiments when test resources are limited. This strategy depends on a foldover algorithm that we developed for nearly orthogonal arrays to rescue LVC experiments when important factor effects are confounded
Designs efficiency for non-market valuation with choice modelling: how to measure it, what to report and why
We review the basic principles for the evaluation of design efficiency in discrete choice modelling with a focus on efficiency of WTP estimates from the multinomial logit model. The discussion is developed under the realistic assumption that researchers can plausibly define a prior on the utility coefficients. Some new measures of design performance in applied studies are proposed and their rationale discussed. An empirical example based on the generation and comparison of fifteen separate designs from a common set of assumptions illustrates the relevant considerations to the context of non-market valuation, with particular emphasis placed on C-efficiency. Conclusions are drawn for the practice of reporting in non-market valuation and for future work on design research
Experimental Designs, Meta-Modeling, and Meta-learning for Mixed-Factor Systems with Large Decision Spaces
Many Air Force studies require a design and analysis process that can accommodate for the computational challenges associated with complex systems, simulations, and real-world decisions. For systems with large decision spaces and a mixture of continuous, discrete, and categorical factors, nearly orthogonal-and-balanced (NOAB) designs can be used as efficient, representative subsets of all possible design points for system evaluation, where meta-models are then fitted to act as surrogates to system outputs. The mixed-integer linear programming (MILP) formulations used to construct first-order NOAB designs are extended to solve for low correlation between second-order model terms (i.e., two-way interactions and quadratics). The resulting second-order approaches are shown to improve design performance measures for second-order model parameter estimation and prediction variance as well as for protection from bias due to model misspecification with respect to second-order terms. Further extensions are developed to construct batch sequential NOAB designs, giving experimenters more flexibility by creating multiple stages of design points using different NOAB approaches, where simultaneous construction of stages is shown to outperform design augmentation overall. To reduce cost and add analytical rigor, meta-learning frameworks are developed for accurate and efficient selection of first-order NOAB designs as well as of meta-models that approximate mixed-factor systems
Data Farming: The Meanings and Methods Behind the Metaphor
17 USC 105 interim-entered record; under review.The article of record as published may be found at https://doi.org/10.36819/SW21.002Operational Research Society Simulation Workshop 2021Data farming captures the notion of purposeful data generation from simulation models. The ready availability of computing power has fundamentally changed the way simulation and other computational models can be used to provide insights to decision makers. Large-scale designed experiments let us grow the simulation output efficiently and effectively. We can explore massive input spaces, use statistical and visualization techniques to uncover interesting features of complex response surfaces, and explicitly identify cause-and-effect relationships. Nonetheless, there are many opportunities for research methods that could further enhance this process. I will begin with a brief overview of key differences between physical and simulation experiments, as well as current data farming capabilities and their relationship to emerging techniques in data science and analytics. I will then share some thoughts about opportunities and challenges for further improving the state of the art, and transforming the state of the practice, in this domain
BLOCKING FACTORIAL DESIGNS IN GREENHOUSE EXPERIMENTS
Experiments in greenhouses usually have to be conducted with very limited resources. This makes it particularly important to control the between plot variation by appropriate use of blocking. Many greenhouse experiments are naturally laid out in a pattern that makes a class of designs known as semi-Latin squares useful. Their properties have been studied recently by a number of authors and this work is reviewed. Often, the experimental treatments will have a factorial structure. An example of a 23 structure is used to show how factorial treatments can be assigned to treatment labels to ensure that the appropriate information is obtained from the experiment
EXTENDING AND IMPROVING DESIGNS FOR LARGE-SCALE COMPUTER EXPERIMENTS
This research develops methods that increase the inventory of space-filling designs (SFDs) for large-scale computer-based experiments. We present a technique enabling researchers to add sequential blocks of design points effectively and efficiently to existing SFDs. We accomplish this through a quadratically constrained mixed-integer program that augments cataloged or computationally expensive designs by optimally permuting and stacking columns of an initial base design to minimize the maximum absolute pairwise correlation among columns in the new extended design. We extend many classes of SFDs to dimensions that are currently not easily obtainable. Adding new design points provides more degrees of freedom for building metamodels and assessing fit. The resulting extended designs have better correlation and space-filling properties than the original base designs and compare well with other types of SFDs created from scratch in the extended design space. In addition, through massive computer-based experimentation, we compare popular software packages for generating SFDs and provide insight into the methods and relationships among design measures of correlation and space-fillingness. These results provide experimenters with a broad understanding of SFD software packages, algorithms, and optimality criteria. Further, we provide a probability-distribution model for the maximum absolute pairwise correlation among columns in the widely used maximin Latin hypercube designs.Lieutenant Colonel, United States Marine CorpsApproved for public release. Distribution is unlimited
AN OPERATIONAL EFFECTIVENESS ANALYSIS ON MANNED-UNMANNED TEAMING USING WEAPONIZED UNMANNED VEHICLES IN URBAN TERRAIN
In recent years, militaries have strengthened efforts to integrate unmanned technologies to improve manned-unmanned teaming (MUM-T) capabilities. As some countries’ fighting-age populations are decreasing, militaries are turning to readily available, cost efficient, and sophisticated unmanned technologies. MUM-T holds great potential not only to alleviate manpower shortages in militaries, but also to improve combat capabilities. This thesis studies the effectiveness of MUM-T at the frontline, down to infantry teams supporting offensive operations in urban terrain. An agent-based simulation is used to model a MUM-T combat operation with and without an unmanned ground vehicle (UGV) to support an infantry company. An analysis was conducted on more than 76,800 simulated battles. It was observed that MUM-T concepts could dramatically increase combat effectiveness, as assessed by increased enemy casualties. The UGV reloading time, weapon accuracy, and own force structure were also observed to significantly impact the infantry’s lethality and survivability. This analysis concludes that implementation of MUM-T at the small-unit tactical level has great potential to enhance overall combat performance. Moving forward, combat models could be integrated into future military exercises such that the findings from simulations can be verified and validated.Major, Singapore ArmyApproved for public release. Distribution is unlimited
- …