39 research outputs found
Recommended from our members
Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 developers manual.
The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes
Recommended from our members
DAKOTA, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 reference manual
The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications
A graph-based system for network-vulnerability analysis
This paper presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker. The system could be used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success
Recommended from our members
A graph-based system for network-vulnerability analysis
This paper presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker. The system could be used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success
Progressive Response Surfaces
Response surface functions are often used as simple and inexpensive replacements for computationally expensive computer models that simulate the behavior of a complex system over some parameter space. Progressive response surfaces are ones that are built up progressively as global information is added from new sample points in the parameter space. As the response surfaces are globally upgraded based on new information, heuristic indications of the convergence of the response surface approximation to the exact (fitted) function can be inferred. Sampling points can be incrementally added in a structured fashion, or in an unstructured fashion. Whatever the approach, at least in early stages of sampling it is usually desirable to sample the entire parameter space uniformly. At later stages of sampling, depending on the nature of the quantity being resolved, it may be desirable to continue sampling uniformly over the entire parameter space (Progressive response surfaces), or to switch to a focusing/economizing strategy of preferentially sampling certain regions of the parameter space based on information gained in early stages of sampling (Adaptive response surfaces). Here we consider Progressive response surfaces where a balanced indication of global response over the parameter space is desired.We use a variant of Moving Least Squares to fit and interpolate structured and unstructured point sets over the parameter space. On a 2-D test problem we compare response surface accuracy for three incremental sampling methods: Progressive Lattice Sampling; Simple-Random Monte Carlo; and Halton Quasi-Monte-Carlo sequences. We are ultimately after a system for constructing efficiently upgradable response surface approximations with reliable error estimates
RECONCILED TOP-DOWN AND BOTTOM-UP HIERARCHICAL MULTISCALE CALIBRATION OF BCC FE CRYSTAL PLASTICITY
Low-Cost Robust Airfoil Optimization by Variable-Fidelity Models and Stochastic Expansions
In this paper, we present a robust optimization algorithm for low computational cost air-foil design under aleatory uncertainty. Our approach exploits stochastic expansions derived from the Non-Intrusive Polynomial Chaos (NIPC) technique to create response surface approximation (RSA) models utilized in the optimization process. In this work, we employ a combined NIPC expansion approach, where both the design and the uncertain parameters are the input arguments of the RSA model. In order to reduce the computational complexity of the design process, the high-fidelity computational fluid dynamic (CFD) model is replaced by a suitably corrected low-fidelity one, the latter being evaluated using the same CFD solver but with a coarser mesh and relaxed convergence criteria. The model correction is realized at the response level using multi-point output space mapping (OSM).The OSM correction can be obtained without costly parameter extraction procedure and ensures that the low-fidelity model represents the high-fidelity one with sufficient accuracy. The proposed robust optimization algorithm is applied to the design of transonic airfoils with four deterministic design variables (the airfoil shape parameters and the angle of at-tack) and one aleatory uncertain variable (the Mach number). In terms of computational cost, the proposed surrogate-based technique outperforms the conventional approach that exclusively uses the high-fidelity model to create the RSA models: the design cost corresponds to only 12 equivalent high-fidelity model evaluations versus 42 for the conventional method