7 research outputs found

    Data Driven Surrogate Based Optimization in the Problem Solving Environment WBCSim

    Get PDF
    Large scale, multidisciplinary, engineering designs are always difficult due to the complexity and dimensionality of these problems. Direct coupling between the analysis codes and the optimization routines can be prohibitively time consuming due to the complexity of the underlying simulation codes. One way of tackling this problem is by constructing computationally cheap(er) approximations of the expensive simulations, that mimic the behavior of the simulation model as closely as possible. This paper presents a data driven, surrogate based optimization algorithm that uses a trust region based sequential approximate optimization (SAO) framework and a statistical sampling approach based on design of experiment (DOE) arrays. The algorithm is implemented using techniques from two packages—SURFPACK and SHEPPACK that provide a collection of approximation algorithms to build the surrogates and three different DOE techniques—full factorial (FF), Latin hypercube sampling (LHS), and central composite design (CCD)—are used to train the surrogates. The results are compared with the optimization results obtained by directly coupling an optimizer with the simulation code. The biggest concern in using the SAO framework based on statistical sampling is the generation of the required database. As the number of design variables grows, the computational cost of generating the required database grows rapidly. A data driven approach is proposed to tackle this situation, where the trick is to run the expensive simulation if and only if a nearby data point does not exist in the cumulatively growing database. Over time the database matures and is enriched as more and more optimizations are performed. Results show that the proposed methodology dramatically reduces the total number of calls to the expensive simulation runs during the optimization process

    Computational Steering in the Problem Solving Environment WBCSim

    Get PDF
    Computational steering allows scientists to interactively control a numerical experiment and adjust parameters of the computation on-the-fly and explore “what if ” analysis. Computational steering effectively reduces computational time, makes research more efficient, and opens up new product design opportunities. There are several problem solving environments (PSEs) featuring computational steering. However, there is hardly any work explaining how to enable computational steering for PSEs embedded with legacy simulation codes. This paper describes a practical approach to implement computational steering for such PSEs by using WBCSim as an example. WBCSim is a Web based simulation system designed to increase the productivity of wood scientists conducting research on wood-based composites manufacturing processes. WBCSim serves as a prototypical example for the design, construction, and evaluation of small-scale PSEs. Various changes have been made to support computational steering across the three layers—client, server, developer—comprising the WBCSim system. A detailed description of the WBCSim system architecture is presented, along with a typical scenario of computational steering usage

    Detecting High-Energy Emission from Gamma-Ray Bursts with EGRET and GLAST

    Get PDF
    The research described in this dissertation explores the detection of high-energy emission from gamma-ray bursts (GRBs) with EGRET and GLAST. Data from the EGRET experiment were searched for evidence of ~1-250 MeV emission that preceded or followed gamma-ray bursts on a time scale of hours. This led to the discovery of a gamma-ray burst with high-energy, post-quiescent emission from the prompt phase that was coincident with lower-energy (keV) emission. To do detailed event filtering studies for the GLAST Large Area Telescope (LAT), the flight software event filter was embedded in the standard science analysis environment. The event trigger rate, reasons why it must be reduced, and hardware-level methods of reducing it are studied. Much work was done to improve the performance of the prototype event filter, and additional work was done to develop algorithms to allow the LAT to distinguish Earth albedo photons from celestial gamma-rays, and to eliminate albedo events from the data stream. It is shown that it is possible to reduce the background rate to meet LAT mission requirements, while simultaneously keeping the gamma-ray acceptance rate high enough to exceed the relevant LAT requirements for those events. Using the onboard event filter, real-time, onboard, gamma-ray burst detection was then studied. A detection algorithm had been developed by members of the LAT collaboration, but the algorithm required a lower onboard background rate than the basic LAT requirement for downlink, in addition to knowledge of incident gamma-ray directions. Therefore, several methods of reducing the background rate to acceptable levels were provided, and onboard track reconstruction methods were created and tested. GRB detection was tested for two background filters and two track reconstruction methods for simulated bursts that had realistic light curves and spectral characteristics. With prototype background cuts, track reconstruction, and burst detection algorithms, the LAT burst detection requirements were exceeded. Suggestions were offered about how to enhance burst detection performance in the coming months before GLAST is launched
    corecore