9,735 research outputs found

    Application of State of the Art Modeling Techniques to Predict Flooding and Waves for a Coastal Area within a Protected Bay

    Get PDF
    Flood Insurance Rate Maps (FIRMs) are developed by the Federal Emergency Management Agency (FEMA) to provide guidance in establishing the risk to structures and infrastructure from storm surge sand associated waves in the coastal zone. The maps are used by state agencies and municipalities to help guide coastal planning and establish the minimum elevation and construction standards for new or substantially improved structures. A summary of the methods used and a comparison with the results of 2013 FIRM mapping are presented for Warwick, Rhode Island (RI), a coastal community located within Narragansett Bay. Because of its location, Warwick is protected from significant coastal erosion and wave attacks, but is subject to surge amplification. Concerns surrounding the FEMA methods used in the 2013 FIRM analysis are put in context with the National Research Council’s (NRC) 2009 review of the FEMA coastal mapping program. New mapping is then performed using state of the art, fully coupled surge and wave modeling, and data analysis methods, to address the NRC concerns. The new maps and methodologies are in compliance with FEMA regulations and guidelines. This new approach makes extensive use of the numerical modeling results from the recent US Army Corp of Engineers, North Atlantic Coast Comprehensive Study (NACCS, 2015). Revised flooding maps are presented and compared to the 2013 FIRM maps, to provide insight into the differences. The new maps highlight the importance of developing better estimates of surge dynamics and the advancement in nearshore mapping of waves in flood inundated areas by the use of state of the art, two-dimensional, wave transformation models

    A unified approach to linking experimental, statistical and computational analysis of spike train data

    Get PDF
    A fundamental issue in neuroscience is how to identify the multiple biophysical mechanisms through which neurons generate observed patterns of spiking activity. In previous work, we proposed a method for linking observed patterns of spiking activity to specific biophysical mechanisms based on a state space modeling framework and a sequential Monte Carlo, or particle filter, estimation algorithm. We have shown, in simulation, that this approach is able to identify a space of simple biophysical models that were consistent with observed spiking data (and included the model that generated the data), but have yet to demonstrate the application of the method to identify realistic currents from real spike train data. Here, we apply the particle filter to spiking data recorded from rat layer V cortical neurons, and correctly identify the dynamics of an slow, intrinsic current. The underlying intrinsic current is successfully identified in four distinct neurons, even though the cells exhibit two distinct classes of spiking activity: regular spiking and bursting. This approach – linking statistical, computational, and experimental neuroscience – provides an effective technique to constrain detailed biophysical models to specific mechanisms consistent with observed spike train data.Published versio

    Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models

    Get PDF
    This chapter gives a survey on the use of statistical designs for what-if analysis in simula- tion, including sensitivity analysis, optimization, and validation/verification. Sensitivity analysis is divided into two phases. The first phase is a pilot stage, which consists of screening or searching for the important factors among (say) hundreds of potentially important factors. A novel screening technique is presented, namely sequential bifurcation. The second phase uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as a metamodel or a response surface. Regression analysis gives better results when the simu- lation experiment is well designed, using either classical statistical designs (such as frac- tional factorials) or optimal designs (such as pioneered by Fedorov, Kiefer, and Wolfo- witz). To optimize the simulated system, the analysts may apply Response Surface Metho- dology (RSM); RSM combines regression analysis, statistical designs, and steepest-ascent hill-climbing. To validate a simulation model, again regression analysis and statistical designs may be applied. Several numerical examples and case-studies illustrate how statisti- cal techniques can reduce the ad hoc character of simulation; that is, these statistical techniques can make simulation studies give more general results, in less time. Appendix 1 summarizes confidence intervals for expected values, proportions, and quantiles, in termi- nating and steady-state simulations. Appendix 2 gives details on four variance reduction techniques, namely common pseudorandom numbers, antithetic numbers, control variates or regression sampling, and importance sampling. Appendix 3 describes jackknifing, which may give robust confidence intervals.least squares;distribution-free;non-parametric;stopping rule;run-length;Von Neumann;median;seed;likelihood ratio

    Tracking slow modulations in synaptic gain using dynamic causal modelling : validation in epilepsy

    Get PDF
    In thiswork we propose a proof of principle that dynamic causal modelling can identify plausible mechanisms at the synaptic level underlying brain state changes over a timescale of seconds. As a benchmark example for validation we used intracranial electroencephalographic signals in a human subject. These data were used to infer the (effective connectivity) architecture of synaptic connections among neural populations assumed to generate seizure activity. Dynamic causal modelling allowed us to quantify empirical changes in spectral activity in terms of a trajectory in parameter space -identifying key synaptic parameters or connections that cause observed signals. Using recordings from three seizures in one patient, we considered a network of two sources (within and just outside the putative ictal zone). Bayesian model selection was used to identify the intrinsic (within-source) and extrinsic (between-source) connectivity. Having established the underlying architecture, we were able to track the evolution of key connectivity parameters (e.g., inhibitory connections to superficial pyramidal cells) and test specific hypotheses about the synaptic mechanisms involved in ictogenesis. Our key finding was that intrinsic synaptic changes were sufficient to explain seizure onset, where these changes showed dissociable time courses over several seconds. Crucially, these changes spoke to an increase in the sensitivity of principal cells to intrinsic inhibitory afferents and a transient loss of excitatory-inhibitory balance

    Ultrafast dynamic conductivity and scattering rate saturation of photoexcited charge carriers in silicon investigated with a midinfrared continuum probe

    Full text link
    We employ ultra-broadband terahertz-midinfrared probe pulses to characterize the optical response of photoinduced charge-carrier plasmas in high-resistivity silicon in a reflection geometry, over a wide range of excitation densities (10^{15}-10^{19} cm^{-3}) at room temperature. In contrast to conventional terahertz spectroscopy studies, this enables one to directly cover the frequency range encompassing the resultant plasma frequencies. The intensity reflection spectra of the thermalized plasma, measured using sum-frequency (up-conversion) detection of the probe pulses, can be modeled well by a standard Drude model with a density-dependent momentum scattering time of approx. 200 fs at low densities, reaching approx. 20 fs for densities of approx. 10^{19} cm^{-3}, where the increase of the scattering rate saturates. This behavior can be reproduced well with theoretical results based on the generalized Drude approach for the electron-hole scattering rate, where the saturation occurs due to phase-space restrictions as the plasma becomes degenerate. We also study the initial sub-picosecond temporal development of the Drude response, and discuss the observed rise in the scattering time in terms of initial charge-carrier relaxation, as well as the optical response of the photoexcited sample as predicted by finite-difference time-domain simulations.Comment: 9 pages, 4 figure

    An efficient surrogate model for emulation and physics extraction of large eddy simulations

    Full text link
    In the quest for advanced propulsion and power-generation systems, high-fidelity simulations are too computationally expensive to survey the desired design space, and a new design methodology is needed that combines engineering physics, computer simulations and statistical modeling. In this paper, we propose a new surrogate model that provides efficient prediction and uncertainty quantification of turbulent flows in swirl injectors with varying geometries, devices commonly used in many engineering applications. The novelty of the proposed method lies in the incorporation of known physical properties of the fluid flow as {simplifying assumptions} for the statistical model. In view of the massive simulation data at hand, which is on the order of hundreds of gigabytes, these assumptions allow for accurate flow predictions in around an hour of computation time. To contrast, existing flow emulators which forgo such simplications may require more computation time for training and prediction than is needed for conducting the simulation itself. Moreover, by accounting for coupling mechanisms between flow variables, the proposed model can jointly reduce prediction uncertainty and extract useful flow physics, which can then be used to guide further investigations.Comment: Submitted to JASA A&C
    • …
    corecore