9,108 research outputs found

    Multivariate Steepest Ascent Using Bayesian Reliability

    Get PDF
    The path of steepest ascent can used to optimize a response in an experiment, but problems can occur with multiple responses. Past approaches to this issue such as Del Castillo’s overlap of confidence cones and Mee and Xiao’s Pareto Optimality, have not considered the correlations of the responses or parameter uncertainty. We propose a new method using the Bayesian reliability to calculate this direction. We utilize this method with four examples: a 2 factor, 2-response experiment where the paths of steepest ascent are similar, ensuring our results match Del Castillo’s and Mee and Xiao’s; a 2 factor, 2-response experiment with disparate paths of steepest ascent illustrating the importance of the Bayesian reliability; two simulation examples, showing parameter uncertainty is considered; and a 5 factor, 2-response experiment proving this method is not dimensional limited. With a Bayesian reliable point, a direction in multivariate steepest ascent can be found

    A control algorithm for autonomous optimization of extracellular recordings

    Get PDF
    This paper develops a control algorithm that can autonomously position an electrode so as to find and then maintain an optimal extracellular recording position. The algorithm was developed and tested in a two-neuron computational model representative of the cells found in cerebral cortex. The algorithm is based on a stochastic optimization of a suitably defined signal quality metric and is shown capable of finding the optimal recording position along representative sampling directions, as well as maintaining the optimal signal quality in the face of modeled tissue movements. The application of the algorithm to acute neurophysiological recording experiments and its potential implications to chronic recording electrode arrays are discussed

    Multidisciplinary Design Optimization for Space Applications

    Get PDF
    Multidisciplinary Design Optimization (MDO) has been increasingly studied in aerospace engineering with the main purpose of reducing monetary and schedule costs. The traditional design approach of optimizing each discipline separately and manually iterating to achieve good solutions is substituted by exploiting the interactions between the disciplines and concurrently optimizing every subsystem. The target of the research was the development of a flexible software suite capable of concurrently optimizing the design of a rocket propellant launch vehicle for multiple objectives. The possibility of combining the advantages of global and local searches have been exploited in both the MDO architecture and in the selected and self developed optimization methodologies. Those have been compared according to computational efficiency and performance criteria. Results have been critically analyzed to identify the most suitable optimization approach for the targeted MDO problem

    Unbundling Policy in the United States Players, Outcomes and Effects

    Get PDF
    Building on attempts during the 1980s to establish principles of Open Network Architecture (ONA), unbundling obligations became a cornerstone of the framework for local competition devised by the Telecommunications Act of 1996. Several of the regulations developed by the Federal Communications Commission (FCC), including the impairment test to assess whether a network element had to be unbundled, the TELRIC pricing method, the obligation to re-bundle network elements to service platforms and the unbundling provisions for broadband networks were challenged repeatedly in court. In response to multiple defeats of earlier rules, the FCC had to refine its approach and define unbundling obligations more narrowly. Effective as of March 11th, 2005, unbundling obligations will essentially be limited to the local copper loop, dedicated interoffice transportation on routes connecting small markets, and high-capacity loops in small markets. Carriers presently using unbundled network elements that do not qualify under the new rules will have to transition to alternative solutions within 12-18 months. During this period, the FCC has set higher ceiling prices for these unbundled network elements. The Commission affirmed the elimination in 2003 of its unbundling obligations in broadband markets.Unbundling; voice; broadband

    Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models

    Get PDF
    This chapter gives a survey on the use of statistical designs for what-if analysis in simula- tion, including sensitivity analysis, optimization, and validation/verification. Sensitivity analysis is divided into two phases. The first phase is a pilot stage, which consists of screening or searching for the important factors among (say) hundreds of potentially important factors. A novel screening technique is presented, namely sequential bifurcation. The second phase uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as a metamodel or a response surface. Regression analysis gives better results when the simu- lation experiment is well designed, using either classical statistical designs (such as frac- tional factorials) or optimal designs (such as pioneered by Fedorov, Kiefer, and Wolfo- witz). To optimize the simulated system, the analysts may apply Response Surface Metho- dology (RSM); RSM combines regression analysis, statistical designs, and steepest-ascent hill-climbing. To validate a simulation model, again regression analysis and statistical designs may be applied. Several numerical examples and case-studies illustrate how statisti- cal techniques can reduce the ad hoc character of simulation; that is, these statistical techniques can make simulation studies give more general results, in less time. Appendix 1 summarizes confidence intervals for expected values, proportions, and quantiles, in termi- nating and steady-state simulations. Appendix 2 gives details on four variance reduction techniques, namely common pseudorandom numbers, antithetic numbers, control variates or regression sampling, and importance sampling. Appendix 3 describes jackknifing, which may give robust confidence intervals.least squares;distribution-free;non-parametric;stopping rule;run-length;Von Neumann;median;seed;likelihood ratio
    • …
    corecore