276,240 research outputs found

    Modeling restricted enrollment and optimal cost-efficient design in multicenter clinical trials

    Full text link
    Design and forecasting of patient enrollment is among the greatest challenges that the clinical research enterprize faces today, as inefficient enrollment can be a major cause of drug development delays. Therefore, the development of the innovative statistical and artificial intelligence technologies for improving the efficiency of clinical trials operation are of the imperative need. This paper is describing further developments in the innovative statistical methodology for modeling and forecasting patient enrollment. The underlying technique uses a Poisson-gamma enrollment model developed by Anisimov & Fedorov in the previous publications and is extended here to analytic modeling of the enrollment on country/region level. A new analytic technique based on the approximation of the enrollment process in country/region by a Poisson-gamma process with aggregated parameters is developed. Another innovative direction is the development of the analytic technique for modeling the enrollment under some restrictions (enrollment caps in countries). Some discussion on using historic trials for better prediction of the enrollment in the new trials is provided. These results are used for solving the problem of optimal trial cost-efficient enrollment design: find an optimal allocation of sites/countries that minimizes the global trial cost given that the probability to reach an enrollment target in time is no less than some prescribed probability. Different techniques to find an optimal solution for high dimensional optimization problem for the cases of unrestricted and restricted enrollment and for a small and large number of countries are discussed.Comment: 22 pages, 3 figure

    WIND POWER PROBABILISTIC PREDICTION AND UNCERTAINTY MODELING FOR OPERATION OF LARGE-SCALE POWER SYSTEMS

    Get PDF
    Over the last decade, large scale renewable energy generation has been integrated into power systems. Wind power generation is known as a widely-used and interesting kind of renewable energy generation around the world. However, the high uncertainty of wind power generation leads to some unavoidable error in wind power prediction process; consequently, it makes the optimal operation and control of power systems very challenging. Since wind power prediction error cannot be entirely removed, providing accurate models for wind power uncertainty can assist power system operators in mitigating its negative effects on decision making conditions. There are efficient ways to show the wind power uncertainty, (i) accurate wind power prediction error probability distribution modeling in the form of probability density functions and (ii) construction of reliable and sharp prediction intervals. Construction of accurate probability density functions and high-quality prediction intervals are difficult because wind power time series is non-stationary. In addition, incorporation of probability density functions and prediction intervals in power systems’ decision-making problems are challenging. In this thesis, the goal is to propose comprehensive frameworks for wind power uncertainty modeling in the form of both probability density functions and prediction intervals and incorporation of each model in power systems’ decision-making problems such as look-ahead economic dispatch. To accurately quantify the uncertainty of wind power generation, different approaches are studied, and a comprehensive framework is then proposed to construct the probability density functions using a mixture of beta kernels. The framework outperforms benchmarks because it can validly capture the actual features of wind power probability density function such as main mass, boundaries, high skewness, and fat tails from the wind power sample moments. Also, using the proposed framework, a generic convex model is proposed for chance-constrained look-ahead economic dispatch problems. It allows power system operators to use piecewise linearization techniques to convert the problem to a mixed-integer linear programming problem. Numerical simulations using IEEE 118-bus test system show that compared with widely used sequential linear programming approaches, the proposed mixed-integer linear programming model leads to less system’s total cost. A framework based on the concept of bandwidth selection for a new and flexible kernel density estimator is proposed for construction of prediction intervals. Unlike previous related works, the proposed framework uses neither a cost function-based optimization problem nor point prediction results; rather, a diffusion-based kernel density estimator is utilized to achieve high-quality prediction intervals for non-stationary wind power time series. The proposed prediction interval construction framework is also founded based on a parallel computing procedure to promote the computational efficiency for practical applications in power systems. Simulation results demonstrate the high performance of the proposed framework compared to well-known conventional benchmarks such as bootstrap extreme learning machine, lower upper bound estimation, quantile regression, auto-regressive integrated moving average, and linear programming-based quantile regression. Finally, a new adjustable robust optimization approach is used to incorporate the constructed prediction intervals with the proposed fuzzy and adaptive diffusion estimator-based prediction interval construction framework. However, to accurately model the correlation and dependence structure of wind farms, especially in high dimensional cases, C-Vine copula models are used for prediction interval construction. The simulation results show that uncertainty modeling using C-Vine copula can lead the system operators to get more realistic sense about the level of overall uncertainty in the system, and consequently more conservative results for energy and reserve scheduling are obtained

    The ROMES method for statistical modeling of reduced-order-model error

    Full text link
    This work presents a technique for statistically modeling errors introduced by reduced-order models. The method employs Gaussian-process regression to construct a mapping from a small number of computationally inexpensive `error indicators' to a distribution over the true error. The variance of this distribution can be interpreted as the (epistemic) uncertainty introduced by the reduced-order model. To model normed errors, the method employs existing rigorous error bounds and residual norms as indicators; numerical experiments show that the method leads to a near-optimal expected effectivity in contrast to typical error bounds. To model errors in general outputs, the method uses dual-weighted residuals---which are amenable to uncertainty control---as indicators. Experiments illustrate that correcting the reduced-order-model output with this surrogate can improve prediction accuracy by an order of magnitude; this contrasts with existing `multifidelity correction' approaches, which often fail for reduced-order models and suffer from the curse of dimensionality. The proposed error surrogates also lead to a notion of `probabilistic rigor', i.e., the surrogate bounds the error with specified probability

    Bayesian Updating, Model Class Selection and Robust Stochastic Predictions of Structural Response

    Get PDF
    A fundamental issue when predicting structural response by using mathematical models is how to treat both modeling and excitation uncertainty. A general framework for this is presented which uses probability as a multi-valued conditional logic for quantitative plausible reasoning in the presence of uncertainty due to incomplete information. The fundamental probability models that represent the structure’s uncertain behavior are specified by the choice of a stochastic system model class: a set of input-output probability models for the structure and a prior probability distribution over this set that quantifies the relative plausibility of each model. A model class can be constructed from a parameterized deterministic structural model by stochastic embedding utilizing Jaynes’ Principle of Maximum Information Entropy. Robust predictive analyses use the entire model class with the probabilistic predictions of each model being weighted by its prior probability, or if structural response data is available, by its posterior probability from Bayes’ Theorem for the model class. Additional robustness to modeling uncertainty comes from combining the robust predictions of each model class in a set of competing candidates weighted by the prior or posterior probability of the model class, the latter being computed from Bayes’ Theorem. This higherlevel application of Bayes’ Theorem automatically applies a quantitative Ockham razor that penalizes the data-fit of more complex model classes that extract more information from the data. Robust predictive analyses involve integrals over highdimensional spaces that usually must be evaluated numerically. Published applications have used Laplace's method of asymptotic approximation or Markov Chain Monte Carlo algorithms

    Sequential optimization of strip bending process using multiquadric radial basis function surrogate models

    Get PDF
    Surrogate models are used within the sequential optimization strategy for forming processes. A sequential improvement (SI) scheme is used to refine the surrogate model in the optimal region. One of the popular surrogate modeling methods for SI is Kriging. However, the global response of Kriging models deteriorates in some cases due to local model refinement within SI. This may be problematic for multimodal optimization problems and for other applications where correct prediction of the global response is needed. In this paper the deteriorating global behavior of the Kriging surrogate modeling technique is shown for a model of a strip bending process. It is shown that a Radial Basis Function (RBF) surrogate model with Multiquadric (MQ) basis functions performs equally well in terms of optimization efficiency and better in terms of global predictive accuracy. The local point density is taken into account in the model formulatio
    • …
    corecore