208 research outputs found

    Constrained Optimal Synthesis and Robustness Analysis by Randomized Algorithms

    Full text link
    In this paper, we consider robust control using randomized algorithms. We extend the existing order statistics distribution theory to the general case in which the distribution of population is not assumed to be continuous and the order statistics is associated with certain constraints. In particular, we derive an inequality on distribution for related order statistics. Moreover, we also propose two different approaches in searching reliable solutions to the robust analysis and optimal synthesis problems under constraints. Furthermore, minimum computational effort is investigated and bounds for sample size are derived.Comment: 14 pages, 2 figure

    Risk Analysis in Robust Control -- Making the Case for Probabilistic Robust Control

    Full text link
    This paper offers a critical view of the "worst-case" approach that is the cornerstone of robust control design. It is our contention that a blind acceptance of worst-case scenarios may lead to designs that are actually more dangerous than designs based on probabilistic techniques with a built-in risk factor. The real issue is one of modeling. If one accepts that no mathematical model of uncertainties is perfect then a probabilistic approach can lead to more reliable control even if it cannot guarantee stability for all possible cases. Our presentation is based on case analysis. We first establish that worst-case is not necessarily "all-encompassing." In fact, we show that for some uncertain control problems to have a conventional robust control solution it is necessary to make assumptions that leave out some feasible cases. Once we establish that point, we argue that it is not uncommon for the risk of unaccounted cases in worst-case design to be greater than that of the accepted risk in a probabilistic approach. With an example, we quantify the risks and show that worst-case can be significantly more risky. Finally, we join our analysis with existing results on computational complexity and probabilistic robustness to argue that the deterministic worst-case analysis is not necessarily the better tool.Comment: 22 pages, 2 figure

    A Statistical Theory for the Analysis of Uncertain Systems

    Full text link
    This paper addresses the issues of conservativeness and computational complexity of probabilistic robustness analysis. We solve both issues by defining a new sampling strategy and robustness measure. The new measure is shown to be much less conservative than the existing one. The new sampling strategy enables the definition of efficient hierarchical sample reuse algorithms that reduce significantly the computational complexity and make it independent of the dimension of the uncertainty space. Moreover, we show that there exists a one to one correspondence between the new and the existing robustness measures and provide a computationally simple algorithm to derive one from the other.Comment: 32 pages, 15 figure

    Probabilistic Robustness Analysis -- Risks, Complexity and Algorithms

    Full text link
    It is becoming increasingly apparent that probabilistic approaches can overcome conservatism and computational complexity of the classical worst-case deterministic framework and may lead to designs that are actually safer. In this paper we argue that a comprehensive probabilistic robustness analysis requires a detailed evaluation of the robustness function and we show that such evaluation can be performed with essentially any desired accuracy and confidence using algorithms with complexity linear in the dimension of the uncertainty space. Moreover, we show that the average memory requirements of such algorithms are absolutely bounded and well within the capabilities of today's computers. In addition to efficiency, our approach permits control over statistical sampling error and the error due to discretization of the uncertainty radius. For a specific level of tolerance of the discretization error, our techniques provide an efficiency improvement upon conventional methods which is inversely proportional to the accuracy level; i.e., our algorithms get better as the demands for accuracy increase.Comment: 28 pages, 5 figure

    On the Binomial Confidence Interval and Probabilistic Robust Control

    Full text link
    The Clopper-Pearson confidence interval has ever been documented as an exact approach in some statistics literature. More recently, such approach of interval estimation has been introduced to probabilistic control theory and has been referred as non-conservative in control community. In this note, we clarify the fact that the so-called exact approach is actually conservative. In particular, we derive analytic results demonstrating the extent of conservatism in the context of probabilistic robustness analysis. This investigation encourages seeking better methods of confidence interval construction for robust control purpose.Comment: 6 pages, 1 figur

    Fast Construction of Robustness Degradation Function

    Full text link
    We develop a fast algorithm to construct the robustness degradation function, which describes quantitatively the relationship between the proportion of systems guaranteeing the robustness requirement and the radius of the uncertainty set. This function can be applied to predict whether a controller design based on an inexact mathematical model will perform satisfactorily when implemented on the true system.Comment: 16 pages, 8 figure

    Explicit Formula for Constructing Binomial Confidence Interval with Guaranteed Coverage Probability

    Full text link
    In this paper, we derive an explicit formula for constructing the confidence interval of binomial parameter with guaranteed coverage probability. The formula overcomes the limitation of normal approximation which is asymptotic in nature and thus inevitably introduce unknown errors in applications. Moreover, the formula is very tight in comparison with classic Clopper-Pearson's approach from the perspective of interval width. Based on the rigorous formula, we also obtain approximate formulas with excellent performance of coverage probability.Comment: 20 pages, 27 figure

    Sample Reuse Techniques of Randomized Algorithms for Control under Uncertainty

    Full text link
    Sample reuse techniques have significantly reduced the numerical complexity of probabilistic robustness analysis. Existing results show that for a nested collection of hyper-spheres the complexity of the problem of performing NN equivalent i.i.d. (identical and independent) experiments for each sphere is absolutely bounded, independent of the number of spheres and depending only on the initial and final radii. In this chapter we elevate sample reuse to a new level of generality and establish that the numerical complexity of performing NN equivalent i.i.d. experiments for a chain of sets is absolutely bounded if the sets are nested. Each set does not even have to be connected, as long as the nested property holds. Thus, for example, the result permits the integration of deterministic and probabilistic analysis to eliminate regions from an uncertainty set and reduce even further the complexity of some problems. With a more general view, the result enables the analysis of complex decision problems mixing real-valued and discrete-valued random variables.Comment: 13 pages, 1 figur

    Fast Parallel Frequency Sweeping Algorithms for Robust D{\cal D}-Stability Margin

    Full text link
    This paper considers the robust D{\cal D}-stability margin problem under polynomic structured real parametric uncertainty. Based on the work of De Gaston and Safonov (1988), we have developed techniques such as, a parallel frequency sweeping strategy, different domain splitting schemes, which significantly reduce the computational complexity and guarantee the convergence.Comment: 27 pages, 14 figure

    Parallel Branch and Bound Algorithm for Computing Maximal Structured Singular Value

    Full text link
    In this paper, we have developed a parallel branch and bound algorithm which computes the maximal structured singular value μ\mu without tightly bounding μ\mu for each frequency and thus significantly reduce the computational complexity.Comment: 10 pages, 4 figure
    • …
    corecore