343,445 research outputs found

    Developing a pressure ulcer risk factor minimum data set and risk assessment framework

    Get PDF
    AIM: To agree a draft pressure ulcer risk factor Minimum Data Set to underpin the development of a new evidenced-based Risk Assessment Framework.BACKGROUND: A recent systematic review identified the need for a pressure ulcer risk factor Minimum Data Set and development and validation of an evidenced-based pressure ulcer Risk Assessment Framework. This was undertaken through the Pressure UlceR Programme Of reSEarch (RP-PG-0407-10056), funded by the National Institute for Health Research and incorporates five phases. This article reports phase two, a consensus study.DESIGN: Consensus study.METHOD: A modified nominal group technique based on the Research and Development/University of California at Los Angeles appropriateness method. This incorporated an expert group, review of the evidence and the views of a Patient and Public Involvement service user group. Data were collected December 2010-December 2011.FINDINGS: The risk factors and assessment items of the Minimum Data Set (including immobility, pressure ulcer and skin status, perfusion, diabetes, skin moisture, sensory perception and nutrition) were agreed. In addition, a draft Risk Assessment Framework incorporating all Minimum Data Set items was developed, comprising a two stage assessment process (screening and detailed full assessment) and decision pathways.CONCLUSION: The draft Risk Assessment Framework will undergo further design and pre-testing with clinical nurses to assess and improve its usability. It will then be evaluated in clinical practice to assess its validity and reliability. The Minimum Data Set could be used in future for large scale risk factor studies informing refinement of the Risk Assessment Framework

    Robust filtering for bilinear uncertain stochastic discrete-time systems

    Get PDF
    Copyright [2002] IEEE. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Brunel University's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected]. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.This paper deals with the robust filtering problem for uncertain bilinear stochastic discrete-time systems with estimation error variance constraints. The uncertainties are allowed to be norm-bounded and enter into both the state and measurement matrices. We focus on the design of linear filters, such that for all admissible parameter uncertainties, the error state of the bilinear stochastic system is mean square bounded, and the steady-state variance of the estimation error of each state is not more than the individual prespecified value. It is shown that the design of the robust filters can be carried out by solving some algebraic quadratic matrix inequalities. In particular, we establish both the existence conditions and the explicit expression of desired robust filters. A numerical example is included to show the applicability of the present method

    A scenario approach for non-convex control design

    Full text link
    Randomized optimization is an established tool for control design with modulated robustness. While for uncertain convex programs there exist randomized approaches with efficient sampling, this is not the case for non-convex problems. Approaches based on statistical learning theory are applicable to non-convex problems, but they usually are conservative in terms of performance and require high sample complexity to achieve the desired probabilistic guarantees. In this paper, we derive a novel scenario approach for a wide class of random non-convex programs, with a sample complexity similar to that of uncertain convex programs and with probabilistic guarantees that hold not only for the optimal solution of the scenario program, but for all feasible solutions inside a set of a-priori chosen complexity. We also address measure-theoretic issues for uncertain convex and non-convex programs. Among the family of non-convex control- design problems that can be addressed via randomization, we apply our scenario approach to randomized Model Predictive Control for chance-constrained nonlinear control-affine systems.Comment: Submitted to IEEE Transactions on Automatic Contro

    Understanding Deutsch's probability in a deterministic multiverse

    Get PDF
    Difficulties over probability have often been considered fatal to the Everett interpretation of quantum mechanics. Here I argue that the Everettian can have everything she needs from `probability' without recourse to indeterminism, ignorance, primitive identity over time or subjective uncertainty: all she needs is a particular *rationality principle*. The decision-theoretic approach recently developed by Deutsch and Wallace claims to provide just such a principle. But, according to Wallace, decision theory is itself applicable only if the correct attitude to a future Everettian measurement outcome is subjective uncertainty. I argue that subjective uncertainty is not to be had, but I offer an alternative interpretation that enables the Everettian to live without uncertainty: we can justify Everettian decision theory on the basis that an Everettian should *care about* all her future branches. The probabilities appearing in the decision-theoretic representation theorem can then be interpreted as the degrees to which the rational agent cares about each future branch. This reinterpretation, however, reduces the intuitive plausibility of one of the Deutsch-Wallace axioms (Measurement Neutrality).Comment: 34 pages (excluding bibliography); no figures. To appear in Studies in the History and Philosophy of Modern Physics, Septamber 2004. Replaced to include changes made during referee and editorial review (abstract extended; arrangement and presentation of material in sections 4.1, 5.3, 5.4 altered significantly; minor changes elsewhere

    Quantum Probabilities as Behavioral Probabilities

    Full text link
    We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.Comment: Latex file, 32 page

    Anticipating climate change: knowledge use in participatory flood management in the river Meuse

    Get PDF
    Given the latest knowledge on climate change, the Dutch government wants to anticipate the increased risk of flooding. For the river Meuse in The Netherlands, the design discharge is estimated to increase from 3800m3/s to 4600m3/s. With the existing policy of “Room for the River”, this increase is to be accommodated without raising the dikes. At the same time the floodplains are often claimed for other functions, e.g. new housing or industrial estates. In 2001 the Ministry of Transport, Public Works and Water Management started the study “Integrated assessment of the river Meuse (IVM)” with the objectives of making an inventory of the probable physical effects of a design flood, assuming climate change, on the river Meuse in 2050, investigating possible spatial and technical measures to mitigate these effects, and finally combining various measures to create an integral strategy for flood protection, while at the same time increasing spatial quality. This paper presents the results of research into the decision making process that took place in order to achieve these objectives. Special attention was given to the role of scientific and technical knowledge in the decision making process, e.g. by investigating the effect of the quality of input data on acceptance by stakeholders, and the interactive use of a decision support system to visualise hydraulic effects. Conclusions on successes and pitfalls are drawn from observation and interviews with participants. It demonstrates how it is possible to integrate the necessary, technically complex knowledge in a political debate with stakeholders on how to deal with flood risk. Furthermore, the experience indicates in what area improvements could be made

    Sequential Randomized Algorithms for Convex Optimization in the Presence of Uncertainty

    Full text link
    In this paper, we propose new sequential randomized algorithms for convex optimization problems in the presence of uncertainty. A rigorous analysis of the theoretical properties of the solutions obtained by these algorithms, for full constraint satisfaction and partial constraint satisfaction, respectively, is given. The proposed methods allow to enlarge the applicability of the existing randomized methods to real-world applications involving a large number of design variables. Since the proposed approach does not provide a priori bounds on the sample complexity, extensive numerical simulations, dealing with an application to hard-disk drive servo design, are provided. These simulations testify the goodness of the proposed solution.Comment: 18 pages, Submitted for publication to IEEE Transactions on Automatic Contro
    corecore