16 research outputs found

    Оценка условных вероятностей байесовской сети доверия при априорной информации о взаимодействии между ее узлами в системе многомодальной аутентификации пользователя

    Get PDF
    Рассмотрен подход, реализованный в разработанной математической модели системы многомодальной аутентификации пользователя, получения оценки условных вероятностей байесовской сети доверия на основе использования нечисловой экспертной, неточной и неполной информации о биометрических параметрах пользователя. Найдены усредненные оценки вероятностных параметров и соответствующие им стандартные отклонения, дающие возможность решить задачу оценки вероятности легитимности пользователя, работающего с системо

    Comparing stochastic design decision belief models : pointwise versus interval probabilities.

    Get PDF
    Decision support systems can either directly support a product designer or support an agent operating within a multi-agent system (MAS). Stochastic based decision support systems require an underlying belief model that encodes domain knowledge. The underlying supporting belief model has traditionally been a probability distribution function (PDF) which uses pointwise probabilities for all possible outcomes. This can present a challenge during the knowledge elicitation process. To overcome this, it is proposed to test the performance of a credal set belief model. Credal sets (sometimes also referred to as p-boxes) use interval probabilities rather than pointwise probabilities and therefore are more easier to elicit from domain experts. The PDF and credal set belief models are compared using a design domain MAS which is able to learn, and thereby refine, the belief model based on its experience. The outcome of the experiment illustrates that there is no significant difference between the PDF based and credal set based belief models in the performance of the MAS

    Optimal Decision-making in Oil Extraction under Imprecise Information

    Get PDF
    AbstractThe issue on investing was examined with probabilities given in the form of interval in oil extraction. The solution ways of decision-making on investment with three alternatives and four criteria have been investigated and alternative which will be invested in has been identified. It is shown that it is more appropriate to use the method based on interval probabilities in order to carry out geological and technical measures during investing, unlikely decision-making based on the classical probabilities

    A new method for failure modes and effects analysis and its application in a hydrokinetic turbine system

    Get PDF
    The traditional failure modes and effects analysis (FMEA) is a conceptual design methodology for dealing with potential failures. FMEA uses the risk priority number (RPN), which is the product of three ranked factors to prioritize risks of different failure modes. The three factors are occurrence, severity, and detection. However, the RPN may not be able to provide consistent evaluation of risks for the following reasons: the RPN has a high degree of subjectivity, it is difficult to compare different RPNs, and possible failures may be overlooked in the traditional FMEA method. The objective of this research is to develop a new FMEA methodology that can overcome the aforementioned drawbacks. The expected cost is adopted to evaluate risks. This will not only reduce the subjectivity in RPNs, but also provide a consistent basis for risk analysis. In addition, the cause-effect chain structures are used in the new methodology. Such structures are constructed based upon failure scenarios, which can include all possible end effects (failures) given a root cause. Consequently, the results of the risk analysis will be more reliable and accurate. In the new methodology, the occurrence and severity ratings are replaced by expected costs. The detection rating is reflected in failure scenarios by the probabilities of either successful or unsuccessful detections of causes or effects. This treatment makes the new methodology more realistic. The new methodology also uses interval variables to accommodate uncertainties due to insufficient data. The new methodology is evaluated and applied to a hydrokinetic turbine system. This turbine is horizontal axis turbine, and it is under development at Missouri S&T --Abstract, page iii

    A multi-step goal programming approach for group decision making with incomplete interval additive reciprocal comparison matrices

    Get PDF
    This article presents a goal programming framework to solve group decision making problems where decision-makers’ judgments are provided as incomplete interval additive reciprocal comparison matrices (IARCMs). New properties of multiplicative consistent IARCMs are put forward and used to define consistent incomplete IARCMs. A two-step goal programming method is developed to estimate missing values for an incomplete IARCM. The first step minimizes the inconsistency of the completed IARCMs and controls uncertainty ratios of the estimated judgments within an acceptable threshold, and the second step finds the most appropriate estimated missing values among the optimal solutions obtained from the previous step. A weighted geometric mean approach is proposed to aggregate individual IARCMs into a group IARCM by employing the lower bounds of the interval additive reciprocal judgments. A two-step procedure consisting of two goal programming models is established to derive interval weights from the group IARCM. The first model is devised to minimize the absolute difference between the logarithm of the group preference and that of the constructed multiplicative consistent judgment. The second model is developed to generate an interval-valued priority vector by maximizing the uncertainty ratio of the constructed consistent IARCM and incorporating the optimal objective value of the first model as a constraint. Two numerical examples are furnished to demonstrate validity and applicability of the proposed approach

    The siren call of probability: dangers associated with using probability for consideration of the future

    Get PDF
    Many tools for thinking about the future employ probability. For example, Delphi studies often ask expert participants to assign probabilities to particular future outcomes. Similarly, while some scenario planners reject probability, others insist that assigning probabilities to scenarios is required to make them meaningful. Formal modelling and forecasting methods often also employ probability in one way or another. The paper questions this widespread use of probability as a device for considering the future, firstly showing that objective probability, based on empirically-observed frequencies, has some well-known drawbacks when used for this purpose. However, what is less-widely acknowledged is that this is also true of the subjective probability used in, for example, Delphi. Subjective probability is less distinct from objective probability than proponents of its use might imply, meaning it therefore suffers from similar problems. The paper draws on the foundations of probability theory as set out by Kolmogorov, as-well-as the work of Keynes, Shackle, Aumann, Tversky and Kahneman, and others, to reassert the essential distinction between risk and uncertainty, and to warn about the dangers of inappropriate use of probability for considering the future. The paper sets out some criteria for appropriate use

    A Pairwise Comparison Matrix Framework for Large-Scale Decision Making

    Get PDF
    abstract: A Pairwise Comparison Matrix (PCM) is used to compute for relative priorities of criteria or alternatives and are integral components of widely applied decision making tools: the Analytic Hierarchy Process (AHP) and its generalized form, the Analytic Network Process (ANP). However, a PCM suffers from several issues limiting its application to large-scale decision problems, specifically: (1) to the curse of dimensionality, that is, a large number of pairwise comparisons need to be elicited from a decision maker (DM), (2) inconsistent and (3) imprecise preferences maybe obtained due to the limited cognitive power of DMs. This dissertation proposes a PCM Framework for Large-Scale Decisions to address these limitations in three phases as follows. The first phase proposes a binary integer program (BIP) to intelligently decompose a PCM into several mutually exclusive subsets using interdependence scores. As a result, the number of pairwise comparisons is reduced and the consistency of the PCM is improved. Since the subsets are disjoint, the most independent pivot element is identified to connect all subsets. This is done to derive the global weights of the elements from the original PCM. The proposed BIP is applied to both AHP and ANP methodologies. However, it is noted that the optimal number of subsets is provided subjectively by the DM and hence is subject to biases and judgement errors. The second phase proposes a trade-off PCM decomposition methodology to decompose a PCM into a number of optimally identified subsets. A BIP is proposed to balance the: (1) time savings by reducing pairwise comparisons, the level of PCM inconsistency, and (2) the accuracy of the weights. The proposed methodology is applied to the AHP to demonstrate its advantages and is compared to established methodologies. In the third phase, a beta distribution is proposed to generalize a wide variety of imprecise pairwise comparison distributions via a method of moments methodology. A Non-Linear Programming model is then developed that calculates PCM element weights which maximizes the preferences of the DM as well as minimizes the inconsistency simultaneously. Comparison experiments are conducted using datasets collected from literature to validate the proposed methodology.Dissertation/ThesisPh.D. Industrial Engineering 201
    corecore