38,584 research outputs found

    The effect of data aggregation interval on voltage results

    Get PDF
    For various technical and operational reasons, many power quality surveys are carried out using non-standard data aggregation intervals. The data aggregation interval is the time interval that rapidly sampled data is reduced to by the monitoring instrument for subsequent analysis and reporting. Some of the rationales for using non-standard data aggregation intervals include instrumentation limitations, memory restrictions, a belief that more insights may be obtained from data captured at faster aggregation intervals and dual use of instrumentation (such is the case for many smart revenue meters). There is much conjecture over the effect which the data aggregation interval will have on the final outcomes of a power quality survey. IEC61000-4-30 which is the international standard describing power quality monitoring methodology suggests 10 minute data aggregation intervals are appropriate for routine power quality monitoring of most power quality disturbances including magnitude of supply voltage. This paper investigates the variation observed for magnitude of supply voltage monitoring when data is captured at a range of data aggregation intervals

    IP Scoring Rules: Foundations and Applications

    Get PDF

    A two-step fusion process for multi-criteria decision applied to natural hazards in mountains

    Get PDF
    Mountain river torrents and snow avalanches generate human and material damages with dramatic consequences. Knowledge about natural phenomenona is often lacking and expertise is required for decision and risk management purposes using multi-disciplinary quantitative or qualitative approaches. Expertise is considered as a decision process based on imperfect information coming from more or less reliable and conflicting sources. A methodology mixing the Analytic Hierarchy Process (AHP), a multi-criteria aid-decision method, and information fusion using Belief Function Theory is described. Fuzzy Sets and Possibilities theories allow to transform quantitative and qualitative criteria into a common frame of discernment for decision in Dempster-Shafer Theory (DST ) and Dezert-Smarandache Theory (DSmT) contexts. Main issues consist in basic belief assignments elicitation, conflict identification and management, fusion rule choices, results validation but also in specific needs to make a difference between importance and reliability and uncertainty in the fusion process

    Generalized basic probability assignments

    Get PDF
    Dempster-Shafer theory allows to construct belief functions from (precise) basic probability assignments. The present paper extends this idea substantially. By considering SETS of basic probability assignments, an appealing constructive approach to general interval probability (general imprecise probabilities) is achieved, which allows for a very flexible modelling of uncertain knowledge

    Evaluation of Corporate Sustainability

    Get PDF
    As a consequence of an increasing demand in sustainable development for business organizations, the evaluation of corporate sustainability has become a topic intensively focused by academic researchers and business practitioners. Several techniques in the context of multiple criteria decision analysis (MCDA) have been suggested to facilitate the evaluation and the analysis of sustainability performance. However, due to the complexity of evaluation, such as a compilation of quantitative and qualitative measures, interrelationships among various sustainability criteria, the assessor’s hesitation in scoring, or incomplete information, simple techniques may not be able to generate reliable results which can reflect the overall sustainability performance of a company. This paper proposes a series of mathematical formulations based upon the evidential reasoning (ER) approach which can be used to aggregate results from qualitative judgments with quantitative measurements under various types of complex and uncertain situations. The evaluation of corporate sustainability through the ER model is demonstrated using actual data generated from three sugar manufacturing companies in Thailand. The proposed model facilitates managers in analysing the performance and identifying improvement plans and goals. It also simplifies decision making related to sustainable development initiatives. The model can be generalized to a wider area of performance assessment, as well as to any cases of multiple criteria analysis

    Loss Distribution Approach for Operational Risk Capital Modelling under Basel II: Combining Different Data Sources for Risk Estimation

    Full text link
    The management of operational risk in the banking industry has undergone significant changes over the last decade due to substantial changes in operational risk environment. Globalization, deregulation, the use of complex financial products and changes in information technology have resulted in exposure to new risks very different from market and credit risks. In response, Basel Committee for banking Supervision has developed a regulatory framework, referred to as Basel II, that introduced operational risk category and corresponding capital requirements. Over the past five years, major banks in most parts of the world have received accreditation under the Basel II Advanced Measurement Approach (AMA) by adopting the loss distribution approach (LDA) despite there being a number of unresolved methodological challenges in its implementation. Different approaches and methods are still under hot debate. In this paper, we review methods proposed in the literature for combining different data sources (internal data, external data and scenario analysis) which is one of the regulatory requirement for AMA

    Uncertainty Analysis of the Adequacy Assessment Model of a Distributed Generation System

    Full text link
    Due to the inherent aleatory uncertainties in renewable generators, the reliability/adequacy assessments of distributed generation (DG) systems have been particularly focused on the probabilistic modeling of random behaviors, given sufficient informative data. However, another type of uncertainty (epistemic uncertainty) must be accounted for in the modeling, due to incomplete knowledge of the phenomena and imprecise evaluation of the related characteristic parameters. In circumstances of few informative data, this type of uncertainty calls for alternative methods of representation, propagation, analysis and interpretation. In this study, we make a first attempt to identify, model, and jointly propagate aleatory and epistemic uncertainties in the context of DG systems modeling for adequacy assessment. Probability and possibility distributions are used to model the aleatory and epistemic uncertainties, respectively. Evidence theory is used to incorporate the two uncertainties under a single framework. Based on the plausibility and belief functions of evidence theory, the hybrid propagation approach is introduced. A demonstration is given on a DG system adapted from the IEEE 34 nodes distribution test feeder. Compared to the pure probabilistic approach, it is shown that the hybrid propagation is capable of explicitly expressing the imprecision in the knowledge on the DG parameters into the final adequacy values assessed. It also effectively captures the growth of uncertainties with higher DG penetration levels
    • 

    corecore