513,419 research outputs found

    Statistical Process Control And Analytical Hierarchy Process Methods For Reducing Earth Resistance

    Get PDF
    Security or protection systems at generators, substations, transmission, and housing and others, especially to overcome lightning disturbances and leakage currents, the earthing system must meet the required standards. For each part of the system, from transmission, substation and residential electricity utilization as well as other parts, the standard of earth resistance is not the same, for example for housing a maximum of 5 Ω, for a small generator system a maximum of 10 Ω, for a large generator a maximum of 20 Ω and so on etc. In areas where the soil type resistance is high, it is necessary to reduce the value of the earth resistance so that it reaches the Indonesian national standard. There are many ways to reduce the value of earth resistance, including lowering soil resistivity, adding electrodes to the soil, changing the type of electrode and its diameter, and so on. The aim of the research was to determine the effect of the value of earthing resistance on the depth of electrode planting and the type of soil that is affected by the water content in the soil and to apply simple statistical tools, namely Statistical Process Control (SPC) and Anlytical Hierarchy Process (AHP).The measurement method in this research uses a three-point system, namely one point for the test electrode and two points for the auxiliary electrode and the measuring instrument used to measure earth resistance is a digital earth resistance tester type digital model 4105A. This tool is designed according to international electrical commission (IEC) standards. To analyze reducing earth resistance using Statistical Process Control (SPC) and Analytical Hierarchy Process (AHP) methods. The design of this research is to answer the problems and research objectives that have been planned, namely to determine the effect of soil type on earth resistance and to analyze efforts to reduce the value of earth resistance to achieve standards with statistical process control (SPC) and analytical hierarchy process (AHP) methods. To find potential causes, it is done by calculating the analytical hierarchy process (AHP) in order to obtain a sequence of problems to be solved. If the consistency ratio value is more than 10%, then the data judgment must be corrected, but if the consistency ratio is less than or equal to 10% then the calculation results are declared correct or accepted. From the results of the AHP calculation, it is then verified by testing the earth resistance

    Indicators and methods for assessing the quality of logistic activity processes

    Get PDF
    Purpose: This article is aimed at identifying and evaluating the quality and safety indicators of processes in the logistics system and solving the problems of product control in the goods’ distribution process. Design/Methodology/Approach: In order to assess the risks and quality of control methods in the goods’ distribution processes, studies were carried out in the process of grain supply, on which the risk assessment was tested using the fault tree using a qualitative approach with a deductive logic, which allowed to identify events at the lower levels of the system. To evaluate the results when comparing various methods of monitoring the characteristics of products in the product distribution process certain statistical tools were used. The evaluation with comparative tests is required in order to determine the way of measuring products in the goods distribution logistics system. The study uses the methods of formalization, analysis, measurement, experimental and comparison. Findings: The considered risk assessment method and the given example allow us to recommend its use for the product distribution processes for various purposes. A technique is proposed for comparing various control methods based on statistical tools that can be recommended for various goods’ distribution operations. Practical implications: The results of the study can be applied in practice to improve the quality of goods’ distribution processes and reduce risks in the various supply chains. Originality/value: The main contribution of this study is to shift the emphasis on the assessment of processes in goods’ distribution to the positions of a risk-based approach and the use of various statistical tools in logistics’ activities.peer-reviewe

    Statistical Reconstruction of Qutrits

    Full text link
    We discuss a procedure of measurement followed by the reproduction of the quantum state of a three-level optical system - a frequency- and spatially degenerate two-photon field. The method of statistical estimation of the quantum state based on solving the likelihood equation and analyzing the statistical properties of the obtained estimates is developed. Using the root approach of estimating quantum states, the initial two-photon state vector is reproduced from the measured fourth moments in the field . The developed approach applied to quantum states reconstruction is based on the amplitudes of mutually complementary processes. Classical algorithm of statistical estimation based on the Fisher information matrix is generalized to the case of quantum systems obeying Bohr's complementarity principle. It has been experimentally proved that biphoton-qutrit states can be reconstructed with the fidelity of 0.995-0.999 and higher.Comment: Submitted to Physical Review

    Active Sampling-based Binary Verification of Dynamical Systems

    Full text link
    Nonlinear, adaptive, or otherwise complex control techniques are increasingly relied upon to ensure the safety of systems operating in uncertain environments. However, the nonlinearity of the resulting closed-loop system complicates verification that the system does in fact satisfy those requirements at all possible operating conditions. While analytical proof-based techniques and finite abstractions can be used to provably verify the closed-loop system's response at different operating conditions, they often produce conservative approximations due to restrictive assumptions and are difficult to construct in many applications. In contrast, popular statistical verification techniques relax the restrictions and instead rely upon simulations to construct statistical or probabilistic guarantees. This work presents a data-driven statistical verification procedure that instead constructs statistical learning models from simulated training data to separate the set of possible perturbations into "safe" and "unsafe" subsets. Binary evaluations of closed-loop system requirement satisfaction at various realizations of the uncertainties are obtained through temporal logic robustness metrics, which are then used to construct predictive models of requirement satisfaction over the full set of possible uncertainties. As the accuracy of these predictive statistical models is inherently coupled to the quality of the training data, an active learning algorithm selects additional sample points in order to maximize the expected change in the data-driven model and thus, indirectly, minimize the prediction error. Various case studies demonstrate the closed-loop verification procedure and highlight improvements in prediction error over both existing analytical and statistical verification techniques.Comment: 23 page

    Closed-Loop Statistical Verification of Stochastic Nonlinear Systems Subject to Parametric Uncertainties

    Full text link
    This paper proposes a statistical verification framework using Gaussian processes (GPs) for simulation-based verification of stochastic nonlinear systems with parametric uncertainties. Given a small number of stochastic simulations, the proposed framework constructs a GP regression model and predicts the system's performance over the entire set of possible uncertainties. Included in the framework is a new metric to estimate the confidence in those predictions based on the variance of the GP's cumulative distribution function. This variance-based metric forms the basis of active sampling algorithms that aim to minimize prediction error through careful selection of simulations. In three case studies, the new active sampling algorithms demonstrate up to a 35% improvement in prediction error over other approaches and are able to correctly identify regions with low prediction confidence through the variance metric.Comment: 8 pages, submitted to ACC 201
    • …
    corecore