179,419 research outputs found

    An integrated aerospace requirement setting and risk analysis tool for life cycle cost reduction and system design improvement

    Get PDF
    In the early conceptual stage of the service orientated model, decisions regarding the design of a new technical product are largely influenced by Service Requirements. Those decisions, therefore, have to merge both technical and business aspects to obtain desired product reliability and reduced Whole Life Cost (WLC). It is, therefore, critical at that phase to define the risk of potential noncompliance of Service Requirements in order to ensure the right design choices; as these decisions have a large impact on the overall product and service development. This paper presents outcome of research project to investigate different approaches used by companies to analyse Service Requirements to achieve reduced Life Cycle Cost (LCC). Analysis using Weibull distribution and Monte Carlo principle have been proposed here; based on the conducted literature review these are considered as the most widely used techniques in product reliability studies. Based on those techniques, a methodology and its software tool for risk evaluation of failure to deliver a new product against Service Requirements are presented in this paper. This is part of the on-going research project which, apart from analysing the gap between the current Service Requirements achievements and the design targets for a new aircraft engine, it also facilitates an optimisation of those requirements at the minimum risk of nonconformity

    SRAT-Distribution Voltage Sags and Reliability Assessment Tool

    Get PDF
    Interruptions to supply and sags of distribution system voltage are the main aspects causing customer complaints. There is a need for analysis of supply reliability and voltage sag to relate system performance with network structure and equipment design parameters. This analysis can also give prediction of voltage dips, as well as relating traditional reliability and momentary outage measures to the properties of protection systems and to network impedances. Existing reliability analysis software often requires substantial training, lacks automated facilities, and suffers from data availability. Thus it requires time-consuming manual intervention for the study of large networks. A user-friendly sag and reliability assessment tool (SRAT) has been developed based on existing impedance data, protection characteristics, and a model of failure probability. The new features included in SRAT are a) efficient reliability and sag assessments for a radial network with limited loops, b) reliability evaluation associated with realistic protection and restoration schemes, c) inclusion of momentary outages in the same model as permanent outage evaluation, d) evaluation of the sag transfer through meshed subtransmission network, and e) simplified probability distribution model determined from available faults records. Examples of the application of the tools to an Australian distribution network are used to illustrate the application of this model

    Open TURNS: An industrial software for uncertainty quantification in simulation

    Full text link
    The needs to assess robust performances for complex systems and to answer tighter regulatory processes (security, safety, environmental control, and health impacts, etc.) have led to the emergence of a new industrial simulation challenge: to take uncertainties into account when dealing with complex numerical simulation frameworks. Therefore, a generic methodology has emerged from the joint effort of several industrial companies and academic institutions. EDF R&D, Airbus Group and Phimeca Engineering started a collaboration at the beginning of 2005, joined by IMACS in 2014, for the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial challenges attached to uncertainties, which are transparency, genericity, modularity and multi-accessibility. This paper focuses on OpenTURNS and presents its main features: openTURNS is an open source software under the LGPL license, that presents itself as a C++ library and a Python TUI, and which works under Linux and Windows environment. All the methodological tools are described in the different sections of this paper: uncertainty quantification, uncertainty propagation, sensitivity analysis and metamodeling. A section also explains the generic wrappers way to link openTURNS to any external code. The paper illustrates as much as possible the methodological tools on an educational example that simulates the height of a river and compares it to the height of a dyke that protects industrial facilities. At last, it gives an overview of the main developments planned for the next few years

    Quality measures for ETL processes: from goals to implementation

    Get PDF
    Extraction transformation loading (ETL) processes play an increasingly important role for the support of modern business operations. These business processes are centred around artifacts with high variability and diverse lifecycles, which correspond to key business entities. The apparent complexity of these activities has been examined through the prism of business process management, mainly focusing on functional requirements and performance optimization. However, the quality dimension has not yet been thoroughly investigated, and there is a need for a more human-centric approach to bring them closer to business-users requirements. In this paper, we take a first step towards this direction by defining a sound model for ETL process quality characteristics and quantitative measures for each characteristic, based on existing literature. Our model shows dependencies among quality characteristics and can provide the basis for subsequent analysis using goal modeling techniques. We showcase the use of goal modeling for ETL process design through a use case, where we employ the use of a goal model that includes quantitative components (i.e., indicators) for evaluation and analysis of alternative design decisions.Peer ReviewedPostprint (author's final draft

    Application of six sigma methodology to reduce defects of a grinding process

    Get PDF
    Six Sigma is a data-driven leadership approach using specific tools and methodologies that lead to fact-based decision making. This paper deals with the application of the Six Sigma methodology in reducing defects in a fine grinding process of an automotive company in India. The DMAIC (Define–Measure–Analyse–Improve–Control) approach has been followed here to solve the underlying problem of reducing process variation and improving the process yield. This paper explores how a manufacturing process can use a systematic methodology to move towards world-class quality level. The application of the Six Sigma methodology resulted in reduction of defects in the fine grinding process from 16.6 to 1.19%. The DMAIC methodology has had a significant financial impact on the profitability of the company in terms of reduction in scrap cost, man-hour saving on rework and increased output. A saving of approximately US$2.4 million per annum was reported from this project

    Using quality models in software package selection

    Get PDF
    The growing importance of commercial off-the-shelf software packages requires adapting some software engineering practices, such as requirements elicitation and testing, to this emergent framework. Also, some specific new activities arise, among which selection of software packages plays a prominent role. All the methodologies that have been proposed recently for choosing software packages compare user requirements with the packages' capabilities. There are different types of requirements, such as managerial, political, and, of course, quality requirements. Quality requirements are often difficult to check. This is partly due to their nature, but there is another reason that can be mitigated, namely the lack of structured and widespread descriptions of package domains (that is, categories of software packages such as ERP systems, graphical or data structure libraries, and so on). This absence hampers the accurate description of software packages and the precise statement of quality requirements, and consequently overall package selection and confidence in the result of the process. Our methodology for building structured quality models helps solve this drawback.Peer ReviewedPostprint (published version

    Big Data Analytics for QoS Prediction Through Probabilistic Model Checking

    Get PDF
    As competitiveness increases, being able to guaranting QoS of delivered services is key for business success. It is thus of paramount importance the ability to continuously monitor the workflow providing a service and to timely recognize breaches in the agreed QoS level. The ideal condition would be the possibility to anticipate, thus predict, a breach and operate to avoid it, or at least to mitigate its effects. In this paper we propose a model checking based approach to predict QoS of a formally described process. The continous model checking is enabled by the usage of a parametrized model of the monitored system, where the actual value of parameters is continuously evaluated and updated by means of big data tools. The paper also describes a prototype implementation of the approach and shows its usage in a case study.Comment: EDCC-2014, BIG4CIP-2014, Big Data Analytics, QoS Prediction, Model Checking, SLA compliance monitorin

    Modelling of reduced GB transmission system in PSCAD/EMTDC

    Get PDF
    Energy and environmental issues are two of the greatest challenges facing the world today. In response to energy needs and environmental concerns, renewable energy technologies are now considered the future technologies of choice. Renewable energy is produced from natural sources that are clean and free; however, it is widely accepted that renewable energy is not a solution without challenges. An example of this can be seen in the UK, where there is much interest amongst generation developers in the construction of new large scale onshore and offshore wind farms, especially in Scotland. The stability of electric power systems is also an important issue. It is important to have full knowledge of the system and to be able to predict the behaviour under different situations is an important objective. As a result, several industrial grade power system simulator tools have been developed in order to estimate the behaviour of the electric power system under certain conditions. This paper presents a reduced Great Britain (GB) system model for stability analysis using PSCAD/EMTDC. The reduced model is based upon a future GB transmission system model and, hence, contains different types and mix of generation, HVDC transmission lines and additional interconnection. The model is based on the reduced DIgSILENT PowerFactory model developed by National Grid

    Investigating Automatic Static Analysis Results to Identify Quality Problems: an Inductive Study

    Get PDF
    Background: Automatic static analysis (ASA) tools examine source code to discover "issues", i.e. code patterns that are symptoms of bad programming practices and that can lead to defective behavior. Studies in the literature have shown that these tools find defects earlier than other verification activities, but they produce a substantial number of false positive warnings. For this reason, an alternative approach is to use the set of ASA issues to identify defect prone files and components rather than focusing on the individual issues. Aim: We conducted an exploratory study to investigate whether ASA issues can be used as early indicators of faulty files and components and, for the first time, whether they point to a decay of specific software quality attributes, such as maintainability or functionality. Our aim is to understand the critical parameters and feasibility of such an approach to feed into future research on more specific quality and defect prediction models. Method: We analyzed an industrial C# web application using the Resharper ASA tool and explored if significant correlations exist in such a data set. Results: We found promising results when predicting defect-prone files. A set of specific Resharper categories are better indicators of faulty files than common software metrics or the collection of issues of all issue categories, and these categories correlate to different software quality attributes. Conclusions: Our advice for future research is to perform analysis on file rather component level and to evaluate the generalizability of categories. We also recommend using larger datasets as we learned that data sparseness can lead to challenges in the proposed analysis proces
    corecore