815,828 research outputs found

    Testing a Framework for the Quality of Process Models - A Case Study

    Get PDF
    Process modeling can be regarded as the currently most popular form of conceptual modeling.\ud Research evidence illustrates how process modeling is applied across the different information\ud system life cycle phases for a range of different applications, such as configuration of Enterprise\ud Systems, workflow management, or software development. However, a detailed discussion of\ud critical factors of the quality of process models is still missing. This paper proposes a framework\ud consisting of six quality factors, which is derived from a comprehensive literature review. It then\ud presents in a case study, a utility provider, who had designed various business process models for\ud the selection of an Enterprise System. The paper summarizes potential means of conducting a\ud successful process modeling initiative and evaluates the described modeling approach within the\ud Guidelines of Modeling (GoM) framework. An outlook shows the potential lessons learnt, and\ud concludes with insights to the next phases of this study

    Assessment and optimization of environmental systems using data analysis and simulation.

    Get PDF
    For most environmental systems, specifically wastewater treatment plants and aquifers, a significant number of performance data variables are attained on a time series basis. Due to the interconnectedness of the variables, it is often difficult to assess over-arching trends and quantify temporal operational performance. The objective of this research study was to provide an effective means for comprehensive temporal evaluation of environmental systems. The proposed methodology used several multivariate data analyses and statistical techniques to present an assessment framework for the water quality monitoring programs as well as optimization of treatment plants and aquifer systems. The developed procedure considered the combination of statistical and data analysis algorithms including correlation techniques, factor analysis and principal component analysis, and multivariate stepwise regression analysis. Those methodologies were used to develop a series of independent indexes to quantify the composition of wastewater and groundwater. Also, by developing a stepwise data analysis approach, a baseline was introduced to discover the key operational parameters which significantly affect the performance of environmental systems. Moreover, a comprehensive approach was introduced to develop numerical models for forecasting key operational and quality parameters which can be used for future simulation and scenario analysis practices. The developed methodology and frameworks were successfully applied to four case studies which include three wastewater treatment plants and an aquifer system. In the first case study, the aforementioned approach was applied to the Floyds Fork water quality treatment center in Louisville, KY. The objective of this case study was to establish simple and reliable predictive models to correlate target variables with specific measured parameters. The study presented a multivariate statistical and data analyses of the wastewater physicochemical parameters to provide a baseline for temporal assessment of the treatment plant. Fifteen quality and quantity parameters were analyzed using data recorded from 2010 to 2016. To determine the overall quality condition of raw and treated wastewater, a Wastewater Quality Index (WWQI) was developed. To identify treatment process performance, the interdependencies between the variables were determined by using Principal Component Analysis (PCA). The five extracted components adequately represented the organic, nutrient, oxygen demanding, and ion activity loadings of influent and effluent streams. The study also utilized the model to predict quality parameters such as Biological Oxygen Demand (BOD), Total Phosphorus (TP), and WWQI. High accuracies ranging from 71% to 97% were achieved for fitting the models with the training dataset and relative prediction percentage errors less than 9% were achieved for the testing dataset. The presented techniques and procedures in this case study provide an assessment framework for the wastewater treatment monitoring programs. The second case study focused on assessing methane production of a novel combined system for treatment of high strength organic wastewater. The studied pilot plant comprised Rotating Biological Contactor (RBC) process under anaerobic condition, in conjunction with Moving Bed Biofilm Reactor (MBBR) as the combining aerobic process. Various operational parameters were tested to maximize the Chemical Oxygen Demand (COD) removal performance and methane gas production from treating high strength synthetic wastewater. The identified optimal parameters included hydraulic retention time, organic loading rate, and disk rotational speed; equal to 5 days, 7 rpm, and 2 kg COD/m3/d, respectively. Under these conditions, the combined system achieved high removal efficiency (98% from influent COD of 10,000 mg/L) with additional benefit of methane production (116.60 L/d from a 46-liter AnRBC reactor). The obtained results from conducting this case study confirmed the effectiveness of integrated hybrid system in achieving both high removal efficiency and methane production. Thus, this system was recommended for treating high strength organic wastewater. The third case study focused on assessing the feasibility of using a contact stabilization process for secondary treatment of refinery wastewater through a step by step analysis. the studied pilot plant comprised contact-stabilization activated sludge process in conjunction with clarification reactor. Various operational parameters were tested to minimize excessive sludge production and maximize system removal performance from treating petroleum refinery wastewater. The mixed liquor dissolved oxygen (DO) and the rate of activated return sludge (RS) were selected as key operational parameters. The results indicated that the system had an optimum performance under applied aeration of 3.7 mg oxygen per liter of mixed liquor and 46% return sludge. This operational combination resulted in COD removal efficiency of 78% with daily biomass production of 1.42 kg/day. Considering the results from this case study, the contact stabilization activated sludge process was suggested as an effective alternative for secondary treatment of wastewater from petroleum refineries. The last case study combined probabilistic and deterministic approaches for assessing aquifer’s water quality. The probabilistic approach used multivariate statistical analysis to classify the groundwater’s physiochemical characteristics. Building upon the obtained results, the deterministic approach used hydrochemistry analyses for a more comprehensive assessment of groundwater suitability for different applications. For this purpose, a large geologic basin, under arid weather conditions, was evaluated. The ultimate objective was to identify: 1) groundwater classification scheme, 2) processes governing the groundwater chemistry, 3) hydrochemical characteristics of groundwater, and 4) suitability of the groundwater for drinking and agricultural purposes. Considering the results from multivariate statistical analysis, chloride salts dissolution was identified within the aquifer. Further application of the deterministic approach revealed degradation of groundwater quality throughout the basin, possibly due to the saltwater intrusion. By developing the water quality index and a multi-hazard risk assessment methodology, the suitability of groundwater for human consumption and irrigation purposes were assessed. The combined consideration of deterministic and probabilistic approaches provided an effective means for comprehensive evaluation of groundwater quality across different aquifers or within one. The presented procedures and methodologies in this research study provide environmental analysts and governmental decision makers with a comprehensive tool to evaluate current and future quality conditions within any given wastewater treatment plants and/or aquifer systems

    Empirical Analysis of the Test Maturity Model Integration (TMMi)

    Get PDF
    Testimine on tarkvaraarenduse elutsĂŒkli oluline osa, mille eesmĂ€rgiks on hinnata tarkvaratoote kvaliteeti. Kuna tarkvara roll meie igapĂ€evaelus ĂŒha kasvab, tĂ”usevad ka nĂ”uded selle kvaliteedile. Sellega seonduvalt on tĂ”statatud kĂŒsimus olemasolevate testimisprotsesside ja –tehnikate kĂŒpsuse ĂŒle ning on sagenenud soov neid parendada. Paraku ei ole sobiva parendusmeetodi valimine lihtne. Tarkvaraarenduse valdkonnas on esile kerkinud mitmeid testimisprotsesside parendusmudeleid, kuid nende kasutusjuhised on tihti ebapiisavad. Samuti ei ole piisavalt juhiseid sobivaima mudeli valimiseks. KĂ€esolevas töös kirjeldatakse olemasolevate testimisprotsesside parendusmudelite seas lĂ€bi viidud uurimust, mille tulemusena valiti potentisaalselt parim mudel Playtech Estonia Casino ĂŒksuse seisukohast lĂ€htuvalt. Hindamaks vĂ€ljavalitud mudeli Test Maturity Model integration (TMMi) sobivust, viidi Casino ĂŒksuses lĂ€bi testimisprotsesside hindamisel ja parendamisel pĂ”hinev praktiline uurimus. Hindamisprotsessi raames viidi lĂ€bi kĂŒsimustikul pĂ”hinev olukorra kaardistamine ning vestlused töötajatega, eesmĂ€rgiga saada ĂŒlevaade ĂŒksuse testimisprotsesside kĂŒpsusest. Kaardistuse tulemusena identifitseeriti ĂŒksuse nĂ”rgemad protsessialad ning tehti ettepanekud nende parendamiseks. Töö kĂ€igus antakse ĂŒlevaade TMMi mudeli kĂŒpsusastmete struktuurist ning neis sisalduvatest protsessialadest. Samuti antakse soovitusi kuidas arvestada mudelis kirjeldatud parimaid tavasid ettevĂ”tte parendusprotsessis. LĂ€biviidud praktilise uurimuse tulemusena tĂ”deme, et testimisprotsesside parendamise olulisim aspekt on mĂ”ista parendamise pĂ”hjusi ja eesmĂ€rke, vĂ”ttes arvesse ettevĂ”tte vajadusi. Parima mudeli valimise peamiseks eelduseks on parendusprotsessi ettevĂ”ttepoolsete nĂ”uete mÀÀratlemine. LĂ€biviidud praktilise uurimuse kĂ€igus kogetu pĂ”hjal antakse hinnang TMMi kasutamisele ja sobivusele, tuues vĂ€lja kahtlused antud mudeli sobivusele vĂ€ledaid arendusmeetodeid kasutavates ettevĂ”tetes. Lisaks kirjeldatakse vĂ”imalusi TMMi mudeli parendamiseks. KokkuvĂ”te soovitatud parendustest on edastatud ka mudelit vĂ€ljastavale organisatsioonile TMMi Foundation, mis panustab uurimisaluse mudeli edasisse arengusse.Testing is an essential part of the software development lifecycle for gaining trust in the quality of the delivered product. Concerns have been raised over the maturity of existing test processes and techniques, and the desire to improve has been acknowledged. Unfortunately, the decisions on how the improvement should be carried out are not straightforward. This poses a variety of problems because even though there are test process improvement models available on the market, the guidelines on how to use them are unsatisfactory. Furthermore, insufficient information is available on how to choose the best model. The current paper describes the literature study conducted on existing test process improvement models. As a result of this research, the potentially best model was chosen from the perspective of Playtech Estonia’s Casino unit. To evaluate the choice, we conducted a single-object case study of test process assessment and improvement process based on the selected Test Maturity Model integration (TMMi) in the Casino unit. As part of the assessment process, a survey was carried out and staff interviews were performed to obtain an understanding of test process maturity. Based on the outcome, improvements were proposed to Casino unit’s testing activities. We provide an overview of the test process maturity concept of the framework and suggest how the described best practices could be considered in the organisation’s improvement process. As a result of the assessment and improvement case study, we show the importance of understanding the reasons and objectives for test process improvement in consideration with the needs of the organisation. Identifying the best model is primarily dependent on defining organisation-side requirements for an improvement framework. An evaluation on the performance and suitability of TMMi in a software development organisation is presented based on the case study experience, raising also the concerns of its applicability in agile environments. Improvement possibilities for TMMi are described, which have been also forwarded to the model publisher TMMi Foundation, and contribute to the potential enhancement of the framework

    Inferential Model Predictive Control Using Statistical Tools

    Get PDF
    With an ever increasing emphasis on reducing costs and improving quality control, the application of advanced process control in the bulk chemical and petrochemical industry is steadily rising. Two major areas of development are model-based control strategies and process sensors. This study deals with the application of multivariate statistical techniques for developing soft-sensors in an inferential model predictive control framework. McAvoy (2003) has proposed model predictive statistical process control (MP-SPC), a principal component (PC) score control methodology. MP-SPC was found to be very effective in reducing the variability in the quality variables without using any real-time, on-line quality or disturbance measurements. This work extends McAvoy's formulation to incorporate multiple manipulated variables and demonstrates the controller's performance under different disturbance scenarios and for an additional case study. Moreover, implementation issues critical to the success of the formulations considered such as controller tuning, measurement selection and model identification are also studied. A key feature is the emphasis on confirming the consistency of the cross-correlation between the selected measurements and the quality variable before on-line implementation and that between the scores and the quality variables after on-line implementation. An analysis of the controller's performance in dealing with disturbances of different frequencies, sizes and directions, as well as non-stationarities in the disturbance, reveals the robustness of the approach. The penalty on manipulated variable moves is the most effective tuning parameter. A unique scheme, developed in this study, takes advantage of the information contained in historical databases combined with plant testing to generate collinear PC score models. The proposed measurement selection algorithm ranks measurements that have a consistent cross-correlation with the quality variable according to their cross-correlation coefficient and lead time. Higher ranked variables are chosen as long as they make sufficiently large contributions to the PC score model. Several approaches for identifying dynamic score models are proposed. All approaches put greater emphasis on short term predictions. Two approaches utilize the statistics associated with the PC score models. The Hotelling's statistic and the Q-residual information may be used to remove outliers during pre-processing or may be incorporated as sample weights. The process dynamics and controller performance results presented in this study are simulations based on well-known, industrially benchmarked, test-bed models: the Tennessee Eastman challenge process and the azeotropic distillation tower of the Vinyl Acetate monomer process

    A Model-Driven Approach for Business Process Management

    Get PDF
    The Business Process Management is a common mechanism recommended by a high number of standards for the management of companies and organizations. In software companies this practice is every day more accepted and companies have to assume it, if they want to be competitive. However, the effective definition of these processes and mainly their maintenance and execution are not always easy tasks. This paper presents an approach based on the Model-Driven paradigm for Business Process Management in software companies. This solution offers a suitable mechanism that was implemented successfully in different companies with a tool case named NDTQ-Framework.Ministerio de EducaciĂłn y Ciencia TIN2010-20057-C03-02Junta de AndalucĂ­a TIC-578

    A requirements engineering framework for integrated systems development for the construction industry

    Get PDF
    Computer Integrated Construction (CIC) systems are computer environments through which collaborative working can be undertaken. Although many CIC systems have been developed to demonstrate the communication and collaboration within the construction projects, the uptake of CICs by the industry is still inadequate. This is mainly due to the fact that research methodologies of the CIC development projects are incomplete to bridge the technology transfer gap. Therefore, defining comprehensive methodologies for the development of these systems and their effective implementation on real construction projects is vital. Requirements Engineering (RE) can contribute to the effective uptake of these systems because it drives the systems development for the targeted audience. This paper proposes a requirements engineering approach for industry driven CIC systems development. While some CIC systems are investigated to build a broad and deep contextual knowledge in the area, the EU funded research project, DIVERCITY (Distributed Virtual Workspace for Enhancing Communication within the Construction Industry), is analysed as the main case study project because its requirements engineering approach has the potential to determine a framework for the adaptation of requirements engineering in order to contribute towards the uptake of CIC systems

    Integration of 3D Feedback Control Systems for Fabrication of Engineered Assemblies for Industrial Construction Projects

    Get PDF
    A framework and methods are presented in this thesis to support integration of 3D feedback control systems to improve dimensional conformance during fabrication of engineered assemblies such as process piping, structural steel, vessels, tanks, and associated instrumentation for industrial construction projects. Fabrication includes processes such as cutting, bending, fitting, welding, and connecting. Companies specializing in these processes are known as fabricators, fabrication shops or fab shops. Typically, fab shops do not use 3D feedback control systems in their measurement and quality control processes. Instead, most measurements are done using manual tools such as tape measures, callipers, bubble levels, straight edges, squares, and templates. Inefficiency and errors ensue, costing the industry tens of billions of dollars per year globally. Improvement is impeded by a complex fabrication industry system dependent on deeply embedded existing processes, inflexible supply chains, and siloed information environments. The goal of this thesis is to address these impediments by developing and validating a new implementation framework including several specific methods. To accomplish this goal, several research objectives must be met: 1. Determine if 3D dimensional control methods are possible for fab shops that do not have access to 3D models corresponding to shop drawings, thus serving as a step toward deploying more integrated, sophisticated and higher performing control systems. 2. Discover ways to solve incompatibility between requested information from fabrication workers and the output information delivered by state-of-the-art 3D inspection systems. 3. Conduct a credible cost-benefit analysis to understand the benefits required to justify the implementation costs, such as training, process change management, and capital expenditures for 3D data acquisition units for fab shops. 4. Investigate ways to compare quality and accuracy of dimensional control data sourced from modern point cloud processing methods, conventional surveying methods, and hand tools. Methodologies used in this research include: (1) an initial literature review to understand the knowledge gaps coupled with informal interviews of practitioners from industrial research partners, which was revisited throughout the development of the dissertation, (2) development of a conceptual framework for 3D fabrication control based on 3D imaging, (3) development and validation of algorithms to address key impediments to implementation of the framework, (4) experiments in the fab shop environment to validate elements of the framework, and (5) analysis to develop conclusions, identify weaknesses in the research, understand its contributions, and make recommendations. By developing and testing the preceding framework, it was discovered that three stages of evolution are necessary for implementation. These stages are: 1. Utilization of 3D digital templates to enable simple scan-vs-3D-model workflows for shops without access to 3D design models. 2. Development of a new language and framework for dimensional control through current ways of thinking and communication of quality control information. 3. Redefining quality control processes based on state-of-the-art tools and technologies, including automated dimensional control systems. With respect to the first stage, and to address the lack of access to 3D models, a framework for developing 3D digital template models was developed for inspecting received parts. The framework was used for developing a library of 600 3D models of piping parts. The library was leveraged to deploy a 3D quality control system that was then tested in an industrial-scale case study. The results of the case study were used to develop a discrete event simulation model. The simulation results from the model and subsequent cost-benefit analysis show that investment in integrating the scan-vs-3D-model quality control systems can have significant cost savings and provide a payback period of less than two years. With respect to the second stage and to bridge the gap between what 3D inspection systems can offer and what is expected by the fabrication workers, the concept of Termination Points was further defined and a framework for measuring and classifying them was developed. The framework was used to developed applications and tools based on the provided set of definitions. Those applications and tools were further analyzed, and the results are reported in each chapter. It is concluded that the methods developed based on the framework can have sufficient accuracy and can add significant value for fabrication quality control

    Uncertainty Analysis for Data-Driven Chance-Constrained Optimization

    Get PDF
    In this contribution our developed framework for data-driven chance-constrained optimization is extended with an uncertainty analysis module. The module quantifies uncertainty in output variables of rigorous simulations. It chooses the most accurate parametric continuous probability distribution model, minimizing deviation between model and data. A constraint is added to favour less complex models with a minimal required quality regarding the fit. The bases of the module are over 100 probability distribution models provided in the Scipy package in Python, a rigorous case-study is conducted selecting the four most relevant models for the application at hand. The applicability and precision of the uncertainty analyser module is investigated for an impact factor calculation in life cycle impact assessment to quantify the uncertainty in the results. Furthermore, the extended framework is verified with data from a first principle process model of a chloralkali plant, demonstrating the increased precision of the uncertainty description of the output variables, resulting in 25% increase in accuracy in the chance-constraint calculation.BMWi, 0350013A, ChemEFlex - Umsetzbarkeitsanalyse zur Lastflexibilisierung elektrochemischer Verfahren in der Industrie; Teilvorhaben: Modellierung der Chlor-Alkali-Elektrolyse sowie anderer Prozesse und deren Bewertung hinsichtlich Wirtschaftlichkeit und möglicher HemmnisseDFG, 414044773, Open Access Publizieren 2019 - 2020 / Technische UniversitÀt Berli

    Non-functional requirements: size measurement and testing with COSMIC-FFP

    Get PDF
    The non-functional requirements (NFRs) of software systems are well known to add a degree of uncertainty to process of estimating the cost of any project. This paper contributes to the achievement of more precise project size measurement through incorporating NFRs into the functional size quantification process. We report on an initial solution proposed to deal with the problem of quantitatively assessing the NFR modeling process early in the project, and of generating test cases for NFR verification purposes. The NFR framework has been chosen for the integration of NFRs into the requirements modeling process and for their quantitative assessment. Our proposal is based on the functional size measurement method, COSMIC-FFP, adopted in 2003 as the ISO/IEC 19761 standard. Also in this paper, we extend the use of COSMIC-FFP for NFR testing purposes. This is an essential step for improving NFR development and testing effort estimates, and consequently for managing the scope of NFRs. We discuss the merits of the proposed approach and the open questions related to its design
    • 

    corecore