4,163 research outputs found

    In -cylinder combustion -based virtual emissions sensing

    Get PDF
    The development of a real-time, on-board measurement of exhaust emissions from heavy-duty engines would offer tremendous advantages in on-board diagnostics and engine control. In the absence of suitable measurement hardware, an alternative approach is the development of software-based predictive approaches. This study demonstrates the feasibility of using in-cylinder pressure-based variables as the inputs to predictive neural networks that are then used to predict engine-out exhaust gas emissions. Specifically, a large steady-state engine operation data matrix provides the necessary information for training a successful predictive network while at the same time eliminating errors produced by the dispersive and time-delay effects of the emissions measurement system which includes the exhaust system, the dilution tunnel, and the emissions analyzers. The steady-state training conditions allow for the correlation of time-averaged in-cylinder combustion variables to the engine-out gaseous emissions. A back-propagation neural network is then capable of learning the relationships between these variables and the measured gaseous emissions with the ability to interpolate between steady-state points in the matrix. The networks were then validated using the transient Federal Test Procedure cycle and in-cylinder combustion parameters gathered in real time through the use of an acquisition system based on a digital signal processor. The predictive networks for NOx and CO 2 proved highly successful while those for HC and CO were not as effective. Problems with the HC and CO networks included very low measured levels and validation data that fell beyond the training matrix boundary during transient engine operation

    Workshop - Systems Design Meets Equation-based Languages

    Get PDF

    Effective Utilization of Historical Data to Increase Organizational Performance: Focus on Sales/ Tendering and Projects

    Get PDF
    Master's thesis in Offshore technologyIn Oil and Gas industry there was not enough focus on this topic as cost was not a big factor in good olden days. But the sensational drop in oil prices below US$40 per barrel at the end of 2015 made the price more than 60 percent down compared to the one in previous years. It’s clear that the sector is going through one of the most transformative periods in its history. This situation has created more challenges to all O&G company leaders by forcing them to change their business strategies. The operating companies in the Oil and Gas industry have been focusing to reduce costs and increase organizational performance. Accordingly suppliers companies need to acknowledge their focus on the efficiency and optimization of resources to be able to sustain and grow in a competitive market. It demands better control of estimates and cost on future sales/tendering process. As quoted by one of the Operations Managers “An informed organization saves cost and wins faster”. The only way to get reliable information for any organization is by analyzing ‘what happened in the past’ and what we learned from it. In other words this is achieved through utilization of historical data from previous projects and by developing benchmarking metrics. Further, usage of the historical data can improve estimation and scheduling, support strategic planning, and improve the organizational processes. The historical project data or information can help in making strategic business decisions in any Organization. It can play a significant role in providing very distinct advantage over the competitors. Historical data can help the management to decide what projects are right for the future of the company and which projects can be avoided. Further, it can help to learn from past mistakes and win future bids by not repeating them. Most of the top management understands the importance of having and using historical project information or data. The problem is that very few companies have the methodologies, procedures, and systems in place to effectively use this information to improve their project processes and to support the estimation, scheduling, and control of future projects (opportunities). The present work focuses on historical data, estimation process and lessons learned for enhancing organizational performance. Further, the work includes a case study and number of expert interviews conducted at ABB. The work discusses how to collect, normalize, and analyze historical project data to develop practical information. Three models have been developed for project estimation process with a feedback loop, Lessons learned process model and Historical data utilization process. The recommendations have been made to use the historical data for establishing references for the sales/tendering department for future estimates, which can reduce the dependency on manual or a single person’s judgment and improve the estimation process. Some suggestions have also been made for establishing lessons learned process which can improve organizational performance. The results from analysis show that by applying the recommended processes, organizations can achieve efficiency through easy access and storage of historical database, easy access to lessons learned, measurable KPIs. Also use of key variables like project complexity and severity of requirements for estimation process and historical data process can form a better relation for data analysis and utilization.AB

    Efficient scheduling of batch processes in continuous processing lines

    Get PDF
    This thesis focuses mainly on the development of efficient formulations for scheduling in industrial environments. Likewise, decisions over the processes more related to advanced process control or production planning are included in the scheduling; in this way, the schedule obtained will be more efficient than it would be if the additional restrictions were not considered. The formulations have to emphasize obtaining online implementations, as they are planned to be used in real plants. The most common scheduling problems handled in the industrial environments are: the assignment of tasks to units, the distribution of production among parallel units and the distribution of shared resources among concurrent processes. Most advances in this work are the result of a collaborative work.Departamento de Ingeniería de Sistemas y AutomáticaDoctorado en Ingeniería Industria

    Multilateral well modeling from compartmentalized reservoirs

    Get PDF
    The existence of compartmentalization in oil and gas fields have been a major industry challenge for decades. This phenomenon introduces some complexity and uncertainty in the prediction of well productivities and the overall hydrocarbon recovery factor. In the past, multiple vertical wells were drilled to increase recovery. The recent advancement of the multilateral and completions technology made multilateral wells a viable alternative to produce multiple reservoir compartments, most especially offshore. Reservoir compartments may possess a set of unique characteristics, such as average pressures, thicknesses, permeabilities, and porosity distribution. This variation in properties introduced complex dynamics in forecasting the commingled production figures and overall recovery factor from complex reservoir structures using advanced well systems. Existing analytical and semi-analytical productivity models due to the simplifying assumptions in their development are only suitable for first approximations and early estimates in field applications. In this work, a comparison is made between the more widely known finite difference numerical method and the finite volume numerical discretization method in field productivity predictions. The Matlab Reservoir Simulation Toolbox (MRST) which is a collection of open source codes, based on the finite volume discretization methodology is used to develop the reservoir compartment and multilateral well model used in this study. We investigate the pressure drop behavior over time through the lateral for a conventional well completion and compare with the pressure drop behavior for a smart well completion with downhole flow control devices for flow control and optimization. Several cases of compartmentalized reservoirs with faults of varying orientation and sealing capacity is then investigated. The production profile results obtained from the base reservoir case from MRST is compared to those from the IMEX simulator tool in CMG (Computer Modeling Group), a commercial reservoir simulator, based on the finite difference numerical discretization method. The results we obtain show a more accurate production profile prediction based on the finite volume method over the finite difference method, as expected. The ability of the simulation toolbox as well as the importance of using an improved and more efficient numerical discretization scheme in solving an increasingly complex array of reservoir structures and advanced well geometries, with multiphase fluid flow is demonstrated. Finally, an adjoint gradient – based method of optimization implemented in the toolbox is used to investigate the optimization potential of using the smart well completions versus a conventional well completion with the net present value as the objective function. Results obtained show that an investment in smart completions for the multilateral well ultimately yields a higher net cash flow and net present value over a conventional well of equivalent length designed without smart completions

    Predictive Scale-Bridging Simulations through Active Learning

    Full text link
    Throughout computational science, there is a growing need to utilize the continual improvements in raw computational horsepower to achieve greater physical fidelity through scale-bridging over brute-force increases in the number of mesh elements. For instance, quantitative predictions of transport in nanoporous media, critical to hydrocarbon extraction from tight shale formations, are impossible without accounting for molecular-level interactions. Similarly, inertial confinement fusion simulations rely on numerical diffusion to simulate molecular effects such as non-local transport and mixing without truly accounting for molecular interactions. With these two disparate applications in mind, we develop a novel capability which uses an active learning approach to optimize the use of local fine-scale simulations for informing coarse-scale hydrodynamics. Our approach addresses three challenges: forecasting continuum coarse-scale trajectory to speculatively execute new fine-scale molecular dynamics calculations, dynamically updating coarse-scale from fine-scale calculations, and quantifying uncertainty in neural network models

    NASA SBIR abstracts of 1991 phase 1 projects

    Get PDF
    The objectives of 301 projects placed under contract by the Small Business Innovation Research (SBIR) program of the National Aeronautics and Space Administration (NASA) are described. These projects were selected competitively from among proposals submitted to NASA in response to the 1991 SBIR Program Solicitation. The basic document consists of edited, non-proprietary abstracts of the winning proposals submitted by small businesses. The abstracts are presented under the 15 technical topics within which Phase 1 proposals were solicited. Each project was assigned a sequential identifying number from 001 to 301, in order of its appearance in the body of the report. Appendixes to provide additional information about the SBIR program and permit cross-reference of the 1991 Phase 1 projects by company name, location by state, principal investigator, NASA Field Center responsible for management of each project, and NASA contract number are included
    corecore