250,267 research outputs found
Economic modeling of carbon dioxide integrated pipeline network for enhanced oil recovery and geologic sequestration in the Texas Gulf Coast region
Naturally occurring CO2 is transported via pipelines to oil fields in West Texas to enhance production. A similar pipeline system
is proposed for the Gulf Coast region of Texas. The CO2 would come from anthropogenic sources. Using GIS data, oil fields and
CO2 sources are selected and a pipeline route is designed, taking into consideration rights of way and environmental sensitivities.
We modified several pipeline cost models from the literature to capture recent construction cost escalations. Our resulting cost
estimates agree with mid-to-high range cost quotes for pipelines reported to the Federal Energy Regulatory Commission by the
companies.Bureau of Economic Geolog
Time--Distance Helioseismology Data Analysis Pipeline for Helioseismic and Magnetic Imager onboard Solar Dynamics Observatory (SDO/HMI) and Its Initial Results
The Helioseismic and Magnetic Imager onboard the Solar Dynamics Observatory
(SDO/HMI) provides continuous full-disk observations of solar oscillations. We
develop a data-analysis pipeline based on the time-distance helioseismology
method to measure acoustic travel times using HMI Doppler-shift observations,
and infer solar interior properties by inverting these measurements. The
pipeline is used for routine production of near-real-time full-disk maps of
subsurface wave-speed perturbations and horizontal flow velocities for depths
ranging from 0 to 20 Mm, every eight hours. In addition, Carrington synoptic
maps for the subsurface properties are made from these full-disk maps. The
pipeline can also be used for selected target areas and time periods. We
explain details of the pipeline organization and procedures, including
processing of the HMI Doppler observations, measurements of the travel times,
inversions, and constructions of the full-disk and synoptic maps. Some initial
results from the pipeline, including full-disk flow maps, sunspot subsurface
flow fields, and the interior rotation and meridional flow speeds, are
presented.Comment: Accepted by Solar Physics topical issue 'Solar Dynamics Observatory
Analyzing and Developing Aspects of the Artist Pipeline for Clemson University Art
Major digital production facilities such as Sony Pictures Imageworks, Pixar Animation studio, Walt Disney Animation Studio, and Epic Games use a production system called a pipeline. The term “pipeline” refers to the structure and process of data flow between the various phases of production from story to final edit. This paper examines current production pipeline practices in the Digital Production Arts program at Clemson University and proposes updates and modifications to the workflow. Additionally, this thesis suggests tools that are intended to improve the pipeline with artist-friendly interfaces and customizable integration between software and remote-production capabilities
A new approach to the optimization of the extraction of astrometric and photometric information from multi-wavelength images in cosmological fields
This paper describes a new approach to the optimization of information
extraction in multi-wavelength image cubes of cosmological fields. The
objective is to create a framework for the automatic identification and tagging
of sources according to various criteria (isolated source, partially
overlapped, fully overlapped, cross-matched, etc) and to set the basis for the
automatic production of the SEDs (spectral energy distributions) for all
objects detected in the many multi-wavelength images in cosmological fields.In
order to do so, a processing pipeline is designed that combines Voronoi
tessellation, Bayesian cross-matching, and active contours to create a
graph-based representation of the cross-match probabilities. This pipeline
produces a set of SEDs with quality tags suitable for the application of
already-proven data mining methods. The pipeline briefly described here is also
applicable to other astrophysical scenarios such as star forming regions.Comment: GREAT Workshop. This paper will be published in Springer as part of
the proceedings for the GREAT Worksho
THE EXPLOSION EFFECT: A Custom Volume FX Production Pipeline
Explosions are very important elements in film FX production. This paper introduces a custom volume FX production pipeline and how to use the pipeline to produce explosion FX. A detailed explanation is given of the pipeline artistic and technical aspects
Order fulfillment in high variety production environments
Providing high levels of product variety and product customization is challenging for many companies. This paper presents a new classification of production and order fulfillment approaches available to manufacturing companies that offer high variety and/or product customization. Six categories of approaches are identified and described. An important emerging approach - open pipeline planning – is highlighted for high variety manufacturing environments. It allows a customer order to be fulfilled from anywhere in the system, enabling greater responsiveness in Build-to-Forecast systems. The links between the open pipeline approach, decoupling concepts and postponement strategies are discussed and the relevance of the approach to the volume automotive sector is highlighted. Results from a simulation study are presented illustrating the potential benefits when products can be reconfigured in an open pipeline system. The application of open pipeline concepts to different manufacturing domains is discussed and the operating characteristics of most relevance are highlighted. In addition to the automotive, sectors such as machinery and instrumentation, computer servers, telecommunications and electronic equipment may benefit from an open pipeline planning approach. When properly designed these systems can significantly enhance order fulfillment performance
Capacity Assessment Of The System Of Gas Pipelines, Receiving And Transporting Gas Of Inland Production
Today, the majority of gas fields in Ukraine are in the final stages of development, which is characterized by a significant decrease in wellhead pressure, as well as an increased gas-water factor. As is well known, when lowering wellhead pressure arises the problem of ensuring the design capacity of the gas production system as a whole.The main function of the gas pipeline system of the gas producing company of Ukraine is collection of gas from deposits and transport natural gas to consumers.Taking into account the tasks of ensuring the energy independence of Ukraine, as well as the program to build up gas of its own production, the question of assessing the capacity of the gas pipeline system remains relevant, performing the function of collection and transportation.As part of the research, the current state of the gas collection and transportation system is analyzed. The workload of gas pipeline sections in the chain from the wellhead to the consumer is investigated. As a result, it is established that the initial sections of the gas production system are fully loaded. Areas that can be recharged are identified, as a result of which it will reduce the output pressure at the wellheads and stabilize hydrocarbon production.On the basis of the conducted research, it is revealed that one of the alternative methods of increasing the capacity of the gas production system at the initial sections is to increase the equivalent diameter and length of the system by building new gas pipelines. It is also found that the periodic cleaning of pipelines in existing parts of the system prevents the decrease in capacity.It has been established that reducing the backpressure of the system is possible only in conjunction with unloading the system by changing the flow directions, creating centralized gas collection points, as well as retrofitting existing booster compressor stations.The availability of data on the load on the gas transmission system will allow the gas producing company to plan the distribution of gas to areas with available free capacity, while ensuring an increase in the production of its own gas. As a result, when the gas is distributed to areas with partial load, it will prevent excessive pressure losses in the system, as well as provide optimal system operation conditions
Prototyping Virtual Data Technologies in ATLAS Data Challenge 1 Production
For efficiency of the large production tasks distributed worldwide, it is
essential to provide shared production management tools comprised of
integratable and interoperable services. To enhance the ATLAS DC1 production
toolkit, we introduced and tested a Virtual Data services component. For each
major data transformation step identified in the ATLAS data processing pipeline
(event generation, detector simulation, background pile-up and digitization,
etc) the Virtual Data Cookbook (VDC) catalogue encapsulates the specific data
transformation knowledge and the validated parameters settings that must be
provided before the data transformation invocation. To provide for local-remote
transparency during DC1 production, the VDC database server delivered in a
controlled way both the validated production parameters and the templated
production recipes for thousands of the event generation and detector
simulation jobs around the world, simplifying the production management
solutions.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics
(CHEP03), La Jolla, Ca, USA, March 2003, 5 pages, 3 figures, pdf. PSN TUCP01
Decision Support System for Design and Evaluation of Pipeline Projects
Petronet India Limited (PIL) was created to give impetus to investments in pipeline projects for transportation of petroleum products in the country. Since these projects have a long life and require large investments, correct assessment of location, capacity and financial viability are of critical importance. This paper is based on the study undertaken for PIL to evaluate a few of their pipeline projects. The study resulted in creation of a comprehensive software package that is capable of operational and financial evaluation of pipeline projects based on countrywide view on production and distribution of petroleum products. The core of the package is an LP based optimization model. The package is capable of performing sensitivity analysis to investigate the impact of uncertainty on the proposed project due to from changes in the values of key factors including distribution network and capacities, refining capacities and pattern of demand. •A model is developed for identification of viable pipeline projects, taking into account the demand and capacity additions to production and distribution network for petroleum products in the future. •The model can be used for financial evaluation of such projects based on appropriate assumptions to forecast the investments required as well as the net cash flows from the project. •The solution procedure is implemented for the models developed in the form of a software package that would allow the decision maker to experiment with assumption and generate solutions with ease and with little manual intervention. •The software package developed above is further embellished so that it also provides additional information to the decision maker in the form of reports that contain details of movement of products and the mode combinations used for the movements.
An Open-Source and Portable MLOps Pipeline for Continuous Training and Continuous Deployment
Machine Learning Operations (MLOps), derived from DevOps, aims to unify the development, deployment, and maintenance of machine learning (ML) models. Continuous training (CT) automatically retrains ML models, and continuous deployment (CD) automatically deploys the retrained models to production. Therefore, they are essential for maintaining ML model performance in dynamic production environments. The existing proprietary solutions suffer from drawbacks such as a lack of transparency and potential vendor lock-in. Additionally, current MLOps pipelines built using open-source tools still lack flexible CT and CD for ML models. This study proposes a cloud-agnostic and open-source MLOps pipeline that enables users to retrain and redeploy their ML models flexibly. We applied the Design Science methodology, consisting of identifying the problem, defining the solution objectives, and implementing, demonstrating, and evaluating the solution.
The resulting solution is an MLOps pipeline called CTCD-e MLOps pipeline. We formed a conceptual model of the needed functionalities of our MLOps pipeline and implemented the pipeline using only open-source tools. The CTCD-e MLOps pipeline runs atop Kubernetes. It can autonomously adapt ML models to dynamic production data by automatically starting retraining ML models when their performance degrades. It can also automatically A/B test the performance of the retrained models in production and fully deploys them only when they outperform their predecessors. Our demonstration and evaluation of the CTCD-e MLOps pipeline show that it is cloud-agnostic and can also be installed in on-premises environments. Additionally, the CTCD-e MLOps pipeline enables its users to flexibly configure model retraining and redeployment as well as production A/B test of the retrained models based on various requirements
- …