1,473 research outputs found

    Synchronized sweep algorithms for scalable scheduling constraints

    Get PDF
    This report introduces a family of synchronized sweep based filtering algorithms for handling scheduling problems involving resource and precedence constraints. The key idea is to filter all constraints of a scheduling problem in a synchronized way in order to scale better. In addition to normal filtering mode, the algorithms can run in greedy mode, in which case they perform a greedy assignment of start and end times. The filtering mode achieves a significant speed-up over the decomposition into independent cumulative and precedence constraints, while the greedy mode can handle up to 1 million tasks with 64 resources constraints and 2 million precedences. These algorithms were implemented in both CHOCO and SICStus

    Synchronized sweep algorithms for scalable scheduling constraints

    Get PDF
    This report introduces a family of synchronized sweep based filtering algorithms for handling scheduling problems involving resource and precedence constraints. The key idea is to filter all constraints of a scheduling problem in a synchronized way in order to scale better. In addition to normal filtering mode, the algorithms can run in greedy mode, in which case they perform a greedy assignment of start and end times. The filtering mode achieves a significant speed-up over the decomposition into independent cumulative and precedence constraints, while the greedy mode can handle up to 1 million tasks with 64 resources constraints and 2 million precedences. These algorithms were implemented in both CHOCO and SICStus

    Self-decomposable Global Constraints

    Get PDF
    International audienceScalability becomes more and more critical to decision support technologies. In order to address this issue in Constraint Programming, we introduce the family of self-decomposable constraints. These constraints can be satisfied by applying their own filtering algorithms on variable subsets only. We introduce a generic framework which dynamically decompose propagation, by filtering over variable subsets. Our experiments over the CUMULATIVE constraint illustrate the practical relevance of self-decomposition

    Improved Well Boundary Conditions: Automated Adaptation of Numerical Well Controls in Reservoir Simulation Models

    Full text link
    Wells in reservoir simulation models are set using constant boundary conditions. This results in producers being shut due to high water or gas production. In actual field operations, well flow rates and pressures are adjusted to control high water and gas production. This thesis introduces a novel yet simple method that automates adaptation of well conditions to the dynamic wellbore or near wellbore (reservoir) performance. The adaptive conditions algorithm was incorporated within a developed 3D three-phase black-oil reservoir simulator. Several 1D wellbore model candidates were compared against 3D computational fluid dynamics models in predicting published two-phase pipe flow results. Improvement in oil recovery, increase in well operational lifetime, and reduction in produced water and gas were all observed when using adaptive well controls. This thesis can result in a positive outlook when assessing business plans compared to the cost associated with early abandonment of wells. One of the many advantages of this methodology is the reduction in number of variables for optimization studies due to the elimination of rate control steps. Methods of maximizing reservoir net present value via production rate optimizations are limiting. The optimization problem requires setting the variables beforehand and doing so limits solutions to a predefined number of rate changes at exact and specific times. Integrating adaptive well controls to an optimization study increased convergence rates and enhanced the optimized solution. This is due to the well rates being able to automatically adapt to reservoir/well performances thereby having unbounded access to rate changes which would have been numerically expensive to include in rate optimization setups. Comparison of conversion times showed that optimization runs using adaptive well conditions converging earlier than the base case. Furthermore, the adaptive case required less than half the number of generations to produce an improved maximum NPV compared the base case. Several studies were performed using a three-phase reservoir model with six production wells and seven water injectors. In all optimization cases, the maximum NPV for the respective base model was consistently lower than the NPV of the first generation for all the different adaptive rate models

    Aeronautical Engineering: A special bibliography with indexes, supplement 55

    Get PDF
    This bibliography lists 260 reports, articles, and other documents introduced into the NASA scientific and technical information system in February 1975

    3D city scale reconstruction using wide area motion imagery

    Get PDF
    3D reconstruction is one of the most challenging but also most necessary part of computer vision. It is generally applied everywhere, from remote sensing to medical imaging and multimedia. Wide Area Motion Imagery is a field that has gained traction over the recent years. It consists in using an airborne large field of view sensor to cover a typically over a square kilometer area for each captured image. This is particularly valuable data for analysis but the amount of information is overwhelming for any human analyst. Algorithms to efficiently and automatically extract information are therefore needed and 3D reconstruction plays a critical part in it, along with detection and tracking. This dissertation work presents novel reconstruction algorithms to compute a 3D probabilistic space, a set of experiments to efficiently extract photo realistic 3D point clouds and a range of transformations for possible applications of the generated 3D data to filtering, data compression and mapping. The algorithms have been successfully tested on our own datasets provided by Transparent Sky and this thesis work also proposes methods to evaluate accuracy, completeness and photo-consistency. The generated data has been successfully used to improve detection and tracking performances, and allows data compression and extrapolation by generating synthetic images from new point of view, and data augmentation with the inferred occlusion areas.Includes bibliographical reference

    Consistent Density Scanning and Information Extraction From Point Clouds of Building Interiors

    Get PDF
    Over the last decade, 3D range scanning systems have improved considerably enabling the designers to capture large and complex domains such as building interiors. The captured point cloud is processed to extract specific Building Information Models, where the main research challenge is to simultaneously handle huge and cohesive point clouds representing multiple objects, occluded features and vast geometric diversity. These domain characteristics increase the data complexities and thus make it difficult to extract accurate information models from the captured point clouds. The research work presented in this thesis improves the information extraction pipeline with the development of novel algorithms for consistent density scanning and information extraction automation for building interiors. A restricted density-based, scan planning methodology computes the number of scans to cover large linear domains while ensuring desired data density and reducing rigorous post-processing of data sets. The research work further develops effective algorithms to transform the captured data into information models in terms of domain features (layouts), meaningful data clusters (segmented data) and specific shape attributes (occluded boundaries) having better practical utility. Initially, a direct point-based simplification and layout extraction algorithm is presented that can handle the cohesive point clouds by adaptive simplification and an accurate layout extraction approach without generating an intermediate model. Further, three information extraction algorithms are presented that transforms point clouds into meaningful clusters. The novelty of these algorithms lies in the fact that they work directly on point clouds by exploiting their inherent characteristic. First a rapid data clustering algorithm is presented to quickly identify objects in the scanned scene using a robust hue, saturation and value (H S V) color model for better scene understanding. A hierarchical clustering algorithm is developed to handle the vast geometric diversity ranging from planar walls to complex freeform objects. The shape adaptive parameters help to segment planar as well as complex interiors whereas combining color and geometry based segmentation criterion improves clustering reliability and identifies unique clusters from geometrically similar regions. Finally, a progressive scan line based, side-ratio constraint algorithm is presented to identify occluded boundary data points by investigating their spatial discontinuity

    Shuttle Ku-band and S-band communications implementation study

    Get PDF
    Various aspects of the shuttle orbiter S-band network communication system, the S-band payload communication system, and the Ku-band communication system are considered. A method is proposed for obtaining more accurate S-band antenna patterns of the actual shuttle orbiter vehicle during flight because the preliminary antenna patterns using mock-ups are not realistic that they do not include the effects of additional appendages such as wings and tail structures. The Ku-band communication system is discussed especially the TDRS antenna pointing accuracy with respect to the orbiter and the modifications required and resulting performance characteristics of the convolutionally encoded high data rate return link to maintain bit synchronizer lock on the ground. The TDRS user constraints on data bit clock jitter and data asymmetry on unbalanced QPSK with noisy phase references are included. The S-band payload communication system study is outlined including the advantages and experimental results of a peak regulator design built and evaluated by Axiomatrix for the bent-pipe link versus the existing RMS-type regulator. The nominal sweep rate for the deep-space transponder of 250 Hz/s, and effects of phase noise on the performance of a communication system are analyzed
    • …
    corecore