12 research outputs found

    Analysis of cuttings concentration experimental data using exploratory data analysis

    Get PDF
    Cuttings transportation is a complex phenomenon involving many inter-acting variables. Experimental investigations on cuttings transport are carried out by different research groups for decades and varying findings are reported which points out to the need for a methodical data analysis approach. In the current paper, six experimental datasets (702 observations) are analyzed using exploratory data analysis (EDA) in a two-fold manner – univariate and multivariate analysis. Univariate analysis shows the asymmetry in distribution for each experimental parameter indicating the need for a nonparametric modeling approach. Multivariate analysis shows the interaction of the experimental parameters among themselves and their influence on downhole cuttings concentration (Cc) using 6D scatter plots and correlation coefficients (Kendall's ). EDA of the current experimental data reveals the following major findings: • Smaller Cc in concentric vertical wells compared to concentric non-vertical wells. • Drilling fluid flow rate is a dominant operational parameter in vertical wellbore cleaning while string rotation (RPM) is dominant in non-vertical wellbore cleaning. • Little impact of RPM in concentric vertical well and negative eccentric deviated/highly deviated well cleaning. However, RPM together with drilling fluid flow rate provides better cleaning of non-vertical wells with positive eccentricity. • RPM has higher influence on cuttings transport in narrow annulus compared to that in wide annulus. • Assuming drilling fluid of sufficient viscosity and drill string rotation present, low viscous fluid under turbulent flow and high viscous fluid under laminar flow provide better hole cleaning. Further, Kendall's indicates apparent viscosity playing a more significant role in cleaning deviated wellbores compared to other inclinations for the current dataset. • Drilling fluid flow rate influences the transport of heavier cuttings and larger cuttings more while RPM has higher influence on the transport of lighter cuttings and smaller cuttings. • Better hole cleaning by heavier drilling fluids than that by lighter fluids.publishedVersio

    Rheological characterization of Polyanionic Cellulose solutions with application to drilling fluids and cuttings transport modeling

    Get PDF
    In petroleum drilling, aqueous Polyanionic Cellulose solutions (PAC) are often used as a drilling fluid model system in experimental laboratory studies to investigate cuttings transport. Cuttings transport refers to the transportation of drilled-off solids out of the wellbore. In these studies, PAC solutions are typically assumed to behave purely viscous, i.e. they do not show time-dependent/thixotropic and/or viscoelastic properties. In this study, a rheological characterization of PAC has been performed in combination with an evaluation of time scales characterizing the fluid to verify the conventional assumption of a purely-viscous fluid. It is found that PAC solutions are generally not purely viscous; they feature viscoelastic behavior on time scales of the order of 0.01 to 1 s, such as normal stress differences, as well as thixotropic behavior on larger time scales of the order of 10 to 1000 s because of their polymeric microstructure. If simplified to a purely viscous fluid, the degree of uncertainty in representing the measured apparent shear viscosity may increase by an order of ≈ 75 to 90% depending on the relevant time scale. When obtaining flow curves, a sufficiently long measurement point duration (sample time for a particular torque reading) is required to ensure that the liquid microstructure has reached its dynamic equilibrium at the desired shear rate. Due to their polymeric nature, PAC solutions feature Newtonian viscosity plateaus at both low and high shear rates. For modeling purposes, the application of a Cross/Carreau material function is recommended because it both best describes the flow curve data and minimizes extrapolation errors compared to the conventionally used Power Law material function.acceptedVersio

    Requirement Engineering for a Small Project with Pre-Specified Scope

    Get PDF
    This paper describes the requirements engineering process developed forthe EU-funded project Analysis of MassIve DataSTreams (AMIDST). Theprocess adopts a use case based approach to requirements engineering (RE),that is tailored to the specific characteristics of the AMIDST project. Thesecharacteristics include a relatively small group of developers, a pre-definedproject scope, stakeholders from different industries, and the development ofa general software framework that can be instantiated according to the needsof the stakeholders. The resulting methodology is sufficiently general to beof relevance for similar development projects

    New Doppler-Based Imaging Methods in Echocardiography with Applications in Blood/Tissue Segmentation

    No full text
    Part 1: The bandwidth of the ultrasound Doppler signal is proposed as a classification function of blood and tissue signal in transthoracial echocardiography of the left ventricle. The new echocardiographic mode, Bandwidth Imaging, utilizes the difference in motion between tissue and blood. Specifically, Bandwidth Imaging is the absolute value of the normalized autocorrelation function with lag one. Bandwidth Imaging is therefore linearly dependent on the the square of the bandwidth estimated from the Doppler spectrum. A 2-tap Finite Impulse Response high-pass filter is used prior to autocorrelation calculation to account for the high level of DC clutter noise in the apical regions. Reasonable pulse strategies are discussed and several images of Bandwidth Imaging are included. An in vivo experiment is presented, where the apparent error rate of Bandwidth Imaging is compared with apparent error rate of Second-Harmonic Imaging on 15 healthy men. The apparent error rate is calculated from signal from all myocardial wall segments defined in \cite{Cer02}. The ground truth of the position of the myocardial wall segments is determined by manual tracing of endocardium in Second-Harmonic Imaging. A hypotheses test of Bandwidth Imaging having lower apparent error rate than Second-Harmonic Imaging is proved for a p-value of 0.94 in 3 segments of end diastole and 1 segment in end systole on non averaged data. When data is averaged by a structural element of 5 radial, 3 lateral and 4 temporal samples, the numbers of segments are increased to 9 in end diastole and to 6 in end systole. These segments are mostly located in apical and anterior wall regions. Further, a global measure GM is defined as the proportion of misclassified area in the regions close to endocardium in an image. The hypothesis test of Second-Harmonic Imaging having lower GM than Bandwidth Imaging is proved for a p-value of 0.94 in the four-chamber view in end systole in any type of averaging. On the other side, the hypothesis test of Bandwidth Imaging having lower GM than Second-Harmonic Imaging is proved for a p-value of 0.94 in long-axis view in end diastole in any type of averaging. Moreover, if images are averaged by the above structural element the test indicates that Bandwidth Imaging has a lower apparent error rate than Second-Harmonic Imaging in all views and times (end diastole or end systole), except in four-chamber view in end systole. This experiment indicates that Bandwidth Imaging can supply additional information for automatic border detection routines on endocardium. Part 2: Knowledge Based Imaging is suggested as a method to distinguish blood from tissue signal in transthoracial echocardiography. This method utilizes the maximum likelihood function to classify blood and tissue signal. Knowledge Based Imaging uses the same pulse strategy as Bandwidth Imaging, but is significantly more difficult to implement. Therefore, Knowledge Based Imaging and Bandwidth Imaging are compared with Fundamental Imaging by a computer simulation based on a parametric model of the signal. The rate apparent error rate is calculated in any reasonable tissue to blood signal ratio, tissue to white noise ratio and clutter to white noise ratio. Fundamental Imaging classifies well when tissue to blood signal ratio is high and tissue to white noise ratio is higher than clutter to white noise ratio. Knowledge Based Imaging classifies also well in this environment. In addition, Knowledge Based Imaging classifies well whenever blood to white noise ratio is above 30 dB. This is the case, even when clutter to white noise ratio is higher than tissue to white noise ratio and tissue to blood signal ratio is zero. Bandwidth Imaging performs similar to Knowledge Based Imaging, but blood to white noise ratio has to be 20 dB higher for a reasonable classification. Also the highpass filter coefficient prior to Bandwidth Imaging calculation is discussed by the simulations. Some images of different parameter settings of Knowledge Based Imaging are visually compared with Second-Harmonic Imaging, Fundamental Imaging and Bandwidth Imaging. Changing parameters of Knowledge Based Imaging can make the image look similar to both Bandwidth Imaging and Fundamental Imaging.PhD i medisinsk teknologiPhD in Medical Technolog

    Multi-dimensional semi-analytical model for axial stick?slip of a rod sliding on a surface with Coulomb friction

    No full text
    A multi-dimensional lumped element model of a long non-rotating rod that moves on a slick surface with both dynamic and static Coulomb friction is outlined. The rod is accelerated to a constant velocity, and the free end of the rod experiences the effect of stick and slip. This article describes a new modeling approach, where the model is able to switch between different linear semi-analytical sub-models, depending on how much of the rod is moving. Fundamental understanding of the stick–slip effect is revealed, and a potential shortcoming of the model is also discussed. The model is computationally effective and may be suitable for real-time applications in, for instance, oil-well drilling

    Gibbs-like phenomenon inherent in a lumped element model of a rod

    No full text
    The underlying assumption of a lumped element model is that a spatially distributed physical system can be approximated by a topology of discrete entities. The impact of this assumption is illustrated by a model of a finitely long elastic rod with uniform cross section. The model involves a cascade of masses and springs, where the boundary itself is driven by a step function. Previous authors have found closed-form solutions to related problems using the Laplace transform, while in this article we obtain closed-form solutions by eigenvalue decomposition. This means that the extension to a rod with non-uniform cross section is further illuminated. The closed-form solution is compared to a closed-form solution of a distributed parameter model. Both solutions involve a sum of a forward and a backward moving wave that travels with the speed of sound. In the case of the distributed parameter model, these waves are perfect square waves, while in the case of the lumped element model, these waves are imperfect square waves that are subjected to ‘‘Gibbs-like’’ ringing. Properties of this phenomenon are described. It is also shown that this phenomenon disappears, when using a continuous step function and a model with sufficiently many elements

    Automatic lithology prediction from well logging using kernel density estimation

    No full text
    Technologies of real-time data measurement during drilling operation have kept the attention of petroleum industries in the past years, especially with the benefit of real-time formation evaluation through logging-while-drilling technology. It is expected that most of the logging data will be recorded in real-time operation. Hence, application of automated lithology prediction tool will be essential. An automatic method to predict lithology from borehole geophysical data was developed. It was solved as a multivariate classification problem with multidimensional explanatory variables. The learning algorithm combines kernel density estimates and a classification rule that is based on these estimates. The goal of this work is to test the method on a univariate variable and validate the prediction accuracy by calculating the misclassification rates. In addition, the results will be established as a baseline for application in practice and future developments for multivariate variables analysis. Gamma-ray from wireline logging is selected as the variable to describe two lithology groups of shale and not-shale. Data from six wells in the Norwegian Continental Shelf were extracted and examined with aids of explorative data analysis and hypothesis testing, and then divided into a training- and test data set. The selected algorithm processed the training data into models, and later each element of test data was assigned to the models to get the prediction. The results were validated with cutting data, and it was proved that the models predicted the lithology effectively with misclassification rates less than 15% at its lowest and average of ± 31%. Moreover, the results confirmed that the method has a promising prospect as lithology prediction tool, especially in real-time operation, because the non-parametric approach allows real-time modeling with fewer data assumptions required
    corecore