12,114 research outputs found

    Microfocal X-Ray Computed Tomography Post-Processing Operations for Optimizing Reconstruction Volumes of Stented Arteries During 3D Computational Fluid Dynamics Modeling

    Get PDF
    Restenosis caused by neointimal hyperplasia (NH) remains an important clinical problem after stent implantation. Restenosis varies with stent geometry, and idealized computational fluid dynamics (CFD) models have indicated that geometric properties of the implanted stent may differentially influence NH. However, 3D studies capturing the in vivo flow domain within stented vessels have not been conducted at a resolution sufficient to detect subtle alterations in vascular geometry caused by the stent and the subsequent temporal development of NH. We present the details and limitations of a series of post-processing operations used in conjunction with microfocal X-ray CT imaging and reconstruction to generate geometrically accurate flow domains within the localized region of a stent several weeks after implantation. Microfocal X-ray CT reconstruction volumes were subjected to an automated program to perform arterial thresholding, spatial orientation, and surface smoothing of stented and unstented rabbit iliac arteries several weeks after antegrade implantation. A transfer function was obtained for the current post-processing methodology containing reconstructed 16 mm stents implanted into rabbit iliac arteries for up to 21 days after implantation and resolved at circumferential and axial resolutions of 32 and 50 μm, respectively. The results indicate that the techniques presented are sufficient to resolve distributions of WSS with 80% accuracy in segments containing 16 surface perturbations over a 16 mm stented region. These methods will be used to test the hypothesis that reductions in normalized wall shear stress (WSS) and increases in the spatial disparity of WSS immediately after stent implantation may spatially correlate with the temporal development of NH within the stented region

    Lateral conduction effects on heat-transfer data obtained with the phase-change paint technique

    Get PDF
    A computerized tool, CAPE, (Conduction Analysis Program using Eigenvalues) has been developed to account for lateral heat conduction in wind tunnel models in the data reduction of the phase-change paint technique. The tool also accounts for the effects of finite thickness (thin wings) and surface curvature. A special reduction procedure using just one time of melt is also possible on leading edges. A novel iterative numerical scheme was used, with discretized spatial coordinates but analytic integration in time, to solve the inverse conduction problem involved in the data reduction. A yes-no chart is provided which tells the test engineer when various corrections are large enough so that CAPE should be used. The accuracy of the phase-change paint technique in the presence of finite thickness and lateral conduction is also investigated

    The Surface Laplacian Technique in EEG: Theory and Methods

    Full text link
    This paper reviews the method of surface Laplacian differentiation to study EEG. We focus on topics that are helpful for a clear understanding of the underlying concepts and its efficient implementation, which is especially important for EEG researchers unfamiliar with the technique. The popular methods of finite difference and splines are reviewed in detail. The former has the advantage of simplicity and low computational cost, but its estimates are prone to a variety of errors due to discretization. The latter eliminates all issues related to discretization and incorporates a regularization mechanism to reduce spatial noise, but at the cost of increasing mathematical and computational complexity. These and several others issues deserving further development are highlighted, some of which we address to the extent possible. Here we develop a set of discrete approximations for Laplacian estimates at peripheral electrodes and a possible solution to the problem of multiple-frame regularization. We also provide the mathematical details of finite difference approximations that are missing in the literature, and discuss the problem of computational performance, which is particularly important in the context of EEG splines where data sets can be very large. Along this line, the matrix representation of the surface Laplacian operator is carefully discussed and some figures are given illustrating the advantages of this approach. In the final remarks, we briefly sketch a possible way to incorporate finite-size electrodes into Laplacian estimates that could guide further developments.Comment: 43 pages, 8 figure

    Multipath and interference errors reduction in gps using antenna arrays

    Get PDF
    The Global Positioning System (GPS) is a worldwide satellite based positioning system that provides any user with tridimensional position, speed and time information. The measured pseudorange is affected by the multipath propagation, which probably is the major source of errors for high precision systems. After a presentation of the GPS and the basic techniques employed to perform pseudorange measurements, the influence of the multipath components on the pseudorange measurement is explained. Like every system the GPS is also exposed to the errors that can be caused by the interferences, and a lot of civil applications need robust receivers to interferences for reasons of safety. In this paper some signal array processing techniques for reducing the code measurement errors due to the multipath propagation and the interferences are presented. Firstly, a non-adaptive beamforming is used. Secondly, a variant of the MUSIC and the maximum likelihood estimator can be used to estimate the DOA of the reflections and the interferences, and then a weight vector that removes these signals is calculated. In the third place, a beamforming with temporal reference is presented; the reference is not the GPS signal itself, but the output of a matched filter to the code. An interesting feature of the proposed techniques is that they can be applied to an array of arbitrary geometry.Peer ReviewedPostprint (published version

    Space-Varying Coefficient Models for Brain Imaging

    Get PDF
    The methodological development and the application in this paper originate from diffusion tensor imaging (DTI), a powerful nuclear magnetic resonance technique enabling diagnosis and monitoring of several diseases as well as reconstruction of neural pathways. We reformulate the current analysis framework of separate voxelwise regressions as a 3d space-varying coefficient model (VCM) for the entire set of DTI images recorded on a 3d grid of voxels. Hence by allowing to borrow strength from spatially adjacent voxels, to smooth noisy observations, and to estimate diffusion tensors at any location within the brain, the three-step cascade of standard data processing is overcome simultaneously. We conceptualize two VCM variants based on B-spline basis functions: a full tensor product approach and a sequential approximation, rendering the VCM numerically and computationally feasible even for the huge dimension of the joint model in a realistic setup. A simulation study shows that both approaches outperform the standard method of voxelwise regressions with subsequent regularization. Due to major efficacy, we apply the sequential method to a clinical DTI data set and demonstrate the inherent ability of increasing the rigid grid resolution by evaluating the incorporated basis functions at intermediate points. In conclusion, the suggested fitting methods clearly improve the current state-of-the-art, but ameloriation of local adaptivity remains desirable
    • …
    corecore