51,306 research outputs found

    Two-level pipelined systolic array graphics engine

    Get PDF
    The authors report a VLSI design of an advanced systolic array graphics (SAG) engine built from pipelined functional units which can generate realistic images interactively for high-resolution displays. They introduce a structured frame store system as an environment for the advanced SAG engine and present the principles and architecture of the advanced SAG engine. They introduce pipelined functional units into this SAG engine to meet the performance requirements. This is done by a formal approach where the original systolic array is represented at bit level by a finite, vertex-weighted, edge-weighted, directed graph. Two architectures built from pipelined functional units are described. A prototype containing nine processing elements was fabricated in a 1.6-¿m CMOS technolog

    How to mesh up Ewald sums (I): A theoretical and numerical comparison of various particle mesh routines

    Full text link
    Standard Ewald sums, which calculate e.g. the electrostatic energy or the force in periodically closed systems of charged particles, can be efficiently speeded up by the use of the Fast Fourier Transformation (FFT). In this article we investigate three algorithms for the FFT-accelerated Ewald sum, which attracted a widespread attention, namely, the so-called particle-particle-particle-mesh (P3M), particle mesh Ewald (PME) and smooth PME method. We present a unified view of the underlying techniques and the various ingredients which comprise those routines. Additionally, we offer detailed accuracy measurements, which shed some light on the influence of several tuning parameters and also show that the existing methods -- although similar in spirit -- exhibit remarkable differences in accuracy. We propose combinations of the individual components, mostly relying on the P3M approach, which we regard as most flexible.Comment: 18 pages, 8 figures included, revtex styl

    Image Reconstruction from Undersampled Confocal Microscopy Data using Multiresolution Based Maximum Entropy Regularization

    Full text link
    We consider the problem of reconstructing 2D images from randomly under-sampled confocal microscopy samples. The well known and widely celebrated total variation regularization, which is the L1 norm of derivatives, turns out to be unsuitable for this problem; it is unable to handle both noise and under-sampling together. This issue is linked with the notion of phase transition phenomenon observed in compressive sensing research, which is essentially the break-down of total variation methods, when sampling density gets lower than certain threshold. The severity of this breakdown is determined by the so-called mutual incoherence between the derivative operators and measurement operator. In our problem, the mutual incoherence is low, and hence the total variation regularization gives serious artifacts in the presence of noise even when the sampling density is not very low. There has been very few attempts in developing regularization methods that perform better than total variation regularization for this problem. We develop a multi-resolution based regularization method that is adaptive to image structure. In our approach, the desired reconstruction is formulated as a series of coarse-to-fine multi-resolution reconstructions; for reconstruction at each level, the regularization is constructed to be adaptive to the image structure, where the information for adaption is obtained from the reconstruction obtained at coarser resolution level. This adaptation is achieved by using maximum entropy principle, where the required adaptive regularization is determined as the maximizer of entropy subject to the information extracted from the coarse reconstruction as constraints. We demonstrate the superiority of the proposed regularization method over existing ones using several reconstruction examples

    Refraction-corrected ray-based inversion for three-dimensional ultrasound tomography of the breast

    Get PDF
    Ultrasound Tomography has seen a revival of interest in the past decade, especially for breast imaging, due to improvements in both ultrasound and computing hardware. In particular, three-dimensional ultrasound tomography, a fully tomographic method in which the medium to be imaged is surrounded by ultrasound transducers, has become feasible. In this paper, a comprehensive derivation and study of a robust framework for large-scale bent-ray ultrasound tomography in 3D for a hemispherical detector array is presented. Two ray-tracing approaches are derived and compared. More significantly, the problem of linking the rays between emitters and receivers, which is challenging in 3D due to the high number of degrees of freedom for the trajectory of rays, is analysed both as a minimisation and as a root-finding problem. The ray-linking problem is parameterised for a convex detection surface and three robust, accurate, and efficient ray-linking algorithms are formulated and demonstrated. To stabilise these methods, novel adaptive-smoothing approaches are proposed that control the conditioning of the update matrices to ensure accurate linking. The nonlinear UST problem of estimating the sound speed was recast as a series of linearised subproblems, each solved using the above algorithms and within a steepest descent scheme. The whole imaging algorithm was demonstrated to be robust and accurate on realistic data simulated using a full-wave acoustic model and an anatomical breast phantom, and incorporating the errors due to time-of-flight picking that would be present with measured data. This method can used to provide a low-artefact, quantitatively accurate, 3D sound speed maps. In addition to being useful in their own right, such 3D sound speed maps can be used to initialise full-wave inversion methods, or as an input to photoacoustic tomography reconstructions
    corecore