189 research outputs found

    Array languages and the N-body problem

    Get PDF
    This paper is a description of the contributions to the SICSA multicore challenge on many body planetary simulation made by a compiler group at the University of Glasgow. Our group is part of the Computer Vision and Graphics research group and we have for some years been developing array compilers because we think these are a good tool both for expressing graphics algorithms and for exploiting the parallelism that computer vision applications require. We shall describe experiments using two languages on two different platforms and we shall compare the performance of these with reference C implementations running on the same platforms. Finally we shall draw conclusions both about the viability of the array language approach as compared to other approaches used in the challenge and also about the strengths and weaknesses of the two, very different, processor architectures we used

    Can we meaningfully speak of changes in price under the regime of changes in techniques?

    Get PDF
    This paper presents a simulation exercise on Sraffa's system under various types of technical changes to show that the direction of changes in prices of commodities is contingent on the choice of the numeraire. Thus, such a comparison of prices in two systems turns out to be meaningless. This result points to the arbitrary nature of the neoclassical supply functions, as they inevitably compare prices across several Sraffa systems on the basis of an arbitrarily chosen numeraire. We anticipated such a result from our reading of Sraffa as part of his 'prelude to a critique of economic theory'

    Non-classical computing: feasible versus infeasible

    Get PDF
    Physics sets certain limits on what is and is not computable. These limits are very far from having been reached by current technologies. Whilst proposals for hypercomputation are almost certainly infeasible, there are a number of non classical approaches that do hold considerable promise. There are a range of possible architectures that could be implemented on silicon that are distinctly different from the von Neumann model. Beyond this, quantum simulators, which are the quantum equivalent of analogue computers, may be constructable in the near future

    Mainstream parallel array programming on cell

    Get PDF
    We present the E] compiler and runtime library for the ‘F’ subset of the Fortran 95 programming language. ‘F’ provides first-class support for arrays, allowing E] to implicitly evaluate array expressions in parallel using the SPU coprocessors of the Cell Broadband Engine. We present performance results from four benchmarks that all demonstrate absolute speedups over equivalent ‘C’ or Fortran versions running on the PPU host processor. A significant benefit of this straightforward approach is that a serial implementation of any code is always available, providing code longevity, and a familiar development paradigm

    Compressed sensing electron tomography using adaptive dictionaries: a simulation study

    Get PDF
    Electron tomography (ET) is an increasingly important technique for examining the three-dimensional morphologies of nanostructures. ET involves the acquisition of a set of 2D projection images to be reconstructed into a volumetric image by solving an inverse problem. However, due to limitations in the acquisition process this inverse problem is considered ill-posed (i.e., no unique solution exists). Furthermore reconstruction usually suffers from missing wedge artifacts (e.g., star, fan, blurring, and elongation artifacts). Compressed sensing (CS) has recently been applied to ET and showed promising results for reducing missing wedge artifacts caused by limited angle sampling. CS uses a nonlinear reconstruction algorithm that employs image sparsity as a priori knowledge to improve the accuracy of density reconstruction from a relatively small number of projections compared to other reconstruction techniques. However, The performance of CS recovery depends heavily on the degree of sparsity of the reconstructed image in the selected transform domain. Prespecified transformations such as spatial gradients provide sparse image representation, while synthesising the sparsifying transform based on the properties of the particular specimen may give even sparser results and can extend the application of CS to specimens that can not be sparsely represented with other transforms such as Total variation (TV). In this work, we show that CS reconstruction in ET can be significantly improved by tailoring the sparsity representation using a sparse dictionary learning principle

    Confocal microscopic image sequence compression using vector quantization and 3D pyramids

    Get PDF
    The 3D pyramid compressor project at the University of Glasgow has developed a compressor for images obtained from CLSM device. The proposed method using a combination of image pyramid coder and vector quantization techniques has good performance at compressing confocal volume image data. An experiment was conducted on several kinds of CLSM data using the presented compressor compared to other well-known volume data compressors, such as MPEG-1. The results showed that the 3D pyramid compressor gave higher subjective and objective image quality of reconstructed images at the same compression ratio and presented more acceptable results when applying image processing filters on reconstructed images

    Linear chemically sensitive electron tomography using DualEELS and dictionary-based compressed sensing

    Get PDF
    We have investigated the use of DualEELS in elementally sensitive tilt series tomography in the scanning transmission electron microscope. A procedure is implemented using deconvolution to remove the effects of multiple scattering, followed by normalisation by the zero loss peak intensity. This is performed to produce a signal that is linearly dependent on the projected density of the element in each pixel. This method is compared with one that does not include deconvolution (although normalisation by the zero loss peak intensity is still performed). Additionaly, we compare the 3D reconstruction using a new compressed sensing algorithm, DLET, with the well-established SIRT algorithm. VC precipitates, which are extracted from a steel on a carbon replica, are used in this study. It is found that the use of this linear signal results in a very even density throughout the precipitates. However, when deconvolution is omitted, a slight density reduction is observed in the cores of the precipitates (a so-called cupping artefact). Additionally, it is clearly demonstrated that the 3D morphology is much better reproduced using the DLET algorithm, with very little elongation in the missing wedge direction. It is therefore concluded that reliable elementally sensitive tilt tomography using EELS requires the appropriate use of DualEELS together with a suitable reconstruction algorithm, such as the compressed sensing based reconstruction algorithm used here, to make the best use of the limited data volume and signal to noise inherent in core-loss EELS

    Potential drug interactions and duplicate prescriptions among ambulatory cancer patients: a prevalence study using an advanced screening method

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The pharmacotherapeutic treatment of patients with cancer is generally associated with multiple side-effects. Drug interactions and duplicate prescriptions between anti-cancer drugs or interactions with medication to treat comorbidity can reinforce or intensify side-effects.</p> <p>The aim of the present study is to gain more insight into the prevalence of drug interactions and duplicate prescriptions among patients being treated in the outpatient day care departments for oncology and hematological illnesses. For the first time the prevalence of drug interactions with OTC-drugs in cancer patients will be studied. Possible risk factors for the occurrence of these drug-related problems will also be studied.</p> <p>Methods/Design</p> <p>A multicenter cross-sectional observational study of the epidemiology of drug interactions and duplicate prescriptions is performed among all oncology and hemato-oncology patients treated with systemic anti-cancer drugs at the oncology and hematology outpatient day care department of the VU University medical center and the Zaans Medical Center.</p> <p>Discussion</p> <p>In this article the prevalence of potential drug interactions in outpatient day-care patients treated with anti-cancer agents is studied using a novel more extensive screening method. If this study shows a high prevalence of drug interactions clinical pharmacists and oncologists must collaborate to develop a pharmaceutical screening programme, including an automated electronic warning system, to support drug prescribing for ambulatory cancer patient. This programme could minimize the occurrence of drug related problems such as drug interactions and duplicate prescriptions, thereby increasing quality of life.</p> <p>Trial registration</p> <p>This study is registered, number NTR2238.</p
    • …
    corecore