5,912 research outputs found

    Spectral element methods: Algorithms and architectures

    Get PDF
    Spectral element methods are high-order weighted residual techniques for partial differential equations that combine the geometric flexibility of finite element methods with the rapid convergence of spectral techniques. Spectral element methods are described for the simulation of incompressible fluid flows, with special emphasis on implementation of spectral element techniques on medium-grained parallel processors. Two parallel architectures are considered: the first, a commercially available message-passing hypercube system; the second, a developmental reconfigurable architecture based on Geometry-Defining Processors. High parallel efficiency is obtained in hypercube spectral element computations, indicating that load balancing and communication issues can be successfully addressed by a high-order technique/medium-grained processor algorithm-architecture coupling

    Large-scale compression of genomic sequence databases with the Burrows-Wheeler transform

    Full text link
    Motivation The Burrows-Wheeler transform (BWT) is the foundation of many algorithms for compression and indexing of text data, but the cost of computing the BWT of very large string collections has prevented these techniques from being widely applied to the large sets of sequences often encountered as the outcome of DNA sequencing experiments. In previous work, we presented a novel algorithm that allows the BWT of human genome scale data to be computed on very moderate hardware, thus enabling us to investigate the BWT as a tool for the compression of such datasets. Results We first used simulated reads to explore the relationship between the level of compression and the error rate, the length of the reads and the level of sampling of the underlying genome and compare choices of second-stage compression algorithm. We demonstrate that compression may be greatly improved by a particular reordering of the sequences in the collection and give a novel `implicit sorting' strategy that enables these benefits to be realised without the overhead of sorting the reads. With these techniques, a 45x coverage of real human genome sequence data compresses losslessly to under 0.5 bits per base, allowing the 135.3Gbp of sequence to fit into only 8.2Gbytes of space (trimming a small proportion of low-quality bases from the reads improves the compression still further). This is more than 4 times smaller than the size achieved by a standard BWT-based compressor (bzip2) on the untrimmed reads, but an important further advantage of our approach is that it facilitates the building of compressed full text indexes such as the FM-index on large-scale DNA sequence collections.Comment: Version here is as submitted to Bioinformatics and is same as the previously archived version. This submission registers the fact that the advanced access version is now available at http://bioinformatics.oxfordjournals.org/content/early/2012/05/02/bioinformatics.bts173.abstract . Bioinformatics should be considered as the original place of publication of this article, please cite accordingl

    Relating multi-sequence longitudinal intensity profiles and clinical covariates in new multiple sclerosis lesions

    Get PDF
    Structural magnetic resonance imaging (MRI) can be used to detect lesions in the brains of multiple sclerosis (MS) patients. The formation of these lesions is a complex process involving inflammation, tissue damage, and tissue repair, all of which are visible on MRI. Here we characterize the lesion formation process on longitudinal, multi-sequence structural MRI from 34 MS patients and relate the longitudinal changes we observe within lesions to therapeutic interventions. In this article, we first outline a pipeline to extract voxel level, multi-sequence longitudinal profiles from four MRI sequences within lesion tissue. We then propose two models to relate clinical covariates to the longitudinal profiles. The first model is a principal component analysis (PCA) regression model, which collapses the information from all four profiles into a scalar value. We find that the score on the first PC identifies areas of slow, long-term intensity changes within the lesion at a voxel level, as validated by two experienced clinicians, a neuroradiologist and a neurologist. On a quality scale of 1 to 4 (4 being the highest) the neuroradiologist gave the score on the first PC a median rating of 4 (95% CI: [4,4]), and the neurologist gave it a median rating of 3 (95% CI: [3,3]). In the PCA regression model, we find that treatment with disease modifying therapies (p-value < 0.01), steroids (p-value < 0.01), and being closer to the boundary of abnormal signal intensity (p-value < 0.01) are associated with a return of a voxel to intensity values closer to that of normal-appearing tissue. The second model is a function-on-scalar regression, which allows for assessment of the individual time points at which the covariates are associated with the profiles. In the function-on-scalar regression both age and distance to the boundary were found to have a statistically significant association with the profiles

    Precision determination of absolute neutron flux

    Full text link
    A technique for establishing the total neutron rate of a highly-collimated monochromatic cold neutron beam was demonstrated using a method of an alpha-gamma counter. The method involves only the counting of measured rates and is independent of neutron cross sections, decay chain branching ratios, and neutron beam energy. For the measurement, a target of 10B-enriched boron carbide totally absorbed the neutrons in a monochromatic beam, and the rate of absorbed neutrons was determined by counting 478keV gamma rays from neutron capture on 10B with calibrated high-purity germanium detectors. A second measurement based on Bragg diffraction from a perfect silicon crystal was performed to determine the mean de Broglie wavelength of the beam to a precision of 0.024 %. With these measurements, the detection efficiency of a neutron monitor based on neutron absorption on 6Li was determined to an overall uncertainty of 0.058 %. We discuss the principle of the alpha-gamma method and present details of how the measurement was performed including the systematic effects. We also describe how this method may be used for applications in neutron dosimetry and metrology, fundamental neutron physics, and neutron cross section measurements.Comment: 44 page

    Scapegoat: John Dewey and the character education crisis

    Get PDF
    Many conservatives, including some conservative scholars, blame the ideas and influence of John Dewey for what has frequently been called a crisis of character, a catastrophic decline in moral behavior in the schools and society of North America. Dewey’s critics claim that he is responsible for the undermining of the kinds of instruction that could lead to the development of character and the strengthening of the will, and that his educational philosophy and example exert a ubiquitous and disastrous influence on students’ conceptions of moral behavior. This article sets forth the views of some of these critics and juxtaposes them with what Dewey actually believed and wrote regarding character education. The juxtaposition demonstrates that Dewey neither called for nor exemplified the kinds of character-eroding pedagogy his critics accuse him of championing; in addition, this paper highlights the ways in which Dewey argued consistently and convincingly that the pedagogical approaches advocated by his critics are the real culprits in the decline of character and moral education

    Neural networks-based regularization for large-scale medical image reconstruction

    Get PDF
    In this paper we present a generalized Deep Learning-based approach for solving ill-posed large-scale inverse problems occuring in medical image reconstruction. Recently, Deep Learning methods using iterative neural networks (NNs) and cascaded NNs have been reported to achieve state-of-the-art results with respect to various quantitative quality measures as PSNR, NRMSE and SSIM across different imaging modalities. However, the fact that these approaches employ the application of the forward and adjoint operators repeatedly in the network architecture requires the network to process the whole images or volumes at once, which for some applications is computationally infeasible. In this work, we follow a different reconstruction strategy by strictly separating the application of the NN, the regularization of the solution and the consistency with the measured data. The regularization is given in the form of an image prior obtained by the output of a previously trained NN which is used in a Tikhonov regularization framework. By doing so, more complex and sophisticated network architectures can be used for the removal of the artefacts or noise than it is usually the case in iterative NNs. Due to the large scale of the considered problems and the resulting computational complexity of the employed networks, the priors are obtained by processing the images or volumes as patches or slices. We evaluated the method for the cases of 3D cone-beam low dose CT and undersampled 2D radial cine MRI and compared it to a total variation-minimization-based reconstruction algorithm as well as to a method with regularization based on learned overcomplete dictionaries. The proposed method outperformed all the reported methods with respect to all chosen quantitative measures and further accelerates the regularization step in the reconstruction by several orders of magnitude

    The SMC SNR 1E0102.2-7219 as a Calibration Standard for X-ray Astronomy in the 0.3-2.5 keV Bandpass

    Get PDF
    The flight calibration of the spectral response of CCD instruments below 1.5 keV is difficult in general because of the lack of strong lines in the on-board calibration sources typically available. We have been using 1E 0102.2-7219, the brightest supernova remnant in the Small Magellanic Cloud, to evaluate the response models of the ACIS CCDs on the Chandra X-ray Observatory (CXO), the EPIC CCDs on the XMM-Newton Observatory, the XIS CCDs on the Suzaku Observatory, and the XRT CCD on the Swift Observatory. E0102 has strong lines of O, Ne, and Mg below 1.5 keV and little or no Fe emission to complicate the spectrum. The spectrum of E0102 has been well characterized using high-resolution grating instruments, namely the XMM-Newton RGS and the CXO HETG, through which a consistent spectral model has been developed that can then be used to fit the lower-resolution CCD spectra. We have also used the measured intensities of the lines to investigate the consistency of the effective area models for the various instruments around the bright O (~570 eV and 654 eV) and Ne (~910 eV and 1022 eV) lines. We find that the measured fluxes of the O VII triplet, the O VIII Ly-alpha line, the Ne IX triplet, and the Ne X Ly-alpha line generally agree to within +/-10 % for all instruments, with 28 of our 32 fitted normalizations within +/-10% of the RGS-determined value. The maximum discrepancies, computed as the percentage difference between the lowest and highest normalization for any instrument pair, are 23% for the O VII triplet, 24% for the O VIII Ly-alpha line, 13% for the Ne IX triplet, and 19% for the Ne X Ly-alpha line. If only the CXO and XMM are compared, the maximum discrepancies are 22% for the O VII triplet, 16% for the O VIII Ly-alpha line, 4% for the Ne IX triplet, and 12% for the Ne X Ly-alpha line.Comment: 16 pages, 11 figures, to be published in Proceedings of the SPIE 7011: Space Telescopes and Instrumentation II: Ultraviolet to Gamma Ray 200

    Public Participation Organizations and Open Policy:A Constitutional Moment for British Democracy?

    Get PDF
    This article builds on work in Science and Technology Studies and cognate disciplines concerning the institutionalization of public engagement and participation practices. It describes and analyses ethnographic qualitative research into one “organization of participation,” the UK government–funded Sciencewise program. Sciencewise’s interactions with broader political developments are explored, including the emergence of “open policy” as a key policy object in the UK context. The article considers what the new imaginary of openness means for institutionalized forms of public participation in science policymaking, asking whether this is illustrative of a “constitutional moment” in relations between society and science policymaking

    Deweyan tools for inquiry and the epistemological context of critical pedagogy

    Get PDF
    This article develops the notion of resistance as articulated in the literature of critical pedagogy as being both culturally sponsored and cognitively manifested. To do so, the authors draw upon John Dewey\u27s conception of tools for inquiry. Dewey provides a way to conceptualize student resistance not as a form of willful disputation, but instead as a function of socialization into cultural models of thought that actively truncate inquiry. In other words, resistance can be construed as the cognitive and emotive dimensions of the ongoing failure of institutions to provide ideas that help individuals both recognize social problems and imagine possible solutions. Focusing on Dewey\u27s epistemological framework, specifically tools for inquiry, provides a way to grasp this problem. It also affords some innovative solutions; for instance, it helps conceive of possible links between the regular curriculum and the study of specific social justice issues, a relationship that is often under-examined. The aims of critical pedagogy depend upon students developing dexterity with the conceptual tools they use to make meaning of the evidence they confront; these are background skills that the regular curriculum can be made to serve even outside social justice-focused curricula. Furthermore, the article concludes that because such inquiry involves the exploration and potential revision of students\u27 world-ordering beliefs, developing flexibility in how one thinks may be better achieved within academic subjects and topics that are not so intimately connected to students\u27 current social lives, especially where students may be directly implicated
    • …
    corecore