59 research outputs found

    SIRF: Synergistic Image Reconstruction Framework

    Get PDF
    The combination of positron emission tomography (PET) with magnetic resonance (MR) imaging opens the way to more accurate diagnosis and improved patient management. At present, the data acquired by PET-MR scanners are essentially processed separately, but the opportunity to improve accuracy of the tomographic reconstruction via synergy of the two imaging techniques is an active area of research. In this paper, we present Release 2.1.0 of the CCP-PETMR Synergistic Image Reconstruction Framework (SIRF) software suite, providing an open-source software platform for efficient implementation and validation of novel reconstruction algorithms. SIRF provides user-friendly Python and MATLAB interfaces built on top of C++ libraries. SIRF uses advanced PET and MR reconstruction software packages and tools. Currently, for PET this is Software for Tomographic Image Reconstruction (STIR); for MR, Gadgetron and ISMRMRD; and for image registration tools, NiftyReg. The software aims to be capable of reconstructing images from acquired scanner data, whilst being simple enough to be used for educational purposes

    Optimal values of rovibronic energy levels for triplet electronic states of molecular deuterium

    Full text link
    Optimal set of 1050 rovibronic energy levels for 35 triplet electronic states of D2D_2 has been obtained by means of a statistical analysis of all available wavenumbers of triplet-triplet rovibronic transitions studied in emission, absorption, laser and anticrossing spectroscopic experiments of various authors. We used a new method of the analysis (Lavrov, Ryazanov, JETP Letters, 2005), which does not need any \it a priory \rm assumptions concerning the molecular structure being based on only two fundamental principles: Rydberg-Ritz and maximum likelihood. The method provides the opportunity to obtain the RMS estimates for uncertainties of the experimental wavenumbers independent from those presented in original papers. 234 from 3822 published wavenumber values were found to be spurious, while the remaining set of the data may be divided into 20 subsets (samples) of uniformly precise data having close to normal distributions of random errors within the samples. New experimental wavenumber values of 125 questionable lines were obtained in the present work. Optimal values of the rovibronic levels were obtained from the experimental data set consisting of 3713 wavenumber values (3588 old and 125 new). The unknown shift between levels of ortho- and para- deuterium was found by least squares analysis of the a3ÎŁg+a^3\Sigma_g^+, v=0v = 0, N=0Ă·18N = 0 \div 18 rovibronic levels with odd and even values of NN. All the energy levels were obtained relative to the lowest vibro-rotational level (v=0v = 0, N=0N = 0) of the a3ÎŁg+a^3\Sigma_g^+ electronic state, and presented in tabular form together with the standard deviations of the empirical determination. New energy level values differ significantly from those available in literature.Comment: 46 pages, 9 picture

    Data Descriptor: A global multiproxy database for temperature reconstructions of the Common Era

    Get PDF
    Reproducible climate reconstructions of the Common Era (1 CE to present) are key to placing industrial-era warming into the context of natural climatic variability. Here we present a community-sourced database of temperature-sensitive proxy records from the PAGES2k initiative. The database gathers 692 records from 648 locations, including all continental regions and major ocean basins. The records are from trees, ice, sediment, corals, speleothems, documentary evidence, and other archives. They range in length from 50 to 2000 years, with a median of 547 years, while temporal resolution ranges from biweekly to centennial. Nearly half of the proxy time series are significantly correlated with HadCRUT4.2 surface temperature over the period 1850-2014. Global temperature composites show a remarkable degree of coherence between high-and low-resolution archives, with broadly similar patterns across archive types, terrestrial versus marine locations, and screening criteria. The database is suited to investigations of global and regional temperature variability over the Common Era, and is shared in the Linked Paleo Data (LiPD) format, including serializations in Matlab, R and Python.(TABLE)Since the pioneering work of D'Arrigo and Jacoby1-3, as well as Mann et al. 4,5, temperature reconstructions of the Common Era have become a key component of climate assessments6-9. Such reconstructions depend strongly on the composition of the underlying network of climate proxies10, and it is therefore critical for the climate community to have access to a community-vetted, quality-controlled database of temperature-sensitive records stored in a self-describing format. The Past Global Changes (PAGES) 2k consortium, a self-organized, international group of experts, recently assembled such a database, and used it to reconstruct surface temperature over continental-scale regions11 (hereafter, ` PAGES2k-2013').This data descriptor presents version 2.0.0 of the PAGES2k proxy temperature database (Data Citation 1). It augments the PAGES2k-2013 collection of terrestrial records with marine records assembled by the Ocean2k working group at centennial12 and annual13 time scales. In addition to these previously published data compilations, this version includes substantially more records, extensive new metadata, and validation. Furthermore, the selection criteria for records included in this version are applied more uniformly and transparently across regions, resulting in a more cohesive data product.This data descriptor describes the contents of the database, the criteria for inclusion, and quantifies the relation of each record with instrumental temperature. In addition, the paleotemperature time series are summarized as composites to highlight the most salient decadal-to centennial-scale behaviour of the dataset and check mutual consistency between paleoclimate archives. We provide extensive Matlab code to probe the database-processing, filtering and aggregating it in various ways to investigate temperature variability over the Common Era. The unique approach to data stewardship and code-sharing employed here is designed to enable an unprecedented scale of investigation of the temperature history of the Common Era, by the scientific community and citizen-scientists alike
    • 

    corecore