60 research outputs found
SIRF: Synergistic Image Reconstruction Framework
The combination of positron emission tomography (PET) with magnetic resonance (MR) imaging opens the way to more accurate diagnosis and improved patient management. At present, the data acquired by PET-MR scanners are essentially processed separately, but the opportunity to improve accuracy of the tomographic reconstruction via synergy of the two imaging techniques is an active area of research. In this paper, we present Release 2.1.0 of the CCP-PETMR Synergistic Image Reconstruction Framework (SIRF) software suite, providing an open-source software platform for efficient implementation and validation of novel reconstruction algorithms. SIRF provides user-friendly Python and MATLAB interfaces built on top of C++ libraries. SIRF uses advanced PET and MR reconstruction software packages and tools. Currently, for PET this is Software for Tomographic Image Reconstruction (STIR); for MR, Gadgetron and ISMRMRD; and for image registration tools, NiftyReg. The software aims to be capable of reconstructing images from acquired scanner data, whilst being simple enough to be used for educational purposes. The most recent version of the software can be downloaded from http://www.ccppetmr.ac.uk/downloads and https://github.com/CCPPETMR/. Program summary: Program Title: Synergistic Image Reconstruction Framework (SIRF) Program Files DOI: http://dx.doi.org/10.17632/s45f5jh55j.1 Licensing provisions: GPLv3 and Apache-2.0 Programming languages: C++, C, Python, MATLAB Nature of problem: In current practice, data acquired by PET-MR scanners are processed separately. Methods for improving the accuracy of the tomographic reconstruction using the synergy of the two imaging techniques are actively being investigated by the PET-MR research and development community, however, practical application is heavily reliant on software. Open-source software available to the PET-MR community â such as the PET package (STIR) (Thielemans et al., 2012) and the MR package Gadgetron (Hansen and SĂžrensen, 2013) â provide a basis for new synergistic PET-MR software. However, these two software packages are independent and have very different software architectures. They are mostly written in C++ but many researchers in the PET-MR community are more familiar with script-style languages, such as Python and MATLAB, which enable rapid prototyping of novel reconstruction algorithms. In the current situation it is difficult for researchers to exploit any synergy between PET and MR data. Furthermore, techniques from one field cannot easily be applied in the other. Solution method: In SIRF, the bulk of computation is performed by available advanced open-source reconstruction and registration software (currently STIR, Gadgetron and NiftyReg) that can use multithreading and GPUs. The SIRF C++ code provides a thin layer on top of these existing libraries. The SIRF layer has unified data-containers and access mechanisms. This C++ layer provides the basis for a simple and intuitive Python and MATLAB interface, enabling users to quickly develop and test their reconstruction algorithms using these scripting languages only. At the same time, advanced users proficient in C++ can directly utilise wider SIRF functionality via the SIRF C++ libraries that we provide
SIRF: Synergistic Image Reconstruction Framework
The combination of positron emission tomography (PET) with magnetic resonance (MR) imaging opens the way to more accurate diagnosis and improved patient management. At present, the data acquired by PET-MR scanners are essentially processed separately, but the opportunity to improve accuracy of the tomographic reconstruction via synergy of the two imaging techniques is an active area of research. In this paper, we present Release 2.1.0 of the CCP-PETMR Synergistic Image Reconstruction Framework (SIRF) software suite, providing an open-source software platform for efficient implementation and validation of novel reconstruction algorithms. SIRF provides user-friendly Python and MATLAB interfaces built on top of C++ libraries. SIRF uses advanced PET and MR reconstruction software packages and tools. Currently, for PET this is Software for Tomographic Image Reconstruction (STIR); for MR, Gadgetron and ISMRMRD; and for image registration tools, NiftyReg. The software aims to be capable of reconstructing images from acquired scanner data, whilst being simple enough to be used for educational purposes
Optimal values of rovibronic energy levels for triplet electronic states of molecular deuterium
Optimal set of 1050 rovibronic energy levels for 35 triplet electronic states
of has been obtained by means of a statistical analysis of all available
wavenumbers of triplet-triplet rovibronic transitions studied in emission,
absorption, laser and anticrossing spectroscopic experiments of various
authors. We used a new method of the analysis (Lavrov, Ryazanov, JETP Letters,
2005), which does not need any \it a priory \rm assumptions concerning the
molecular structure being based on only two fundamental principles:
Rydberg-Ritz and maximum likelihood. The method provides the opportunity to
obtain the RMS estimates for uncertainties of the experimental wavenumbers
independent from those presented in original papers. 234 from 3822 published
wavenumber values were found to be spurious, while the remaining set of the
data may be divided into 20 subsets (samples) of uniformly precise data having
close to normal distributions of random errors within the samples. New
experimental wavenumber values of 125 questionable lines were obtained in the
present work. Optimal values of the rovibronic levels were obtained from the
experimental data set consisting of 3713 wavenumber values (3588 old and 125
new). The unknown shift between levels of ortho- and para- deuterium was found
by least squares analysis of the , ,
rovibronic levels with odd and even values of . All the energy levels were
obtained relative to the lowest vibro-rotational level (, ) of
the electronic state, and presented in tabular form together
with the standard deviations of the empirical determination. New energy level
values differ significantly from those available in literature.Comment: 46 pages, 9 picture
Recommended from our members
Design and Optimization of Large Accelerator Systems through High-Fidelity Electromagnetic Simulations
SciDAC1, with its support for the 'Advanced Computing for 21st Century Accelerator Science and Technology' (AST) project, witnessed dramatic advances in electromagnetic (EM) simulations for the design and optimization of important accelerators across the Office of Science. In SciDAC2, EM simulations continue to play an important role in the 'Community Petascale Project for Accelerator Science and Simulation' (ComPASS), through close collaborations with SciDAC CETs/Institutes in computational science. Existing codes will be improved and new multi-physics tools will be developed to model large accelerator systems with unprecedented realism and high accuracy using computing resources at petascale. These tools aim at targeting the most challenging problems facing the ComPASS project. Supported by advances in computational science research, they have been successfully applied to the International Linear Collider (ILC) and the Large Hadron Collider (LHC) in High Energy Physics (HEP), the JLab 12-GeV Upgrade in Nuclear Physics (NP), as well as the Spallation Neutron Source (SNS) and the Linac Coherent Light Source (LCLS) in Basic Energy Sciences (BES)
Recommended from our members
COMPASS, the COMmunity Petascale project for Accelerator Science and Simulation, a board computational accelerator physics initiative
Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction
Data Descriptor: A global multiproxy database for temperature reconstructions of the Common Era
Reproducible climate reconstructions of the Common Era (1 CE to present) are key to placing industrial-era warming into the context of natural climatic variability. Here we present a community-sourced database of temperature-sensitive proxy records from the PAGES2k initiative. The database gathers 692 records from 648 locations, including all continental regions and major ocean basins. The records are from trees, ice, sediment, corals, speleothems, documentary evidence, and other archives. They range in length from 50 to 2000 years, with a median of 547 years, while temporal resolution ranges from biweekly to centennial. Nearly half of the proxy time series are significantly correlated with HadCRUT4.2 surface temperature over the period 1850-2014. Global temperature composites show a remarkable degree of coherence between high-and low-resolution archives, with broadly similar patterns across archive types, terrestrial versus marine locations, and screening criteria. The database is suited to investigations of global and regional temperature variability over the Common Era, and is shared in the Linked Paleo Data (LiPD) format, including serializations in Matlab, R and Python.(TABLE)Since the pioneering work of D'Arrigo and Jacoby1-3, as well as Mann et al. 4,5, temperature reconstructions of the Common Era have become a key component of climate assessments6-9. Such reconstructions depend strongly on the composition of the underlying network of climate proxies10, and it is therefore critical for the climate community to have access to a community-vetted, quality-controlled database of temperature-sensitive records stored in a self-describing format. The Past Global Changes (PAGES) 2k consortium, a self-organized, international group of experts, recently assembled such a database, and used it to reconstruct surface temperature over continental-scale regions11 (hereafter, ` PAGES2k-2013').This data descriptor presents version 2.0.0 of the PAGES2k proxy temperature database (Data Citation 1). It augments the PAGES2k-2013 collection of terrestrial records with marine records assembled by the Ocean2k working group at centennial12 and annual13 time scales. In addition to these previously published data compilations, this version includes substantially more records, extensive new metadata, and validation. Furthermore, the selection criteria for records included in this version are applied more uniformly and transparently across regions, resulting in a more cohesive data product.This data descriptor describes the contents of the database, the criteria for inclusion, and quantifies the relation of each record with instrumental temperature. In addition, the paleotemperature time series are summarized as composites to highlight the most salient decadal-to centennial-scale behaviour of the dataset and check mutual consistency between paleoclimate archives. We provide extensive Matlab code to probe the database-processing, filtering and aggregating it in various ways to investigate temperature variability over the Common Era. The unique approach to data stewardship and code-sharing employed here is designed to enable an unprecedented scale of investigation of the temperature history of the Common Era, by the scientific community and citizen-scientists alike
- âŠ