57 research outputs found
Recent Progress in STIR 5.0
STIR is an open source software for Emission Tomography data manipulation and image reconstruction, covering both PET and SPECT. In this work recent additions to the STIR code base are highlighted, namely the ability to read General Electric (GE) Raw Data Format 9 (RDF9) files, incorporation of GPU operators for forward and back projection, as well as work towards quantitative imaging for both PET and SPECT
Versatile regularisation toolkit for iterative image reconstruction with proximal splitting algorithms
Ill-posed image recovery requires regularisation to ensure stability. The presented open-source regularisation toolkit consists of state-of-the-art variational algorithms which can be embedded in a plug-and-play fashion
into the general framework of proximal splitting methods. The packaged regularisers aim to satisfy various prior expectations of the investigated objects, e.g., their structural characteristics, smooth or non-smooth surface morphology.
The flexibility of the toolkit helps with the design of more advanced model-based iterative reconstruction methods
for different imaging modalities while operating with simpler building blocks. The toolkit is written for CPU and
GPU architectures and wrapped for Python/MATLAB. We demonstrate the functionality of the toolkit in application
to Positron Emission Tomography (PET) and X-ray synchrotron computed tomography (CT)
Validation of frequency and mode extraction calculations from time-domain simulations of accelerator cavities
The recently developed frequency extraction algorithm [G.R. Werner and J.R.
Cary, J. Comp. Phys. 227, 5200 (2008)] that enables a simple FDTD algorithm to
be transformed into an efficient eigenmode solver is applied to a realistic
accelerator cavity modeled with embedded boundaries and Richardson
extrapolation. Previously, the frequency extraction method was shown to be
capable of distinguishing M degenerate modes by running M different simulations
and to permit mode extraction with minimal post-processing effort that only
requires solving a small eigenvalue problem. Realistic calculations for an
accelerator cavity are presented in this work to establish the validity of the
method for realistic modeling scenarios and to illustrate the complexities of
the computational validation process. The method is found to be able to extract
the frequencies with error that is less than a part in 10^5. The corrected
experimental and computed values differ by about one parts in 10^$, which is
accounted for (in largest part) by machining errors. The extraction of
frequencies and modes from accelerator cavities provides engineers and
physicists an understanding of potential cavity performance as it depends on
shape without incurring manufacture and measurement costs
Recommended from our members
Design and Optimization of Large Accelerator Systems through High-Fidelity Electromagnetic Simulations
SciDAC1, with its support for the 'Advanced Computing for 21st Century Accelerator Science and Technology' (AST) project, witnessed dramatic advances in electromagnetic (EM) simulations for the design and optimization of important accelerators across the Office of Science. In SciDAC2, EM simulations continue to play an important role in the 'Community Petascale Project for Accelerator Science and Simulation' (ComPASS), through close collaborations with SciDAC CETs/Institutes in computational science. Existing codes will be improved and new multi-physics tools will be developed to model large accelerator systems with unprecedented realism and high accuracy using computing resources at petascale. These tools aim at targeting the most challenging problems facing the ComPASS project. Supported by advances in computational science research, they have been successfully applied to the International Linear Collider (ILC) and the Large Hadron Collider (LHC) in High Energy Physics (HEP), the JLab 12-GeV Upgrade in Nuclear Physics (NP), as well as the Spallation Neutron Source (SNS) and the Linac Coherent Light Source (LCLS) in Basic Energy Sciences (BES)
Optimal values of rovibronic energy levels for triplet electronic states of molecular deuterium
Optimal set of 1050 rovibronic energy levels for 35 triplet electronic states
of has been obtained by means of a statistical analysis of all available
wavenumbers of triplet-triplet rovibronic transitions studied in emission,
absorption, laser and anticrossing spectroscopic experiments of various
authors. We used a new method of the analysis (Lavrov, Ryazanov, JETP Letters,
2005), which does not need any \it a priory \rm assumptions concerning the
molecular structure being based on only two fundamental principles:
Rydberg-Ritz and maximum likelihood. The method provides the opportunity to
obtain the RMS estimates for uncertainties of the experimental wavenumbers
independent from those presented in original papers. 234 from 3822 published
wavenumber values were found to be spurious, while the remaining set of the
data may be divided into 20 subsets (samples) of uniformly precise data having
close to normal distributions of random errors within the samples. New
experimental wavenumber values of 125 questionable lines were obtained in the
present work. Optimal values of the rovibronic levels were obtained from the
experimental data set consisting of 3713 wavenumber values (3588 old and 125
new). The unknown shift between levels of ortho- and para- deuterium was found
by least squares analysis of the , ,
rovibronic levels with odd and even values of . All the energy levels were
obtained relative to the lowest vibro-rotational level (, ) of
the electronic state, and presented in tabular form together
with the standard deviations of the empirical determination. New energy level
values differ significantly from those available in literature.Comment: 46 pages, 9 picture
Recommended from our members
COMPASS, the COMmunity Petascale project for Accelerator Science and Simulation, a board computational accelerator physics initiative
Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction
Data Descriptor: A global multiproxy database for temperature reconstructions of the Common Era
Reproducible climate reconstructions of the Common Era (1 CE to present) are key to placing industrial-era warming into the context of natural climatic variability. Here we present a community-sourced database of temperature-sensitive proxy records from the PAGES2k initiative. The database gathers 692 records from 648 locations, including all continental regions and major ocean basins. The records are from trees, ice, sediment, corals, speleothems, documentary evidence, and other archives. They range in length from 50 to 2000 years, with a median of 547 years, while temporal resolution ranges from biweekly to centennial. Nearly half of the proxy time series are significantly correlated with HadCRUT4.2 surface temperature over the period 1850-2014. Global temperature composites show a remarkable degree of coherence between high-and low-resolution archives, with broadly similar patterns across archive types, terrestrial versus marine locations, and screening criteria. The database is suited to investigations of global and regional temperature variability over the Common Era, and is shared in the Linked Paleo Data (LiPD) format, including serializations in Matlab, R and Python.(TABLE)Since the pioneering work of D'Arrigo and Jacoby1-3, as well as Mann et al. 4,5, temperature reconstructions of the Common Era have become a key component of climate assessments6-9. Such reconstructions depend strongly on the composition of the underlying network of climate proxies10, and it is therefore critical for the climate community to have access to a community-vetted, quality-controlled database of temperature-sensitive records stored in a self-describing format. The Past Global Changes (PAGES) 2k consortium, a self-organized, international group of experts, recently assembled such a database, and used it to reconstruct surface temperature over continental-scale regions11 (hereafter, ` PAGES2k-2013').This data descriptor presents version 2.0.0 of the PAGES2k proxy temperature database (Data Citation 1). It augments the PAGES2k-2013 collection of terrestrial records with marine records assembled by the Ocean2k working group at centennial12 and annual13 time scales. In addition to these previously published data compilations, this version includes substantially more records, extensive new metadata, and validation. Furthermore, the selection criteria for records included in this version are applied more uniformly and transparently across regions, resulting in a more cohesive data product.This data descriptor describes the contents of the database, the criteria for inclusion, and quantifies the relation of each record with instrumental temperature. In addition, the paleotemperature time series are summarized as composites to highlight the most salient decadal-to centennial-scale behaviour of the dataset and check mutual consistency between paleoclimate archives. We provide extensive Matlab code to probe the database-processing, filtering and aggregating it in various ways to investigate temperature variability over the Common Era. The unique approach to data stewardship and code-sharing employed here is designed to enable an unprecedented scale of investigation of the temperature history of the Common Era, by the scientific community and citizen-scientists alike
- âŠ