149 research outputs found
Light Hadron Masses from Lattice QCD
This article reviews lattice QCD results for the light hadron spectrum. We
give an overview of different formulations of lattice QCD, with discussions on
the fermion doubling problem and improvement programs. We summarize recent
developments in algorithms and analysis techniques, that render calculations
with light, dynamical quarks feasible on present day computer resources.
Finally, we summarize spectrum results for ground state hadrons and resonances
using various actions.Comment: 53 pages, 24 figures, one table; Rev.Mod.Phys. (published version);
v2: corrected typ
Geometric Numerical Integration (hybrid meeting)
The topics of the workshop
included interactions between geometric numerical integration and numerical partial differential equations;
geometric aspects of stochastic differential equations;
interaction with optimisation and machine learning;
new applications of geometric integration in physics;
problems of discrete geometry, integrability, and algebraic aspects
Recommended from our members
Data-driven reduction strategies for Bayesian inverse problems
A persistent central challenge in computational science and engineering (CSE), with both national and global security implications, is the efficient solution of large-scale Bayesian inverse problems. These problems range from estimating material parameters in subsurface simulations to estimating phenomenological parameters in climate models. Despite recent progress, our ability to quantify uncertainties and solve large-scale inverse problems lags well behind our ability to develop the governing forward simulations.
Inverse problems present unique computational challenges that are only magnified as we include larger observational data sets and demand higher-resolution parameter estimates. Even with the current state-of-the-art, solving deterministic large-scale inverse problems is prohibitively expensive. Large-scale uncertainty quantification (UQ), cast in the Bayesian inversion framework, is thus rendered intractable. To conquer these challenges, new methods that target the root causes of computational complexity are needed.
In this dissertation, we propose data-driven strategies for overcoming this “curse of di- mensionality.” First, we address the computational complexity induced in large-scale inverse problems by high-dimensional observational data. We propose a randomized misfit approach
(RMA), which uses random projections—quasi-orthogonal, information-preserving transformations—to map the high-dimensional data-misfit vector to a low-dimensional space. We provide the first theoretical explanation for why randomized misfit methods are successful in practice with a small reduced data-misfit dimension (n = O(1)).
Next, we develop the randomized geostatistical approach (RGA) for Bayesian sub- surface inverse problems with high-dimensional data. We show that the RGA is able to resolve transient groundwater inverse problems with noisy observed data dimensions up to 107, whereas a comparison method fails due to out-of-memory errors.
Finally, we address the solution of Bayesian inverse problems with spatially localized data. The motivation is CSE applications that would gain from high-fidelity estimation over a smaller data-local domain, versus expensive and uncertain estimation over the full simulation domain. We propose several truncated domain inversion methods using domain decomposition theory to build model-informed artificial boundary conditions. Numerical investigations of MAP estimation and sampling demonstrate improved fidelity and fewer partial differential equation (PDE) solves with our truncated methods.Computational Science, Engineering, and Mathematic
Large Scale Inverse Problems
This book is thesecond volume of a three volume series recording the "Radon Special Semester 2011 on Multiscale Simulation & Analysis in Energy and the Environment" that took placein Linz, Austria, October 3-7, 2011. This volume addresses the common ground in the mathematical and computational procedures required for large-scale inverse problems and data assimilation in forefront applications. The solution of inverse problems is fundamental to a wide variety of applications such as weather forecasting, medical tomography, and oil exploration. Regularisation techniques are needed to ensure solutions of sufficient quality to be useful, and soundly theoretically based. This book addresses the common techniques required for all the applications, and is thus truly interdisciplinary. This collection of survey articles focusses on the large inverse problems commonly arising in simulation and forecasting in the earth sciences
BIOMEDICAL IMAGE RESOLUTION IMPROVEMENTS BY COMBINED USE OF FOCAL MODULATION, PUPIL ENGINEERING, AND SPARSITY PRIORS.
Ph.DDOCTOR OF PHILOSOPH
Recommended from our members
Improvements in the robustness and accuracy of bioluminescence tomographic reconstructions of distributed sources within small animals
High quality three-dimensional bioluminescence tomographic (BLT) images, if available, would constitute a major advance and provide much more useful information than the two-dimensional bioluminescence images that are frequently used today. To-date, high quality BLT images have not been available, largely because of the poor quality of the data being input into the reconstruction process. Many significant confounds are not routinely corrected for and the noise in this data is unnecessarily large and poorly distributed. Moreover, many of the design choices affecting image quality are not well considered, including choices regarding the number and type of filters used when making multispectral measurements and choices regarding the frequency and uniformity of the sampling of both the range and domain of the BLT inverse problem. Finally, progress in BLT image quality is difficult to gauge owing to a lack of realistic gold-standard references that engage the full complexity and uncertainty within a small animal BLT imaging experiment.
Within this dissertation, I address all of these issues. I develop a Cerenkov-based gold-standard wherein a Positron Emission Tomography (PET) image can be used to gauge improvements in the accuracy of BLT reconstruction algorithms. In the process of creating this reference, I discover and describe corrections for several confounds that if left uncorrected would introduce artifacts into the BLT images. This includes corrections for the angle of the animal’s skin surface relative to the camera, for the height of each point on the skin surface relative to the focal plane, and for the variation in bioluminescence intensity as a function of luciferin concentration over time. Once applied, I go on to derive equations and algorithms that when employed are able to minimize the noise in the final images under the constraints of a multispectral BLT data acquisition. These equations and algorithms allow for an optimal choice of filters to be made and for the acquisition time to be optimally distributed among those filtered measurements. These optimizations make use of Barrett’s and Moore-Penrose pseudoinverse matrices which also come into play in a paradigm I describe that can be used to guide choices regarding sampling of the domain and range
- …