917 research outputs found
New Horizons in Time-Domain Diffuse Optical Spectroscopy and Imaging
Jöbsis was the first to describe the in vivo application of near-infrared spectroscopy (NIRS), also called diffuse optical spectroscopy (DOS). NIRS was originally designed for the clinical monitoring of tissue oxygenation, and today it has also become a useful tool for neuroimaging studies (functional near-infrared spectroscopy, fNIRS). However, difficulties in the selective and quantitative measurements of tissue hemoglobin (Hb), which have been central in the NIRS field for over 40 years, remain to be solved. To overcome these problems, time-domain (TD) and frequency-domain (FD) measurements have been tried. Presently, a wide range of NIRS instruments are available, including commonly available commercial instruments for continuous wave (CW) measurements, based on the modified Beer–Lambert law (steady-state domain measurements). Among these measurements, the TD measurement is the most promising approach, although compared with CW and FD measurements, TD measurements are less common, due to the need for large and expensive instruments with poor temporal resolution and limited dynamic range. However, thanks to technological developments, TD measurements are increasingly being used in research, and also in various clinical settings. This Special Issue highlights issues at the cutting edge of TD DOS and diffuse optical tomography (DOT). It covers all aspects related to TD measurements, including advances in hardware, methodology, the theory of light propagation, and clinical applications
Exploring QCD matter in extreme conditions with Machine Learning
In recent years, machine learning has emerged as a powerful computational
tool and novel problem-solving perspective for physics, offering new avenues
for studying strongly interacting QCD matter properties under extreme
conditions. This review article aims to provide an overview of the current
state of this intersection of fields, focusing on the application of machine
learning to theoretical studies in high energy nuclear physics. It covers
diverse aspects, including heavy ion collisions, lattice field theory, and
neutron stars, and discuss how machine learning can be used to explore and
facilitate the physics goals of understanding QCD matter. The review also
provides a commonality overview from a methodology perspective, from
data-driven perspective to physics-driven perspective. We conclude by
discussing the challenges and future prospects of machine learning applications
in high energy nuclear physics, also underscoring the importance of
incorporating physics priors into the purely data-driven learning toolbox. This
review highlights the critical role of machine learning as a valuable
computational paradigm for advancing physics exploration in high energy nuclear
physics.Comment: 146 pages,53 figure
Simulated Annealing
The book contains 15 chapters presenting recent contributions of top researchers working with Simulated Annealing (SA). Although it represents a small sample of the research activity on SA, the book will certainly serve as a valuable tool for researchers interested in getting involved in this multidisciplinary field. In fact, one of the salient features is that the book is highly multidisciplinary in terms of application areas since it assembles experts from the fields of Biology, Telecommunications, Geology, Electronics and Medicine
Radiative Transfer Using Path Integrals for Multiple Scattering in Participating Media
The theory of light transport forms the basis by which many computer graphic renderers are implemented. The more general theory of radiative transfer has applications in the wider scientific community, including ocean and atmospheric science, medicine, and even geophysics. Accurately capturing multiple scattering physics of light transport is an issue of great concern. Multiple scattering is responsible for indirect lighting, which is desired for images where high realism is the goal. Additionally, multiple scattering is quite important for scientific applications as it is a routine phenomenon. Computationally, it is a difficult process to model. Many have developed solutions for hard surface scenes where it is assumed that light travels in straight paths, for example, scenes without participating media. However, multiple scattering for participating media is still an open question, especially in developing robust and general techniques for particularly difficult scenes.
Radiative transfer can be expressed mathematically as a Feynman path integral (FPI), and we give background on how the transport kernel of the volume rendering equation can be written in terms of a FPI. To move this model into a numerical setting, we need numerical methods to solve the model. We start by focusing on the spatial and angular integrals of the volume rendering equation, and show a way to generate seed paths without regard as to if they are cast from the emitter or the sensor. Seed paths are converted into a discretized form, and we use an existing numerical method to tackle the FPI. A modified version of this technique shows how to reduce the running time from a quadratic to a linear expression. We then perform experimental analysis of the path integral calculation. The entire numerical method is put to full scale test on a distributed computing platform to calculate beam spread functions and compare the results to experimental data.
The dissertation is laid out as follows. In Chapter 1, we introduce the basic concepts of light propagation for computer graphics, multiple scattering, and volume rendering. Chapter 2 offers background on the subject of FPIs and some mathematical techniques used in their numerical integration for this work. Chapter 3 is a survey of radiative transfer and multiple scattering as it is studied in computer graphics and elsewhere. Chapter 4 is a full description of the current methodology. In Section 4.1 we describe sensor and emitter geometries used for our experiments. We propose a new algorithm for creating seed paths to use in the numerical integration of the FPI in Section 4.2. Section 4.3 introduces past work in the numerical integration, formalizes it, and improves upon its running time. Section 4.4 presents some analysis of the path weighting. In Chapters 5 and 6 we run experiments using the numerical methods. The first characterizes the calculation of the path integral itself using arbitrary spatial parameters, and shows repeatability and unbiased calculation given enough samples. In the second, we calculate beam spread functions, a basic property of scattering media, and compare the calculations to experimentally acquired data. Chapter 7 presents a summary of contributions, a summary of conclusions, and future directions for the research
Hybrid Inflation: Multi-field Dynamics and Cosmological Constraints
The dynamics of hybrid models is usually approximated by the evolution of a
scalar field slowly rolling along a nearly flat valley. Inflation ends with a
waterfall phase, due to a tachyonic instability. This final phase is usually
assumed to be nearly instantaneous. In this thesis, we go beyond these
approximations and analyze the exact 2-field dynamics of hybrid models. Several
effects are put in evidence: 1) the possible slow-roll violations along the
valley induce the non existence of inflation at small field values. Provided
super-planckian fields, the scalar spectrum of the original model is red, in
agreement with observations. 2) The initial field values are not fine-tuned
along the valley but also occupy a considerable part of the field space
exterior to it. They form a structure with fractal boundaries. Using bayesian
methods, their distribution in the whole parameter space is studied. Natural
bounds on the potential parameters are derived. 3) For the original model,
inflation is found to continue for more than 60 e-folds along waterfall
trajectories in some part of the parameter space. The scalar power spectrum of
adiabatic perturbations is modified and is generically red, possibly in
agreement with CMB observations. Topological defects are conveniently stretched
outside the observable Universe. 4) The analysis of the initial conditions is
extended to the case of a closed Universe, in which the initial singularity is
replaced by a classical bounce. In the third part of the thesis, we study how
the present CMB constraints on the cosmological parameters could be ameliorated
with the observation of the 21cm cosmic background, by future giant
radio-telescopes. Forecasts are determined for a characteristic Fast Fourier
Transform Telescope, by using both Fisher matrix and MCMC methods.Comment: 218 pages, PhD thesis, June 201
- …