8,703 research outputs found
Efficient Computation of PDF-Based Characteristics from Diffusion MR Signal
International audienceWe present a general method for the computation of PDF-based characteristics of the tissue micro-architecture in MR imaging. The approach relies on the approximation of the MR signal by a series expansion based on Spherical Harmonics and Laguerre-Gaussian functions, followed by a simple projection step that is efficiently done in a finite dimensional space. The resulting algorithm is generic, flexible and is able to compute a large set of useful characteristics of the local tissues structure. We illustrate the effectiveness of this approach by showing results on synthetic and real MR datasets acquired in a clinical time-frame
Probing white-matter microstructure with higher-order diffusion tensors and susceptibility tensor MRI.
Diffusion MRI has become an invaluable tool for studying white matter microstructure and brain connectivity. The emergence of quantitative susceptibility mapping and susceptibility tensor imaging (STI) has provided another unique tool for assessing the structure of white matter. In the highly ordered white matter structure, diffusion MRI measures hindered water mobility induced by various tissue and cell membranes, while susceptibility sensitizes to the molecular composition and axonal arrangement. Integrating these two methods may produce new insights into the complex physiology of white matter. In this study, we investigated the relationship between diffusion and magnetic susceptibility in the white matter. Experiments were conducted on phantoms and human brains in vivo. Diffusion properties were quantified with the diffusion tensor model and also with the higher order tensor model based on the cumulant expansion. Frequency shift and susceptibility tensor were measured with quantitative susceptibility mapping and susceptibility tensor imaging. These diffusion and susceptibility quantities were compared and correlated in regions of single fiber bundles and regions of multiple fiber orientations. Relationships were established with similarities and differences identified. It is believed that diffusion MRI and susceptibility MRI provide complementary information of the microstructure of white matter. Together, they allow a more complete assessment of healthy and diseased brains
Graph Spectral Image Processing
Recent advent of graph signal processing (GSP) has spurred intensive studies
of signals that live naturally on irregular data kernels described by graphs
(e.g., social networks, wireless sensor networks). Though a digital image
contains pixels that reside on a regularly sampled 2D grid, if one can design
an appropriate underlying graph connecting pixels with weights that reflect the
image structure, then one can interpret the image (or image patch) as a signal
on a graph, and apply GSP tools for processing and analysis of the signal in
graph spectral domain. In this article, we overview recent graph spectral
techniques in GSP specifically for image / video processing. The topics covered
include image compression, image restoration, image filtering and image
segmentation
Data augmentation in Rician noise model and Bayesian Diffusion Tensor Imaging
Mapping white matter tracts is an essential step towards understanding brain
function. Diffusion Magnetic Resonance Imaging (dMRI) is the only noninvasive
technique which can detect in vivo anisotropies in the 3-dimensional diffusion
of water molecules, which correspond to nervous fibers in the living brain. In
this process, spectral data from the displacement distribution of water
molecules is collected by a magnetic resonance scanner. From the statistical
point of view, inverting the Fourier transform from such sparse and noisy
spectral measurements leads to a non-linear regression problem. Diffusion
tensor imaging (DTI) is the simplest modeling approach postulating a Gaussian
displacement distribution at each volume element (voxel). Typically the
inference is based on a linearized log-normal regression model that can fit the
spectral data at low frequencies. However such approximation fails to fit the
high frequency measurements which contain information about the details of the
displacement distribution but have a low signal to noise ratio. In this paper,
we directly work with the Rice noise model and cover the full range of
-values. Using data augmentation to represent the likelihood, we reduce the
non-linear regression problem to the framework of generalized linear models.
Then we construct a Bayesian hierarchical model in order to perform
simultaneously estimation and regularization of the tensor field. Finally the
Bayesian paradigm is implemented by using Markov chain Monte Carlo.Comment: 37 pages, 3 figure
Recommended from our members
The LONI QC System: A Semi-Automated, Web-Based and Freely-Available Environment for the Comprehensive Quality Control of Neuroimaging Data.
Quantifying, controlling, and monitoring image quality is an essential prerequisite for ensuring the validity and reproducibility of many types of neuroimaging data analyses. Implementation of quality control (QC) procedures is the key to ensuring that neuroimaging data are of high-quality and their validity in the subsequent analyses. We introduce the QC system of the Laboratory of Neuro Imaging (LONI): a web-based system featuring a workflow for the assessment of various modality and contrast brain imaging data. The design allows users to anonymously upload imaging data to the LONI-QC system. It then computes an exhaustive set of QC metrics which aids users to perform a standardized QC by generating a range of scalar and vector statistics. These procedures are performed in parallel using a large compute cluster. Finally, the system offers an automated QC procedure for structural MRI, which can flag each QC metric as being 'good' or 'bad.' Validation using various sets of data acquired from a single scanner and from multiple sites demonstrated the reproducibility of our QC metrics, and the sensitivity and specificity of the proposed Auto QC to 'bad' quality images in comparison to visual inspection. To the best of our knowledge, LONI-QC is the first online QC system that uniquely supports the variety of functionality where we compute numerous QC metrics and perform visual/automated image QC of multi-contrast and multi-modal brain imaging data. The LONI-QC system has been used to assess the quality of large neuroimaging datasets acquired as part of various multi-site studies such as the Transforming Research and Clinical Knowledge in Traumatic Brain Injury (TRACK-TBI) Study and the Alzheimer's Disease Neuroimaging Initiative (ADNI). LONI-QC's functionality is freely available to users worldwide and its adoption by imaging researchers is likely to contribute substantially to upholding high standards of brain image data quality and to implementing these standards across the neuroimaging community
AI-Generated Incentive Mechanism and Full-Duplex Semantic Communications for Information Sharing
The next generation of Internet services, such as Metaverse, rely on mixed
reality (MR) technology to provide immersive user experiences. However, the
limited computation power of MR headset-mounted devices (HMDs) hinders the
deployment of such services. Therefore, we propose an efficient information
sharing scheme based on full-duplex device-to-device (D2D) semantic
communications to address this issue. Our approach enables users to avoid heavy
and repetitive computational tasks, such as artificial intelligence-generated
content (AIGC) in the view images of all MR users. Specifically, a user can
transmit the generated content and semantic information extracted from their
view image to nearby users, who can then use this information to obtain the
spatial matching of computation results under their view images. We analyze the
performance of full-duplex D2D communications, including the achievable rate
and bit error probability, by using generalized small-scale fading models. To
facilitate semantic information sharing among users, we design a contract
theoretic AI-generated incentive mechanism. The proposed diffusion model
generates the optimal contract design, outperforming two deep reinforcement
learning algorithms, i.e., proximal policy optimization and soft actor-critic
algorithms. Our numerical analysis experiment proves the effectiveness of our
proposed methods. The code for this paper is available at
https://github.com/HongyangDu/SemSharingComment: Accepted by IEEE JSA
LABORATORY SIMULATION OF TURBULENT-LIKE FLOWS
Most turbulence studies up to the present are based on statistical modeling, however,
the spatio-temporal flow structure of the turbulence is still largely unexplored. Tur-
bulence has been established to have a multi-scale instantaneous streamline structure
which influences the energy spectrum and other properties such as dissipation and
mixing.
In an attempt to further understand the fundamental nature of turbulence and its
consequences for efficient mixing, a new class of flows, so called “turbulent-like”, is in-
troduced and its spatio-temporal structure of the flows characterised. These flows are
generated in the laboratory using a shallow layer of brine and controlled by multi-scale
electromagnetic forces resulting from a combination of electric current and a magnetic
field created by a fractal permanent magnet distribution. These flows are laminar, yet
turbulent-like, in that they have multi-scale streamline topology in the shape of “cat’s
eyes” within “cat’s eyes” (or 8’s within 8’s) similar to the known schematic streamline
structure of two-dimensional turbulence. Unsteadiness is introduced to the flows by
means of time-dependent electrical current.
Particle Tracking Velocimetry (PTV) measurements are performed. The technique
developed provides highly resolved Eulerian velocity fields in space and time. The
analysis focuses on the impact of the forcing frequency, mean intensity and amplitude
on various Eulerian and Lagrangian properties of the flows e.g. energy spectrum and
fluid element dispersion statistics. Other statistics such as the integral length and time
scales are also extracted to characterise the unsteady multi-scale flows.
The research outcome provides the analysis of laboratory generated unsteady multi-
scale flows which are a tool for the controlled study of complex flow properties related
to turbulence and mixing with potential applications as efficient mixers as well as in
geophysical, environmental and industrial fields
An adaptive grid refinement strategy for the simulation of negative streamers
The evolution of negative streamers during electric breakdown of a
non-attaching gas can be described by a two-fluid model for electrons and
positive ions. It consists of continuity equations for the charged particles
including drift, diffusion and reaction in the local electric field, coupled to
the Poisson equation for the electric potential. The model generates field
enhancement and steep propagating ionization fronts at the tip of growing
ionized filaments. An adaptive grid refinement method for the simulation of
these structures is presented. It uses finite volume spatial discretizations
and explicit time stepping, which allows the decoupling of the grids for the
continuity equations from those for the Poisson equation. Standard refinement
methods in which the refinement criterion is based on local error monitors fail
due to the pulled character of the streamer front that propagates into a
linearly unstable state. We present a refinement method which deals with all
these features. Tests on one-dimensional streamer fronts as well as on
three-dimensional streamers with cylindrical symmetry (hence effectively 2D for
numerical purposes) are carried out successfully. Results on fine grids are
presented, they show that such an adaptive grid method is needed to capture the
streamer characteristics well. This refinement strategy enables us to
adequately compute negative streamers in pure gases in the parameter regime
where a physical instability appears: branching streamers.Comment: 46 pages, 19 figures, to appear in J. Comp. Phy
- …