1,186 research outputs found
Hyperspectral Unmixing Overview: Geometrical, Statistical, and Sparse Regression-Based Approaches
Imaging spectrometers measure electromagnetic energy scattered in their
instantaneous field view in hundreds or thousands of spectral channels with
higher spectral resolution than multispectral cameras. Imaging spectrometers
are therefore often referred to as hyperspectral cameras (HSCs). Higher
spectral resolution enables material identification via spectroscopic analysis,
which facilitates countless applications that require identifying materials in
scenarios unsuitable for classical spectroscopic analysis. Due to low spatial
resolution of HSCs, microscopic material mixing, and multiple scattering,
spectra measured by HSCs are mixtures of spectra of materials in a scene. Thus,
accurate estimation requires unmixing. Pixels are assumed to be mixtures of a
few materials, called endmembers. Unmixing involves estimating all or some of:
the number of endmembers, their spectral signatures, and their abundances at
each pixel. Unmixing is a challenging, ill-posed inverse problem because of
model inaccuracies, observation noise, environmental conditions, endmember
variability, and data set size. Researchers have devised and investigated many
models searching for robust, stable, tractable, and accurate unmixing
algorithms. This paper presents an overview of unmixing methods from the time
of Keshava and Mustard's unmixing tutorial [1] to the present. Mixing models
are first discussed. Signal-subspace, geometrical, statistical, sparsity-based,
and spatial-contextual unmixing algorithms are described. Mathematical problems
and potential solutions are described. Algorithm characteristics are
illustrated experimentally.Comment: This work has been accepted for publication in IEEE Journal of
Selected Topics in Applied Earth Observations and Remote Sensin
Sensor Signal and Information Processing II [Editorial]
This Special Issue compiles a set of innovative developments on the use of sensor signals and information processing. In particular, these contributions report original studies on a wide variety of sensor signals including wireless communication, machinery, ultrasound, imaging, and internet data, and information processing methodologies such as deep learning, machine learning, compressive sensing, and variational Bayesian. All these devices have one point in common: These algorithms have incorporated some form of computational intelligence as part of their core framework in problem solving. They have the capacity to generalize and discover knowledge for themselves, learning to learn new information whenever unseen data are captured
Recommended from our members
Denoising scanner effects from multimodal MRI data using linked independent component analysis
Pooling magnetic resonance imaging (MRI) data across research studies, or utilizing shared data from imaging repositories, presents exceptional opportunities to advance and enhance reproducibility of neuroscience research. However, scanner confounds hinder pooling data collected on different scanners or across software and hardware upgrades on the same scanner, even when all acquisition protocols are harmonized. These confounds reduce power and can lead to spurious findings. Unfortunately, methods to address this problem are scant. In this study, we propose a novel denoising approach that implements a data-driven linked independent component analysis (LICA) to identify scanner-related effects for removal from multimodal MRI to denoise scanner effects. We utilized multi-study data to test our proposed method that were collected on a single 3T scanner, pre- and post-software and major hardware upgrades and using different acquisition parameters. Our proposed denoising method shows a greater reduction of scanner-related variance compared with standard GLM confound regression or ICA-based single-modality denoising. Although we did not test it here, for combining data across different scanners, LICA should prove even better at identifying scanner effects as between-scanner variability is generally much larger than within-scanner variability. Our method has great promise for denoising scanner effects in multi-study and in large-scale multi-site studies that may be confounded by scanner differences.Open access articleThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]
Task-Related, Low-Frequency Task-Residual, and Resting State Activity in the Default Mode Network Brain Regions
The hypothesis of a default mode network (DMN) of brain function is based on observations of task-independent decreases of brain activity during effort as participants are engaged in tasks in contrast to resting. On the other hand, studies also showed that DMN regions activate rather than deactivate in response to task-related events. Thus, does DMN “deactivate” during effort as compared to resting? We hypothesized that, with high-frequency event-related signals removed, the task-residual activities of the DMN would decrease as compared to resting. We addressed this hypothesis with two approaches. First, we examined DMN activities during resting, task residuals, and task conditions in the stop signal task using independent component analysis (ICA). Second, we compared the fractional amplitude of low-frequency fluctuation (fALFF) signals of DMN in resting, task residuals, and task data. In the results of ICA of 76 subjects, the precuneus and posterior cingulate cortex (PCC) showed increased activation during task as compared to resting and task residuals, indicating DMN responses to task events. Precuneus but not the PCC showed decreased activity during task residual as compared to resting. The latter finding was mirrored by fALFF, which is decreased in the precuneus during task residuals, as compared to resting and task. These results suggested that the low-frequency blood oxygen level-dependent signals of the precuneus may represent a useful index of effort during cognitive performance
Modern optical astronomy: technology and impact of interferometry
The present `state of the art' and the path to future progress in high
spatial resolution imaging interferometry is reviewed. The review begins with a
treatment of the fundamentals of stellar optical interferometry, the origin,
properties, optical effects of turbulence in the Earth's atmosphere, the
passive methods that are applied on a single telescope to overcome atmospheric
image degradation such as speckle interferometry, and various other techniques.
These topics include differential speckle interferometry, speckle spectroscopy
and polarimetry, phase diversity, wavefront shearing interferometry,
phase-closure methods, dark speckle imaging, as well as the limitations imposed
by the detectors on the performance of speckle imaging. A brief account is
given of the technological innovation of adaptive-optics (AO) to compensate
such atmospheric effects on the image in real time. A major advancement
involves the transition from single-aperture to the dilute-aperture
interferometry using multiple telescopes. Therefore, the review deals with
recent developments involving ground-based, and space-based optical arrays.
Emphasis is placed on the problems specific to delay-lines, beam recombination,
polarization, dispersion, fringe-tracking, bootstrapping, coherencing and
cophasing, and recovery of the visibility functions. The role of AO in
enhancing visibilities is also discussed. The applications of interferometry,
such as imaging, astrometry, and nulling are described. The mathematical
intricacies of the various `post-detection' image-processing techniques are
examined critically. The review concludes with a discussion of the
astrophysical importance and the perspectives of interferometry.Comment: 65 pages LaTeX file including 23 figures. Reviews of Modern Physics,
2002, to appear in April issu
FUNCTIONAL NETWORK CONNECTIVITY IN HUMAN BRAIN AND ITS APPLICATIONS IN AUTOMATIC DIAGNOSIS OF BRAIN DISORDERS
The human brain is one of the most complex systems known to the mankind. Over the past 3500 years, mankind has constantly investigated this remarkable system in order to understand its structure and function. Emerging of neuroimaging techniques such as functional magnetic resonance imaging (fMRI) have opened a non-invasive in-vivo window into brain function. Moreover, fMRI has made it possible to study brain disorders such as schizophrenia from a different angle unknown to researchers before. Human brain function can be divided into two categories: functional segregation and integration. It is well-understood that each region in the brain is specialized in certain cognitive or motor tasks. The information processed in these specialized regions in different temporal and spatial scales must be integrated in order to form a unified cognition or behavior. One way to assess functional integration is by measuring functional connectivity (FC) among specialized regions in the brain. Recently, there is growing interest in studying the FC among brain functional networks. This type of connectivity, which can be considered as a higher level of FC, is termed functional network connectivity (FNC) and measures the statistical dependencies among brain functional networks. Each functional network may consist of multiple remote brain regions. Four studies related to FNC are presented in this work. First FNC is compared during the resting-state and auditory oddball task (AOD). Most previous FNC studies have been focused on either resting-state or task-based data but have not directly compared these two. Secondly we propose an automatic diagnosis framework based on resting-state FNC features for mental disorders such as schizophrenia. Then, we investigate the proper preprocessing for fMRI time-series in order to conduct FNC studies. Specifically the impact of autocorrelated time-series on FNC will be comprehensively assessed in theory, simulation and real fMRI data. At the end, the notion of autoconnectivity as a new perspective on human brain functionality will be proposed. It will be shown that autoconnectivity is cognitive-state and mental-state dependent and we discuss how this source of information, previously believed to originate from physical and physiological noise, can be used to discriminate schizophrenia patients with high accuracy
Computer-Aided, Multi-Modal, and Compression Diffuse Optical Studies of Breast Tissue
Diffuse Optical Tomography and Spectroscopy permit measurement of important physiological parameters non-invasively through ~10 cm of tissue. I have applied these techniques in measurements of human breast and breast cancer. My thesis integrates three loosely connected themes in this context: multi-modal breast cancer imaging, automated data analysis of breast cancer images, and microvascular hemodynamics of breast under compression. As per the first theme, I describe construction, testing, and the initial clinical usage of two generations of imaging systems for simultaneous diffuse optical and magnetic resonance imaging. The second project develops a statistical analysis of optical breast data from many spatial locations in a population of cancers to derive a novel optical signature of malignancy; I then apply this data-derived signature for localization of cancer in additional subjects. Finally, I construct and deploy diffuse optical instrumentation to measure blood content and blood flow during breast compression; besides optics, this research has implications for any method employing breast compression, e.g., mammography
- …