20,893 research outputs found
Nuclear Disarmament Verification via Resonant Phenomena
Nuclear disarmament treaties are not sufficient in and of themselves to
neutralize the existential threat of the nuclear weapons. Technologies are
necessary for verifying the authenticity of the nuclear warheads undergoing
dismantlement before counting them towards a treaty partner's obligation. This
work presents a novel concept that leverages isotope-specific nuclear resonance
phenomena to authenticate a warhead's fissile components by comparing them to a
previously authenticated template. All information is encrypted in the physical
domain in a manner that amounts to a physical zero-knowledge proof system.
Using Monte Carlo simulations, the system is shown to reveal no isotopic or
geometric information about the weapon, while readily detecting hoaxing
attempts. This nuclear technique can dramatically increase the reach and
trustworthiness of future nuclear disarmament treaties
Hyperspectral Unmixing Overview: Geometrical, Statistical, and Sparse Regression-Based Approaches
Imaging spectrometers measure electromagnetic energy scattered in their
instantaneous field view in hundreds or thousands of spectral channels with
higher spectral resolution than multispectral cameras. Imaging spectrometers
are therefore often referred to as hyperspectral cameras (HSCs). Higher
spectral resolution enables material identification via spectroscopic analysis,
which facilitates countless applications that require identifying materials in
scenarios unsuitable for classical spectroscopic analysis. Due to low spatial
resolution of HSCs, microscopic material mixing, and multiple scattering,
spectra measured by HSCs are mixtures of spectra of materials in a scene. Thus,
accurate estimation requires unmixing. Pixels are assumed to be mixtures of a
few materials, called endmembers. Unmixing involves estimating all or some of:
the number of endmembers, their spectral signatures, and their abundances at
each pixel. Unmixing is a challenging, ill-posed inverse problem because of
model inaccuracies, observation noise, environmental conditions, endmember
variability, and data set size. Researchers have devised and investigated many
models searching for robust, stable, tractable, and accurate unmixing
algorithms. This paper presents an overview of unmixing methods from the time
of Keshava and Mustard's unmixing tutorial [1] to the present. Mixing models
are first discussed. Signal-subspace, geometrical, statistical, sparsity-based,
and spatial-contextual unmixing algorithms are described. Mathematical problems
and potential solutions are described. Algorithm characteristics are
illustrated experimentally.Comment: This work has been accepted for publication in IEEE Journal of
Selected Topics in Applied Earth Observations and Remote Sensin
Fast Iterative Reconstruction for Multi-spectral CT by a Schmidt Orthogonal Modification Algorithm (SOMA)
Multi-spectral CT (MSCT) is increasingly used in industrial non-destructive
testing and medical diagnosis because of its outstanding performance like
material distinguishability. The process of obtaining MSCT data can be modeled
as nonlinear equations and the basis material decomposition comes down to the
inverse problem of the nonlinear equations. For different spectra data,
geometric inconsistent parameters cause geometrical inconsistent rays, which
will lead to mismatched nonlinear equations. How to solve the mismatched
nonlinear equations accurately and quickly is a hot issue. This paper proposes
a general iterative method to invert the mismatched nonlinear equations and
develops Schmidt orthogonalization to accelerate convergence. The validity of
the proposed method is verified by MSCT basis material decomposition
experiments. The results show that the proposed method can decompose the basis
material images accurately and improve the convergence speed greatly
Accurate molecular imaging of small animals taking into account animal models, handling, anaesthesia, quality control and imaging system performance
Small-animal imaging has become an important technique for the development of new radiotracers, drugs and therapies. Many laboratories have now a combination of different small-animal imaging systems, which are being used by biologists, pharmacists, medical doctors and physicists. The aim of this paper is to give an overview of the important factors in the design of a small animal, nuclear medicine and imaging experiment. Different experts summarize one specific aspect important for a good design of a small-animal experiment
New approach to calculating the fundamental matrix
The estimation of the fundamental matrix (F) is to determine the epipolar geometry and to establish a geometrical relation between two images of the same scene or elaborate video frames. In the literature, we find many techniques that have been proposed for robust estimations such as RANSAC (random sample consensus), least-squares median (LMeds), and M estimators as exhaustive. This article presents a comparison between the different detectors that are (Harris, FAST, SIFT, and SURF) in terms of detected points number, the number of correct matches and the computation speed of the ‘F’. Our method based first on the extraction of descriptors by the algorithm (SURF) was used in comparison to the other one because of its robustness, then set the threshold of uniqueness to obtain the best points and also normalize these points and rank it according to the weighting function of the different regions at the end of the estimation of the matrix''F'' by the technique of the M-estimator at eight points, to calculate the average error and the speed of the calculation ''F''. The results of the experimental simulation were applied to the real images with different changes of viewpoints, for example (rotation, lighting, and moving object), give a good agreement in terms of the counting speed of the fundamental matrix and the acceptable average error. The results of the simulation show this technique of use in real-time application
- …