826 research outputs found
Sensitivity of Neutrino Mass Experiments to the Cosmic Neutrino Background
The KATRIN neutrino experiment is a next-generation tritium beta decay
experiment aimed at measuring the mass of the electron neutrino to better than
200 meV at 90% C.L. Due to its intense tritium source, KATRIN can also serve as
a possible target for the process of neutrino capture, {\nu}e +3H \to 3He+ +
e-. The latter process, possessing no energy threshold, is sensitive to the
Cosmic Neutrino Background (C{\nu}B). In this paper, we explore the potential
sensitivity of the KATRIN experiment to the relic neutrino density. The KATRIN
experiment is sensitive to a C{\nu}B over-density ratio of 2.0x 10^9 over
standard concordance model predictions (at 90% C.L.), addressing the validity
of certain speculative cosmological models
Solving for Micro- and Macro- Scale Electrostatic Configurations Using the Robin Hood Algorithm
We present a novel technique by which highly-segmented electrostatic
configurations can be solved. The Robin Hood method is a matrix-inversion
algorithm optimized for solving high density boundary element method (BEM)
problems. We illustrate the capabilities of this solver by studying two
distinct geometry scales: (a) the electrostatic potential of a large volume
beta-detector and (b) the field enhancement present at surface of electrode
nano-structures. Geometries with elements numbering in the O(10^5) are easily
modeled and solved without loss of accuracy. The technique has recently been
expanded so as to include dielectrics and magnetic materials.Comment: 40 pages, 20 figure
Violation of the Leggett-Garg Inequality in Neutrino Oscillations
The Leggett-Garg inequality, an analogue of Bell's inequality involving
correlations of measurements on a system at different times, stands as one of
the hallmark tests of quantum mechanics against classical predictions. The
phenomenon of neutrino oscillations should adhere to quantum-mechanical
predictions and provide an observable violation of the Leggett-Garg inequality.
We demonstrate how oscillation phenomena can be used to test for violations of
the classical bound by performing measurements on an ensemble of neutrinos at
distinct energies, as opposed to a single neutrino at distinct times. A study
of the MINOS experiment's data shows a greater than violation over
a distance of 735 km, representing the longest distance over which either the
Leggett-Garg inequality or Bell's inequality has been tested.Comment: Updated to match published version. 6 pages, 2 figure
New approach to 3D electrostatic calculations for micro-pattern detectors
We demonstrate practically approximation-free electrostatic calculations of
micromesh detectors that can be extended to any other type of micropattern
detectors. Using newly developed Boundary Element Method called Robin Hood
Method we can easily handle objects with huge number of boundary elements
(hundreds of thousands) without any compromise in numerical accuracy. In this
paper we show how such calculations can be applied to Micromegas detectors by
comparing electron transparencies and gains for four different types of meshes.
We demonstrate inclusion of dielectric material by calculating the electric
field around different types of dielectric spacers
Principal components technique analysis for vegetation and land use discrimination
Automatic pre-processing technique called Principal Components (PRINCO) in analyzing LANDSAT digitized data, for land use and vegetation cover, on the Brazilian cerrados was evaluated. The chosen pilot area, 223/67 of MSS/LANDSAT 3, was classified on a GE Image-100 System, through a maximum-likehood algorithm (MAXVER). The same procedure was applied to the PRINCO treated image. PRINCO consists of a linear transformation performed on the original bands, in order to eliminate the information redundancy of the LANDSAT channels. After PRINCO only two channels were used thus reducing computer effort. The original channels and the PRINCO channels grey levels for the five identified classes (grassland, "cerrado", burned areas, anthropic areas, and gallery forest) were obtained through the MAXVER algorithm. This algorithm also presented the average performance for both cases. In order to evaluate the results, the Jeffreys-Matusita distance (JM-distance) between classes was computed. The classification matrix, obtained through MAXVER, after a PRINCO pre-processing, showed approximately the same average performance in the classes separability
Atmospheric correction analysis on LANDSAT data over the Amazon region
The Amazon Region natural resources were studied in two ways and compared. A LANDSAT scene and its attributes were selected, and a maximum likelihood classification was made. The scene was atmospherically corrected, taking into account Amazonic peculiarities revealed by (ground truth) of the same area, and the subsequent classification. Comparison shows that the classification improves with the atmospherically corrected images
- …