3,428 research outputs found
Neutrino masses and mixings in a Minimal S_3-invariant Extension of the Standard Model
The mass matrices of the charged leptons and neutrinos, that had been derived
in the framework of a Minimal S_3-invariant Extension of the Standard Model,
are here reparametrized in terms of their eigenvalues. The neutrino mixing
matrix, V_PMNS, is then computed and exact, explicit analytical expressions for
the neutrino mixing angles as functions of the masses of the neutrinos and
charged leptons are obtained. The reactor, theta_13, and the atmosferic,
theta_23, mixing angles are found to be functions only of the masses of the
charged leptons. The numerical values of theta_13{th} and theta_23{th} computed
from our theoretical expressions are found to be in excellent agreement with
the latest experimental determinations. The solar mixing angle, theta_12{th},
is found to be a function of both, the charged lepton and neutrino masses, as
well as of a Majorana phase phi_nu. A comparison of our theoretical expression
for the solar angle theta_12{th} with the latest experimental value
theta_12{exp} ~ 34 deg allowed us to fix the scale and origin of the neutrino
mass spectrum and obtain the mass values |m_nu1|=0.0507 eV, |m_nu2|=0.0499 eV
and |m_nu3|=0.0193 eV, in very good agreement with the observations of neutrino
oscillations, the bounds extracted from neutrinoless double beta decay and the
precision cosmological measurements of the CMB.Comment: To appear in the Proceedings of the XXIX Symposium on Nuclear
Physics, Cocoyoc, Mex., January 2006. Some typographical errors on formulae
correcte
Recommended from our members
A Double Error Dynamic Asymptote Model of Associative Learning
In this paper a formal model of associative learning is presented which incorporates representational and computational mechanisms that, as a coherent corpus, empower it to make accurate predictions of a wide variety of phenomena that so far have eluded a unified account in learning theory. In particular, the Double Error Dynamic Asymptote (DDA) model introduces: 1) a fully-connected network architecture in which stimuli are represented as temporally clustered elements that associate to each other, so that elements of one cluster engender activity on other clusters, which naturally implements neutral stimuli associations and mediated learning; 2) a predictor error term within the traditional error correction rule (the double error), which reduces the rate of learning for expected predictors; 3) a revaluation associability rate that operates on the assumption that the outcome predictiveness is tracked over time so that prolonged uncertainty is learned, reducing the levels of attention to initially surprising outcomes; and critically 4) a biologically plausible variable asymptote, which encapsulates the principle of Hebbian learning, leading to stronger associations for similar levels of cluster activity. The outputs of a set of simulations of the DDA model are presented along with empirical results from the literature. Finally, the predictive scope of the model is discussed
- …