644 research outputs found
Testing matter effects in propagation of atmospheric and long-baseline neutrinos
We quantify our current knowledge of the size and flavor structure of the
matter effects in the evolution of atmospheric and long-baseline neutrinos
based solely on the analysis of the corresponding neutrino data. To this aim we
generalize the matter potential of the Standard Model by rescaling its
strength, rotating it away from the e-e sector, and rephasing it with respect
to the vacuum term. This phenomenological parametrization can be easily
translated in terms of non-standard neutrino interactions in matter. We show
that in the most general case, the strength of the potential cannot be
determined solely by atmospheric and long-baseline data. However its flavor
composition is very much constrained and the present determination of the
neutrino masses and mixing is robust under its presence. We also present an
update of the constraints arising from this analysis in the particular case in
which no potential is present in the e-mu and e-tau sectors. Finally we
quantify to what degree in this scenario it is possible to alleviate the
tension between the oscillation results for neutrinos and antineutrinos in the
MINOS experiment and show the relevance of the high energy part of the spectrum
measured at MINOS.Comment: PDFLaTeX file using JHEP3 class, 25 pages, 7 figures included.
Accepted for publication in JHE
A missense TGFB2 variant p.(Arg320Cys) causes a paradoxical and striking increase in aortic TGFB1/2 expression.
Loeys-Dietz syndrome (LDS) is an autosomal dominant connective tissue disorder with a range of cardiovascular, skeletal, craniofacial and cutaneous manifestations. LDS type 4 is caused by mutations in TGFβ ligand 2 (TGFB2) and based on the family pedigrees described to date, appears to have a milder clinical phenotype, often presenting with isolated aortic disease. We sought to investigate its molecular basis in a new pedigree. We identified a missense variant p.(Arg320Cys) (NM_003238.3) in a highly evolutionary conserved region of TGFB2 in a new LDS type 4 pedigree with multiple cases of aortic aneurysms and dissections. There was striking upregulation of TGFB1 and TGFB2 expression on immunofluorescent staining, and western blotting of the aortic tissue from the index case confirming the functional importance of the variant. This case highlights the striking paradox of predicted loss-of-function mutations in TGFB2 causing enhanced TGFβ signaling in this emerging familial aortopathy.Raya Al Maskari has a PhD studentship funded by the Omani government.This is the author accepted manuscript. The final version is available from Nature Publishing Group via https://doi.org/10.1038/ejhg.2016.14
Non-standard interactions versus non-unitary lepton flavor mixing at a neutrino factory
The impact of heavy mediators on neutrino oscillations is typically described
by non-standard four-fermion interactions (NSIs) or non-unitarity (NU). We
focus on leptonic dimension-six effective operators which do not produce
charged lepton flavor violation. These operators lead to particular
correlations among neutrino production, propagation, and detection non-standard
effects. We point out that these NSIs and NU phenomenologically lead, in fact,
to very similar effects for a neutrino factory, for completely different
fundamental reasons. We discuss how the parameters and probabilities are
related in this case, and compare the sensitivities. We demonstrate that the
NSIs and NU can, in principle, be distinguished for large enough effects at the
example of non-standard effects in the --sector, which basically
corresponds to differentiating between scalars and fermions as heavy mediators
as leading order effect. However, we find that a near detector at superbeams
could provide very synergistic information, since the correlation between
source and matter NSIs is broken for hadronic neutrino production, while NU is
a fundamental effect present at any experiment.Comment: 32 pages, 5 figures. Final version published in JHEP. v3: Typo in Eq.
(27) correcte
Large-Theta(13) Perturbation Theory of Neutrino Oscillation for Long-Baseline Experiments
The Cervera et al. formula, the best known approximate formula of neutrino
oscillation probability for long-baseline experiments, can be regarded as a
second-order perturbative formula with small expansion parameter epsilon \equiv
Delta m^2_{21} / Delta m^2_{31} \simeq 0.03 under the assumption s_{13} \simeq
epsilon. If theta_{13} is large, as suggested by a candidate nu_{e} event at
T2K as well as the recent global analyses, higher order corrections of s_{13}
to the formula would be needed for better accuracy. We compute the corrections
systematically by formulating a perturbative framework by taking theta_{13} as
s_{13} \sim \sqrt{epsilon} \simeq 0.18, which guarantees its validity in a wide
range of theta_{13} below the Chooz limit. We show on general ground that the
correction terms must be of order epsilon^2. Yet, they nicely fill the mismatch
between the approximate and the exact formulas at low energies and relatively
long baselines. General theorems are derived which serve for better
understanding of delta-dependence of the oscillation probability. Some
interesting implications of the large theta_{13} hypothesis are discussed.Comment: Fig.2 added, 23 pages. Matches to the published versio
Effective Rheology of Bubbles Moving in a Capillary Tube
We calculate the average volumetric flux versus pressure drop of bubbles
moving in a single capillary tube with varying diameter, finding a square-root
relation from mapping the flow equations onto that of a driven overdamped
pendulum. The calculation is based on a derivation of the equation of motion of
a bubble train from considering the capillary forces and the entropy production
associated with the viscous flow. We also calculate the configurational
probability of the positions of the bubbles.Comment: 4 pages, 1 figur
Sterile neutrino portal to Dark Matter I: the U(1) B−L case
In this paper we explore the possibility that the sterile neutrino and Dark Matter sectors in the Universe have a common origin. We study the consequences of this assumption in the simple case of coupling the dark sector to the Standard Model via a global U(1)B−L, broken down spontaneously by a dark scalar. This dark scalar provides masses to the dark fermions and communicates with the Higgs via a Higgs portal coupling. We find an interesting interplay between Dark Matter annihilation to dark scalars — the CP-even that mixes with the Higgs and the CP-odd which becomes a Goldstone boson, the Majoron — and heavy neutrinos, as well as collider probes via the coupling to the Higgs. Moreover, Dark Matter annihilation into sterile neutrinos and its subsequent decay to gauge bosons and quarks, charged leptons or neutrinos lead to indirect detection signatures which are close to current bounds on the gamma ray flux from the galactic center and dwarf galaxies
Sterile neutrino portal to Dark Matter II: exact dark symmetry
We analyze a simple extension of the standard model (SM) with a dark sector composed of a scalar and a fermion, both singlets under the SM gauge group but charged under a dark sector symmetry group. Sterile neutrinos, which are singlets under both groups, mediate the interactions between the dark sector and the SM particles, and generate masses for the active neutrinos via the seesaw mechanism. We explore the parameter space region where the observed Dark Matter relic abundance is determined by the annihilation into sterile neutrinos, both for fermion and scalar Dark Matter particles. The scalar Dark Matter case provides an interesting alternative to the usual Higgs portal scenario. We also study the constraints from direct Dark Matter searches and the prospects for indirect detection via sterile neutrino decays to leptons, which may be able to rule out Dark Matter masses below and around 100 GeV
Perspectives on utilization of edible coatings and nano-laminate coatings for extension of postharvest storage of fruits and vegetables
It is known that in developing countries, a large quantity of fruit and vegetable losses results at postharvest and processing stages due to poor or scarce storage technology and mishandling during harvest. The use of new and innovative technologies for reducing postharvest losses is a requirement that has not been fully covered. The use of edible coatings (mainly based on biopolymers) as a postharvest technique for agricultural commodities has offered biodegradable alternatives in order to solve problems (e.g., microbiological growth) during produce storage. However, biopolymer-based coatings can present some disadvantages such as: poor mechanical properties (e.g., lipids) or poor water vapor barrier properties (e.g., polysaccharides), thus requiring the development of new alternatives to solve these drawbacks. Recently, nanotechnology has emerged as a promising tool in the food processing industry, providing new insights about postharvest technologies on produce storage. Nanotechnological approaches can contribute through the design of functional packing materials with lower amounts of bioactive ingredients, better gas and mechanical properties and with reduced impact on the sensorial qualities of the fruits and vegetables. This work reviews some of the main factors involved in postharvest losses and new technologies for extension of postharvest storage of fruits and vegetables, focused on perspective uses of edible coatings and nano-laminate coatings.María L. Flores-López thanks Mexican Science and Technology Council (CONACYT, Mexico) for PhD fellowship support (CONACYT Grant Number: 215499/310847). Miguel A. Cerqueira (SFRH/BPD/72753/2010) is recipient of a fellowship from the Fundação para a Ciência e Tecnologia (FCT, POPH-QREN and FSE Portugal). The authors also thank the FCT Strategic Project of UID/ BIO/04469/2013 unit, the project RECI/BBB-EBI/0179/2012 (FCOMP-01-0124-FEDER-027462) and the project ‘‘BioInd Biotechnology and Bioengineering for improved Industrial and AgroFood processes,’’ REF. NORTE-07-0124-FEDER-000028 Co-funded by the Programa Operacional Regional do Norte (ON.2 – O Novo Norte), QREN, FEDER. Fundação Cearense de Apoio ao Desenvolvimento Científico e Tecnológico – FUNCAP, CE Brazil (CI10080-00055.01.00/13)
Approaches in biotechnological applications of natural polymers
Natural polymers, such as gums and mucilage, are biocompatible, cheap, easily available and non-toxic materials of native origin. These polymers are increasingly preferred over synthetic materials for industrial applications due to their intrinsic properties, as well as they are considered alternative sources of raw materials since they present characteristics of sustainability, biodegradability and biosafety. As definition, gums and mucilages are polysaccharides or complex carbohydrates consisting of one or more monosaccharides or their derivatives linked in bewildering variety of linkages and structures. Natural gums are considered polysaccharides naturally occurring in varieties of plant seeds and exudates, tree or shrub exudates, seaweed extracts, fungi, bacteria, and animal sources. Water-soluble gums, also known as hydrocolloids, are considered exudates and are pathological products; therefore, they do not form a part of cell wall. On the other hand, mucilages are part of cell and physiological products. It is important to highlight that gums represent the largest amounts of polymer materials derived from plants. Gums have enormously large and broad applications in both food and non-food industries, being commonly used as thickening, binding, emulsifying, suspending, stabilizing agents and matrices for drug release in pharmaceutical and cosmetic industries. In the food industry, their gelling properties and the ability to mold edible films and coatings are extensively studied. The use of gums depends on the intrinsic properties that they provide, often at costs below those of synthetic polymers. For upgrading the value of gums, they are being processed into various forms, including the most recent nanomaterials, for various biotechnological applications. Thus, the main natural polymers including galactomannans, cellulose, chitin, agar, carrageenan, alginate, cashew gum, pectin and starch, in addition to the current researches about them are reviewed in this article.. }To the Conselho Nacional de Desenvolvimento Cientfíico e Tecnológico (CNPq) for fellowships (LCBBC and MGCC) and the Coordenação de Aperfeiçoamento de Pessoal de Nvíel Superior (CAPES) (PBSA). This study was supported by the Portuguese Foundation for Science and Technology (FCT) under the scope of the strategic funding of UID/BIO/04469/2013 unit, the Project RECI/BBB-EBI/0179/2012 (FCOMP-01-0124-FEDER-027462) and COMPETE 2020 (POCI-01-0145-FEDER-006684) (JAT)
Alignment of the ALICE Inner Tracking System with cosmic-ray tracks
37 pages, 15 figures, revised version, accepted by JINSTALICE (A Large Ion Collider Experiment) is the LHC (Large Hadron Collider) experiment devoted to investigating the strongly interacting matter created in nucleus-nucleus collisions at the LHC energies. The ALICE ITS, Inner Tracking System, consists of six cylindrical layers of silicon detectors with three different technologies; in the outward direction: two layers of pixel detectors, two layers each of drift, and strip detectors. The number of parameters to be determined in the spatial alignment of the 2198 sensor modules of the ITS is about 13,000. The target alignment precision is well below 10 micron in some cases (pixels). The sources of alignment information include survey measurements, and the reconstructed tracks from cosmic rays and from proton-proton collisions. The main track-based alignment method uses the Millepede global approach. An iterative local method was developed and used as well. We present the results obtained for the ITS alignment using about 10^5 charged tracks from cosmic rays that have been collected during summer 2008, with the ALICE solenoidal magnet switched off.Peer reviewe
- …
