14,890 research outputs found
The need for isotopic data on refractory elements in the solar wind
The Sun accounts for the bulk of material in the solar system. Information on the isotopic composition of elements in the solar wind is therefore essential for an understanding of the contribution made by each nucleogenetic component that has been identified in meteorites. Recent work suggests that isotopic data on the solar wind may also help us to understand the physical process that is concentrating light elements at the solar surface. Refractory and volatile elements would behave alike under the conditions of solar fractionation. Prolonged exposure of foils at 1AU from the sun would be a relatively inexpensively way to collect the quantity of solar wind implanted refractory elements needed to test this hypothesis
The Complexity of Rooted Phylogeny Problems
Several computational problems in phylogenetic reconstruction can be
formulated as restrictions of the following general problem: given a formula in
conjunctive normal form where the literals are rooted triples, is there a
rooted binary tree that satisfies the formula? If the formulas do not contain
disjunctions, the problem becomes the famous rooted triple consistency problem,
which can be solved in polynomial time by an algorithm of Aho, Sagiv,
Szymanski, and Ullman. If the clauses in the formulas are restricted to
disjunctions of negated triples, Ng, Steel, and Wormald showed that the problem
remains NP-complete. We systematically study the computational complexity of
the problem for all such restrictions of the clauses in the input formula. For
certain restricted disjunctions of triples we present an algorithm that has
sub-quadratic running time and is asymptotically as fast as the fastest known
algorithm for the rooted triple consistency problem. We also show that any
restriction of the general rooted phylogeny problem that does not fall into our
tractable class is NP-complete, using known results about the complexity of
Boolean constraint satisfaction problems. Finally, we present a pebble game
argument that shows that the rooted triple consistency problem (and also all
generalizations studied in this paper) cannot be solved by Datalog
Integrating IVHM and Asset Design
Integrated Vehicle Health Management (IVHM) describes a set of capabilities that enable effective and efficient maintenance and operation of the target vehicle. It accounts for the collection of data, conducting analysis, and supporting the decision-making process for sustainment and operation. The design of IVHM systems endeavours to account for all causes of failure in a disciplined, systems engineering, manner. With industry striving to reduce through-life cost, IVHM is a powerful tool to give forewarning of impending failure and hence control over the outcome. Benefits have been realised from this approach across a number of different sectors but, hindering our ability to realise further benefit from this maturing technology, is the fact that IVHM is still treated as added on to the design of the asset, rather than being a sub-system in its own right, fully integrated with the asset design. The elevation and integration of IVHM in this way will enable architectures to be chosen that accommodate health ready sub-systems from the supply chain and design trade-offs to be made, to name but two major benefits. Barriers to IVHM being integrated with the asset design are examined in this paper. The paper presents progress in overcoming them, and suggests potential solutions for those that remain. It addresses the IVHM system design from a systems engineering perspective and the integration with the asset design will be described within an industrial design process
Metastable Kinks in the Orbifold
We consider static configurations of bulk scalar fields in extra dimensional
models in which the fifth dimension is an orbifold. There may exist a
finite number of such configurations, with total number depending on the size
of the orbifold interval. We perform a detailed Sturm-Liouville stability
analysis that demonstrates that all but the lowest-lying configurations - those
with no nodes in the interval - are unstable. We also present a powerful
general criterion with which to determine which of these nodeless solutions are
stable. The detailed analysis underlying the results presented in this letter,
and applications to specific models, are presented in a comprehensive companion
paper.Comment: 4 pages, 4 figures, reference added, typo corrected, submitted to PR
Heavy-neutrino decays at neutrino telescopes
It has been recently proposed that a sterile neutrino \nu_h of mass
m_h=40--80 MeV, mixing |U_{\mu h}|^2=10^{-3}--10^{-2}, lifetime \tau_h \lsim
10^{-9} s, and a dominant decay mode (\nu_h \to \nu_\mu \gamma) could be the
origin of the experimental anomalies observed at LSND, KARMEN and MiniBooNE.
Such a particle would be abundant inside air showers, as it can be produced in
kaon decays (K -> \nu_h \mu, K_L -> \nu_h \pi \mu). We use the Z-moment method
to evaluate its atmospheric flux and the frequency of its decays inside
neutrino telescopes. We show that the \nu_h would imply around 10^4 contained
showers per year inside a 0.03 km^3 telescope like ANTARES or the DeepCore in
IceCube. These events would have a characteristic energy and zenith-angle
distribution (E_\nu = 0.1--10 TeV and \theta < 90^o), which results from a
balance between the reach of the heavy neutrino (that disfavors low energies)
and a sizeable production rate and decay probability. The standard background
from contained neutrino events (\nu_e N \to e X and neutral-current
interactions of high inelasticity) is 100 times smaller. Therefore, although it
may be challenging from an experimental point of view, a search at ANTARES and
IceCube could confirm this heavy-neutrino possibility.Comment: 10 pages. Comments on constraints from muon capture and cosmology
added, minor corrections, references added. Version to appear as a Rapid
Communication in PR
Integrating IVHM and asset design
Integrated Vehicle Health Management (IVHM) describes a set of capabilities that enable effective and efficient maintenance and operation of the target vehicle. It accounts for the collecting of data, conducting analysis, and supporting the decision-making process for sustainment and operation. The design of IVHM systems endeavours to account for all causes of failure in a disciplined, systems engineering, manner. With industry striving to reduce through-life cost, IVHM is a powerful tool to give forewarning of impending failure and hence control over the outcome. Benefits have been realised from this approach across a number of different sectors but, hindering our ability to realise further benefit from this maturing technology, is the fact that IVHM is still treated as added on to the design of the asset, rather than being a sub-system in its own right, fully integrated with the asset design. The elevation and integration of IVHM in this way will enable architectures to be chosen that accommodate health ready sub-systems from the supply chain and design trade-offs to be made, to name but two major benefits. Barriers to IVHM being integrated with the asset design are examined in this paper. The paper presents progress in overcoming them, and suggests potential solutions for those that remain. It addresses the IVHM system design from a systems engineering perspective and the integration with the asset design will be described within an industrial design process
Phantom Black Holes in Einstein-Maxwell-Dilaton Theory
We obtain the general static, spherically symmetric solution for the
Einstein-Maxwell-dilaton system in four dimensions with a phantom coupling for
the dilaton and/or the Maxwell field. This leads to new classes of black hole
solutions, with single or multiple horizons. Using the geodesic equations, we
analyse the corresponding Penrose diagrams revealing, in some cases, new causal
structures.Comment: Latex file, 32 pages, 15 figures in eps format. Typo corrected in Eq.
(3.18
Independence in constraint logic programs
Studying independence of literals, variables, and substitutions has proven very useful in the context of logic programming (LP). Here we study independence in the broader context of constraint logic programming (CLP). We show that a naive extrapolation of the LP definitions of independence to CLP is unsatisfactory (in fact, wrong) for two reasons. First, because interaction between variables through constraints is more complex than in the case of logic programming. Second, in order to ensure the efUciency of several optimizations not only must independence of the search space be considered, but also an orthogonal issue - "independence of constraint solving." We clarify these issues by proposing various types of search independence
and constraint solver independence, and show how they can be combined to allow different independence-related optimizations, from parallelism to intelligent backtracking. Sufficient conditions for independence which can be evaluated "a-priori" at run-time are also proposed. Our results suggest that independence, provided a suitable definition is chosen, is even more useful in CLP than in LP
Rapidity Gap Events for Squark Pair Production at the LHC
The exchange of electroweak gauginos in the or channel allows squark
pair production at hadron colliders without color exchange between the squarks.
This can give rise to events where little or no energy is deposited in the
detector between the squark decay products. We discuss the potential for
detection of such rapidity gap events at the Large Hadron Collider (LHC). We
present an analysis with full event simulation using PYTHIA as well as
Herwig++, but without detector simulation. We analyze the transverse energy
deposited between the jets from squark decay, as well as the probability of
finding a third jet in between the two hardest jets. For the mSUGRA benchmark
point SPS1a we find statistically significant evidence for a color singlet
exchange contribution.Comment: 4 pages, 2 figures. To be published in the proceedings of SUSY09,
Northeastern University, Boston, M
Constraining the Accretion Geometry of the Intermediate Polar EX Hya Using NuSTAR, Swift, and Chandra Observations
In magnetically accreting white dwarfs, the height above the white dwarf surface where the standing shock is formed is intimately related with the accretion rate and the white dwarf mass. However, it is difficult to measure. We obtained new data with NuSTAR and Swift that, together with archival Chandra data, allow us to constrain the height of the shock in the intermediate polar EX Hya. We conclude that the shock has to form at least at a distance of about one white dwarf radius from the surface in order to explain the weak Fe Kα 6.4 keV line, the absence of a reflection hump in the high-energy continuum, and the energy dependence of the white dwarf spin pulsed fraction. Additionally, the NuSTAR data allowed us to measure the true, uncontaminated hard X-ray (12-40 keV) flux, whose measurement was contaminated by the nearby galaxy cluster Abell 3528 in non-imaging X-ray instruments.Fil: Luna, Gerardo Juan Manuel. Consejo Nacional de Investigaciónes CientÃficas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de AstronomÃa y FÃsica del Espacio. - Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de AstronomÃa y FÃsica del Espacio; ArgentinaFil: Mukai, K.. National Aeronautics and Space Administration; Estados UnidosFil: Orio, M.. Università di Padova; ItaliaFil: Zemko, P.. Università di Padova; Itali
- …