55,933 research outputs found
On the Dark Matter Solutions to the Cosmic Ray Lepton Puzzle
Recent measurements of cosmic ray leptons by PAMELA, ATIC, HESS and Fermi
revealed interesting excesses. Many authors suggested particle Dark Matter (DM)
annihilations could be at the origin of these effects. In this paper, we
critically assess this interpretation by reviewing some results questioning the
naturalness and robustness of such an interpretation. Natural values for the DM
particle parameters lead to a poor leptons production so that models often
require signal enhancement effects that we constrain here. Considering DM
annihilations are likely to produce antiprotons as well, we use the PAMELA
antiproton to proton ratio measurements to constrain a possible exotic
contribution. We also consider the possibility of an enhancement due to a
nearby clump of DM. This scenario appears unlikely when compared to the
state-of-the-art cosmological N-body simulations. We conclude that the bulk of
the observed signals most likely has no link with DM and is rather a new, yet
unconsidered source of background for searches in these channels.Comment: 8 pages, Proceedings of the Invisible Universe International
Conference 2009, Pari
Seeking particle dark matter in the TeV sky
Under the assumption that dark matter is made of new particles, annihilations
of those are required to reproduce the correct dark matter abundance in the
Universe. This process can occur in dense regions of our Galaxy such as the
Galactic center, dwarf galaxies and other types of sub-haloes. High-energy
gamma-rays are expected to be produced in dark matter particle collisions and
could be detected by ground-based Cherenkov telescopes such as HESS, MAGIC and
VERITAS. The main experimental challenges to get constraints on particle dark
matter models are reviewed, making explicit the pros and cons that are inherent
to this technique, together with the current results from running
observatories. Main results concerning dark matter searches towards selected
targets with Cherenkov telescopes are presented. Eventually, a focus is made on
a new way to perform a search for Galactic subhaloes with such telescopes,
based on wide-field surveys, as well as future prospects.Comment: 12 pages, 10 figures. To appear in the proceedings of the eleventh
international symposium Frontiers of Fundamental Physic
Decoherence and quantum trajectories
Decoherence is the process by which quantum systems interact and become
correlated with their external environments; quantum trajectories are a
powerful technique by which decohering systems can be resolved into stochastic
evolutions, conditioned on different possible ``measurements'' of the
environment. By calling on recently-developed tools from quantum information
theory, we can analyze simplified models of decoherence, explicitly quantifying
the flow of information and randomness between the system, the environment, and
potential observers.Comment: 14 pages, Springer LNP LaTeX macros, 1 figure in encapsulated
postscript format. To appear in proceedings of DICE 200
Auxetic two-dimensional lattice with Poisson's Ratio arbitrarily close to -1
In this paper we propose a new lattice structure having macroscopic Poisson's
ratio arbitrarily close to the stability limit -1. We tested experimentally the
effective Poisson's ratio of the micro-structured medium; the uniaxial test has
been performed on a thermoplastic lattice produced with a 3d printing
technology. A theoretical analysis of the effective properties has been
performed and the expression of the macroscopic constitutive properties is
given in full analytical form as a function of the constitutive properties of
the elements of the lattice and on the geometry of the microstructure. The
analysis has been performed on three micro-geometry leading to an isotropic
behaviour for the cases of three-fold and six-fold symmetry and to a cubic
behaviour for the case of four-fold symmetry.Comment: 26 pages, 12 figures (26 subfigures
The Architecture of MEG Simulation and Analysis Software
MEG (Mu to Electron Gamma) is an experiment dedicated to search for the
decay that is strongly suppressed in the Standard
Model but predicted in several Super Symmetric extensions of it at an
accessible rate. MEG is a small-size experiment ( physicists at
any time) with a life span of about 10 years. The limited human resource
available, in particular in the core offline group, emphasized the importance
of reusing software and exploiting existing expertise. Great care has been
devoted to provide a simple system that hides implementation details to the
average programmer. That allowed many members of the collaboration to
contribute to the development of the software of the experiment with limited
programming skill. The offline software is based on two frameworks: {\bf REM}
in FORTRAN 77 used for the event generation and detector simulation package
{\bf GEM}, based on GEANT 3, and {\bf ROME} in C++ used in the readout
simulation {\bf Bartender} and in the reconstruction and analysis program {\bf
Analyzer}. Event display in the simulation is based on GEANT 3 graphic
libraries and in the reconstruction on ROOT graphic libraries. Data are stored
in different formats in various stage of the processing. The frameworks include
utilities for input/output, database handling and format conversion transparent
to the user.Comment: Presented at the IEEE NSS Knoxville, 2010 Revised according to
referee's remarks Accepted by European Physical Journal Plu
- …
