2,163 research outputs found

    Gamma ray tests of Minimal Dark Matter

    Full text link
    We reconsider the model of Minimal Dark Matter (a fermionic, hypercharge-less quintuplet of the EW interactions) and compute its gamma ray signatures. We compare them with a number of gamma ray probes: the galactic halo diffuse measurements, the galactic center line searches and recent dwarf galaxies observations. We find that the original minimal model, whose mass is fixed at 9.4 TeV by the relic abundance requirement, is constrained by the line searches from the Galactic Center: it is ruled out if the Milky Way possesses a cuspy profile such as NFW but it is still allowed if it has a cored one. Observations of dwarf spheroidal galaxies are also relevant (in particular searches for lines), and ongoing astrophysical progresses on these systems have the potential to eventually rule out the model. We also explore a wider mass range, which applies to the case in which the relic abundance requirement is relaxed. Most of our results can be safely extended to the larger class of multi-TeV WIMP DM annihilating into massive gauge bosons.Comment: 25 pages, 8 figures. v2: a few comments and references added, matches version published on JCA

    Be vicarious: the challenge for project management in the service economy

    Get PDF
    Purpose. The paper aims to answer to the following questions: which are the critical dynamic capabilities to survive in the rubber landscape of service economy? Does it exist in service economy a dynamic capabilities provider? Methodology. The paper combines the literature review on dynamic capability perspective and that on vicariance to the Project Management professional services. Findings. Firstly, the paper identifies vicariance as an intriguing dynamic capability, crucial to survive in the rubber landscape of service economy. Secondly, the paper sheds light on Project Management (PM) as a vicarious that provides vicariance. Practical implications. For each critical organizational dimension, the paper identifies the links among the service economy challenges and the vicariance typology required to the project manager to face those challenge. Originality/value.The approach to conceive the PM as a vicarious that provides vicariance is original and leads to new insights on the professional services management. In fact, on one hand, dynamic capabilities cannot easily be bought through a market transaction; on the other hand, they must be built. This building can be achieved internally, by the organization itself (i.e. hierarchy), or through a partnership (i.e. hybrid form among hierarchy and market). PM professional services enrich organizations with additional information variety according to a hybrid (i.e. non- market) coordination model

    The PMBOK standard evolution: leading the rising complexity

    Get PDF
    The aim of this work is to enlighten how the Standard for Project Management (part II of PMBOK® Guide) has evolved over the last 30 years as it has introjected the perspective of complexity. The several contexts (private firms, public institutions etc.) in which Project Management is applied become more and more complex (i.e. uncertain and characterized by unpredictable feedbacks among their own variables and their environments). This needs an enrichment (and perhaps a new conceptualization) of the endowment of information variety provided by the Standard for Project Management with respect to the specific requisite variety asked at a local level (i.e. the specific organizational contexts), to lead a project with efficiency, effectiveness and sustainability. The traditional Standard for Project Management can no longer be considered as a “comfort zone” (i.e. a set of established and “familiar” frameworks, rules and tools aiming to ensure certain and predictable results). On the contrary, the Standard for Project Management should shift towards an open standard, that is able to consistently co-evolve with the increasingly complex contexts that even more ask for new tools, creative solutions and original combinations between exploitative and explorative knowledge

    Homeopathic Dark Matter, or how diluted heavy substances produce high energy cosmic rays

    Full text link
    We point out that current and planned telescopes have the potential of probing annihilating Dark Matter (DM) with a mass of O(100) TeV and beyond. As a target for such searches, we propose models where DM annihilates into lighter mediators, themselves decaying into Standard Model (SM) particles. These models allow to reliably compute the energy spectra of the SM final states, and to naturally evade the unitarity bound on the DM mass. Indeed, long-lived mediators may cause an early matter-dominated phase in the evolution of the Universe and, upon decaying, dilute the density of preexisting relics thus allowing for very large DM masses. We compute this dilution in detail and provide results in a ready-to-use form. Considering for concreteness a model of dark U(1) DM, we then study both dilution and the signals at various high energy telescopes observing gamma rays, neutrinos and charged cosmic rays. This study enriches the physics case of these experiments, and opens a new observational window on heavy new physics sectors.Comment: 39 pages, 11 figures. v2: reference added, fixed technical issue causing 2 figures not to show properly. v3: BBN constraints amended, conclusions unchanged. Matches published versio

    Asymmetric dark matter: residual annihilations and self-interactions

    Full text link
    Dark matter (DM) coupled to light mediators has been invoked to resolve the putative discrepancies between collisionless cold DM and galactic structure observations. However, γ\gamma-ray searches and the CMB strongly constrain such scenarios. To ease the tension, we consider asymmetric DM. We show that, contrary to the common lore, detectable annihilations occur even for large asymmetries, and derive bounds from the CMB, γ\gamma-ray, neutrino and antiproton searches. We then identify the viable space for self-interacting DM. Direct detection does not exclude this scenario, but provides a way to test it.Comment: 20 pages, 4 figure

    Study of Cassini and New Horizons trajectories using JPL SPICE library

    Get PDF
    The obtaining of scientific data in interplanetary missions is based on a series of complex algorithms that serve to plan and develop the experiments.At different stages of the mission, due to the fact that the conditions planned to do each observation can end up not being the ones that actually happen, the best timing to carry out each observation must be recalculated.Therefore, the objective of this project is to understand and implement a subset of the algorithms used to obtain scientific data, using Matlab and SPICE, a library created by JPL, which is a software tool that contains a series of functions that are helpful when designing observations.In this project, a series of SPICE-based algorithms have been developed to calculate occultations, transit and the illumination of celestial bodies. Specifically,in this project are studied the flybys that were made by the probes: Cassini,Voyager and New Horizons, on the different planets for which they passed by. Forthe case of when Cassini flies over Jupiter, observations have been designed in two different ways; one considering whether the camera was pointing to the planet andthe other without taking it into account.The results obtained from the algorithms have been compared with the real trajectories already made, obtaining satisfactory results. It has also been proven from navigated images that the algorithms work correctly. Another algorithm has also been created to calculate the time interval during which the transit of a celestial body can be seen in front of another, from anyplanet. In particular, it has been tested with the transit that Mercury is going to make on November 11, 2019, achieving the expected result
    corecore