223 research outputs found

    Sub-arcsecond radio and optical observations of the likely counterpart to the gamma-ray source 2FGL J2056.7+4939

    Full text link
    We have searched and reviewed all multi- wavelength data available for the region towards the gamma-ray source 2FGL J2056.7+4939 in order to con- strain its possible counterpart at lower energies. As a result, only a point-like optical/infrared source with flat-spectrum radio emission is found to be consistent with all X-ray and gamma-ray error circles. Its struc- ture is marginally resolved at radio wavelengths at the sub-arcsecond level. An extragalactic scenario appears to be the most likely interpretation for this object.Comment: 5 pages, 3 figures, 1 tabl

    Pointfree factorization of operation refinement

    Get PDF
    The standard operation refinement ordering is a kind of “meet of op- posites”: non-determinism reduction suggests “smaller” behaviour while increase of definition suggests “larger” behaviour. Groves’ factorization of this ordering into two simpler relations, one per refinement concern, makes it more mathe- matically tractable but is far from fully exploited in the literature. We present a pointfree theory for this factorization which is more agile and calculational than the standard set-theoretic approach. In particular, we show that factorization leads to a simple proof of structural refinement for arbitrary parametric types and ex- ploit factor instantiation across different subclasses of (relational) operation. The prospect of generalizing the factorization to coalgebraic refinement is discussedFundação para a Ciência e a Tecnologia (FCT) - PURE Project (Program Understanding and Re-engineering: Calculi and Applications), contract POSI/ICHS/44304/2002

    Calculating invariants as coreflexive bisimulations

    Get PDF
    Invariants, bisimulations and assertions are the main ingredients of coalgebra theory applied to software systems. In this paper we reduce the first to a particular case of the second and show how both together pave the way to a theory of coalgebras which regards invariant predicates as types. An outcome of such a theory is a calculus of invariants’ proof obligation discharge, a fragment of which is presented in the paper. The approach has two main ingredients: one is that of adopting relations as “first class citizens” in a pointfree reasoning style; the other lies on a synergy found between a relational construct, Reynolds’ relation on functions involved in the abstraction theorem on parametric polymorphism and the coalgebraic account of bisimulations and invariants. This leads to an elegant proof of the equivalence between two different definitions of bisimulation found in coalgebra literature (due to B. Jacobs and Aczel & Mendler, respectively) and to their instantiation to the classical Park-Milner definition popular in process algebra.Partially supported by the Fundacao para a Ciencia e a Tecnologia, Portugal, under grant number SFRH/BD/27482/2006

    Observation of the Ankle and Evidence for a High-Energy Break in the Cosmic Ray Spectrum

    Full text link
    We have measured the cosmic ray spectrum at energies above 101710^{17} eV using the two air fluorescence detectors of the High Resolution Fly's Eye experiment operating in monocular mode. We describe the detector, PMT and atmospheric calibrations, and the analysis techniques for the two detectors. We fit the spectrum to models describing galactic and extragalactic sources. Our measured spectrum gives an observation of a feature known as the ``ankle'' near 3Ă—10183\times 10^{18} eV, and strong evidence for a suppression near 6Ă—10196\times 10^{19} eV.Comment: 14 pages, 9 figures. To appear in Physics Letters B. Accepted versio

    UHECR as Decay Products of Heavy Relics? The Lifetime Problem

    Full text link
    The essential features underlying the top-down scenarii for UHECR are discussed, namely, the stability (or lifetime) imposed to the heavy objects (particles) whatever they be: topological and non-topological solitons, X-particles, cosmic defects, microscopic black-holes, fundamental strings. We provide an unified formula for the quantum decay rate of all these objects as well as the particle decays in the standard model. The key point in the top-down scenarii is the necessity to adjust the lifetime of the heavy object to the age of the universe. This ad-hoc requirement needs a very high dimensional operator to govern its decay and/or an extremely small coupling constant. The natural lifetimes of such heavy objects are, however, microscopic times associated to the GUT energy scale (sim 10^{-28} sec. or shorter). It is at this energy scale (by the end of inflation) where they could have been abundantly formed in the early universe and it seems natural that they decayed shortly after being formed.Comment: 11 pages, LaTex, no figures, updated versio

    Transforming data by calculation

    Get PDF
    Thispaperaddressesthefoundationsofdata-modeltransformation.A catalog of data mappings is presented which includes abstraction and representa- tion relations and associated constraints. These are justified in an algebraic style via the pointfree-transform, a technique whereby predicates are lifted to binary relation terms (of the algebra of programming) in a two-level style encompassing both data and operations. This approach to data calculation, which also includes transformation of recursive data models into “flat” database schemes, is offered as alternative to standard database design from abstract models. The calculus is also used to establish a link between the proposed transformational style and bidi- rectional lenses developed in the context of the classical view-update problem.Fundação para a Ciência e a Tecnologia (FCT

    Pointfree Factorization of Operation Refinement

    Full text link
    The standard operation refinement ordering is a kind of “meet of opposites”: non-determinism reduction suggests “smaller ” behaviour while increase of definition suggests “larger ” behaviour. Groves ’ factorization of this ordering into two simpler relations, one per refinement concern, makes it more mathematically tractable but is far from fully exploited in the literature. We present a pointfree theory for this factorization which is more agile and calculational than the standard set-theoretic approach. In particular, we show that factorization leads to a simple proof of structural refinement for arbitrary parametric types and exploit factor instantiation across different subclasses of (relational) operation. The prospect of generalizing the factorization to coalgebraic refinement is discussed

    A Likelihood Method for Measuring the Ultrahigh Energy Cosmic Ray Composition

    Get PDF
    Air fluorescence detectors traditionally determine the dominant chemical composit ion of the ultrahigh energy cosmic ray flux by comparing the averaged slant depth of the shower maximum, XmaxX_{max}, as a function of energy to the slant depths expect ed for various hypothesized primaries. In this paper, we present a method to make a direct measurement of the expected mean number of protons and iron by comparing the shap es of the expected XmaxX_{max} distributions to the distribution for data. The advantages of this method includes the use of information of the full distribution and its ability to calculate a flux for various cosmic ray compositi ons. The same method can be expanded to marginalize uncertainties due to choice of spectra, hadronic models and atmospheric parameters. We demonstrate the technique with independent simulated data samples from a parent sample of protons and iron. We accurately predict the number of protons and iron in the parent sample and show that the uncertainties are meaningful.Comment: 11 figures, 22 pages, accepted by Astroparticle Physic

    The New Look pMSSM with Neutralino and Gravitino LSPs

    Full text link
    The pMSSM provides a broad perspective on SUSY phenomenology. In this paper we generate two new, very large, sets of pMSSM models with sparticle masses extending up to 4 TeV, where the lightest supersymmetric particle (LSP) is either a neutralino or gravitino. The existence of a gravitino LSP necessitates a detailed study of its cosmological effects and we find that Big Bang Nucleosynthesis places strong constraints on this scenario. Both sets are subjected to a global set of theoretical, observational and experimental constraints resulting in a sample of \sim 225k viable models for each LSP type. The characteristics of these two model sets are briefly compared. We confront the neutralino LSP model set with searches for SUSY at the 7 TeV LHC using both the missing (MET) and non-missing ET ATLAS analyses. In the MET case, we employ Monte Carlo estimates of the ratios of the SM backgrounds at 7 and 8 TeV to rescale the 7 TeV data-driven ATLAS backgrounds to 8 TeV. This allows us to determine the pMSSM parameter space coverage for this collision energy. We find that an integrated luminosity of \sim 5-20 fb^{-1} at 8 TeV would yield a substantial increase in this coverage compared to that at 7 TeV and can probe roughly half of the model set. If the pMSSM is not discovered during the 8 TeV run, then our model set will be essentially void of gluinos and lightest first and second generation squarks that are \lesssim 700-800 GeV, which is much less than the analogous mSUGRA bound. Finally, we demonstrate that non-MET SUSY searches continue to play an important role in exploring the pMSSM parameter space. These two pMSSM model sets can be used as the basis for investigations for years to come.Comment: 54 pages, 22 figures; typos fixed, references adde
    • …
    corecore