4,536 research outputs found

    On The Origin Of The Gamma Rays From The Galactic Center

    Full text link
    The region surrounding the center of the Milky Way is both astrophysically rich and complex, and is predicted to contain very high densities of dark matter. Utilizing three years of data from the Fermi Gamma Ray Space Telescope (and the recently available Pass 7 ultraclean event class), we study the morphology and spectrum of the gamma ray emission from this region and find evidence of a spatially extended component which peaks at energies between 300 MeV and 10 GeV. We compare our results to those reported by other groups and find good agreement. The extended emission could potentially originate from either the annihilations of dark matter particles in the inner galaxy, or from the collisions of high energy protons that are accelerated by the Milky Way's supermassive black hole with gas. If interpreted as dark matter annihilation products, the emission spectrum favors dark matter particles with a mass in the range of 7-12 GeV (if annihilating dominantly to leptons) or 25-45 GeV (if annihilating dominantly to hadronic final states). The intensity of the emission corresponds to a dark matter annihilation cross section consistent with that required to generate the observed cosmological abundance in the early universe (sigma v ~ 3 x 10^-26 cm^3/s). We also present conservative limits on the dark matter annihilation cross section which are at least as stringent as those derived from other observations.Comment: 13 pages, 11 figure

    Nano-scale reservoir computing

    Full text link
    This work describes preliminary steps towards nano-scale reservoir computing using quantum dots. Our research has focused on the development of an accumulator-based sensing system that reacts to changes in the environment, as well as the development of a software simulation. The investigated systems generate nonlinear responses to inputs that make them suitable for a physical implementation of a neural network. This development will enable miniaturisation of the neurons to the molecular level, leading to a range of applications including monitoring of changes in materials or structures. The system is based around the optical properties of quantum dots. The paper will report on experimental work on systems using Cadmium Selenide (CdSe) quantum dots and on the various methods to render the systems sensitive to pH, redox potential or specific ion concentration. Once the quantum dot-based systems are rendered sensitive to these triggers they can provide a distributed array that can monitor and transmit information on changes within the material.Comment: 8 pages, 9 figures, accepted for publication in Nano Communication Networks, http://www.journals.elsevier.com/nano-communication-networks/. An earlier version was presented at the 3rd IEEE International Workshop on Molecular and Nanoscale Communications (IEEE MoNaCom 2013

    A chemical sensor based on a photonic-crystal L3 nanocavity defined in a silicon-nitride membrane

    Get PDF
    The application of a silicon-nitride based L3 optical nanocavity as a chemical sensor is explored. It is shown that by adjusting the thickness of an ultra-thin Lumogen Red film deposited onto the nanocavity surface, the fundamental optical mode undergoes a progressive red-shift as the layer-thickness increases, with the cavity being able to detect the presence of a single molecular monolayer. The optical properties of a nanocavity whose surface is coated with a thin layer of a porphyrin-based polymer are also explored. On exposure of the cavity to an acidic-vapour, it is shown that changes in the optical properties of the porphyrin-film (thickness and refractive index) can be detected through a reversible shift in the cavity mode wavelength. Such effects are described using a finite difference time-domain model

    Steady-State Analysis of Load Balancing with Coxian-22 Distributed Service Times

    Full text link
    This paper studies load balancing for many-server (NN servers) systems. Each server has a buffer of size b1,b-1, and can have at most one job in service and b1b-1 jobs in the buffer. The service time of a job follows the Coxian-2 distribution. We focus on steady-state performance of load balancing policies in the heavy traffic regime such that the normalized load of system is λ=1Nα\lambda = 1 - N^{-\alpha} for 0<α<0.5.0<\alpha<0.5. We identify a set of policies that achieve asymptotic zero waiting. The set of policies include several classical policies such as join-the-shortest-queue (JSQ), join-the-idle-queue (JIQ), idle-one-first (I1F) and power-of-dd-choices (Podd) with d=O(NαlogN)d=O(N^\alpha\log N). The proof of the main result is based on Stein's method and state space collapse. A key technical contribution of this paper is the iterative state space collapse approach that leads to a simple generator approximation when applying Stein's method

    Efficient Photon Coupling from a Diamond Nitrogen Vacancy Centre by Integration with Silica Fibre

    Full text link
    A central goal in quantum information science is to efficiently interface photons with single optical modes for quantum networking and distributed quantum computing. Here, we introduce and experimentally demonstrate a compact and efficient method for the low-loss coupling of a solid-state qubit, the nitrogen vacancy (NV) centre in diamond, with a single-mode optical fibre. In this approach, single-mode tapered diamond waveguides containing exactly one high quality NV memory are selected and integrated on tapered silica fibres. Numerical optimization of an adiabatic coupler indicates that near-unity-efficiency photon transfer is possible between the two modes. Experimentally, we find an overall collection efficiency between 18-40 % and observe a raw single photon count rate above 700 kHz. This integrated system enables robust, alignment-free, and efficient interfacing of single-mode optical fibres with single photon emitters and quantum memories in solids

    Growth Response to Carbadox in Pigs with a High or Low Genetic Capacity for Lean Tissue Growth

    Get PDF
    The impact of feeding carbadox from 12 to 75 pounds bodyweight on the rate, efficiency, and composition of growth in pigs with a high and low genetic capacity for lean tissue growth (LG) was evaluated. The high LG pigs gained bodyweight and muscle tissue faster and utilized feed more efficiently than low LG pigs. High LG pigs also had carcasses with more dissectible muscle and less dissectible fat. The pigs’ responses to carbadox feeding were dependent on the LG genotype. Feeding carbadox from 13 to 75 pounds bodyweight resulted in improved body growth and efficiency of feed utilization, but the magnitude of the responses were greater in the high LG pigs. Dietary carbadox additions from 13 to 75 pounds also resulted in greater muscle growth rates and carcass muscle content at 75 and 250 pounds in the high but not the low LG genotypes. Based on these data, the value of dietary agents such as carbadox that control or destroy antigens in the body need to be based on the impact of the agent on carcass composition as well as rate and efficiency of body growth. Furthermore, the value of the agent will be increased as pigs’ genetic capacity for lean tissue growth is increased

    IndustReal: A Dataset for Procedure Step Recognition Handling Execution Errors in Egocentric Videos in an Industrial-Like Setting

    Full text link
    Although action recognition for procedural tasks has received notable attention, it has a fundamental flaw in that no measure of success for actions is provided. This limits the applicability of such systems especially within the industrial domain, since the outcome of procedural actions is often significantly more important than the mere execution. To address this limitation, we define the novel task of procedure step recognition (PSR), focusing on recognizing the correct completion and order of procedural steps. Alongside the new task, we also present the multi-modal IndustReal dataset. Unlike currently available datasets, IndustReal contains procedural errors (such as omissions) as well as execution errors. A significant part of these errors are exclusively present in the validation and test sets, making IndustReal suitable to evaluate robustness of algorithms to new, unseen mistakes. Additionally, to encourage reproducibility and allow for scalable approaches trained on synthetic data, the 3D models of all parts are publicly available. Annotations and benchmark performance are provided for action recognition and assembly state detection, as well as the new PSR task. IndustReal, along with the code and model weights, is available at: https://github.com/TimSchoonbeek/IndustReal .Comment: Accepted for WACV 2024. 15 pages, 9 figures, including supplementary material

    A Buffer Stocks Model for Stabilizing Price of Staple Food with Considering the Expectation of Non Speculative Wholesaler

    Get PDF
    This paper is a study of price stabilization in the staple food distribution system. All stakeholders experience market risks due to some possibility causes of price volatility. Many models of price stabilization had been developed by employing several approaches such as floor-ceiling prices, buffer funds, export or import taxes, and subsidies. In the previous researches, the models were expanded to increase the purchasing price for producer and decrease the selling price for consumer. Therefore, the policy can influence the losses for non-speculative wholesaler that is reflected by the descending of selling quantity and ascending of the stocks. The objective of this model is not only to keep the expectation of both producer and consumer, but also to protect non-speculative wholesaler from the undesirable result of the stabilization policy. A nonlinear programming model was addressed to determine the instruments of intervention program. Moreover, the result shows that the wholesaler behavior affects the intervention costs. Index Terms Buffer stocks, Price stabilization, Nonlinear programming, Wholesaler behavior
    corecore