3,856 research outputs found

    Impact of Interatomic Electronic Decay Processes on Xe 4d Hole Decay in the Xenon Fluorides

    Full text link
    A hole in a 4d orbital of atomic xenon relaxes through Auger decay after a lifetime of 3 fs. Adding electronegative fluorine ligands to form xenon fluoride molecules, results in withdrawal of valence-electron density from Xe. Thus, within the one-center picture of Auger decay, a lowered Xe 4d Auger width would be expected, in contradiction, however, with experiment. Employing extensive ab initio calculations within the framework of many-body Green's functions, we determine all available decay channels in XeFn and characterize these channels by means of a two-hole population analysis. We derive a relation between two-hole population numbers and partial Auger widths. On this basis, interatomic electronic decay processes are demonstrated to be so strong in the xenon fluorides that they overcompensate the reduction in intra-atomic Auger width and lead to the experimentally observed trend. The nature of the relevant processes is discussed. These processes presumably underlie Auger decay in a variety of systems.Comment: 11 pages, 5 figures, 3 tables, RevTeX4, extensively revised, the discussion of single ionization of XeFn was published separately: J. Chem. Phys. 119, 7763--7771 (2003), preprint arXiv: physics/030612

    Recommendation Subgraphs for Web Discovery

    Full text link
    Recommendations are central to the utility of many websites including YouTube, Quora as well as popular e-commerce stores. Such sites typically contain a set of recommendations on every product page that enables visitors to easily navigate the website. Choosing an appropriate set of recommendations at each page is one of the key features of backend engines that have been deployed at several e-commerce sites. Specifically at BloomReach, an engine consisting of several independent components analyzes and optimizes its clients' websites. This paper focuses on the structure optimizer component which improves the website navigation experience that enables the discovery of novel content. We begin by formalizing the concept of recommendations used for discovery. We formulate this as a natural graph optimization problem which in its simplest case, reduces to a bipartite matching problem. In practice, solving these matching problems requires superlinear time and is not scalable. Also, implementing simple algorithms is critical in practice because they are significantly easier to maintain in production. This motivated us to analyze three methods for solving the problem in increasing order of sophistication: a sampling algorithm, a greedy algorithm and a more involved partitioning based algorithm. We first theoretically analyze the performance of these three methods on random graph models characterizing when each method will yield a solution of sufficient quality and the parameter ranges when more sophistication is needed. We complement this by providing an empirical analysis of these algorithms on simulated and real-world production data. Our results confirm that it is not always necessary to implement complicated algorithms in the real-world and that very good practical results can be obtained by using heuristics that are backed by the confidence of concrete theoretical guarantees

    Resolving on 100 pc scales the UV-continuum in Lyman-α\alpha emitters between redshift 2 to 3 with gravitational lensing

    Get PDF
    We present a study of seventeen LAEs at redshift 2<z<<z<3 gravitationally lensed by massive early-type galaxies (ETGs) at a mean redshift of approximately 0.5. Using a fully Bayesian grid-based technique, we model the gravitational lens mass distributions with elliptical power-law profiles and reconstruct the UV-continuum surface brightness distributions of the background sources using pixellated source models. We find that the deflectors are close to, but not consistent with isothermal models in almost all cases, at the 2σ2\sigma-level. We take advantage of the lensing magnification (typically μ\mu\simeq 20) to characterise the physical and morphological properties of these LAE galaxies. From reconstructing the ultra-violet continuum emission, we find that the star-formation rates range from 0.3 to 8.5 M_{\odot} yr1^{-1} and that the galaxies are typically composed of several compact and diffuse components, separated by 0.4 to 4 kpc. Moreover, they have peak star-formation rate intensities that range from 2.1 to 54.1 M_{\odot} yr1^{-1} kpc2^{-2}. These galaxies tend to be extended with major axis ranging from 0.2 to 1.8 kpc (median 561 pc), and with a median ellipticity of 0.49. This morphology is consistent with disk-like structures of star-formation for more than half of the sample. However, for at least two sources, we also find off-axis components that may be associated with mergers. Resolved kinematical information will be needed to confirm the disk-like nature and possible merger scenario for the LAEs in the sample.Comment: 19 pages, 7 figures, accepted for publication on MNRA

    Dynamical Behavior of a stochastic SIRS epidemic model

    Full text link
    In this paper we study the Kernack - MacKendrick model under telegraph noise. The telegraph noise switches at random between two SIRS models. We give out conditions for the persistence of the disease and the stability of a disease free equilibrium. We show that the asymptotic behavior highly depends on the value of a threshold λ\lambda which is calculated from the intensities of switching between environmental states, the total size of the population as well as the parameters of both SIRS systems. According to the value of λ\lambda, the system can globally tend towards an endemic case or a disease free case. The aim of this work is also to describe completely the omega-limit set of all positive solutions to the model. Moreover, the attraction of the omega-limit set and the stationary distribution of solutions will be pointed out.Comment: 16 page

    A system for online beam emittance measurements and proton beam characterization

    Full text link
    A system for online measurement of the transverse beam emittance was developed. It is named 4^{4}PrOBε\varepsilonaM (4-Profiler Online Beam Emittance Measurement) and was conceived to measure the emittance in a fast and efficient way using the multiple beam profiler method. The core of the system is constituted by four consecutive UniBEaM profilers, which are based on silica fibers passing across the beam. The 4^{4}PrOBε\varepsilonaM system was deployed for characterization studies of the 18~MeV proton beam produced by the IBA Cyclone 18 MeV cyclotron at Bern University Hospital (Inselspital). The machine serves daily radioisotope production and multi-disciplinary research, which is carried out with a specifically conceived Beam Transport Line (BTL). The transverse RMS beam emittance of the cyclotron was measured as a function of several machine parameters, such as the magnetic field, RF peak voltage, and azimuthal angle of the stripper. The beam emittance was also measured using the method based on the quadrupole strength variation. The results obtained with both techniques were compared and a good agreement was found. In order to characterize the longitudinal dynamics, the proton energy distribution was measured. For this purpose, a method was developed based on aluminum absorbers of different thicknesses, a UniBEaM detector, and a Faraday cup. The results were an input for a simulation of the BTL developed in the MAD-X software. This tool allows machine parameters to be tuned online and the beam characteristics to be optimized for specific applications.Comment: published in Journal of Instrumentatio

    Studying the nuclear mass composition of Ultra-High Energy Cosmic Rays with the Pierre Auger Observatory

    Get PDF
    The Fluorescence Detector of the Pierre Auger Observatory measures the atmospheric depth, XmaxX_{max}, where the longitudinal profile of the high energy air showers reaches its maximum. This is sensitive to the nuclear mass composition of the cosmic rays. Due to its hybrid design, the Pierre Auger Observatory also provides independent experimental observables obtained from the Surface Detector for the study of the nuclear mass composition. We present XmaxX_{max}-distributions and an update of the average and RMS values in different energy bins and compare them to the predictions for different nuclear masses of the primary particles and hadronic interaction models. We also present the results of the composition-sensitive parameters derived from the ground level component.Comment: Proceedings of the 12th International Conference on Topics in Astroparticle and Underground Physics, TAUP 2011, Munich, German

    Light Nuclei solving Auger puzzles. The Cen-A imprint

    Full text link
    Ultra High Energy Cosmic Rays (UHECR) map at 60 EeV have been found recently by AUGER group spreading anisotropy signatures in the sky. The result have been interpreted as a manifestation of AGN sources ejecting protons at GZK edges mostly from Super-galactic Plane. The result is surprising due to the absence of much nearer Virgo cluster. Moreover, early GZK cut off in the spectra may be better reconcile with light nuclei (than with protons). In addition a large group (nearly a dozen) of events cluster suspiciously along Cen-A. Finally, proton UHECR composition nature is in sharp disagreement with earlier AUGER claim of a heavy nuclei dominance at 40 EeV. Therefore we interpret here the signals as mostly UHECR light nuclei (He, Be, B, C, O), very possibly mostly the lightest (He,Be) ones, ejected from nearest AGN Cen-A, UHECR smeared by galactic magnetic fields, whose random vertical bending is overlapping with super-galactic arm. The eventual AUGER misunderstanding took place because of such a rare coincidence between the Super Galactic Plane (arm) and the smeared (randomized) signals from Cen-A, bent orthogonally to the Galactic fields. Our derivation verify the consistence of the random smearing angles for He, Be and B, C, O, in reasonable agreement with the AUGER main group events around Cen-A. Only few other rare events are spread elsewhere. The most collimated from Cen-A are the lightest. The most spread the heavier. Consequently Cen-A is the best candidate UHE neutrino tau observable by HEAT and AMIGA as enhanced AUGER array at tens-hundred PeV energy. This model maybe soon tested by new events clustering around the Cen-A and by composition imprint study.Comment: 4 pages, 5 figures

    Global Production Increased by Spatial Heterogeneity in a Population Dynamics Model

    Get PDF
    Spatial and temporal heterogeneity are often described as important factors having a strong impact on biodiversity. The effect of heterogeneity is in most cases analyzed by the response of biotic interactions such as competition of predation. It may also modify intrinsic population properties such as growth rate. Most of the studies are theoretic since it is often difficult to manipulate spatial heterogeneity in practice. Despite the large number of studies dealing with this topics, it is still difficult to understand how the heterogeneity affects populations dynamics. On the basis of a very simple model, this paper aims to explicitly provide a simple mechanism which can explain why spatial heterogeneity may be a favorable factor for production.We consider a two patch model and a logistic growth is assumed on each patch. A general condition on the migration rates and the local subpopulation growth rates is provided under which the total carrying capacity is higher than the sum of the local carrying capacities, which is not intuitive. As we illustrate, this result is robust under stochastic perturbations

    Results of a self-triggered prototype system for radio-detection of extensive air showers at the Pierre Auger Observatory

    Full text link
    We describe the experimental setup and the results of RAuger, a small radio-antenna array, consisting of three fully autonomous and self-triggered radio-detection stations, installed close to the center of the Surface Detector (SD) of the Pierre Auger Observatory in Argentina. The setup has been designed for the detection of the electric field strength of air showers initiated by ultra-high energy cosmic rays, without using an auxiliary trigger from another detection system. Installed in December 2006, RAuger was terminated in May 2010 after 65 registered coincidences with the SD. The sky map in local angular coordinates (i.e., zenith and azimuth angles) of these events reveals a strong azimuthal asymmetry which is in agreement with a mechanism dominated by a geomagnetic emission process. The correlation between the electric field and the energy of the primary cosmic ray is presented for the first time, in an energy range covering two orders of magnitude between 0.1 EeV and 10 EeV. It is demonstrated that this setup is relatively more sensitive to inclined showers, with respect to the SD. In addition to these results, which underline the potential of the radio-detection technique, important information about the general behavior of self-triggering radio-detection systems has been obtained. In particular, we will discuss radio self-triggering under varying local electric-field conditions.Comment: accepted for publication in JINS

    A Three-Point Cosmic Ray Anisotropy Method

    Full text link
    The two-point angular correlation function is a traditional method used to search for deviations from expectations of isotropy. In this paper we develop and explore a statistically descriptive three-point method with the intended application being the search for deviations from isotropy in the highest energy cosmic rays. We compare the sensitivity of a two-point method and a "shape-strength" method for a variety of Monte-Carlo simulated anisotropic signals. Studies are done with anisotropic source signals diluted by an isotropic background. Type I and II errors for rejecting the hypothesis of isotropic cosmic ray arrival directions are evaluated for four different event sample sizes: 27, 40, 60 and 80 events, consistent with near term data expectations from the Pierre Auger Observatory. In all cases the ability to reject the isotropic hypothesis improves with event size and with the fraction of anisotropic signal. While ~40 event data sets should be sufficient for reliable identification of anisotropy in cases of rather extreme (highly anisotropic) data, much larger data sets are suggested for reliable identification of more subtle anisotropies. The shape-strength method consistently performs better than the two point method and can be easily adapted to an arbitrary experimental exposure on the celestial sphere.Comment: Fixed PDF erro
    corecore