1,497 research outputs found

    Drivers of deforestation in the basin of the Usumacinta River: Inference on process from pattern analysis using generalised additive models.

    Get PDF
    Quantifying patterns of deforestation and linking these patterns to potentially influencing variables is a key component of modelling and projecting land use change. Statistical methods based on null hypothesis testing are only partially successful for interpreting deforestation in the context of the processes that have led to their formation. Simplifications of cause-consequence relationships that are difficult to support empirically may influence environment and development policies because they suggest simple solutions to complex problems. Deforestation is a complex process driven by multiple proximate and underlying factors and a range of scales. In this study we use a multivariate statistical analysis to provide contextual explanation for deforestation in the Usumacinta River Basin based on partial pattern matching. Our approach avoided testing trivial null hypotheses of lack of association and investigated the strength and form of the response to drivers. As not all factors involved in deforestation are easily mapped as GIS layers, analytical challenges arise due to lack of a one to one correspondence between mappable attributes and drivers. We avoided testing simple statistical hypotheses such as the detectability of a significant linear relationship between deforestation and proximity to roads or water. We developed a series of informative generalised additive models based on combinations of layers that corresponded to hypotheses regarding processes. The importance of the variables representing accessibility was emphasised by the analysis. We provide evidence that land tenure is a critical factor in shaping the decision to deforest and that direct beam insolation has an effect associated with fire frequency and intensity. The effect of winter insolation was found to have many applied implications for land management. The methodology was useful for interpreting the relative importance of sets of variables representing drivers of deforestation. It was an informative approach, thus allowing the construction of a comprehensive understanding of its causes

    Domain Wall Spacetimes: Instability of Cosmological Event and Cauchy Horizons

    Get PDF
    The stability of cosmological event and Cauchy horizons of spacetimes associated with plane symmetric domain walls are studied. It is found that both horizons are not stable against perturbations of null fluids and massless scalar fields; they are turned into curvature singularities. These singularities are light-like and strong in the sense that both the tidal forces and distortions acting on test particles become unbounded when theses singularities are approached.Comment: Latex, 3 figures not included in the text but available upon reques

    Unbounded violation of tripartite Bell inequalities

    Get PDF
    We prove that there are tripartite quantum states (constructed from random unitaries) that can lead to arbitrarily large violations of Bell inequalities for dichotomic observables. As a consequence these states can withstand an arbitrary amount of white noise before they admit a description within a local hidden variable model. This is in sharp contrast with the bipartite case, where all violations are bounded by Grothendieck's constant. We will discuss the possibility of determining the Hilbert space dimension from the obtained violation and comment on implications for communication complexity theory. Moreover, we show that the violation obtained from generalized GHZ states is always bounded so that, in contrast to many other contexts, GHZ states do in this case not lead to extremal quantum correlations. The results are based on tools from the theories of operator spaces and tensor norms which we exploit to prove the existence of bounded but not completely bounded trilinear forms from commutative C*-algebras.Comment: Substantial changes in the presentation to make the paper more accessible for a non-specialized reade

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    The exposure of the hybrid detector of the Pierre Auger Observatory

    Get PDF
    The Pierre Auger Observatory is a detector for ultra-high energy cosmic rays. It consists of a surface array to measure secondary particles at ground level and a fluorescence detector to measure the development of air showers in the atmosphere above the array. The "hybrid" detection mode combines the information from the two subsystems. We describe the determination of the hybrid exposure for events observed by the fluorescence telescopes in coincidence with at least one water-Cherenkov detector of the surface array. A detailed knowledge of the time dependence of the detection operations is crucial for an accurate evaluation of the exposure. We discuss the relevance of monitoring data collected during operations, such as the status of the fluorescence detector, background light and atmospheric conditions, that are used in both simulation and reconstruction.Comment: Paper accepted by Astroparticle Physic

    Search for composite and exotic fermions at LEP 2

    Get PDF
    A search for unstable heavy fermions with the DELPHI detector at LEP is reported. Sequential and non-canonical leptons, as well as excited leptons and quarks, are considered. The data analysed correspond to an integrated luminosity of about 48 pb^{-1} at an e^+e^- centre-of-mass energy of 183 GeV and about 20 pb^{-1} equally shared between the centre-of-mass energies of 172 GeV and 161 GeV. The search for pair-produced new leptons establishes 95% confidence level mass limits in the region between 70 GeV/c^2 and 90 GeV/c^2, depending on the channel. The search for singly produced excited leptons and quarks establishes upper limits on the ratio of the coupling of the excited fermio

    Search for charginos in e+e- interactions at sqrt(s) = 189 GeV

    Full text link
    An update of the searches for charginos and gravitinos is presented, based on a data sample corresponding to the 158 pb^{-1} recorded by the DELPHI detector in 1998, at a centre-of-mass energy of 189 GeV. No evidence for a signal was found. The lower mass limits are 4-5 GeV/c^2 higher than those obtained at a centre-of-mass energy of 183 GeV. The (\mu,M_2) MSSM domain excluded by combining the chargino searches with neutralino searches at the Z resonance implies a limit on the mass of the lightest neutralino which, for a heavy sneutrino, is constrained to be above 31.0 GeV/c^2 for tan(beta) \geq 1.Comment: 22 pages, 8 figure

    Centrality dependence of charged particle production at large transverse momentum in Pb-Pb collisions at sNN=2.76\sqrt{s_{\rm{NN}}} = 2.76 TeV

    Get PDF
    The inclusive transverse momentum (pTp_{\rm T}) distributions of primary charged particles are measured in the pseudo-rapidity range η<0.8|\eta|<0.8 as a function of event centrality in Pb-Pb collisions at sNN=2.76\sqrt{s_{\rm{NN}}}=2.76 TeV with ALICE at the LHC. The data are presented in the pTp_{\rm T} range 0.15<pT<500.15<p_{\rm T}<50 GeV/cc for nine centrality intervals from 70-80% to 0-5%. The Pb-Pb spectra are presented in terms of the nuclear modification factor RAAR_{\rm{AA}} using a pp reference spectrum measured at the same collision energy. We observe that the suppression of high-pTp_{\rm T} particles strongly depends on event centrality. In central collisions (0-5%) the yield is most suppressed with RAA0.13R_{\rm{AA}}\approx0.13 at pT=6p_{\rm T}=6-7 GeV/cc. Above pT=7p_{\rm T}=7 GeV/cc, there is a significant rise in the nuclear modification factor, which reaches RAA0.4R_{\rm{AA}} \approx0.4 for pT>30p_{\rm T}>30 GeV/cc. In peripheral collisions (70-80%), the suppression is weaker with RAA0.7R_{\rm{AA}} \approx 0.7 almost independently of pTp_{\rm T}. The measured nuclear modification factors are compared to other measurements and model calculations.Comment: 17 pages, 4 captioned figures, 2 tables, authors from page 12, published version, figures at http://aliceinfo.cern.ch/ArtSubmission/node/284

    Evidence for a mixed mass composition at the `ankle' in the cosmic-ray spectrum

    Get PDF
    We report a first measurement for ultra-high energy cosmic rays of the correlation between the depth of shower maximum and the signal in the water Cherenkov stations of air-showers registered simultaneously by the fluorescence and the surface detectors of the Pierre Auger Observatory. Such a correlation measurement is a unique feature of a hybrid air-shower observatory with sensitivity to both the electromagnetic and muonic components. It allows an accurate determination of the spread of primary masses in the cosmic-ray flux. Up till now, constraints on the spread of primary masses have been dominated by systematic uncertainties. The present correlation measurement is not affected by systematics in the measurement of the depth of shower maximum or the signal in the water Cherenkov stations. The analysis relies on general characteristics of air showers and is thus robust also with respect to uncertainties in hadronic event generators. The observed correlation in the energy range around the `ankle' at lg(E/eV)=18.519.0\lg(E/{\rm eV})=18.5-19.0 differs significantly from expectations for pure primary cosmic-ray compositions. A light composition made up of proton and helium only is equally inconsistent with observations. The data are explained well by a mixed composition including nuclei with mass A>4A > 4. Scenarios such as the proton dip model, with almost pure compositions, are thus disfavoured as the sole explanation of the ultrahigh-energy cosmic-ray flux at Earth.Comment: Published version. Added journal reference and DOI. Added Report Numbe
    corecore