1,607 research outputs found

    Pervasive gaps in Amazonian ecological research

    Get PDF
    Biodiversity loss is one of the main challenges of our time,1,2 and attempts to address it require a clear un derstanding of how ecological communities respond to environmental change across time and space.3,4 While the increasing availability of global databases on ecological communities has advanced our knowledge of biodiversity sensitivity to environmental changes,5–7 vast areas of the tropics remain understudied.8–11 In the American tropics, Amazonia stands out as the world’s most diverse rainforest and the primary source of Neotropical biodiversity,12 but it remains among the least known forests in America and is often underrepre sented in biodiversity databases.13–15 To worsen this situation, human-induced modifications16,17 may elim inate pieces of the Amazon’s biodiversity puzzle before we can use them to understand how ecological com munities are responding. To increase generalization and applicability of biodiversity knowledge,18,19 it is thus crucial to reduce biases in ecological research, particularly in regions projected to face the most pronounced environmental changes. We integrate ecological community metadata of 7,694 sampling sites for multiple or ganism groups in a machine learning model framework to map the research probability across the Brazilian Amazonia, while identifying the region’s vulnerability to environmental change. 15%–18% of the most ne glected areas in ecological research are expected to experience severe climate or land use changes by 2050. This means that unless we take immediate action, we will not be able to establish their current status, much less monitor how it is changing and what is being lostinfo:eu-repo/semantics/publishedVersio

    Pervasive gaps in Amazonian ecological research

    Get PDF

    Pervasive gaps in Amazonian ecological research

    Get PDF
    Biodiversity loss is one of the main challenges of our time,1,2 and attempts to address it require a clear understanding of how ecological communities respond to environmental change across time and space.3,4 While the increasing availability of global databases on ecological communities has advanced our knowledge of biodiversity sensitivity to environmental changes,5,6,7 vast areas of the tropics remain understudied.8,9,10,11 In the American tropics, Amazonia stands out as the world's most diverse rainforest and the primary source of Neotropical biodiversity,12 but it remains among the least known forests in America and is often underrepresented in biodiversity databases.13,14,15 To worsen this situation, human-induced modifications16,17 may eliminate pieces of the Amazon's biodiversity puzzle before we can use them to understand how ecological communities are responding. To increase generalization and applicability of biodiversity knowledge,18,19 it is thus crucial to reduce biases in ecological research, particularly in regions projected to face the most pronounced environmental changes. We integrate ecological community metadata of 7,694 sampling sites for multiple organism groups in a machine learning model framework to map the research probability across the Brazilian Amazonia, while identifying the region's vulnerability to environmental change. 15%–18% of the most neglected areas in ecological research are expected to experience severe climate or land use changes by 2050. This means that unless we take immediate action, we will not be able to establish their current status, much less monitor how it is changing and what is being lost

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe

    Measurement of t(t)over-bar normalised multi-differential cross sections in pp collisions at root s=13 TeV, and simultaneous determination of the strong coupling strength, top quark pole mass, and parton distribution functions

    Get PDF
    Peer reviewe

    Bose-Einstein correlations of charged hadrons in proton-proton collisions at s\sqrt s = 13 TeV

    Get PDF
    Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s \sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s \sqrt{s} = 7 TeV, as well as with theoretical predictions.[graphic not available: see fulltext]Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s=\sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s=\sqrt{s} = 7 TeV, as well as with theoretical predictions

    Search for dark matter in events with a leptoquark and missing transverse momentum in proton-proton collisions at 13 TeV

    Get PDF
    A search is presented for dark matter in proton-proton collisions at a center-of-mass energy of root s= 13 TeV using events with at least one high transverse momentum (p(T)) muon, at least one high-p(T) jet, and large missing transverse momentum. The data were collected with the CMS detector at the CERN LHC in 2016 and 2017, and correspond to an integrated luminosity of 77.4 fb(-1). In the examined scenario, a pair of scalar leptoquarks is assumed to be produced. One leptoquark decays to a muon and a jet while the other decays to dark matter and low-p(T) standard model particles. The signature for signal events would be significant missing transverse momentum from the dark matter in conjunction with a peak at the leptoquark mass in the invariant mass distribution of the highest p(T) muon and jet. The data are observed to be consistent with the background predicted by the standard model. For the first benchmark scenario considered, dark matter masses up to 500 GeV are excluded for leptoquark masses m(LQ) approximate to 1400 GeV, and up to 300 GeV for m(LQ) approximate to 1500 GeV. For the second benchmark scenario, dark matter masses up to 600 GeV are excluded for m(LQ) approximate to 1400 GeV. (C) 2019 The Author(s). Published by Elsevier B.V.Peer reviewe

    Search for an L-mu - L-tau gauge boson using Z -> 4 mu events in proton-proton collisions at root s=13 TeV

    Get PDF
    A search for a narrow Z' gauge boson with a mass between 5 and 70 GeV resulting from an L-mu - L-tau U (1) local gauge symmetry is reported. Theories that predict such a particle have been proposed as an explanation of various experimental discrepancies, including the lack of a dark matter signal in direct-detection experiments, tension in the measurement of the anomalous magnetic moment of the muon, and reports of possible lepton flavor universality violation in B meson decays. A data sample of proton-proton collisions at a center-of-mass energy of 13 TeV is used, corresponding to an integrated luminosity of 77.3 fb(-1) recorded in 2016 and 2017 by the CMS detector at the LHC. Events containing four muons with an invariant mass near the standard model Z boson mass are analyzed, and the selection is further optimized to be sensitive to the events that may contain Z -> Z'mu mu -> 4 mu decays. The event yields are consistent with the standard model predictions. Upper limits of 10(-8)-10(-7) at 95% confidence level are set on the product of branching fractions B(Z -> Z'mu mu)B(Z' -> mu mu), depending on the Z' mass, which excludes a Z' boson coupling strength to muons above 0.004-0.3. These are the first dedicated limits on L-mu - L-tau models at the LHC and result in a significant increase in the excluded model parameter space. The results of this search may also be used to constrain the coupling strength of any light Z' gauge boson to muons. (C) 2019 The Author(s). Published by Elsevier B.V.Peer reviewe

    Measurement of electroweak WZ boson production and search for new physics in WZ + two jets events in pp collisions at √s=13TeV

    Get PDF
    A measurement of WZ electroweak (EW) vector boson scattering is presented. The measurement is performed in the leptonic decay modes WZ→ℓνℓ′ℓ′, where ℓ,ℓ′=e,μ. The analysis is based on a data sample of proton-proton collisions at √s=13 TeV at the LHC collected with the CMS detector and corresponding to an integrated luminosity of 35.9 fb−1. The WZ plus two jet production cross section is measured in fiducial regions with enhanced contributions from EW production and found to be consistent with standard model predictions. The EW WZ production in association with two jets is measured with an observed (expected) significance of 2.2 (2.5) standard deviations. Constraints on charged Higgs boson production and on anomalous quartic gauge couplings in terms of dimension-eight effective field theory operators are also presented
    corecore