2,005 research outputs found

    Captive reptile mortality rates in the home and implications for the wildlife trade

    Get PDF
    The trade in wildlife and keeping of exotic pets is subject to varying levels of national and international regulation and is a topic often attracting controversy. Reptiles are popular exotic pets and comprise a substantial component of the live animal trade. High mortality of traded animals raises welfare concerns, and also has implications for conservation if collection from the wild is required to meet demand. Mortality of reptiles can occur at any stage of the trade chain from collector to consumer. However, there is limited information on mortality rates of reptiles across trade chains, particularly amongst final consumers in the home. We investigated mortality rates of reptiles amongst consumers using a specialised technique for asking sensitive questions, additive Randomised Response Technique (aRRT), as well as direct questioning (DQ). Overall, 3.6% of snakes, chelonians and lizards died within one year of acquisition. Boas and pythons had the lowest reported mortality rates of 1.9% and chameleons had the highest at 28.2%. More than 97% of snakes, 87% of lizards and 69% of chelonians acquired by respondents over five years were reported to be captive bred and results suggest that mortality rates may be lowest for captive bred individuals. Estimates of mortality from aRRT and DQ did not differ significantly which is in line with our findings that respondents did not find questions about reptile mortality to be sensitive. This research suggests that captive reptile mortality in the home is rather low, and identifies those taxa where further effort could be made to reduce mortality rate

    How to ask sensitive questions in conservation: A review of specialized questioning techniques

    Get PDF
    Tools for social research are critical for developing an understanding of conservation problems and assessing the feasibility of conservation actions. Social surveys are an essential tool frequently applied in conservation to assess both people’s behaviour and to understand its drivers. However, little attention has been given to the weaknesses and strengths of different survey tools. When topics of conservation concern are illegal or otherwise sensitive, data collected using direct questions are likely to be affected by non-response and social desirability biases, reducing their validity. These sources of bias associated with using direct questions on sensitive topics have long been recognised in the social sciences but have been poorly considered in conservation and natural resource management. We reviewed specialized questioning techniques developed in a number of disciplines specifically for investigating sensitive topics. These methods ensure respondent anonymity, increase willingness to answer, and critically, make it impossible to directly link incriminating data to an individual. We describe each method and report their main characteristics, such as data requirements, possible data outputs, availability of evidence that they can be adapted for use in illiterate communities, and summarize their main advantages and disadvantages. Recommendations for their application in conservation are given. We suggest that the conservation toolbox should be expanded by incorporating specialized questioning techniques, developed specifically to increase response accuracy. By considering the limitations of each survey technique, we will ultimately contribute to more effective evaluations of conservation interventions and more robust policy decisions

    Measurement of differential cross sections for top quark pair production using the lepton plus jets final state in proton-proton collisions at 13 TeV

    Get PDF
    National Science Foundation (U.S.

    Particle-flow reconstruction and global event description with the CMS detector

    Get PDF
    The CMS apparatus was identified, a few years before the start of the LHC operation at CERN, to feature properties well suited to particle-flow (PF) reconstruction: a highly-segmented tracker, a fine-grained electromagnetic calorimeter, a hermetic hadron calorimeter, a strong magnetic field, and an excellent muon spectrometer. A fully-fledged PF reconstruction algorithm tuned to the CMS detector was therefore developed and has been consistently used in physics analyses for the first time at a hadron collider. For each collision, the comprehensive list of final-state particles identified and reconstructed by the algorithm provides a global event description that leads to unprecedented CMS performance for jet and hadronic tau decay reconstruction, missing transverse momentum determination, and electron and muon identification. This approach also allows particles from pileup interactions to be identified and enables efficient pileup mitigation methods. The data collected by CMS at a centre-of-mass energy of 8 TeV show excellent agreement with the simulation and confirm the superior PF performance at least up to an average of 20 pileup interactions

    Identification of heavy-flavour jets with the CMS detector in pp collisions at 13 TeV

    Get PDF
    Many measurements and searches for physics beyond the standard model at the LHC rely on the efficient identification of heavy-flavour jets, i.e. jets originating from bottom or charm quarks. In this paper, the discriminating variables and the algorithms used for heavy-flavour jet identification during the first years of operation of the CMS experiment in proton-proton collisions at a centre-of-mass energy of 13 TeV, are presented. Heavy-flavour jet identification algorithms have been improved compared to those used previously at centre-of-mass energies of 7 and 8 TeV. For jets with transverse momenta in the range expected in simulated tt\mathrm{t}\overline{\mathrm{t}} events, these new developments result in an efficiency of 68% for the correct identification of a b jet for a probability of 1% of misidentifying a light-flavour jet. The improvement in relative efficiency at this misidentification probability is about 15%, compared to previous CMS algorithms. In addition, for the first time algorithms have been developed to identify jets containing two b hadrons in Lorentz-boosted event topologies, as well as to tag c jets. The large data sample recorded in 2016 at a centre-of-mass energy of 13 TeV has also allowed the development of new methods to measure the efficiency and misidentification probability of heavy-flavour jet identification algorithms. The heavy-flavour jet identification efficiency is measured with a precision of a few per cent at moderate jet transverse momenta (between 30 and 300 GeV) and about 5% at the highest jet transverse momenta (between 500 and 1000 GeV)

    Search for heavy resonances decaying to a top quark and a bottom quark in the lepton+jets final state in proton–proton collisions at 13 TeV

    Get PDF
    info:eu-repo/semantics/publishe

    Evidence for the Higgs boson decay to a bottom quark–antiquark pair

    Get PDF
    info:eu-repo/semantics/publishe

    Pseudorapidity and transverse momentum dependence of flow harmonics in pPb and PbPb collisions

    Get PDF
    info:eu-repo/semantics/publishe

    Differential cross section measurements for the production of a W boson in association with jets in proton–proton collisions at √s = 7 TeV

    Get PDF
    Measurements are reported of differential cross sections for the production of a W boson, which decays into a muon and a neutrino, in association with jets, as a function of several variables, including the transverse momenta (pT) and pseudorapidities of the four leading jets, the scalar sum of jet transverse momenta (HT), and the difference in azimuthal angle between the directions of each jet and the muon. The data sample of pp collisions at a centre-of-mass energy of 7 TeV was collected with the CMS detector at the LHC and corresponds to an integrated luminosity of 5.0 fb[superscript −1]. The measured cross sections are compared to predictions from Monte Carlo generators, MadGraph + pythia and sherpa, and to next-to-leading-order calculations from BlackHat + sherpa. The differential cross sections are found to be in agreement with the predictions, apart from the pT distributions of the leading jets at high pT values, the distributions of the HT at high-HT and low jet multiplicity, and the distribution of the difference in azimuthal angle between the leading jet and the muon at low values.United States. Dept. of EnergyNational Science Foundation (U.S.)Alfred P. Sloan Foundatio

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis
    corecore