766 research outputs found

    Ad hoc Smoothing Parameter Performance in Kernel Estimates of GPS-derived Home Ranges

    Get PDF
    Accuracy of home-range estimates in animals is influenced by a variety of factors, such as method of analysis and number of locations, but animal space use is less often considered and frequently over-generalized through simulations. Our objective was to assess effect of an ad hoc (h_ad hoc)smoothing parameter in kernel analysis from two species that were predicted to have different patterns of utilization distributions across a range of sample sizes. We evaluated variation in home-range estimates with location data collected from GPS collars on two species: mule deer Odocoileus hemionus and coyotes Canis latrans. We calculated home ranges using 95% and 50% kernel contours using reference (h_ref and h ad hoc smoothing parameters. To evaluate the influence of sample size, we calculated home ranges using both smoothing parameters for random subsamples of 5, 10, 25 and 50% of GPS locations and compared area estimates to estimates for 100% of GPS locations. On mule deer, we also conducted visual relocations using conventional radiotelemetry, which resulted in fewer locations than GPS collars. Area was overestimated at smaller sample sizes, but an interesting pattern was noted with higher relative bias at 60–100 locations than at sample sizes \u3c 50 locations. Relative bias was most likely due to increased smoothing of outer data points. Subsampling allowed us to examine relative bias across a range of samples sizes for the two smoothing parameters. Minimum number of points to obtain a consistent home range estimates varied by smoothing method, species, study duration, and volume contour (95% or 50%). While h_ad hoc performed consistently better over most sample sizes, there may not be a universal recommendation for all studies and species. Behavioral traits resulting in concentrated or disparate space use complicates comparisons among and between species. We suggest researchers examine their point distribution, justify their choice of smoothing parameter, and report their choices for home-range analysis based on their study objectives

    Evaluating the End-User Experience of Private Browsing Mode

    Get PDF
    Nowadays, all major web browsers have a private browsing mode. However, the mode's benefits and limitations are not particularly understood. Through the use of survey studies, prior work has found that most users are either unaware of private browsing or do not use it. Further, those who do use private browsing generally have misconceptions about what protection it provides. However, prior work has not investigated \emph{why} users misunderstand the benefits and limitations of private browsing. In this work, we do so by designing and conducting a three-part study: (1) an analytical approach combining cognitive walkthrough and heuristic evaluation to inspect the user interface of private mode in different browsers; (2) a qualitative, interview-based study to explore users' mental models of private browsing and its security goals; (3) a participatory design study to investigate why existing browser disclosures, the in-browser explanations of private browsing mode, do not communicate the security goals of private browsing to users. Participants critiqued the browser disclosures of three web browsers: Brave, Firefox, and Google Chrome, and then designed new ones. We find that the user interface of private mode in different web browsers violates several well-established design guidelines and heuristics. Further, most participants had incorrect mental models of private browsing, influencing their understanding and usage of private mode. Additionally, we find that existing browser disclosures are not only vague, but also misleading. None of the three studied browser disclosures communicates or explains the primary security goal of private browsing. Drawing from the results of our user study, we extract a set of design recommendations that we encourage browser designers to validate, in order to design more effective and informative browser disclosures related to private mode

    Chapter 15. COMET2.0-Decision Support System for Agricultural Greenhouse Gas Accounting

    Get PDF
    Improved agricultural practices have a significant potential to mitigate greenhouse gas (GHG) emissions. A key issue for implementing mitigation options is quantifying emissions practically and cost effectively. Web-based systems using process-based models provide a promising approach. COMET2.0 is a further development of the web-based COMET-VR system with an expanded set of crop management systems, inclusion of orchard and vineyards, new agroforestry options, and a nitrous oxide (N2O) emissions estimator, using the Century and DAYCENT dynamic ecosystem models. Compared to empirical emission factor models, COMET2.0 accounted for more of the between site variation in soil C changes following no-till adoption in Corn Belt and Great Plains experiment sites. Predicted N2O emission rates, as a function of application rate, timing (spring vs. fall), and use of nitrification inhibitors, were consistent with observations in the literature. Carbon dynamics for orchard and agroforestry compared well with field measurements but limited availability of data poses a challenge for a fuller validation of these systems. Advantages of a practiced-based approach, using dynamic process-based models include integration of interacting processes and local conditions for more accurate and complete GHG accounting. Web-based systems, designed for non-experts, allow land managers and others to evaluate trade-offs and select mitigation options for their particular conditions. Experimental networks such as GRACEnet will play an important role in improving decision support tools for implementation of agricultural GHG mitigation.Peer Reviewe

    Regulation of skeletal muscle oxidative capacity and insulin signaling by the Mitochondrial Rhomboid Protease PARL

    Get PDF
    Type 2 diabetes mellitus (T2DM) and aging are characterized by insulin resistance and impaired mitochondrial energetics. In lower organisms, remodeling by the protease pcp1 (PARL ortholog) maintains the function and lifecycle of mitochondria. We examined whether variation in PARL protein content is associated with mitochondrial abnormalities and insulin resistance. PARL mRNA and mitochondrial mass were both reduced in elderly subjects and in subjects with T2DM. Muscle knockdown of PARL in mice resulted in malformed mitochondrial cristae, lower mitochondrial content, decreased PGC1&alpha; protein levels, and impaired insulin signaling. Suppression of PARL protein in healthy myotubes lowered mitochondrial mass and insulin-stimulated glycogen synthesis and increased reactive oxygen species production. We propose that lower PARL expression may contribute to the mitochondrial abnormalities seen in aging and T2DM.<br /

    Measurement of the top quark forward-backward production asymmetry and the anomalous chromoelectric and chromomagnetic moments in pp collisions at √s = 13 TeV

    Get PDF
    Abstract The parton-level top quark (t) forward-backward asymmetry and the anomalous chromoelectric (d̂ t) and chromomagnetic (μ̂ t) moments have been measured using LHC pp collisions at a center-of-mass energy of 13 TeV, collected in the CMS detector in a data sample corresponding to an integrated luminosity of 35.9 fb−1. The linearized variable AFB(1) is used to approximate the asymmetry. Candidate t t ¯ events decaying to a muon or electron and jets in final states with low and high Lorentz boosts are selected and reconstructed using a fit of the kinematic distributions of the decay products to those expected for t t ¯ final states. The values found for the parameters are AFB(1)=0.048−0.087+0.095(stat)−0.029+0.020(syst),μ̂t=−0.024−0.009+0.013(stat)−0.011+0.016(syst), and a limit is placed on the magnitude of | d̂ t| &lt; 0.03 at 95% confidence level. [Figure not available: see fulltext.

    MUSiC : a model-unspecific search for new physics in proton-proton collisions at root s=13TeV

    Get PDF
    Results of the Model Unspecific Search in CMS (MUSiC), using proton-proton collision data recorded at the LHC at a centre-of-mass energy of 13 TeV, corresponding to an integrated luminosity of 35.9 fb(-1), are presented. The MUSiC analysis searches for anomalies that could be signatures of physics beyond the standard model. The analysis is based on the comparison of observed data with the standard model prediction, as determined from simulation, in several hundred final states and multiple kinematic distributions. Events containing at least one electron or muon are classified based on their final state topology, and an automated search algorithm surveys the observed data for deviations from the prediction. The sensitivity of the search is validated using multiple methods. No significant deviations from the predictions have been observed. For a wide range of final state topologies, agreement is found between the data and the standard model simulation. This analysis complements dedicated search analyses by significantly expanding the range of final states covered using a model independent approach with the largest data set to date to probe phase space regions beyond the reach of previous general searches.Peer reviewe

    Search for new particles in events with energetic jets and large missing transverse momentum in proton-proton collisions at root s=13 TeV

    Get PDF
    A search is presented for new particles produced at the LHC in proton-proton collisions at root s = 13 TeV, using events with energetic jets and large missing transverse momentum. The analysis is based on a data sample corresponding to an integrated luminosity of 101 fb(-1), collected in 2017-2018 with the CMS detector. Machine learning techniques are used to define separate categories for events with narrow jets from initial-state radiation and events with large-radius jets consistent with a hadronic decay of a W or Z boson. A statistical combination is made with an earlier search based on a data sample of 36 fb(-1), collected in 2016. No significant excess of events is observed with respect to the standard model background expectation determined from control samples in data. The results are interpreted in terms of limits on the branching fraction of an invisible decay of the Higgs boson, as well as constraints on simplified models of dark matter, on first-generation scalar leptoquarks decaying to quarks and neutrinos, and on models with large extra dimensions. Several of the new limits, specifically for spin-1 dark matter mediators, pseudoscalar mediators, colored mediators, and leptoquarks, are the most restrictive to date.Peer reviewe

    Measurement of prompt open-charm production cross sections in proton-proton collisions at root s=13 TeV

    Get PDF
    The production cross sections for prompt open-charm mesons in proton-proton collisions at a center-of-mass energy of 13TeV are reported. The measurement is performed using a data sample collected by the CMS experiment corresponding to an integrated luminosity of 29 nb(-1). The differential production cross sections of the D*(+/-), D-+/-, and D-0 ((D) over bar (0)) mesons are presented in ranges of transverse momentum and pseudorapidity 4 < p(T) < 100 GeV and vertical bar eta vertical bar < 2.1, respectively. The results are compared to several theoretical calculations and to previous measurements.Peer reviewe

    Combined searches for the production of supersymmetric top quark partners in proton-proton collisions at root s=13 TeV

    Get PDF
    A combination of searches for top squark pair production using proton-proton collision data at a center-of-mass energy of 13 TeV at the CERN LHC, corresponding to an integrated luminosity of 137 fb(-1) collected by the CMS experiment, is presented. Signatures with at least 2 jets and large missing transverse momentum are categorized into events with 0, 1, or 2 leptons. New results for regions of parameter space where the kinematical properties of top squark pair production and top quark pair production are very similar are presented. Depending on themodel, the combined result excludes a top squarkmass up to 1325 GeV for amassless neutralino, and a neutralinomass up to 700 GeV for a top squarkmass of 1150 GeV. Top squarks with masses from 145 to 295 GeV, for neutralino masses from 0 to 100 GeV, with a mass difference between the top squark and the neutralino in a window of 30 GeV around the mass of the top quark, are excluded for the first time with CMS data. The results of theses searches are also interpreted in an alternative signal model of dark matter production via a spin-0 mediator in association with a top quark pair. Upper limits are set on the cross section for mediator particle masses of up to 420 GeV

    Search for Physics beyond the Standard Model in Events with Overlapping Photons and Jets

    Get PDF
    Results are reported from a search for new particles that decay into a photon and two gluons, in events with jets. Novel jet substructure techniques are developed that allow photons to be identified in an environment densely populated with hadrons. The analyzed proton-proton collision data were collected by the CMS experiment at the LHC, in 2016 at root s = 13 TeV, and correspond to an integrated luminosity of 35.9 fb(-1). The spectra of total transverse hadronic energy of candidate events are examined for deviations from the standard model predictions. No statistically significant excess is observed over the expected background. The first cross section limits on new physics processes resulting in such events are set. The results are interpreted as upper limits on the rate of gluino pair production, utilizing a simplified stealth supersymmetry model. The excluded gluino masses extend up to 1.7 TeV, for a neutralino mass of 200 GeV and exceed previous mass constraints set by analyses targeting events with isolated photons.Peer reviewe
    corecore