688 research outputs found

    Modelos y análisis para datos de degradación

    Get PDF
    La degradación es una debilidad que eventualmente puede causar la falla (e. g., el desgaste que sufren los neumáticos de un automóvil). Cuando es posible medirla, esta puede proporcionar mayor información que los datos de tiempo de falla, para propósitos de determinación y mejoramiento de la confiabilidad de un producto. Este artículo es de carácter divulgativo y desarrolla técnicas que son propuestas por Meeker y Escobar (1998). Se cree importante hacer conocer este tópico, hoy en la frontera de la Teoría de Confiabilidad (Lawless 2000)). En este trabajo se compara el análisis clásico aproximado de degradación con el modelo de degradación explícito. Estos últimos modelos implican la utilización de modelos físicos de degradación a los cuales se les introduce efectos aleatorios. Se implementan las técnicas para la estimación de modelos mixtos en S-PLUS siguiendo a Pinheiro y Bates (2000) y se utiliza el bootstrap para intervalos de confianza.Degradation is a weakness that eventually can cause failure (e.g. car tire wear). When it is possible to measure degradation, such measures often provide more information than failure-time data for purposes of assessing and improving product reliability. This is a paper which mainly pretends to divulge techniques that had been developed by Meeker and amp; Escobar (1998). We think it is worth to make this topics known, because they are in the research frontier of the Reliability Theory (Lawless 2000). We compare in this work the explicit degradation models with the approximate degradation analysis. The explicit degradation model requires specific models developed by engineers and physical scientists, which are treated as mixed models with random effects. To obtain ML estimates we use S-PLUS following Pinheiro and amp; Bates (2000), and also use bootstrap confidence intervals

    Ecological Implications of Extreme Events: Footprints of the 2010 Earthquake along the Chilean Coast

    Get PDF
    Deciphering ecological effects of major catastrophic events such as earthquakes, tsunamis, volcanic eruptions, storms and fires, requires rapid interdisciplinary efforts often hampered by a lack of pre-event data. Using results of intertidal surveys conducted shortly before and immediately after Chile's 2010 Mw 8.8 earthquake along the entire rupture zone (ca. 34–38°S), we provide the first quantification of earthquake and tsunami effects on sandy beach ecosystems. Our study incorporated anthropogenic coastal development as a key design factor. Ecological responses of beach ecosystems were strongly affected by the magnitude of land-level change. Subsidence along the northern rupture segment combined with tsunami-associated disturbance and drowned beaches. In contrast, along the co-seismically uplifted southern rupture, beaches widened and flattened increasing habitat availability. Post-event changes in abundance and distribution of mobile intertidal invertebrates were not uniform, varying with land-level change, tsunami height and coastal development. On beaches where subsidence occurred, intertidal zones and their associated species disappeared. On some beaches, uplift of rocky sub-tidal substrate eliminated low intertidal sand beach habitat for ecologically important species. On others, unexpected interactions of uplift with man-made coastal armouring included restoration of upper and mid-intertidal habitat seaward of armouring followed by rapid colonization of mobile crustaceans typical of these zones formerly excluded by constraints imposed by the armouring structures. Responses of coastal ecosystems to major earthquakes appear to vary strongly with land-level change, the mobility of the biota and shore type. Our results show that interactions of extreme events with human-altered shorelines can produce surprising ecological outcomes, and suggest these complex responses to landscape alteration can leave lasting footprints in coastal ecosystems

    Measurement of B-c(2S)(+) and B-c*(2S)(+) cross section ratios in proton-proton collisions at root s=13 TeV

    Get PDF
    Peer reviewe

    Search for long-lived particles decaying to jets with displaced vertices in proton-proton collisions at root s=13 Te V

    Get PDF
    A search is presented for long-lived particles produced in pairs in proton-proton collisions at the LHC operating at a center-of-mass energy of 13 TeV. The data were collected with the CMS detector during the period from 2015 through 2018, and correspond to a total integrated luminosity of 140 fb(-1). This search targets pairs of long-lived particles with mean proper decay lengths between 0.1 and 100 mm, each of which decays into at least two quarks that hadronize to jets, resulting in a final state with two displaced vertices. No significant excess of events with two displaced vertices is observed. In the context of R-parity violating supersymmetry models, the pair production of long-lived neutralinos, gluinos, and top squarks is excluded at 95% confidence level for cross sections larger than 0.08 fb, masses between 800 and 3000 GeV, and mean proper decay lengths between 1 and 25 mm.Peer reviewe

    Search for top squark production in fully hadronic final states in proton-proton collisions at root s=13 TeV

    Get PDF
    A search for production of the supersymmetric partners of the top quark, top squarks, is presented. The search is based on proton-proton collision events containing multiple jets, no leptons, and large transverse momentum imbalance. The data were collected with the CMS detector at the CERN LHC at a center-of-mass energy of 13 TeV, and correspond to an integrated luminosity of 137 fb(-1). The targeted signal production scenarios are direct and gluino-mediated top squark production, including scenarios in which the top squark and neutralino masses are nearly degenerate. The search utilizes novel algorithms based on deep neural networks that identify hadronically decaying top quarks and W bosons, which are expected in many of the targeted signal models. No statistically significant excess of events is observed relative to the expectation from the standard model, and limits on the top squark production cross section are obtained in the context of simplified supersymmetric models for various production and decay modes. Exclusion limits as high as 1310 GeVare established at the 95% confidence level on the mass of the top squark for direct top squark production models, and as high as 2260 GeV on the mass of the gluino for gluino-mediated top squark production models. These results represent a significant improvement over the results of previous searches for supersymmetry by CMS in the same final state.Peer reviewe

    Development and validation of HERWIG 7 tunes from CMS underlying-event measurements

    Get PDF
    This paper presents new sets of parameters (“tunes”) for the underlying-event model of the HERWIG7 event generator. These parameters control the description of multiple-parton interactions (MPI) and colour reconnection in HERWIG7, and are obtained from a fit to minimum-bias data collected by the CMS experiment at s=0.9, 7, and 13Te. The tunes are based on the NNPDF 3.1 next-to-next-to-leading-order parton distribution function (PDF) set for the parton shower, and either a leading-order or next-to-next-to-leading-order PDF set for the simulation of MPI and the beam remnants. Predictions utilizing the tunes are produced for event shape observables in electron-positron collisions, and for minimum-bias, inclusive jet, top quark pair, and Z and W boson events in proton-proton collisions, and are compared with data. Each of the new tunes describes the data at a reasonable level, and the tunes using a leading-order PDF for the simulation of MPI provide the best description of the dat

    Measurement of the W gamma Production Cross Section in Proton-Proton Collisions at root s=13 TeV and Constraints on Effective Field Theory Coefficients

    Get PDF
    A fiducial cross section for W gamma production in proton-proton collisions is measured at a center-of-mass energy of 13 TeV in 137 fb(-1) of data collected using the CMS detector at the LHC. The W -> e nu and mu nu decay modes are used in a maximum-likelihood fit to the lepton-photon invariant mass distribution to extract the combined cross section. The measured cross section is compared with theoretical expectations at next-to-leading order in quantum chromodynamics. In addition, 95% confidence level intervals are reported for anomalous triple-gauge couplings within the framework of effective field theory.Peer reviewe

    Search for dark photons in Higgs boson production via vector boson fusion in proton-proton collisions at √s = 13 TeV

    Get PDF
    A search is presented for a Higgs boson that is produced via vector boson fusion and that decays to an undetected particle and an isolated photon. The search is performed by the CMS collaboration at the LHC, using a data set corresponding to an integrated luminosity of 130 fb−1, recorded at a center-of-mass energy of 13 TeV in 2016–2018. No significant excess of events above the expectation from the standard model background is found. The results are interpreted in the context of a theoretical model in which the undetected particle is a massless dark photon. An upper limit is set on the product of the cross section for production via vector boson fusion and the branching fraction for such a Higgs boson decay, as a function of the Higgs boson mass. For a Higgs boson mass of 125 GeV, assuming the standard model production rates, the observed (expected) 95% confidence level upper limit on the branching fraction is 3.5 (2.8)%. This is the first search for such decays in the vector boson fusion channel. Combination with a previous search for Higgs bosons produced in association with a Z boson results in an observed (expected) upper limit on the branching fraction of 2.9 (2.1)% at 95% confidence level

    Reconstruction of signal amplitudes in the CMS electromagnetic calorimeter in the presence of overlapping proton-proton interactions

    Get PDF
    A template fitting technique for reconstructing the amplitude of signals produced by the lead tungstate crystals of the CMS electromagnetic calorimeter is described. This novel approach is designed to suppress the contribution to the signal of the increased number of out-of-time interactions per beam crossing following the reduction of the accelerator bunch spacing from 50 to 25 ns at the start of Run 2 of the LHC. Execution of the algorithm is sufficiently fast for it to be employed in the CMS high-level trigger. It is also used in the offline event reconstruction. Results obtained from simulations and from Run 2 collision data (2015-2018) demonstrate a substantial improvement in the energy resolution of the calorimeter over a range of energies extending from a few GeV to several tens of GeV.Peer reviewe

    Observation of the Production of Three Massive Gauge Bosons at root s=13 TeV

    Get PDF
    The first observation is reported of the combined production of three massive gauge bosons (VVV with V = W, Z) in proton-proton collisions at a center-of-mass energy of 13 TeV. The analysis is based on a data sample recorded by the CMS experiment at the CERN LHC corresponding to an integrated luminosity of 137 fb(-1). The searches for individualWWW, WWZ, WZZ, and ZZZ production are performed in final states with three, four, five, and six leptons (electrons or muons), or with two same-sign leptons plus one or two jets. The observed (expected) significance of the combinedVVV production signal is 5.7 (5.9) standard deviations and the corresponding measured cross section relative to the standard model prediction is 1.02(-0.23)(+0.26). The significances of the individual WWW and WWZ production are 3.3 and 3.4 standard deviations, respectively. Measured production cross sections for the individual triboson processes are also reported
    corecore