4,565 research outputs found

    Constraining compressed supersymmetry using leptonic signatures

    Get PDF
    We study the impact of the multi-lepton searches at the LHC on supersymmetric models with compressed mass spectra. For such models the acceptances of the usual search strategies are significantly reduced due to requirement of large effective mass and missing E_T. On the other hand, lepton searches do have much lower thresholds for missing E_T and p_T of the final state objects. Therefore, if a model with a compressed mass spectrum allows for multi-lepton final states, one could derive constraints using multi-lepton searches. For a class of simplified models we study the exclusion limits using ATLAS multi-lepton search analyses for the final states containing 2-4 electrons or muons with a total integrated luminosity of 1-2/fb at \sqrt{s}=7 TeV. We also modify those analyses by imposing additional cuts, so that their sensitivity to compressed supersymmetric models increase. Using the original and modified analyses, we show that the exclusion limits can be competitive with jet plus missing E_T searches, providing exclusion limits up to gluino masses of 1 TeV. We also analyse the efficiencies for several classes of events coming from different intermediate state particles. This allows us to assess exclusion limits in similar class of models with different cross sections and branching ratios without requiring a Monte Carlo simulation.Comment: 18 pages, 5 figure

    Supersymmetry in the shadow of photini

    Full text link
    Additional neutral gauge fermions -- "photini" -- arise in string compactifications as superpartners of U(1) gauge fields. Unlike their vector counterparts, the photini can acquire weak-scale masses from soft SUSY breaking and lead to observable signatures at the LHC through mass mixing with the bino. In this work we investigate the collider consequences of adding photini to the neutralino sector of the MSSM. Relatively large mixing of one or more photini with the bino can lead to prompt decays of the lightest ordinary supersymmetric particle; these extra cascades transfer most of the energy of SUSY decay chains into Standard Model particles, diminishing the power of missing energy as an experimental handle for signal discrimination. We demonstrate that the missing energy in SUSY events with photini is reduced dramatically for supersymmetric spectra with MSSM neutralinos near the weak scale, and study the effects on limits set by the leading hadronic SUSY searches at ATLAS and CMS. We find that in the presence of even one light photino the limits on squark masses from hadronic searches can be reduced by 400 GeV, with comparable (though more modest) reduction of gluino mass limits. We also consider potential discovery channels such as dilepton and multilepton searches, which remain sensitive to SUSY spectra with photini and can provide an unexpected route to the discovery of supersymmetry. Although presented in the context of photini, our results apply in general to theories in which additional light neutral fermions mix with MSSM gauginos.Comment: 23 pages, 8 figures, references adde

    Reduction of seafood processing wastewater using technologies enhanced by swim–bed technology

    Get PDF
    The increasing growth of the seafood processing industries considerably requires more industrial process activities and water consumption. It is estimated that approximately 10–40 m3 of wastewater is generated from those industries for processing one-tonne of raw materials. Due to limitations and regulations in natural resources utilization, a suitable and systematic wastewater treatment plant is very important to meet rigorous discharge standards. As a result of food waste biodegradability, the biological treatment and some extent of swim-bed technology, including a novel acryl-fibre (biofilm) material might be used effectively to meet the effluent discharge criteria. This chapter aims to develop understanding on current problems and production of the seafood wastewater regarding treatment efficiency and methods of treatment

    THE COST STRUCTURE OF MICROFINANCE INSTITUTIONS IN EASTERN EUROPE AND CENTRAL ASIA

    Full text link
    Microfinance institutions are important, particularly in developing countries, because they expand the frontier of financial intermediation by providing loans to those traditionally excluded from formal financial markets. This paper presents the first systematic statistical examination of the performance of MFIs operating in Eastern Europe and Central Asia. A cost function is estimated for MFIs in the region from 1999-2004. First, the presence of subsidies is found to be associated with higher MFI costs. When output is measured as the number of loans made, we find that MFIs become more efficient over time and that MFIs involved in the provision of group loans and loans to women have lower costs. However, when output is measured as volume of loans rather than their number, this last finding is reversed. This may be due to the fact that such loans are smaller in size; thus for a given volume more loans must be made.http://deepblue.lib.umich.edu/bitstream/2027.42/40195/3/wp809.pd

    Assessment of digital image correlation measurement errors: methodology and results

    Get PDF
    Optical full-field measurement methods such as Digital Image Correlation (DIC) are increasingly used in the field of experimental mechanics, but they still suffer from a lack of information about their metrological performances. To assess the performance of DIC techniques and give some practical rules for users, a collaborative work has been carried out by the Workgroup “Metrology” of the French CNRS research network 2519 “MCIMS (Mesures de Champs et Identification en Mécanique des Solides / Full-field measurement and identification in solid mechanics, http://www.ifma.fr/lami/gdr2519)”. A methodology is proposed to assess the metrological performances of the image processing algorithms that constitute their main component, the knowledge of which being required for a global assessment of the whole measurement system. The study is based on displacement error assessment from synthetic speckle images. Series of synthetic reference and deformed images with random patterns have been generated, assuming a sinusoidal displacement field with various frequencies and amplitudes. Displacements are evaluated by several DIC packages based on various formulations and used in the French community. Evaluated displacements are compared with the exact imposed values and errors are statistically analyzed. Results show general trends rather independent of the implementations but strongly correlated with the assumptions of the underlying algorithms. Various error regimes are identified, for which the dependence of the uncertainty with the parameters of the algorithms, such as subset size, gray level interpolation or shape functions, is discussed

    Management of the thrombotic risk associated with COVID-19:guidance for the hemostasis laboratory

    Get PDF
    Coronavirus disease 2019 (COVID-19) is associated with extreme inflammatory response, disordered hemostasis and high thrombotic risk. A high incidence of thromboembolic events has been reported despite thromboprophylaxis, raising the question of a more effective anticoagulation. First-line hemostasis tests such as activated partial thromboplastin time, prothrombin time, fibrinogen and D-dimers are proposed for assessing thrombotic risk and monitoring hemostasis, but are vulnerable to many drawbacks affecting their reliability and clinical relevance. Specialized hemostasis-related tests (soluble fibrin complexes, tests assessing fibrinolytic capacity, viscoelastic tests, thrombin generation) may have an interest to assess the thrombotic risk associated with COVID-19. Another challenge for the hemostasis laboratory is the monitoring of heparin treatment, especially unfractionated heparin in the setting of an extreme inflammatory response. This review aimed at evaluating the role of hemostasis tests in the management of COVID-19 and discussing their main limitations

    Limited Lifespan of Fragile Regions in Mammalian Evolution

    Full text link
    An important question in genome evolution is whether there exist fragile regions (rearrangement hotspots) where chromosomal rearrangements are happening over and over again. Although nearly all recent studies supported the existence of fragile regions in mammalian genomes, the most comprehensive phylogenomic study of mammals (Ma et al. (2006) Genome Research 16, 1557-1565) raised some doubts about their existence. We demonstrate that fragile regions are subject to a "birth and death" process, implying that fragility has limited evolutionary lifespan. This finding implies that fragile regions migrate to different locations in different mammals, explaining why there exist only a few chromosomal breakpoints shared between different lineages. The birth and death of fragile regions phenomenon reinforces the hypothesis that rearrangements are promoted by matching segmental duplications and suggests putative locations of the currently active fragile regions in the human genome

    A Stealth Supersymmetry Sampler

    Get PDF
    The LHC has strongly constrained models of supersymmetry with traditional missing energy signatures. We present a variety of models that realize the concept of Stealth Supersymmetry, i.e. models with R-parity in which one or more nearly-supersymmetric particles (a "stealth sector") lead to collider signatures with only a small amount of missing energy. The simplest realization involves low-scale supersymmetry breaking, with an R-odd particle decaying to its superpartner and a soft gravitino. We clarify the stealth mechanism and its differences from compressed supersymmetry and explain the requirements for stealth models with high-scale supersymmetry breaking, in which the soft invisible particle is not a gravitino. We also discuss new and distinctive classes of stealth models that couple through a baryon portal or Z' gauge interactions. Finally, we present updated limits on stealth supersymmetry in light of current LHC searches.Comment: 45 pages, 16 figure

    Constraints on the pMSSM from searches for squarks and gluinos by ATLAS

    Get PDF
    We study the impact of the jets and missing transverse momentum SUSY analyses of the ATLAS experiment on the phenomenological MSSM (pMSSM). We investigate sets of SUSY models with a flat and logarithmic prior in the SUSY mass scale and a mass range up to 1 and 3 TeV, respectively. These models were found previously in the study 'Supersymmetry without Prejudice'. Removing models with long-lived SUSY particles, we show that 99% of 20000 randomly generated pMSSM model points with a flat prior and 87% for a logarithmic prior are excluded by the ATLAS results. For models with squarks and gluinos below 600 GeV all models of the pMSSM grid are excluded. We identify SUSY spectra where the current ATLAS search strategy is less sensitive and propose extensions to the inclusive jets search channel

    A search for resonant production of ttˉt\bar{t} pairs in $4.8\ \rm{fb}^{-1}ofintegratedluminosityof of integrated luminosity of p\bar{p}collisionsat collisions at \sqrt{s}=1.96\ \rm{TeV}$

    Get PDF
    We search for resonant production of tt pairs in 4.8 fb^{-1} integrated luminosity of ppbar collision data at sqrt{s}=1.96 TeV in the lepton+jets decay channel, where one top quark decays leptonically and the other hadronically. A matrix element reconstruction technique is used; for each event a probability density function (pdf) of the ttbar candidate invariant mass is sampled. These pdfs are used to construct a likelihood function, whereby the cross section for resonant ttbar production is estimated, given a hypothetical resonance mass and width. The data indicate no evidence of resonant production of ttbar pairs. A benchmark model of leptophobic Z \rightarrow ttbar is excluded with m_{Z'} < 900 GeV at 95% confidence level.Comment: accepted for publication in Physical Review D Sep 21, 201
    corecore