12,435 research outputs found

    Stemming the Global Trade in Falsified and Substandard Medicines

    Get PDF
    Drug safety and quality is an essential assumption of clinical medicine, but there is growing concern that this assumption is not always correct. Poor manufacturing and deliberate fraud occasionally compromises the drug supply in the United States, and the problem is far more common and serious in low- and middle-income countries with weak drug regulatory systems. An Institute of Medicine consensus committee report identified the causes and possible solutions to the problem of falsified and substandard drugs around the world. The vocabulary people use to discuss the problem is itself a concern. The word counterfeit is often used innocuously to describe any drug that is not what it seems, but some NGOs and emerging manufacturing nations object to this term. These groups see hostility to generic pharmaceuticals in a discussion of counterfeit medicines. These groups see hostility to generic pharmaceuticals in a discussion of counterfeit medicines. Precisely speaking, a counterfeit drug infringes on a registered trademark, and trademark infringement in not necessarily a problem of public health consequence. Instead of talking broadly about counterfeit drugs, the WHO and other stakeholders should consider two main categories of drug quality problems. Falsified medicines misrepresent the product’s identity or source or both. Substandard drugs fail to meet the national specifications given in an accepted pharmacopeia or the manufacturer’s dossier. In practice, there is often considerable overlap between categories. There is considerable uncertainty about the size of the falsified and substandard drug market. Improved pharmacovigilance, especially in developing countries, give a better picture of the scope of the problem. In the United States, tighter regulatory controls on the wholesale market and a mandatory drug tracking system would improve drug safety. In developing countries, development finance organizations should invest in small- and medium-sized pharmaceutical manufacturers, and governments should use tools such as franchising, accreditation, low-interest loans, and task shifting to encourage private sector investment in drug retail. Finally, the WHO should work with stakeholders such as the UNODC and the WCO to develop an international code of practice on falsified and substandard drugs

    Asymmetric Dark Matter and Effective Operators

    Full text link
    In order to annihilate in the early Universe to levels well below the measured dark matter density, asymmetric dark matter must possess large couplings to the Standard Model. In this paper, we consider effective operators which allow asymmetric dark matter to annihilate into quarks. In addition to a bound from requiring sufficient annihilation, the energy scale of such operators can be constrained by limits from direct detection and monojet searches at colliders. We show that the allowed parameter space for these operators is highly constrained, leading to non-trivial requirements that any model of asymmetric dark matter must satisfy.Comment: 6 pages, 1 figure. V2 replacement: Citations added. Shading error in Fig. 1 (L_FV panel) corrected. Addition of direct detection bounds on m_chi <5 GeV added, minor alterations in text to reflect these change

    HepData and JetWeb: HEP data archiving and model validation

    Get PDF
    The CEDAR collaboration is extending and combining the JetWeb and HepData systems to provide a single service for tuning and validating models of high-energy physics processes. The centrepiece of this activity is the fitting by JetWeb of observables computed from Monte Carlo event generator events against their experimentally determined distributions, as stored in HepData. Caching the results of the JetWeb simulation and comparison stages provides a single cumulative database of event generator tunings, fitted against a wide range of experimental quantities. An important feature of this integration is a family of XML data formats, called HepML.Comment: 4 pages, 0 figures. To be published in proceedings of CHEP0

    HepForge: A lightweight development environment for HEP software

    Get PDF
    Setting up the infrastructure to manage a software project can become a task as significant writing the software itself. A variety of useful open source tools are available, such as Web-based viewers for version control systems, "wikis" for collaborative discussions and bug-tracking systems, but their use in high-energy physics, outside large collaborations, is insubstantial. Understandably, physicists would rather do physics than configure project management tools. We introduce the CEDAR HepForge system, which provides a lightweight development environment for HEP software. Services available as part of HepForge include the above-mentioned tools as well as mailing lists, shell accounts, archiving of releases and low-maintenance Web space. HepForge also exists to promote best-practice software development methods and to provide a central repository for re-usable HEP software and phenomenology codes.Comment: 3 pages, 0 figures. To be published in proceedings of CHEP06. Refers to the HepForge facility at http://hepforge.cedar.ac.u

    Stops and MET: the shape of things to come

    Full text link
    LHC experiments have placed strong bounds on the production of supersymmetric colored particles (squarks and gluinos), under the assumption that all flavors of squarks are nearly degenerate. However, the current experimental constraints on stop squarks are much weaker, due to the smaller production cross section and difficult backgrounds. While light stops are motivated by naturalness arguments, it has been suggested that such particles become nearly impossible to detect near the limit where their mass is degenerate with the sum of the masses of their decay products. We show that this is not the case, and that searches based on missing transverse energy (MET) have significant reach for stop masses above 175 GeV, even in the degenerate limit. We consider direct pair production of stops, decaying to invisible LSPs and tops with either hadronic or semi-leptonic final states. Modest intrinsic differences in MET are magnified by boosted kinematics and by shape analyses of MET or suitably-chosen observables related to MET. For these observables we show that the distributions of the relevant backgrounds and signals are well-described by simple analytic functions, in the kinematic regime where signal is enhanced. Shape analyses of MET-related distributions will allow the LHC experiments to place significantly improved bounds on stop squarks, even in scenarios where the stop-LSP mass difference is degenerate with the top mass. Assuming 20/fb of luminosity at 8 TeV, we conservatively estimate that experiments can exclude or discover degenerate stops with mass as large as ~ 360 GeV and 560 GeV for massless LSPs.Comment: Version submitted to journal with improved analysis and small fixes, 27 pages, 11 figures, 2 table
    • 

    corecore