227 research outputs found

    Randomized controlled trial of a web-based computer-tailored smoking cessation program as a supplement to nicotine patch therapy

    Full text link
    Aim  To assess the efficacy of World Wide Web-based tailored behavioral smoking cessation materials among nicotine patch users. Design  Two-group randomized controlled trial. Setting  World Wide Web in England and Republic of Ireland. Participants  A total of 3971 subjects who purchased a particular brand of nicotine patch and logged-on to use a free web-based behavioral support program. Intervention  Web-based tailored behavioral smoking cessation materials or web-based non-tailored materials. Measurements  Twenty-eight-day continuous abstinence rates were assessed by internet-based survey at 6-week follow-up and 10-week continuous rates at 12-week follow-up. Findings  Using three approaches to the analyses of 6- and 12-week outcomes, participants in the tailored condition reported clinically and statistically significantly higher continuous abstinence rates than participants in the non-tailored condition. In our primary analyses using as a denominator all subjects who logged-on to the treatment site at least once, continuous abstinence rates at 6 weeks were 29.0% in the tailored condition versus 23.9% in the non-tailored condition (OR = 1.30; P  = 0.0006); at 12 weeks continuous abstinence rates were 22.8% versus 18.1%, respectively (OR = 1.34; P  = 0.0006). Moreover, satisfaction with the program was significantly higher in the tailored than in the non-tailored condition. Conclusions  The results of this study demonstrate a benefit of the web-based tailored behavioral support materials used in conjunction with nicotine replacement therapy. A web-based program that collects relevant information from users and tailors the intervention to their specific needs had significant advantages over a web-based non-tailored cessation program.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/72486/1/j.1360-0443.2005.01093.x.pd

    Mixed Wino Dark Matter: Consequences for Direct, Indirect and Collider Detection

    Full text link
    In supersymmetric models with gravity-mediated SUSY breaking and gaugino mass unification, the predicted relic abundance of neutralinos usually exceeds the strict limits imposed by the WMAP collaboration. One way to obtain the correct relic abundance is to abandon gaugino mass universality and allow a mixed wino-bino lightest SUSY particle (LSP). The enhanced annihilation and scattering cross sections of mixed wino dark matter (MWDM) compared to bino dark matter lead to enhanced rates for direct dark matter detection, as well as for indirect detection at neutrino telescopes and for detection of dark matter annihilation products in the galactic halo. For collider experiments, MWDM leads to a reduced but significant mass gap between the lightest neutralinos so that chi_2^0 two-body decay modes are usually closed. This means that dilepton mass edges-- the starting point for cascade decay reconstruction at the CERN LHC-- should be accessible over almost all of parameter space. Measurement of the m_{\tz_2}-m_{\tz_1} mass gap at LHC plus various sparticle masses and cross sections as a function of beam polarization at the International Linear Collider (ILC) would pinpoint MWDM as the dominant component of dark matter in the universe.Comment: 29 pages including 19 eps figure

    Exploring the BWCA (Bino-Wino Co-Annihilation) Scenario for Neutralino Dark Matter

    Get PDF
    In supersymmetric models with non-universal gaugino masses, it is possible to have opposite-sign SU(2) and U(1) gaugino mass terms. In these models, the gaugino eigenstates experience little mixing so that the lightest SUSY particle remains either pure bino or pure wino. The neutralino relic density can only be brought into accord with the WMAP measured value when bino-wino co-annihilation (BWCA) acts to enhance the dark matter annihilation rate. We map out parameter space regions and mass spectra which are characteristic of the BWCA scenario. Direct and indirect dark matter detection rates are shown to be typically very low. At collider experiments, the BWCA scenario is typified by a small mass gap m_{\tilde Z_2}-m_{\tilde Z_1} ~ 20-80 GeV, so that tree level two body decays of \tilde Z_2 are not allowed. However, in this case the second lightest neutralino has an enhanced loop decay branching fraction to photons. While the photonic neutralino decay signature looks difficult to extract at the Fermilab Tevatron, it should lead to distinctive events at the CERN LHC and at a linear e^+e^- collider.Comment: 44 pages, 21 figure

    Collider and Dark Matter Phenomenology of Models with Mirage Unification

    Get PDF
    We examine supersymmetric models with mixed modulus-anomaly mediated SUSY breaking (MM-AMSB) soft terms which get comparable contributions to SUSY breaking from moduli-mediation and anomaly-mediation. The apparent (mirage) unification of soft SUSY breaking terms at Q=mu_mir not associated with any physical threshold is the hallmark of this scenario. The MM-AMSB structure of soft terms arises in models of string compactification with fluxes, where the addition of an anti-brane leads to an uplifting potential and a de Sitter universe, as first constructed by Kachru {\it et al.}. The phenomenology mainly depends on the relative strength of moduli- and anomaly-mediated SUSY breaking contributions, and on the Higgs and matter field modular weights, which are determined by the location of these fields in the extra dimensions. We delineate the allowed parameter space for a low and high value of tan(beta), for a wide range of modular weight choices. We calculate the neutralino relic density and display the WMAP-allowed regions. We show the reach of the CERN LHC and of the International Linear Collider. We discuss aspects of MM-AMSB models for Tevatron, LHC and ILC searches, muon g-2 and b->s \gamma branching fraction. We also calculate direct and indirect dark matter detection rates, and show that almost all WMAP-allowed models should be accessible to a ton-scale noble gas detector. Finally, we comment on the potential of colliders to measure the mirage unification scale and modular weights in the difficult case where mu_mir>>M_GUT.Comment: 34 pages plus 42 EPS figures; version with high resolution figures is at http://www.hep.fsu.edu/~bae

    Mixed Higgsino Dark Matter from a Reduced SU(3) Gaugino Mass: Consequences for Dark Matter and Collider Searches

    Get PDF
    In gravity-mediated SUSY breaking models with non-universal gaugino masses, lowering the SU(3) gaugino mass |M_3| leads to a reduction in the squark and gluino masses. Lower third generation squark masses, in turn, diminish the effect of a large top quark Yukawa coupling in the running of the higgs mass parameter m_{H_u}^2, leading to a reduction in the magnitude of the superpotential mu parameter (relative to M_1 and M_2). A low | mu | parameter gives rise to mixed higgsino dark matter (MHDM), which can efficiently annihilate in the early universe to give a dark matter relic density in accord with WMAP measurements. We explore the phenomenology of the low |M_3| scenario, and find for the case of MHDM increased rates for direct and indirect detection of neutralino dark matter relative to the mSUGRA model. The sparticle mass spectrum is characterized by relatively light gluinos, frequently with m(gl)<<m(sq). If scalar masses are large, then gluinos can be very light, with gl->Z_i+g loop decays dominating the gluino branching fraction. Top squarks can be much lighter than sbottom and first/second generation squarks. The presence of low mass higgsino-like charginos and neutralinos is expected at the CERN LHC. The small m(Z2)-m(Z1) mass gap should give rise to a visible opposite-sign/same flavor dilepton mass edge. At a TeV scale linear e^+e^- collider, the region of MHDM will mean that the entire spectrum of charginos and neutralinos are amongst the lightest sparticles, and are most likely to be produced at observable rates, allowing for a complete reconstruction of the gaugino-higgsino sector.Comment: 35 pages, including 26 EPS figure

    Congenital myasthenic syndrome caused by a frameshift insertion mutation in

    Get PDF
    Objective: Description of a new variant of the glutamine-fructose-6-phosphate transaminase 1 (GFPT1) gene causing congenital myasthenic syndrome (CMS) in 3 children from 2 unrelated families. Methods: Muscle biopsies, EMG, and whole-exome sequencing were performed. Results: All 3 patients presented with congenital hypotonia, muscle weakness, respiratory insufficiency, head lag, areflexia, and gastrointestinal dysfunction. Genetic analysis identified a homozygous frameshift insertion in the GFPT1 gene (NM_001244710.1: c.686dupC; p.Arg230Ter) that was shared by all 3 patients. In one of the patients, inheritance of the variant was through uniparental disomy (UPD) with maternal origin. Repetitive nerve stimulation and single-fiber EMG was consistent with the clinical diagnosis of CMS with a postjunctional defect. Ultrastructural evaluation of the muscle biopsy from one of the patients showed extremely attenuated postsynaptic folds at neuromuscular junctions and extensive autophagic vacuolar pathology. Conclusions: These results expand on the spectrum of known loss-of-function GFPT1 mutations in CMS12 and in one family demonstrate a novel mode of inheritance due to UPD

    On the influence of the cosmological constant on gravitational lensing in small systems

    Full text link
    The cosmological constant Lambda affects gravitational lensing phenomena. The contribution of Lambda to the observable angular positions of multiple images and to their amplification and time delay is here computed through a study in the weak deflection limit of the equations of motion in the Schwarzschild-de Sitter metric. Due to Lambda the unresolved images are slightly demagnified, the radius of the Einstein ring decreases and the time delay increases. The effect is however negligible for near lenses. In the case of null cosmological constant, we provide some updated results on lensing by a Schwarzschild black hole.Comment: 8 pages, 1 figure; v2: extended discussion on the lens equation, references added, results unchanged, in press on PR

    A User's Guide to the Encyclopedia of DNA Elements (ENCODE)

    Get PDF
    The mission of the Encyclopedia of DNA Elements (ENCODE) Project is to enable the scientific and medical communities to interpret the human genome sequence and apply it to understand human biology and improve health. The ENCODE Consortium is integrating multiple technologies and approaches in a collective effort to discover and define the functional elements encoded in the human genome, including genes, transcripts, and transcriptional regulatory regions, together with their attendant chromatin states and DNA methylation patterns. In the process, standards to ensure high-quality data have been implemented, and novel algorithms have been developed to facilitate analysis. Data and derived results are made available through a freely accessible database. Here we provide an overview of the project and the resources it is generating and illustrate the application of ENCODE data to interpret the human genome.National Human Genome Research Institute (U.S.)National Institutes of Health (U.S.

    Evidence for Transcript Networks Composed of Chimeric RNAs in Human Cells

    Get PDF
    The classic organization of a gene structure has followed the Jacob and Monod bacterial gene model proposed more than 50 years ago. Since then, empirical determinations of the complexity of the transcriptomes found in yeast to human has blurred the definition and physical boundaries of genes. Using multiple analysis approaches we have characterized individual gene boundaries mapping on human chromosomes 21 and 22. Analyses of the locations of the 5′ and 3′ transcriptional termini of 492 protein coding genes revealed that for 85% of these genes the boundaries extend beyond the current annotated termini, most often connecting with exons of transcripts from other well annotated genes. The biological and evolutionary importance of these chimeric transcripts is underscored by (1) the non-random interconnections of genes involved, (2) the greater phylogenetic depth of the genes involved in many chimeric interactions, (3) the coordination of the expression of connected genes and (4) the close in vivo and three dimensional proximity of the genomic regions being transcribed and contributing to parts of the chimeric RNAs. The non-random nature of the connection of the genes involved suggest that chimeric transcripts should not be studied in isolation, but together, as an RNA network

    Disease-Causing 7.4 kb Cis-Regulatory Deletion Disrupting Conserved Non-Coding Sequences and Their Interaction with the FOXL2 Promotor: Implications for Mutation Screening

    Get PDF
    To date, the contribution of disrupted potentially cis-regulatory conserved non-coding sequences (CNCs) to human disease is most likely underestimated, as no systematic screens for putative deleterious variations in CNCs have been conducted. As a model for monogenic disease we studied the involvement of genetic changes of CNCs in the cis-regulatory domain of FOXL2 in blepharophimosis syndrome (BPES). Fifty-seven molecularly unsolved BPES patients underwent high-resolution copy number screening and targeted sequencing of CNCs. Apart from three larger distant deletions, a de novo deletion as small as 7.4 kb was found at 283 kb 5′ to FOXL2. The deletion appeared to be triggered by an H-DNA-induced double-stranded break (DSB). In addition, it disrupts a novel long non-coding RNA (ncRNA) PISRT1 and 8 CNCs. The regulatory potential of the deleted CNCs was substantiated by in vitro luciferase assays. Interestingly, Chromosome Conformation Capture (3C) of a 625 kb region surrounding FOXL2 in expressing cellular systems revealed physical interactions of three upstream fragments and the FOXL2 core promoter. Importantly, one of these contains the 7.4 kb deleted fragment. Overall, this study revealed the smallest distant deletion causing monogenic disease and impacts upon the concept of mutation screening in human disease and developmental disorders in particular
    corecore