240 research outputs found

    Higher Rank Wilson Loops in N = 2* Super-Yang-Mills Theory

    Get PDF
    The N=2* Super-Yang-Mills theory (SYM*) undergoes an infinite sequence of large-N quantum phase transitions. We compute expectation values of Wilson loops in k-symmetric and antisymmetric representations of the SU(N) gauge group in this theory and show that the same phenomenon that causes the phase transitions at finite coupling leads to a non-analytic dependence of Wilson loops on k/N when the coupling is strictly infinite, thus making the higher-representation Wilson loops ideal holographic probes of the non-trivial phase structure of SYM*.Comment: 33 pages, 6 figures. v2: a new reference adde

    Notes on a non-thermal fluctuation-dissipation relation in quantum Brownian motion

    Full text link
    We review how unitarity and stationarity in the Schwinger-Keldysh formalism naturally lead to a (quantum) generalized fluctuation-dissipation relation (gFDR) that works beyond thermal equilibrium. Non-Gaussian loop corrections are also presented. Additionally, we illustrate the application of this gFDR in various scenarios related to quantum Brownian motion and the generalized Langevin equation.Comment: 24 pages, 9 figure

    N=2* Super-Yang-Mills Theory at Strong Coupling

    Get PDF
    The planar N=2* Super-Yang-Mills (SYM) theory is solved at large 't Hooft coupling using localization on S(4). The solution permits detailed investigation of the resonance phenomena responsible for quantum phase transitions in infinite volume, and leads to quantitative predictions for the semiclassical string dual of the N=2* theory.Comment: 34 pages, 9 figures; v2: the name of one author change

    Test for rare variants by environment interactions in sequencing association studies

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/142484/1/biom12368_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/142484/2/biom12368.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/142484/3/biom12368-sup-0001-SuppData.pd

    Acne and risk of mental disorders: A two-sample Mendelian randomization study based on large genome-wide association data

    Get PDF
    BackgroundDespite a growing body of evidence that acne impacts mental disorders, the actual causality has not been established for the possible presence of recall bias and confounders in observational studies.MethodsWe performed a two-sample Mendelian randomization (MR) analysis to evaluate the effect of acne on the risk of six common mental disorders, i.e., depression, anxiety, schizophrenia, obsessive–compulsive disorder (OCD), bipolar disorder, and post-traumatic stress disorder (PTSD). We acquired genetic instruments for assessing acne from the largest genome-wide association study (GWAS) of acne (N = 615,396) and collected summary statistics from the largest available GWAS for depression (N = 500,199), anxiety (N = 17,310), schizophrenia (N = 130,644), OCD (N = 9,725), bipolar disorder (N = 413,466), and PTSD (N = 174,659). Next, we performed the two-sample MR analysis using four methods: inverse-variance weighted method, MR-Egger, weighted median, and MR pleiotropy residual sum and outliers. Sensitivity analysis was also performed for heterogeneity and pleiotropy tests.ResultsThere was no evidence of a causal impact of acne on the risk of depression [odds ratio (OR): 1.002, p = 0.874], anxiety (OR: 0.961, p = 0.49), OCD (OR: 0.979, p = 0.741), bipolar disorder (OR: 0.972, p = 0.261), and PTSD (OR: 1.054, p = 0.069). Moreover, a mild protective effect of acne against schizophrenia was observed (OR: 0.944; p = 0.033).ConclusionThe increased prevalence of mental disorders observed in patients with acne in clinical practice was caused by modifiable factors, and was not a direct outcome of acne. Therefore, strategies targeting the elimination of potential factors and minimization of the occurrence of adverse mental events in acne should be implemented

    SiRA: Sparse Mixture of Low Rank Adaptation

    Full text link
    Parameter Efficient Tuning has been an prominent approach to adapt the Large Language Model to downstream tasks. Most previous works considers adding the dense trainable parameters, where all parameters are used to adapt certain task. We found this less effective empirically using the example of LoRA that introducing more trainable parameters does not help. Motivated by this we investigate the importance of leveraging "sparse" computation and propose SiRA: sparse mixture of low rank adaption. SiRA leverages the Sparse Mixture of Expert(SMoE) to boost the performance of LoRA. Specifically it enforces the top kk experts routing with a capacity limit restricting the maximum number of tokens each expert can process. We propose a novel and simple expert dropout on top of gating network to reduce the over-fitting issue. Through extensive experiments, we verify SiRA performs better than LoRA and other mixture of expert approaches across different single tasks and multitask settings
    corecore