71 research outputs found

    γ∗γ→η,η′\gamma^\ast \gamma \to \eta, \eta^\prime transition form factors

    Full text link
    Using a continuum approach to the hadron bound-state problem, we calculate γ∗γ→η,η′\gamma^\ast \gamma \to \eta, \eta^\prime transition form factors on the entire domain of spacelike momenta, for comparison with existing experiments and in anticipation of new precision data from next-generation e+e−e^+ e^- colliders. One novel feature is a model for the contribution to the Bethe-Salpeter kernel deriving from the non-Abelian anomaly, an element which is crucial for any computation of η,η′\eta, \eta^\prime properties. The study also delivers predictions for the amplitudes that describe the light- and strange-quark distributions within the η,η′\eta, \eta^\prime. Our results compare favourably with available data. Important to this at large-Q2Q^2 is a sound understanding of QCD evolution, which has a visible impact on the η′\eta^\prime in particular. Our analysis also provides some insights into the properties of η,η′\eta, \eta^\prime mesons and associated observable manifestations of the non-Abelian anomaly.Comment: 16 pages, 7 figures, 3 table

    Mitigating the Performance Sacrifice in DP-Satisfied Federated Settings through Graph Contrastive Learning

    Full text link
    Currently, graph learning models are indispensable tools to help researchers explore graph-structured data. In academia, using sufficient training data to optimize a graph model on a single device is a typical approach for training a capable graph learning model. Due to privacy concerns, however, it is infeasible to do so in real-world scenarios. Federated learning provides a practical means of addressing this limitation by introducing various privacy-preserving mechanisms, such as differential privacy (DP) on the graph edges. However, although DP in federated graph learning can ensure the security of sensitive information represented in graphs, it usually causes the performance of graph learning models to degrade. In this paper, we investigate how DP can be implemented on graph edges and observe a performance decrease in our experiments. In addition, we note that DP on graph edges introduces noise that perturbs graph proximity, which is one of the graph augmentations in graph contrastive learning. Inspired by this, we propose leveraging graph contrastive learning to alleviate the performance drop resulting from DP. Extensive experiments conducted with four representative graph models on five widely used benchmark datasets show that contrastive learning indeed alleviates the models' DP-induced performance drops.Comment: Accepted by Information Science

    ARF-BP1/Mule Is a Critical Mediator of the ARF Tumor Suppressor

    Get PDF
    SummaryAlthough the importance of the ARF tumor suppressor in p53 regulation is well established, numerous studies indicate that ARF also suppresses cell growth in a p53/Mdm2-independent manner. To understand the mechanism of ARF-mediated tumor suppression, we identified a ubiquitin ligase, ARF-BP1, as a key factor associated with ARF in vivo. ARF-BP1 harbors a signature HECT motif, and its ubiquitin ligase activity is inhibited by ARF. Notably, inactivation of ARF-BP1, but not Mdm2, suppresses the growth of p53 null cells in a manner reminiscent of ARF induction. Surprisingly, in p53 wild-type cells, ARF-BP1 directly binds and ubiquitinates p53, and inactivation of endogenous ARF-BP1 is crucial for ARF-mediated p53 stabilization. Thus, our study modifies the current view of ARF-mediated p53 activation and reveals that ARF-BP1 is a critical mediator of both the p53-independent and p53-dependent tumor suppressor functions of ARF. As such, ARF-BP1 may serve as a potential target for therapeutic intervention in tumors regardless of p53 status
    • …
    corecore