45 research outputs found

    Excess of Tau events at SND@LHC, FASERν\nu and FASERν\nu2

    Full text link
    During the run III of the LHC, the forward experiments FASERν\nu and SND@LHC will be able to detect the Charged Current (CC) interactions of the high energy neutrinos of all three flavors produced at the ATLAS Interaction Point (IP). This opportunity may unravel mysteries of the third generation leptons. We build three models that can lead to a tau excess at these detectors through the following Lepton Flavor Violating (LFV) beyond Standard Model (SM) processes: (1) π+→μ+ντ\pi^+ \to \mu^+ \nu_\tau; (2) π+→μ+νˉτ\pi^+ \to \mu^+ \bar{\nu}_\tau and (3) νe+nucleus→τ+X\nu_e+{\rm nucleus}\to \tau +X. We comment on the possibility of solving the (g−2)μ(g-2)_\mu anomaly and the τ\tau decay anomalies within these models. We study the potential of the forward experiments to discover the τ\tau excess or to constrain these models in case of no excess. We then compare the reach of the forward experiments with that of the previous as well as next generation experiments such as DUNE. We also discuss how the upgrade of FASERν\nu can distinguish between these models by studying the energy spectrum of the tau

    Estimating the severity of dental and oral problems via sentiment classification over clinical reports

    Full text link
    Analyzing authors' sentiments in texts as a technique for identifying text polarity can be practical and useful in various fields, including medicine and dentistry. Currently, due to factors such as patients' limited knowledge about their condition, difficulties in accessing specialist doctors, or fear of illness, particularly in pandemic conditions, there might be a delay between receiving a radiology report and consulting a doctor. In some cases, this delay can pose significant risks to the patient, making timely decision-making crucial. Having an automatic system that can inform patients about the deterioration of their condition by analyzing the text of radiology reports could greatly impact timely decision-making. In this study, a dataset comprising 1,134 cone-beam computed tomography (CBCT) photo reports was collected from the Shiraz University of Medical Sciences. Each case was examined, and an expert labeled a severity level for the patient's condition on each document. After preprocessing all the text data, a deep learning model based on Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM) network architecture, known as CNN-LSTM, was developed to detect the severity level of the patient's problem based on sentiment analysis in the radiologist's report. The model's performance was evaluated on two datasets, each with two and four classes, in both imbalanced and balanced scenarios. Finally, to demonstrate the effectiveness of our model, we compared its performance with that of other classification models. The results, along with one-way ANOVA and Tukey's test, indicated that our proposed model (CNN-LSTM) performed the best according to precision, recall, and f-measure criteria. This suggests that it can be a reliable model for estimating the severity of oral and dental diseases, thereby assisting patients

    Imprint of massive neutrinos on Persistent Homology of large-scale structure

    Full text link
    Exploiting the Persistent Homology technique and an associated complementary representation which enables us to construct a synergistic pipeline for different topological features quantified by Betti curves in reducing the degeneracy between cosmological parameters, we investigate the footprint of summed massive neutrinos (MνM_{\nu}) in different density fields simulated by the publicly available Quijote suite. Evolution of topological features in the context of super-level filtration on three-dimensional density fields, reveals remarkable indicators for constraining the MνM_{\nu} and σ8\sigma_8. The abundance of 2-holes is more sensitive to the presence of MνM_{\nu}, also the persistence of topological features plays a crucial role in cosmological inference and reducing the degeneracy associated with MνM_{\nu} simulation rather than their birth thresholds when either the total matter density (mm) field or those part including only cold dark matter+baryons (cbcb) is utilized. Incorporating the Betti-1 and Betti-2 for cbcb part of Mν+M^+_{\nu} simulation marginalized over the thresholds implies 5%5\% variation compared to the massless neutrinos simulation. The constraint on MνM_{\nu} from βk\beta_k and its joint analysis with birth threshold and persistency of topological features for total mass density field smoothed by R=5R=5 Mpc h−1^{-1} at zero redshift reach to 0.01720.0172 eV and 0.01520.0152 eV, at 1σ1\sigma confidence interval, respectively.Comment: 12 pages, 8 figures, and one table, comments are welcom

    AI-based Radio and Computing Resource Allocation and Path Planning in NOMA NTNs: AoI Minimization under CSI Uncertainty

    Full text link
    In this paper, we develop a hierarchical aerial computing framework composed of high altitude platform (HAP) and unmanned aerial vehicles (UAVs) to compute the fully offloaded tasks of terrestrial mobile users which are connected through an uplink non-orthogonal multiple access (UL-NOMA). To better assess the freshness of information in computation-intensive applications the criterion of age of information (AoI) is considered. In particular, the problem is formulated to minimize the average AoI of users with elastic tasks, by adjusting UAVs trajectory and resource allocation on both UAVs and HAP, which is restricted by the channel state information (CSI) uncertainty and multiple resource constraints of UAVs and HAP. In order to solve this non-convex optimization problem, two methods of multi-agent deep deterministic policy gradient (MADDPG) and federated reinforcement learning (FRL) are proposed to design the UAVs trajectory, and obtain channel, power, and CPU allocations. It is shown that task scheduling significantly reduces the average AoI. This improvement is more pronounced for larger task sizes. On one hand, it is shown that power allocation has a marginal effect on the average AoI compared to using full transmission power for all users. Compared with traditional transmission schemes, the simulation results show our scheduling scheme results in a substantial improvement in average AoI

    Dynamic Fairness-Aware Spectrum Auction for Enhanced Licensed Shared Access in 6G Networks

    Full text link
    This article introduces a new approach to address the spectrum scarcity challenge in 6G networks by implementing the enhanced licensed shared access (ELSA) framework. Our proposed auction mechanism aims to ensure fairness in spectrum allocation to mobile network operators (MNOs) through a novel weighted auction called the fair Vickery-Clarke-Groves (FVCG) mechanism. Through comparison with traditional methods, the study demonstrates that the proposed auction method improves fairness significantly. We suggest using spectrum sensing and integrating UAV-based networks to enhance efficiency of the LSA system. This research employs two methods to solve the problem. We first propose a novel greedy algorithm, named market share based weighted greedy algorithm (MSWGA) to achieve better fairness compared to the traditional auction methods and as the second approach, we exploit deep reinforcement learning (DRL) algorithms, to optimize the auction policy and demonstrate its superiority over other methods. Simulation results show that the deep deterministic policy gradient (DDPG) method performs superior to soft actor critic (SAC), MSWGA, and greedy methods. Moreover, a significant improvement is observed in fairness index compared to the traditional greedy auction methods. This improvement is as high as about 27% and 35% when deploying the MSWGA and DDPG methods, respectively.Comment: 13 pages, 11 figure
    corecore