45 research outputs found
Excess of Tau events at SND@LHC, FASER and FASER2
During the run III of the LHC, the forward experiments FASER and SND@LHC
will be able to detect the Charged Current (CC) interactions of the high energy
neutrinos of all three flavors produced at the ATLAS Interaction Point (IP).
This opportunity may unravel mysteries of the third generation leptons. We
build three models that can lead to a tau excess at these detectors through the
following Lepton Flavor Violating (LFV) beyond Standard Model (SM) processes:
(1) ; (2) and (3)
. We comment on the possibility of solving the
anomaly and the decay anomalies within these models. We
study the potential of the forward experiments to discover the excess or
to constrain these models in case of no excess. We then compare the reach of
the forward experiments with that of the previous as well as next generation
experiments such as DUNE. We also discuss how the upgrade of FASER can
distinguish between these models by studying the energy spectrum of the tau
Estimating the severity of dental and oral problems via sentiment classification over clinical reports
Analyzing authors' sentiments in texts as a technique for identifying text
polarity can be practical and useful in various fields, including medicine and
dentistry. Currently, due to factors such as patients' limited knowledge about
their condition, difficulties in accessing specialist doctors, or fear of
illness, particularly in pandemic conditions, there might be a delay between
receiving a radiology report and consulting a doctor. In some cases, this delay
can pose significant risks to the patient, making timely decision-making
crucial. Having an automatic system that can inform patients about the
deterioration of their condition by analyzing the text of radiology reports
could greatly impact timely decision-making. In this study, a dataset
comprising 1,134 cone-beam computed tomography (CBCT) photo reports was
collected from the Shiraz University of Medical Sciences. Each case was
examined, and an expert labeled a severity level for the patient's condition on
each document. After preprocessing all the text data, a deep learning model
based on Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM)
network architecture, known as CNN-LSTM, was developed to detect the severity
level of the patient's problem based on sentiment analysis in the radiologist's
report. The model's performance was evaluated on two datasets, each with two
and four classes, in both imbalanced and balanced scenarios. Finally, to
demonstrate the effectiveness of our model, we compared its performance with
that of other classification models. The results, along with one-way ANOVA and
Tukey's test, indicated that our proposed model (CNN-LSTM) performed the best
according to precision, recall, and f-measure criteria. This suggests that it
can be a reliable model for estimating the severity of oral and dental
diseases, thereby assisting patients
Imprint of massive neutrinos on Persistent Homology of large-scale structure
Exploiting the Persistent Homology technique and an associated complementary
representation which enables us to construct a synergistic pipeline for
different topological features quantified by Betti curves in reducing the
degeneracy between cosmological parameters, we investigate the footprint of
summed massive neutrinos () in different density fields simulated by
the publicly available Quijote suite. Evolution of topological features in the
context of super-level filtration on three-dimensional density fields, reveals
remarkable indicators for constraining the and . The
abundance of 2-holes is more sensitive to the presence of , also the
persistence of topological features plays a crucial role in cosmological
inference and reducing the degeneracy associated with simulation
rather than their birth thresholds when either the total matter density ()
field or those part including only cold dark matter+baryons () is utilized.
Incorporating the Betti-1 and Betti-2 for part of simulation
marginalized over the thresholds implies variation compared to the
massless neutrinos simulation. The constraint on from and
its joint analysis with birth threshold and persistency of topological features
for total mass density field smoothed by Mpc h at zero redshift
reach to eV and eV, at confidence interval,
respectively.Comment: 12 pages, 8 figures, and one table, comments are welcom
AI-based Radio and Computing Resource Allocation and Path Planning in NOMA NTNs: AoI Minimization under CSI Uncertainty
In this paper, we develop a hierarchical aerial computing framework composed
of high altitude platform (HAP) and unmanned aerial vehicles (UAVs) to compute
the fully offloaded tasks of terrestrial mobile users which are connected
through an uplink non-orthogonal multiple access (UL-NOMA). To better assess
the freshness of information in computation-intensive applications the
criterion of age of information (AoI) is considered. In particular, the problem
is formulated to minimize the average AoI of users with elastic tasks, by
adjusting UAVs trajectory and resource allocation on both UAVs and HAP, which
is restricted by the channel state information (CSI) uncertainty and multiple
resource constraints of UAVs and HAP. In order to solve this non-convex
optimization problem, two methods of multi-agent deep deterministic policy
gradient (MADDPG) and federated reinforcement learning (FRL) are proposed to
design the UAVs trajectory, and obtain channel, power, and CPU allocations. It
is shown that task scheduling significantly reduces the average AoI. This
improvement is more pronounced for larger task sizes. On one hand, it is shown
that power allocation has a marginal effect on the average AoI compared to
using full transmission power for all users. Compared with traditional
transmission schemes, the simulation results show our scheduling scheme results
in a substantial improvement in average AoI
Dynamic Fairness-Aware Spectrum Auction for Enhanced Licensed Shared Access in 6G Networks
This article introduces a new approach to address the spectrum scarcity
challenge in 6G networks by implementing the enhanced licensed shared access
(ELSA) framework. Our proposed auction mechanism aims to ensure fairness in
spectrum allocation to mobile network operators (MNOs) through a novel weighted
auction called the fair Vickery-Clarke-Groves (FVCG) mechanism. Through
comparison with traditional methods, the study demonstrates that the proposed
auction method improves fairness significantly. We suggest using spectrum
sensing and integrating UAV-based networks to enhance efficiency of the LSA
system. This research employs two methods to solve the problem. We first
propose a novel greedy algorithm, named market share based weighted greedy
algorithm (MSWGA) to achieve better fairness compared to the traditional
auction methods and as the second approach, we exploit deep reinforcement
learning (DRL) algorithms, to optimize the auction policy and demonstrate its
superiority over other methods. Simulation results show that the deep
deterministic policy gradient (DDPG) method performs superior to soft actor
critic (SAC), MSWGA, and greedy methods. Moreover, a significant improvement is
observed in fairness index compared to the traditional greedy auction methods.
This improvement is as high as about 27% and 35% when deploying the MSWGA and
DDPG methods, respectively.Comment: 13 pages, 11 figure