1,557 research outputs found
Radiative Neutrino Mass with Dark matter: From Relic Density to LHC Signatures
In this work we give a comprehensive analysis on the phenomenology of a
specific dark matter (DM) model in which neutrino mass is
induced at two loops by interactions with a DM particle that can be a complex
scalar or a Dirac fermion. Both the DM properties in relic density and direct
detection and the LHC signatures are examined in great detail, and indirect
detection for gamma-ray excess from the Galactic Center is also discussed
briefly. On the DM side, both semi-annihilation and co-annihilation processes
play a crucial role in alleviating the tension of parameter space between relic
density and direct detection. On the collider side, new decay channels
resulting from particles lead to distinct signals at LHC.
Currently the trilepton signal is expected to give the most stringent bound for
both scalar and fermion DM candidates, and the signatures of fermion DM are
very similar to those of electroweakinos in simplified supersymmetric models.Comment: 40 pages, 24 figure
The contribution of Alu exons to the human proteome.
BackgroundAlu elements are major contributors to lineage-specific new exons in primate and human genomes. Recent studies indicate that some Alu exons have high transcript inclusion levels or tissue-specific splicing profiles, and may play important regulatory roles in modulating mRNA degradation or translational efficiency. However, the contribution of Alu exons to the human proteome remains unclear and controversial. The prevailing view is that exons derived from young repetitive elements, such as Alu elements, are restricted to regulatory functions and have not had adequate evolutionary time to be incorporated into stable, functional proteins.ResultsWe adopt a proteotranscriptomics approach to systematically assess the contribution of Alu exons to the human proteome. Using RNA sequencing, ribosome profiling, and proteomics data from human tissues and cell lines, we provide evidence for the translational activities of Alu exons and the presence of Alu exon derived peptides in human proteins. These Alu exon peptides represent species-specific protein differences between primates and other mammals, and in certain instances between humans and closely related primates. In the case of the RNA editing enzyme ADARB1, which contains an Alu exon peptide in its catalytic domain, RNA sequencing analyses of A-to-I editing demonstrate that both the Alu exon skipping and inclusion isoforms encode active enzymes. The Alu exon derived peptide may fine tune the overall editing activity and, in limited cases, the site selectivity of ADARB1 protein products.ConclusionsOur data indicate that Alu elements have contributed to the acquisition of novel protein sequences during primate and human evolution
Requirements-driven self-repairing against environmental failures
Self-repairing approaches have been proposed to alleviate the runtime requirements satisfaction problem by switching to appropriate alternative solutions according to the feedback monitored. However, little has been done formally on analyzing the relations between specific environmental failures and corresponding repairing decisions, making it a challenge to derive a set of alternative solutions to withstand possible environmental failures at runtime. To address these challenges, we propose a requirements-driven self-repairing approach against environmental failures, which combines both development-time and runtime techniques. At the development phase, in a stepwise manner, we formally analyze the issue of self-repairing against environmental failures with the support of the model checking technique, and then design a sufficient and necessary set of alternative solutions to withstand possible environmental failures. The runtime part is a runtime self-repairing mechanism that monitors the operating environment for unsatisfiable situations, and makes self-repairing decisions among alternative solutions in response to the detected environmental failures
RIDCP: Revitalizing Real Image Dehazing via High-Quality Codebook Priors
Existing dehazing approaches struggle to process real-world hazy images owing
to the lack of paired real data and robust priors. In this work, we present a
new paradigm for real image dehazing from the perspectives of synthesizing more
realistic hazy data and introducing more robust priors into the network.
Specifically, (1) instead of adopting the de facto physical scattering model,
we rethink the degradation of real hazy images and propose a phenomenological
pipeline considering diverse degradation types. (2) We propose a Real Image
Dehazing network via high-quality Codebook Priors (RIDCP). Firstly, a VQGAN is
pre-trained on a large-scale high-quality dataset to obtain the discrete
codebook, encapsulating high-quality priors (HQPs). After replacing the
negative effects brought by haze with HQPs, the decoder equipped with a novel
normalized feature alignment module can effectively utilize high-quality
features and produce clean results. However, although our degradation pipeline
drastically mitigates the domain gap between synthetic and real data, it is
still intractable to avoid it, which challenges HQPs matching in the wild.
Thus, we re-calculate the distance when matching the features to the HQPs by a
controllable matching operation, which facilitates finding better counterparts.
We provide a recommendation to control the matching based on an explainable
solution. Users can also flexibly adjust the enhancement degree as per their
preference. Extensive experiments verify the effectiveness of our data
synthesis pipeline and the superior performance of RIDCP in real image
dehazing.Comment: Acceptted by CVPR 202
- …