10,678 research outputs found
Quasi-Topological Ricci Polynomial Gravities
Quasi-topological terms in gravity can be viewed as those that give no
contribution to the equations of motion for a special subclass of metric
ans\"atze. They therefore play no r\^ole in constructing these solutions, but
can affect the general perturbations. We consider Einstein gravity extended
with Ricci tensor polynomial invariants, which admits Einstein metrics with
appropriate effective cosmological constants as its vacuum solutions. We
construct three types of quasi-topological gravities. The first type is for the
most general static metrics with spherical, toroidal or hyperbolic isometries.
The second type is for the special static metrics where is
constant. The third type is the linearized quasi-topological gravities on the
Einstein metrics. We construct and classify results that are either dependent
on or independent of dimensions, up to the tenth order. We then consider a
subset of these three types and obtain Lovelock-like quasi-topological
gravities, that are independent of the dimensions. The linearized gravities on
Einstein metrics on all dimensions are simply Einstein and hence ghost free.
The theories become quasi-topological on static metrics in one specific
dimension, but non-trivial in others. We also focus on the quasi-topological
Ricci cubic invariant in four dimensions as a specific example to study its
effect on holography, including shear viscosity, thermoelectric DC
conductivities and butterfly velocity. In particular, we find that the
holographic diffusivity bounds can be violated by the quasi-topological terms,
which can induce an extra massive mode that yields a butterfly velocity unbound
above.Comment: Latex, 56 pages, discussion on shear viscosity revise
The Progress, Challenges, and Perspectives of Directed Greybox Fuzzing
Most greybox fuzzing tools are coverage-guided as code coverage is strongly
correlated with bug coverage. However, since most covered codes may not contain
bugs, blindly extending code coverage is less efficient, especially for corner
cases. Unlike coverage-guided greybox fuzzers who extend code coverage in an
undirected manner, a directed greybox fuzzer spends most of its time allocation
on reaching specific targets (e.g., the bug-prone zone) without wasting
resources stressing unrelated parts. Thus, directed greybox fuzzing (DGF) is
particularly suitable for scenarios such as patch testing, bug reproduction,
and specialist bug hunting. This paper studies DGF from a broader view, which
takes into account not only the location-directed type that targets specific
code parts, but also the behaviour-directed type that aims to expose abnormal
program behaviours. Herein, the first in-depth study of DGF is made based on
the investigation of 32 state-of-the-art fuzzers (78% were published after
2019) that are closely related to DGF. A thorough assessment of the collected
tools is conducted so as to systemise recent progress in this field. Finally,
it summarises the challenges and provides perspectives for future research.Comment: 16 pages, 4 figure
The flavor-changing rare top decays in topcolor-assisted technicolor theory
In the framework of topcolor-assisted technicolor (TC2) theory, we calculate
the contributions of the scalars(the neutral top-pion and the
top-Higgs ) to the flavor-changing rare top decays (V=
W, g, or Z). Our results show that can enhance the
standard model by several orders of
magnitude for most of the parameter space. The peak of the branching ratio
resonance emerges when the top-Higgs mass is between and . The
branching ratio can reach in the narrow range.Comment: Latex file, 11pages, 2 eps figure
EDEN: A Plug-in Equivariant Distance Encoding to Beyond the 1-WL Test
The message-passing scheme is the core of graph representation learning.
While most existing message-passing graph neural networks (MPNNs) are
permutation-invariant in graph-level representation learning and
permutation-equivariant in node- and edge-level representation learning, their
expressive power is commonly limited by the 1-Weisfeiler-Lehman (1-WL) graph
isomorphism test. Recently proposed expressive graph neural networks (GNNs)
with specially designed complex message-passing mechanisms are not practical.
To bridge the gap, we propose a plug-in Equivariant Distance ENcoding (EDEN)
for MPNNs. EDEN is derived from a series of interpretable transformations on
the graph's distance matrix. We theoretically prove that EDEN is
permutation-equivariant for all level graph representation learning, and we
empirically illustrate that EDEN's expressive power can reach up to the 3-WL
test. Extensive experiments on real-world datasets show that combining EDEN
with conventional GNNs surpasses recent advanced GNNs
- β¦