arXiv.org e-Print Archive
Not a member yet
343883 research outputs found
Sort by
Spread-out percolation on transitive graphs of polynomial growth
Let be a vertex-transitive graph of superlinear polynomial growth. Given
, let be the graph on the same vertex set as , with two vertices
joined by an edge if and only if they are at graph distance at most apart
in . We show that the critical probability for Bernoulli bond
percolation on satisfies as
. This extends work of Penrose and Bollob\'as-Janson-Riordan, who
considered the case .
Our result provides an important ingredient in parallel work of
Georgakopoulos in which he introduces a new notion of dimension in groups. It
also verifies a special case of a conjecture of Easo and Hutchcroft.Comment: 35 page
Schertz style class invariants for higher degree CM fields
Special values of Siegel modular functions for generate class fields of CM fields. They also yield abelian
varieties with a known endomorphism ring. Smaller alternative values of modular
functions that lie in the same class fields (class invariants) thus help to
speed up the computation of those mathematical objects.
We show that modular functions for the subgroup yield class invariants under some splitting
conditions on , generalising results due to Schertz from classical modular
functions to Siegel modular functions. We show how to obtain all Galois
conjugates of a class invariant by evaluating the same modular function in CM
period matrices derived from an \emph{-system}. Such a system consists of
quadratic polynomials with coefficients in the real-quadratic subfield
satisfying certain congruence conditions modulo . We also examine conditions
under which the minimal polynomial of a class invariant is real.
Examples show that we may obtain class invariants that are much smaller than
in previous constructions
Graphs for torus actions on oriented manifolds with isolated fixed points and classification in dimension 6
Let a torus act on a compact oriented manifold with isolated fixed
points, with an additional mild assumption that its isotropy submanifolds are
orientable. We associate a signed labeled multigraph encoding the fixed point
data (weights and signs at fixed points and isotropy submanifolds) of the
manifold. We study operations on and its multigraph, (self) connected sum
and blow up, etc. When the circle group acts on a 6-dimensional , we
classify such a multigraph by proving that we can convert it into the empty
graph by successively applying two types of operations. In particular, this
classifies the fixed point data of any such manifold. We prove this by showing
that for any such manifold, we can successively take equivariant connected sums
at fixed points with itself, , and 6-dimensional analogue
and of the Hirzebruch surfaces (and these with opposite orientations) to
a fixed point free action on a compact oriented 6-manifold. We also classify a
multigraph for a torus action on a 4-dimensional .Comment: Added the assumption on the orientability of isotropy submanifolds.
This paper supercedes arXiv:2108.07560; main results are new, while including
all results of the previous on
Noetherianity of twisted Zhu algebra and bimodules
In this paper we show that for a large natural class of vertex operator
algebras (VOAs) and their modules, the Zhu algebras and bimodules (and their
-twisted analogs) are Noetherian. These carry important information about
the representation theory of the VOA, and its fusion rules, and the Noetherian
property gives the potential for (non-commutative) algebro-geometric methods to
be employed in their study
Is Universal Broadband Service Impossible?
Broadband Internet service is widely expected to be the fundamental universal
service for the 21st century. But more than a decade of national and
international struggles to close the digital divide between broadband haves and
have nots suggest that reaching global universality will be a very difficult
task. This paper argues that the strong guarantees made by the current
broadband paradigm - low latency and constant availability - are unnecessary
obstacles to its adoption as an affordable and universal digital service. We
show that there is nonetheless a plausible strategy for deploying a Basic
Broadband service that does not require such guarantees and is able to offer,
at reasonable cost, almost all the critical and valuable services and
applications currently delivered over low latency broadband, synchronous
telepresence excepted.Comment: Appeared in IEEE 19th International Conference on Mobile Ad Hoc and
Smart Systems 202
Building Ocean Climate Emulators
The current explosion in machine learning for climate has led to skilled,
computationally cheap emulators for the atmosphere. However, the research for
ocean emulators remains nascent despite the large potential for accelerating
coupled climate simulations and improving ocean forecasts on all timescales.
There are several fundamental questions to address that can facilitate the
creation of ocean emulators. Here we focus on two questions: 1) the role of the
atmosphere in improving the extended skill of the emulator and 2) the
representation of variables with distinct timescales (e.g., velocity and
temperature) in the design of any emulator. In tackling these questions, we
show stable prediction of surface fields for over 8 years, training and testing
on data from a high-resolution coupled climate model, using results from four
regions of the globe. Our work lays out a set of physically motivated
guidelines for building ocean climate emulators
Graph-Skeleton: ~1% Nodes are Sufficient to Represent Billion-Scale Graph
Due to the ubiquity of graph data on the web, web graph mining has become a
hot research spot. Nonetheless, the prevalence of large-scale web graphs in
real applications poses significant challenges to storage, computational
capacity and graph model design. Despite numerous studies to enhance the
scalability of graph models, a noticeable gap remains between academic research
and practical web graph mining applications. One major cause is that in most
industrial scenarios, only a small part of nodes in a web graph are actually
required to be analyzed, where we term these nodes as target nodes, while
others as background nodes. In this paper, we argue that properly fetching and
condensing the background nodes from massive web graph data might be a more
economical shortcut to tackle the obstacles fundamentally. To this end, we make
the first attempt to study the problem of massive background nodes compression
for target nodes classification. Through extensive experiments, we reveal two
critical roles played by the background nodes in target node classification:
enhancing structural connectivity between target nodes, and feature correlation
with target nodes. Followingthis, we propose a novel Graph-Skeleton1 model,
which properly fetches the background nodes, and further condenses the semantic
and topological information of background nodes within similar
target-background local structures. Extensive experiments on various web graph
datasets demonstrate the effectiveness and efficiency of the proposed method.
In particular, for MAG240M dataset with 0.24 billion nodes, our generated
skeleton graph achieves highly comparable performance while only containing
1.8% nodes of the original graph.Comment: 21 pages, 11 figures, In Proceedings of the ACM Web Conference 2024
(WWW'24
Pseudo-K\"ahler structure on the -Hitchin component and Goldman symplectic form
The aim of this paper is to show the existence and give an explicit
description of a pseudo-Riemannian metric and a symplectic form on the
-Hitchin component, both compatible with
Labourie and Loftin's complex structure. In particular, they give rise to a
mapping class group invariant pseudo-K\"ahler structure on a neighborhood of
the Fuchsian locus, which restricts to a multiple of the Weil-Petersson metric
on Teichm\"uller space. By comparing our symplectic form with Goldman's
, we prove that the pair cannot define a K\"ahler structure on the Hitchin component.Comment: Title and introduction changed. Added a result regarding Goldman
symplectic for
Role of chemical potential at kinetic freeze-out using Tsallis non-extensive statistics in proton-proton collisions at the Large Hadron Collider
The charged-particle transverse momentum spectra (-spectra)
measured by the ALICE collaboration for collisions at 7 and
13 TeV have been studied using a thermodynamically consistent form of Tsallis
non-extensive statistics. The Tsallis distribution function is fitted to the
-spectra and the results are analyzed as a function of final state
charged-particle multiplicity for various light flavor and strange particles,
such as . At the LHC energies, particles and
antiparticles are produced in equal numbers. However, the equality of particle
and antiparticle yields at the kinetic freeze-out may imply that they have the
same but opposite chemical potential which is not necessarily zero. We use an
alternative procedure that makes use of parameter redundancy, by introducing a
finite chemical potential at the kinetic freeze-out stage. This article
emphasizes the importance of the chemical potential of the system produced in
collisions at the LHC energies using the Tsallis distribution function
which brings the system to a single freeze-out scenario.Comment: Same as the published version in EPJ
Misspecification-robust Sequential Neural Likelihood for Simulation-based Inference
Simulation-based inference techniques are indispensable for parameter
estimation of mechanistic and simulable models with intractable likelihoods.
While traditional statistical approaches like approximate Bayesian computation
and Bayesian synthetic likelihood have been studied under well-specified and
misspecified settings, they often suffer from inefficiencies due to wasted
model simulations. Neural approaches, such as sequential neural likelihood
(SNL) avoid this wastage by utilising all model simulations to train a neural
surrogate for the likelihood function. However, the performance of SNL under
model misspecification is unreliable and can result in overconfident posteriors
centred around an inaccurate parameter estimate. In this paper, we propose a
novel SNL method, which through the incorporation of additional adjustment
parameters, is robust to model misspecification and capable of identifying
features of the data that the model is not able to recover. We demonstrate the
efficacy of our approach through several illustrative examples, where our
method gives more accurate point estimates and uncertainty quantification than
SNL