87,061 research outputs found
Bootstrapping Real-world Deployment of Future Internet Architectures
The past decade has seen many proposals for future Internet architectures.
Most of these proposals require substantial changes to the current networking
infrastructure and end-user devices, resulting in a failure to move from theory
to real-world deployment. This paper describes one possible strategy for
bootstrapping the initial deployment of future Internet architectures by
focusing on providing high availability as an incentive for early adopters.
Through large-scale simulation and real-world implementation, we show that with
only a small number of adopting ISPs, customers can obtain high availability
guarantees. We discuss design, implementation, and evaluation of an
availability device that allows customers to bridge into the future Internet
architecture without modifications to their existing infrastructure
Generating Probability Distributions using Multivalued Stochastic Relay Circuits
The problem of random number generation dates back to von Neumann's work in
1951. Since then, many algorithms have been developed for generating unbiased
bits from complex correlated sources as well as for generating arbitrary
distributions from unbiased bits. An equally interesting, but less studied
aspect is the structural component of random number generation as opposed to
the algorithmic aspect. That is, given a network structure imposed by nature or
physical devices, how can we build networks that generate arbitrary probability
distributions in an optimal way? In this paper, we study the generation of
arbitrary probability distributions in multivalued relay circuits, a
generalization in which relays can take on any of N states and the logical
'and' and 'or' are replaced with 'min' and 'max' respectively. Previous work
was done on two-state relays. We generalize these results, describing a duality
property and networks that generate arbitrary rational probability
distributions. We prove that these networks are robust to errors and design a
universal probability generator which takes input bits and outputs arbitrary
binary probability distributions
Finite reflection groups and graph norms
Given a graph on vertex set and a function , define \begin{align*} \|f\|_{H}:=\left\vert\int
\prod_{ij\in E(H)}f(x_i,x_j)d\mu^{|V(H)|}\right\vert^{1/|E(H)|}, \end{align*}
where is the Lebesgue measure on . We say that is norming if
is a semi-norm. A similar notion is defined by
and is said to be weakly norming if
is a norm. Classical results show that weakly norming graphs
are necessarily bipartite. In the other direction, Hatami showed that even
cycles, complete bipartite graphs, and hypercubes are all weakly norming. We
demonstrate that any graph whose edges percolate in an appropriate way under
the action of a certain natural family of automorphisms is weakly norming. This
result includes all previously known examples of weakly norming graphs, but
also allows us to identify a much broader class arising from finite reflection
groups. We include several applications of our results. In particular, we
define and compare a number of generalisations of Gowers' octahedral norms and
we prove some new instances of Sidorenko's conjecture.Comment: 29 page
Regression Discontinuity Inference with Specification Error
A regression discontinuity (RD) research design is appropriate for program evaluation problems in which treatment status (or the probability of treatment) depends on whether an observed covariate exceeds a fixed threshold. In many applications the treatment-determining covariate is discrete. This makes it impossible to compare outcomes for observations "just above" and "just below" the treatment threshold, and requires the researcher to choose a functional form for the relationship between the treatment variable and the outcomes of interest. We propose a simple econometric procedure to account for uncertainty in the choice of functional form for RD designs with discrete support. In particular, we model deviations of the true regression function from a given approximating function -- the specification errors -- as random. Conventional standard errors ignore the group structure induced by specification errors and tend to overstate the precision of the estimated program impacts. The proposed inference procedure that allows for specification error also has a natural interpretation within a Bayesian framework.
Supersymmetry for Gauged Double Field Theory and Generalised Scherk-Schwarz Reductions
Previous constructions of supersymmetry for double field theory have relied
on the so called strong constraint. In this paper, the strong constraint is
relaxed and the theory is shown to possess supersymmetry once the generalised
Scherk-Schwarz reduction is imposed. The equivalence between the generalised
Scherk-Schwarz reduced theory and the gauged double field theory is then
examined in detail for the supersymmetric theory. As a byproduct we write the
generalised Killing spinor equations for the supersymmetric double field
theory.Comment: 29 pages, LateX, v2 typos fixed and some improved discussion, version
as in Journa
- …