28,800 research outputs found
Timed Consistent Network Updates
Network updates such as policy and routing changes occur frequently in
Software Defined Networks (SDN). Updates should be performed consistently,
preventing temporary disruptions, and should require as little overhead as
possible. Scalability is increasingly becoming an essential requirement in SDN.
In this paper we propose to use time-triggered network updates to achieve
consistent updates. Our proposed solution requires lower overhead than existing
update approaches, without compromising the consistency during the update. We
demonstrate that accurate time enables far more scalable consistent updates in
SDN than previously available. In addition, it provides the SDN programmer with
fine-grained control over the tradeoff between consistency and scalability.Comment: This technical report is an extended version of the paper "Timed
Consistent Network Updates", which was accepted to the ACM SIGCOMM Symposium
on SDN Research (SOSR) '15, Santa Clara, CA, US, June 201
Gibbs fragmentation trees
We study fragmentation trees of Gibbs type. In the binary case, we identify
the most general Gibbs-type fragmentation tree with Aldous' beta-splitting
model, which has an extended parameter range with respect to the
probability distributions on which it is based.
In the multifurcating case, we show that Gibbs fragmentation trees are
associated with the two-parameter Poisson--Dirichlet models for exchangeable
random partitions of , with an extended parameter range
, and , , .Comment: Published in at http://dx.doi.org/10.3150/08-BEJ134 the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
Chainspace: A Sharded Smart Contracts Platform
Chainspace is a decentralized infrastructure, known as a distributed ledger,
that supports user defined smart contracts and executes user-supplied
transactions on their objects. The correct execution of smart contract
transactions is verifiable by all. The system is scalable, by sharding state
and the execution of transactions, and using S-BAC, a distributed commit
protocol, to guarantee consistency. Chainspace is secure against subsets of
nodes trying to compromise its integrity or availability properties through
Byzantine Fault Tolerance (BFT), and extremely high-auditability,
non-repudiation and `blockchain' techniques. Even when BFT fails, auditing
mechanisms are in place to trace malicious participants. We present the design,
rationale, and details of Chainspace; we argue through evaluating an
implementation of the system about its scaling and other features; we
illustrate a number of privacy-friendly smart contracts for smart metering,
polling and banking and measure their performance
Regenerative Composition Structures
A new class of random composition structures (the ordered analog of Kingman's
partition structures) is defined by a regenerative description of component
sizes. Each regenerative composition structure is represented by a process of
random sampling of points from an exponential distribution on the positive
halfline, and separating the points into clusters by an independent
regenerative random set. Examples are composition structures derived from
residual allocation models, including one associated with the Ewens sampling
formula, and composition structures derived from the zero set of a Brownian
motion or Bessel process. We provide characterisation results and formulas
relating the distribution of the regenerative composition to the L{\'e}vy
parameters of a subordinator whose range is the corresponding regenerative set.
In particular, the only reversible regenerative composition structures are
those associated with the interval partition of generated by excursions
of a standard Bessel bridge of dimension for some
Worst-case estimation and asymptotic theory for models with unobservables
This paper proposes a worst-case approach for estimating econometric models containing unobservable variables. Worst-case estimators are robust against the adverse effects of unobservables. In contrast to the classical literature, there are no assumptions about the statistical nature of the unobservables in a worst-case estimation. This method is robust with respect to the unknown probability distribution of the unobservables and should be seen as a complement to standard methods, as cautious modelers should compare different estimations to determine robust models. The limit theory is obtained. A Monte Carlo study of finite sample properties has been conducted. An economic application is included
- …