1,675 research outputs found
Low-Complexity Codes for Random and Clustered High-Order Failures in Storage Arrays
RC (Random/Clustered) codes are a new efficient array-code family for recovering from 4-erasures. RC codes correct most 4-erasures, and essentially all 4-erasures that are clustered. Clustered erasures are introduced as a new erasure model for storage arrays. This model draws its motivation from correlated device failures, that are caused by physical proximity of devices, or by age proximity of endurance-limited solid-state drives. The reliability of storage arrays that employ RC codes is analyzed and compared to known codes. The new RC code is significantly more efficient, in all practical implementation factors, than the best known 4-erasure correcting MDS code. These factors include: small-write update-complexity, full-device update-complexity, decoding complexity and number of supported devices in the array
The Classical Complexity of Boson Sampling
We study the classical complexity of the exact Boson Sampling problem where
the objective is to produce provably correct random samples from a particular
quantum mechanical distribution. The computational framework was proposed by
Aaronson and Arkhipov in 2011 as an attainable demonstration of `quantum
supremacy', that is a practical quantum computing experiment able to produce
output at a speed beyond the reach of classical (that is non-quantum) computer
hardware. Since its introduction Boson Sampling has been the subject of intense
international research in the world of quantum computing. On the face of it,
the problem is challenging for classical computation. Aaronson and Arkhipov
show that exact Boson Sampling is not efficiently solvable by a classical
computer unless and the polynomial hierarchy collapses to
the third level.
The fastest known exact classical algorithm for the standard Boson Sampling
problem takes time to produce samples for a
system with input size and output modes, making it infeasible for
anything but the smallest values of and . We give an algorithm that is
much faster, running in time and
additional space. The algorithm is simple to implement and has low constant
factor overheads. As a consequence our classical algorithm is able to solve the
exact Boson Sampling problem for system sizes far beyond current photonic
quantum computing experimentation, thereby significantly reducing the
likelihood of achieving near-term quantum supremacy in the context of Boson
Sampling.Comment: 15 pages. To appear in SODA '1
The Homeostasis Protocol: Avoiding Transaction Coordination Through Program Analysis
Datastores today rely on distribution and replication to achieve improved
performance and fault-tolerance. But correctness of many applications depends
on strong consistency properties - something that can impose substantial
overheads, since it requires coordinating the behavior of multiple nodes. This
paper describes a new approach to achieving strong consistency in distributed
systems while minimizing communication between nodes. The key insight is to
allow the state of the system to be inconsistent during execution, as long as
this inconsistency is bounded and does not affect transaction correctness. In
contrast to previous work, our approach uses program analysis to extract
semantic information about permissible levels of inconsistency and is fully
automated. We then employ a novel homeostasis protocol to allow sites to
operate independently, without communicating, as long as any inconsistency is
governed by appropriate treaties between the nodes. We discuss mechanisms for
optimizing treaties based on workload characteristics to minimize
communication, as well as a prototype implementation and experiments that
demonstrate the benefits of our approach on common transactional benchmarks
- …