247 research outputs found
Exploring heterogeneity of unreliable machines for p2p backup
P2P architecture is a viable option for enterprise backup. In contrast to
dedicated backup servers, nowadays a standard solution, making backups directly
on organization's workstations should be cheaper (as existing hardware is
used), more efficient (as there is no single bottleneck server) and more
reliable (as the machines are geographically dispersed).
We present the architecture of a p2p backup system that uses pairwise
replication contracts between a data owner and a replicator. In contrast to
standard p2p storage systems using directly a DHT, the contracts allow our
system to optimize replicas' placement depending on a specific optimization
strategy, and so to take advantage of the heterogeneity of the machines and the
network. Such optimization is particularly appealing in the context of backup:
replicas can be geographically dispersed, the load sent over the network can be
minimized, or the optimization goal can be to minimize the backup/restore time.
However, managing the contracts, keeping them consistent and adjusting them in
response to dynamically changing environment is challenging.
We built a scientific prototype and ran the experiments on 150 workstations
in the university's computer laboratories and, separately, on 50 PlanetLab
nodes. We found out that the main factor affecting the quality of the system is
the availability of the machines. Yet, our main conclusion is that it is
possible to build an efficient and reliable backup system on highly unreliable
machines (our computers had just 13% average availability)
Cooperation and Competition when Bidding for Complex Projects: Centralized and Decentralized Perspectives
To successfully complete a complex project, be it a construction of an
airport or of a backbone IT system, agents (companies or individuals) must form
a team having required competences and resources. A team can be formed either
by the project issuer based on individual agents' offers (centralized
formation); or by the agents themselves (decentralized formation) bidding for a
project as a consortium---in that case many feasible teams compete for the
contract. We investigate rational strategies of the agents (what salary should
they ask? with whom should they team up?). We propose concepts to characterize
the stability of the winning teams and study their computational complexity
Improved Metric Distortion for Deterministic Social Choice Rules
In this paper, we study the metric distortion of deterministic social choice
rules that choose a winning candidate from a set of candidates based on voter
preferences. Voters and candidates are located in an underlying metric space. A
voter has cost equal to her distance to the winning candidate. Ordinal social
choice rules only have access to the ordinal preferences of the voters that are
assumed to be consistent with the metric distances. Our goal is to design an
ordinal social choice rule with minimum distortion, which is the worst-case
ratio, over all consistent metrics, between the social cost of the rule and
that of the optimal omniscient rule with knowledge of the underlying metric
space.
The distortion of the best deterministic social choice rule was known to be
between and . It had been conjectured that any rule that only looks at
the weighted tournament graph on the candidates cannot have distortion better
than . In our paper, we disprove it by presenting a weighted tournament rule
with distortion of . We design this rule by generalizing the classic
notion of uncovered sets, and further show that this class of rules cannot have
distortion better than . We then propose a new voting rule, via an
alternative generalization of uncovered sets. We show that if a candidate
satisfying the criterion of this voting rule exists, then choosing such a
candidate yields a distortion bound of , matching the lower bound. We
present a combinatorial conjecture that implies distortion of , and verify
it for small numbers of candidates and voters by computer experiments. Using
our framework, we also show that selecting any candidate guarantees distortion
of at most when the weighted tournament graph is cyclically symmetric.Comment: EC 201
Proportional Approval Voting, Harmonic k-median, and Negative Association
We study a generic framework that provides a unified view on two important classes of problems: (i) extensions of the k-median problem where clients are interested in having multiple facilities in their vicinity (e.g., due to the fact that, with some small probability, the closest facility might be malfunctioning and so might not be available for using), and (ii) finding winners according to some appealing multiwinner election rules, i.e., election system aimed for choosing representatives bodies, such as parliaments, based on preferences of a population of voters over individual candidates. Each problem in our framework is associated with a vector of weights: we show that the approximability of the problem depends on structural properties of these vectors. We specifically focus on the harmonic sequence of weights, since it results in particularly appealing properties of the considered problem. In particular, the objective function interpreted in a multiwinner election setup reflects to the well-known Proportional Approval Voting (PAV) rule.
Our main result is that, due to the specific (harmonic) structure of weights, the problem allows constant factor approximation. This is surprising since the problem can be interpreted as a variant of the k-median problem where we do not assume that the connection costs satisfy the triangle inequality. To the best of our knowledge this is the first constant factor approximation algorithm for a variant of k-median that does not require this assumption. The algorithm we propose is based on dependent rounding [Srinivasan, FOCS\u2701] applied to the solution of a natural LP-relaxation of the problem. The rounding process is well known to produce distributions over integral solutions satisfying Negative Correlation (NC), which is usually sufficient for the analysis of approximation guarantees offered by rounding procedures. In our analysis, however, we need to use the fact that the carefully implemented rounding process satisfies a stronger property, called Negative Association (NA), which allows us to apply standard concentration bounds for conditional random variables
Collective Schedules: Scheduling Meets Computational Social Choice
International audienceWhen scheduling public works or events in a shared facility one needs to accommodate preferences of a population. We formalize this problem by introducing the notion of a collective schedule. We show how to extend fundamental tools from social choice theoryāpositional scoring rules, the Kemeny rule and the Con-dorcet principleāto collective scheduling. We study the computational complexity of finding collective schedules. We also experimentally demonstrate that optimal collective schedules can be found for instances with realistic sizes
High-performance FPGA architecture for data streams processing on example of IPsec gateway
In modern digital world, there is a strong demand for efficient data streams processing methods. One of application areas is cybersecurity ā IPsec is a suite of protocol that adds security to communications at the IP level. This paper presents principles of high-performance FPGA architecture for data streams processing on example of IPsec gateway implementation. Efficiency of the proposed solution allows to use it in networks with data rates of several Gbit/s
- ā¦