872 research outputs found
New results on a generalized coupon collector problem using Markov chains
We study in this paper a generalized coupon collector problem, which consists
in determining the distribution and the moments of the time needed to collect a
given number of distinct coupons that are drawn from a set of coupons with an
arbitrary probability distribution. We suppose that a special coupon called the
null coupon can be drawn but never belongs to any collection. In this context,
we obtain expressions of the distribution and the moments of this time. We also
prove that the almost-uniform distribution, for which all the non-null coupons
have the same drawing probability, is the distribution which minimizes the
expected time to get a fixed subset of distinct coupons. This optimization
result is extended to the complementary distribution of that time when the full
collection is considered, proving by the way this well-known conjecture.
Finally, we propose a new conjecture which expresses the fact that the
almost-uniform distribution should minimize the complementary distribution of
the time needed to get any fixed number of distinct coupons.Comment: 14 page
D.4.1 â Application scenarii and Design of infrastructure
This report has been written by all members of the consortiumSocioPlug is a research project, funded by French National Agency for Research, which aims to investigate on Social Cloud over Plug Networks, Enabling Symmetric Access to Data and Preserving Privacy. In this project, we will perform both theoretical and practical evaluation of the solutions proposed in the all work packages. Task 4 (Infrastructure and Experimentation) will be structured around use-cases where partners developed previous expertise such as distributed collaborative systems, social web and Smart Building. These well known distributed systems will be revisited to fit federation of plug constraints using results of other tasks. In this deliverable, we present some details of the infrastructure of the federation of plug computers, that will be developed in this project. We plan to provide a demonstrator and deploy on it some application according to the use-cases presented in this deliverable
GCP: Gossip-based Code Propagation for Large-scale Mobile Wireless Sensor Networks
Wireless sensor networks (WSN) have recently received an increasing interest.
They are now expected to be deployed for long periods of time, thus requiring
software updates. Updating the software code automatically on a huge number of
sensors is a tremendous task, as ''by hand'' updates can obviously not be
considered, especially when all participating sensors are embedded on mobile
entities. In this paper, we investigate an approach to automatically update
software in mobile sensor-based application when no localization mechanism is
available. We leverage the peer-to-peer cooperation paradigm to achieve a good
trade-off between reliability and scalability of code propagation. More
specifically, we present the design and evaluation of GCP ({\emph Gossip-based
Code Propagation}), a distributed software update algorithm for mobile wireless
sensor networks. GCP relies on two different mechanisms (piggy-backing and
forwarding control) to improve significantly the load balance without
sacrificing on the propagation speed. We compare GCP against traditional
dissemination approaches. Simulation results based on both synthetic and
realistic workloads show that GCP achieves a good convergence speed while
balancing the load evenly between sensors
Optimization results for a generalized coupon collector problem
We study in this paper a generalized coupon collector problem, which consists
in analyzing the time needed to collect a given number of distinct coupons that
are drawn from a set of coupons with an arbitrary probability distribution. We
suppose that a special coupon called the null coupon can be drawn but never
belongs to any collection. In this context, we prove that the almost uniform
distribution, for which all the non-null coupons have the same drawing
probability, is the distribution which stochastically minimizes the time needed
to collect a fixed number of distinct coupons. Moreover, we show that in a
given closed subset of probability distributions, the distribution with all its
entries, but one, equal to the smallest possible value is the one, which
stochastically maximizes the time needed to collect a fixed number of distinct
coupons. An computer science application shows the utility of these results.Comment: arXiv admin note: text overlap with arXiv:1402.524
Nothing can compare with a population, besides agents
15 pagesLeveraging the resemblances between two areas explored so far independently enables to provide a theoretical framework for dis- tributed systems where global behaviors emerge from a set of local in- teractions. The contribution of this paper arise from the observation that population protocols and multi-agent systems (MAS) bear many resemblances. Particularly, some subclasses of MAS seem to fit the same computational power than population protocols. Population protocols provide theoretical foundations for mobile tiny device networks. On the other hand, from long-standing research study in distributed artificial in- telligence, MAS forms an interesting model for society and owns a broad spectrum of application field, from simple reactive system to social sci- ences. Linking the both model should offers several extremely interesting outcomes
Sketch *-metric: Comparing Data Streams via Sketching
12 pages, double colonnesIn this paper, we consider the problem of estimating the distance between any two large data streams in small- space constraint. This problem is of utmost importance in data intensive monitoring applications where input streams are generated rapidly. These streams need to be processed on the fly and accurately to quickly determine any deviance from nominal behavior. We present a new metric, the Sketch â-metric, which allows to define a distance between updatable summaries (or sketches) of large data streams. An important feature of the Sketch â-metric is that, given a measure on the entire initial data streams, the Sketch â-metric preserves the axioms of the latter measure on the sketch (such as the non-negativity, the identity, the symmetry, the triangle inequality but also specific properties of the f-divergence). Extensive experiments conducted on both synthetic traces and real data allow us to validate the robustness and accuracy of the Sketch â-metric
AnKLe: détection automatique d'attaques par divergence d'information
4 pagesInternational audienceDans cet article, nous considĂ©rons le contexte de trĂšs grands systĂšmes distribuĂ©s, au sein desquels chaque noeud doit pouvoir rapidement analyser une grande quantitĂ© d'information, lui arrivant sous la forme d'un flux. Ce dernier ayant pu ĂȘtre modifiĂ© par un adversaire, un problĂšme fondamental consiste en la dĂ©tection et la quantification d'actions malveillantes effectuĂ©es sur ce flux. Dans ce but, nous proposons AnKLe (pour Attack-tolerant eNhanced Kullback-Leibler divergence Estimator), un algorithme local permettant d'estimer la divergence de Kullback-Leibler entre un flux observĂ© et le flux espĂ©rĂ©. AnKLe combine des techniques d'Ă©chantillonnage et des mĂ©thodes de thĂ©orie de l'information. Il est efficace Ă la fois en complexitĂ© en terme d'espace et en temps, et ne nĂ©cessite qu'une passe unique sur le flux. Les rĂ©sultats expĂ©rimentaux montre que l'estimateur fourni par AnKLe est pertinent pour plusieurs types d'attaques, pour lesquels les autres mĂ©thodes existantes sont significativement moins performantes
- âŠ