1,423 research outputs found

    Avoiding Flow Size Overestimation in the Count-Min Sketch with Bloom Filter Constructions

    Get PDF
    The Count-Min sketch is the most popular data structure for flow size estimation, a basic measurement task required in many networks. Typically the number of potential flows is large, eliminating the possibility to maintain a counter per flow within memory of high access rate. The Count-Min sketch is probabilistic and relies on mapping each flow to multiple counters through hashing. This implies potential estimation error such that the size of a flow is overestimated when all flow counters are shared with other flows with observed traffic. Although the error in the estimation can be probabilistically bounded, many applications can benefit from accurate flow size estimation and the guarantee to completely avoid overestimation. We describe a design of the Count-Min sketch with accurate estimations whenever the number of flows with observed traffic follows a known bound, regardless of the identity of these particular flows. We make use of a concept of Bloom filters that avoid false positives and indicate the limitations of existing Bloom filter designs towards accurate size estimation. We suggest new Bloom filter constructions that allow scalability with the support for a larger number of flows and explain how these can imply the unique guarantee of accurate flow size estimation in the well known Count-Min sketch.Ori Rottenstreich was partially supported by the German-Israeli Foundation for Scientic Research and Development (GIF), by the Gordon Fund for System Engineering as well as by the Technion Hiroshi Fujiwara Cyber Security Research Center and the Israel National Cyber Directorate. Pedro Reviriego would like to acknowledge the sup-port of the ACHILLES project PID2019-104207RB-I00 and the Go2Edge network RED2018-102585-T funded by the Spanish Ministry of Science and Innovation and of the Madrid Community research project TAPIR-CM grant no. P2018/TCS-4496

    Attosecond time-resolved photoelectron holography

    Get PDF
    Ultrafast strong-field physics provides insight into quantum phenomena that evolve on an attosecond time scale, the most fundamental of which is quantum tunneling. The tunneling process initiates a range of strong field phenomena such as high harmonic generation (HHG), laser-induced electron diffraction, double ionization and photoelectron holography—all evolving during a fraction of the optical cycle. Here we apply attosecond photoelectron holography as a method to resolve the temporal properties of the tunneling process. Adding a weak second harmonic (SH) field to a strong fundamental laser field enables us to reconstruct the ionization times of photoelectrons that play a role in the formation of a photoelectron hologram with attosecond precision. We decouple the contributions of the two arms of the hologram and resolve the subtle differences in their ionization times, separated by only a few tens of attoseconds

    The role of clinical decision support systems in preventing stroke in primary care: a systematic review.

    Get PDF
    Computerized clinical decision support systems (CDSS) are increasingly being used to facilitate the role of clinicians in complex decision-making processes. This systematic review evaluates evidence of the available CDSS developed and tested to support the decision-making process in primary healthcare for stroke prevention and barriers to practical implementations in primary care settings. A systematic search of Web of Science, Medline Ovid, Embase Ovid, and Cinahl was done. A total of five studies, experimental and observational, were synthesised in this review. This review found that CDSS facilitate decision-making processes in primary health care settings in stroke prevention options. However, barriers were identified in designing, implementing, and using the CDSS

    Requirements and validation of a prototype learning health system for clinical diagnosis

    Get PDF
    Introduction Diagnostic error is a major threat to patient safety in the context of family practice. The patient safety implications are severe for both patient and clinician. Traditional approaches to diagnostic decision support have lacked broad acceptance for a number of well-documented reasons: poor integration with electronic health records and clinician workflow, static evidence that lacks transparency and trust, and use of proprietary technical standards hindering wider interoperability. The learning health system (LHS) provides a suitable infrastructure for development of a new breed of learning decision support tools. These tools exploit the potential for appropriate use of the growing volumes of aggregated sources of electronic health records. Methods We describe the experiences of the TRANSFoRm project developing a diagnostic decision support infrastructure consistent with the wider goals of the LHS. We describe an architecture that is model driven, service oriented, constructed using open standards, and supports evidence derived from electronic sources of patient data. We describe the architecture and implementation of 2 critical aspects for a successful LHS: the model representation and translation of clinical evidence into effective practice and the generation of curated clinical evidence that can be used to populate those models, thus closing the LHS loop. Results/Conclusions Six core design requirements for implementing a diagnostic LHS are identified and successfully implemented as part of this research work. A number of significant technical and policy challenges are identified for the LHS community to consider, and these are discussed in the context of evaluating this work: medico-legal responsibility for generated diagnostic evidence, developing trust in the LHS (particularly important from the perspective of decision support), and constraints imposed by clinical terminologies on evidence generation

    Superselectors: Efficient Constructions and Applications

    Full text link
    We introduce a new combinatorial structure: the superselector. We show that superselectors subsume several important combinatorial structures used in the past few years to solve problems in group testing, compressed sensing, multi-channel conflict resolution and data security. We prove close upper and lower bounds on the size of superselectors and we provide efficient algorithms for their constructions. Albeit our bounds are very general, when they are instantiated on the combinatorial structures that are particular cases of superselectors (e.g., (p,k,n)-selectors, (d,\ell)-list-disjunct matrices, MUT_k(r)-families, FUT(k, a)-families, etc.) they match the best known bounds in terms of size of the structures (the relevant parameter in the applications). For appropriate values of parameters, our results also provide the first efficient deterministic algorithms for the construction of such structures
    • …
    corecore