665,049 research outputs found
Prochlo: Strong Privacy for Analytics in the Crowd
The large-scale monitoring of computer users' software activities has become
commonplace, e.g., for application telemetry, error reporting, or demographic
profiling. This paper describes a principled systems architecture---Encode,
Shuffle, Analyze (ESA)---for performing such monitoring with high utility while
also protecting user privacy. The ESA design, and its Prochlo implementation,
are informed by our practical experiences with an existing, large deployment of
privacy-preserving software monitoring.
(cont.; see the paper
Recommended from our members
Generic system architecture for context-aware, distributed recommendation
In the existing literature on recommender systems, it is difficult to find an architecture for large-scale implementation. Often, the architectures proposed in papers are specific to an algorithm implementation or a domain. Thus, there is no clear architectural starting point for a new recommender system. This paper presents an architecture blueprint for a context-aware recommender system that provides scalability, availability, and security for its users. The architecture also contributes the dynamic ability to switch between single-device (offline), client-server (online), and fully distributed implementations. From this blueprint, a new recommender system could be built with minimal design and implementation effort regardless of the application.Electrical and Computer Engineerin
Some Theoretical Results of Hypercube for Parallel Architecture
This paper surveys some theoretical results of the hypercube for design of VLSI architecture. The parallel computer including the hypercube multiprocessor will become a leading technology that supports efficient computation for large uncertain systems
Photonic architecture for scalable quantum information processing in NV-diamond
Physics and information are intimately connected, and the ultimate
information processing devices will be those that harness the principles of
quantum mechanics. Many physical systems have been identified as candidates for
quantum information processing, but none of them are immune from errors. The
challenge remains to find a path from the experiments of today to a reliable
and scalable quantum computer. Here, we develop an architecture based on a
simple module comprising an optical cavity containing a single
negatively-charged nitrogen vacancy centre in diamond. Modules are connected by
photons propagating in a fiber-optical network and collectively used to
generate a topological cluster state, a robust substrate for quantum
information processing. In principle, all processes in the architecture can be
deterministic, but current limitations lead to processes that are probabilistic
but heralded. We find that the architecture enables large-scale quantum
information processing with existing technology.Comment: 24 pages, 14 Figures. Comment welcom
Increasing the power efficiency of Bloom filters for network string matching
Although software based techniques are widely accepted in computer security systems, there is a growing interest to utilize hardware opportunities in order to compensate for the network bandwidth increases. Recently, hardware based virus protection systems have started to emerge. These type of hardware systems work by identifying the malicious content and removing it from the network streams. In principle, they make use of string matching. Bit by bit, they compare the virus signatures with the bit strings in the network. The Bloom filters are ideal data structures for string matching. Nonetheless, they consume large power when many of them used in parallel to match different virus signatures. In this paper, we propose a new type of Bloom filter architecture which exploits well-known pipelining technique. © 2006 IEEE
The AI Bus architecture for distributed knowledge-based systems
The AI Bus architecture is layered, distributed object oriented framework developed to support the requirements of advanced technology programs for an order of magnitude improvement in software costs. The consequent need for highly autonomous computer systems, adaptable to new technology advances over a long lifespan, led to the design of an open architecture and toolbox for building large scale, robust, production quality systems. The AI Bus accommodates a mix of knowledge based and conventional components, running on heterogeneous, distributed real world and testbed environment. The concepts and design is described of the AI Bus architecture and its current implementation status as a Unix C++ library or reusable objects. Each high level semiautonomous agent process consists of a number of knowledge sources together with interagent communication mechanisms based on shared blackboards and message passing acquaintances. Standard interfaces and protocols are followed for combining and validating subsystems. Dynamic probes or demons provide an event driven means for providing active objects with shared access to resources, and each other, while not violating their security
- …