17,552 research outputs found
Upper Bounds for the Critical Car Densities in Traffic Flow Problems
In most models of traffic flow, the car density is the only free
parameter in determining the average car velocity . The
critical car density , which is defined to be the car density separating
the jamming phase (with ) and the moving phase (with
), is an important physical quantity to investigate. By
means of simple statistical argument, we show that for the
Biham-Middleton-Levine model of traffic flow in two or higher spatial
dimensions. In particular, we show that in 2 dimension and
in () dimensions.Comment: REVTEX 3.0, 5 pages with 1 figure appended at the back, Minor
revision, to be published in the Sept issue of J.Phys.Soc.Japa
Bidirectional PageRank Estimation: From Average-Case to Worst-Case
We present a new algorithm for estimating the Personalized PageRank (PPR)
between a source and target node on undirected graphs, with sublinear
running-time guarantees over the worst-case choice of source and target nodes.
Our work builds on a recent line of work on bidirectional estimators for PPR,
which obtained sublinear running-time guarantees but in an average-case sense,
for a uniformly random choice of target node. Crucially, we show how the
reversibility of random walks on undirected networks can be exploited to
convert average-case to worst-case guarantees. While past bidirectional methods
combine forward random walks with reverse local pushes, our algorithm combines
forward local pushes with reverse random walks. We also discuss how to modify
our methods to estimate random-walk probabilities for any length distribution,
thereby obtaining fast algorithms for estimating general graph diffusions,
including the heat kernel, on undirected networks.Comment: Workshop on Algorithms and Models for the Web-Graph (WAW) 201
Recommended from our members
C-slow retimed parallel histogram architectures for consumer imaging devices
A parallel pipelined array of cells suitable for real-time computation of histograms is proposed. The cell architecture builds on previous work obtained via C-slow retiming techniques and can be clocked at 65 percent faster frequency than previous arrays. The new arrays can be exploited for higher throughput particularly when dual data rate sampling techniques are used to operate on single streams of data from image sensors. In this way, the new cell operates on a p-bit data bus which is more convenient for interfacing to camera sensors or to microprocessors in consumer digital cameras
Predictable arguments of knowledge
We initiate a formal investigation on the power of predictability for argument of knowledge systems for NP. Specifically, we consider private-coin argument systems where the answer of the prover can be predicted, given the private randomness of the verifier; we call such protocols Predictable Arguments of Knowledge (PAoK).
Our study encompasses a full characterization of PAoK, showing that such arguments can be made extremely laconic, with the prover sending a single bit, and assumed to have only one round (i.e., two messages) of communication without loss of generality.
We additionally explore PAoK satisfying additional properties (including zero-knowledge and the possibility of re-using the same challenge across multiple executions with the prover), present several constructions of PAoK relying on different cryptographic tools, and discuss applications to cryptography
More is Less: Perfectly Secure Oblivious Algorithms in the Multi-Server Setting
The problem of Oblivious RAM (ORAM) has traditionally been studied in a
single-server setting, but more recently the multi-server setting has also been
considered. Yet it is still unclear whether the multi-server setting has any
inherent advantages, e.g., whether the multi-server setting can be used to
achieve stronger security goals or provably better efficiency than is possible
in the single-server case.
In this work, we construct a perfectly secure 3-server ORAM scheme that
outperforms the best known single-server scheme by a logarithmic factor. In the
process, we also show, for the first time, that there exist specific algorithms
for which multiple servers can overcome known lower bounds in the single-server
setting.Comment: 36 pages, Accepted in Asiacrypt 201
Reconstruction of plasma density profiles by measuring spectra of radiation emitted from oscillating plasma dipoles
We suggest a new method for characterising non-uniform density distributions of plasma by measuring the spectra of radiation emitted from a localised plasma dipole oscillator excited by colliding electromagnetic pulses. The density distribution can be determined by scanning the collision point in space. Two-dimensional particle-in-cell simulations demonstrate the reconstruction of linear and nonlinear density profiles corresponding to laser-produced plasma. The method can be applied to a wide range of plasma, including fusion and low temperature plasmas. It overcomes many of the disadvantages of existing methods that only yield average densities along the path of probe pulses, such as interferometry and spectroscopy
Laser ablation loading of a radiofrequency ion trap
The production of ions via laser ablation for the loading of radiofrequency
(RF) ion traps is investigated using a nitrogen laser with a maximum pulse
energy of 0.17 mJ and a peak intensity of about 250 MW/cm^2. A time-of-flight
mass spectrometer is used to measure the ion yield and the distribution of the
charge states. Singly charged ions of elements that are presently considered
for the use in optical clocks or quantum logic applications could be produced
from metallic samples at a rate of the order of magnitude 10^5 ions per pulse.
A linear Paul trap was loaded with Th+ ions produced by laser ablation. An
overall ion production and trapping efficiency of 10^-7 to 10^-6 was attained.
For ions injected individually, a dependence of the capture probability on the
phase of the RF field has been predicted. In the experiment this was not
observed, presumably because of collective effects within the ablation plume.Comment: submitted to Appl. Phys. B., special issue on ion trappin
Lorentz invariance violation in top-down scenarios of ultrahigh energy cosmic ray creation
The violation of Lorentz invariance (LI) has been invoked in a number of ways
to explain issues dealing with ultrahigh energy cosmic ray (UHECR) production
and propagation. These treatments, however, have mostly been limited to
examples in the proton-neutron system and photon-electron system. In this paper
we show how a broader violation of Lorentz invariance would allow for a series
of previously forbidden decays to occur, and how that could lead to UHECR
primaries being heavy baryonic states or Higgs bosons.Comment: Replaced with heavily revised (see new Abstract) version accepted by
Phys. Rev. D. 6 page
- …