348 research outputs found
Computational Security Subject to Source Constraints, Guesswork and Inscrutability
Guesswork forms the mathematical framework for
quantifying computational security subject to brute-force determination
by query. In this paper, we consider guesswork
subject to a per-symbol Shannon entropy budget. We introduce
inscrutability rate to quantify the asymptotic difficulty of guessing
U out of V secret strings drawn from the string-source and
prove that the inscrutability rate of any string-source supported
on a finite alphabet X, if it exists, lies between the per-symbol
Shannon entropy constraint and log |X|. We show that for a
stationary string-source, the inscrutability rate of guessing any
fraction (1 - ϵ) of the V strings for any fixed ϵ > 0, as V
grows, approaches the per-symbol Shannon entropy constraint
(which is equal to the Shannon entropy rate for the stationary
string-source). This corresponds to the minimum inscrutability
rate among all string-sources with the same per-symbol Shannon
entropy. We further prove that the inscrutability rate of any
finite-order Markov string-source with hidden statistics remains
the same as the unhidden case, i.e., the asymptotic value of hiding
the statistics per each symbol is vanishing. On the other hand, we
show that there exists a string-source that achieves the upper limit
on the inscrutability rate, i.e., log |X|, under the same Shannon
entropy budget
Space--Time Tradeoffs for Subset Sum: An Improved Worst Case Algorithm
The technique of Schroeppel and Shamir (SICOMP, 1981) has long been the most
efficient way to trade space against time for the SUBSET SUM problem. In the
random-instance setting, however, improved tradeoffs exist. In particular, the
recently discovered dissection method of Dinur et al. (CRYPTO 2012) yields a
significantly improved space--time tradeoff curve for instances with strong
randomness properties. Our main result is that these strong randomness
assumptions can be removed, obtaining the same space--time tradeoffs in the
worst case. We also show that for small space usage the dissection algorithm
can be almost fully parallelized. Our strategy for dealing with arbitrary
instances is to instead inject the randomness into the dissection process
itself by working over a carefully selected but random composite modulus, and
to introduce explicit space--time controls into the algorithm by means of a
"bailout mechanism"
Order-of-magnitude physics of neutron stars
We use basic physics and simple mathematics accessible to advanced
undergraduate students to estimate the main properties of neutron stars. We set
the stage and introduce relevant concepts by discussing the properties of
"everyday" matter on Earth, degenerate Fermi gases, white dwarfs, and scaling
relations of stellar properties with polytropic equations of state. Then, we
discuss various physical ingredients relevant for neutron stars and how they
can be combined in order to obtain a couple of different simple estimates of
their maximum mass, beyond which they would collapse, turning into black holes.
Finally, we use the basic structural parameters of neutron stars to briefly
discuss their rotational and electromagnetic properties.Comment: 13 pages, 3 figures, accepted for publication in European Physical
Journal
TEXTURAL ANALYSIS AND STATISTICAL INVESTIGATION OF PATTERNS IN SYNTHETIC APERTURE SONAR IMAGES
Textural analysis and statistical investigation of patterns in synthetic aperture sonar (SAS) images is useful for oceanographic purposes such as biological habitat mapping or bottom type identification for offshore construction. Seafloor classification also has many tactical benefits for the U.S. Navy in terms of mine identification and undersea warfare. Common methods of texture analysis rely on statistical moments of image intensity, or more generally, the probability density function of the scene. One of the most common techniques uses Haralick’s Grey Level Co-occurrence Matrix (GLCM) to calculate image features used in the applications listed above. Although widely used, seafloor classification and segmentation are difficult using Haralick features. Typically, these features are calculated at a single scale. Improvements based on the understanding that patterns are multiscale was compared with this baseline, with a goal of improving seafloor classification. Synthetic aperture sonar (SAS) data was provided by the Norwegian Research Defense Establishment for this work, and was labeled into six distinct seafloor classes, with 757 total examples. We analyze the feature importance determined by neighborhood component analysis as a function of scale and direction to determine which spatial scale and azimuthal direction is most informative for good classification performance.Office of Naval Research, Arlington, VA , 22217Lieutenant, United States NavyApproved for public release. Distribution is unlimited
- …