16 research outputs found
Recommended from our members
Selecting Closest Vectors Through Randomization
We consider the problem of finding the closest vectors to a given vector in a large set of vectors, and propose a randomized solution. The method has applications in Automatic Target Recognition (ATR), Web Information Retrieval, and Data Mining.Engineering and Applied Science
: An HB-like protocol secure against man-in-the-middle attacks
We construct a simple authentication protocol whose security is based solely on the problem of Learning Parity with Noise (LPN) which is secure against Man-in-the-Middle attacks. Our protocol is suitable for RFID devices, whose limited circuit size and power constraints rule out the use of more heavyweight operations such as modular exponentiation. The protocol is extremely simple: both parties compute a noisy bilinear function of their inputs. The proof, however, is quite technical, and we believe that some of our technical tools may be of independent interest
The Science Performance of JWST as Characterized in Commissioning
This paper characterizes the actual science performance of the James Webb
Space Telescope (JWST), as determined from the six month commissioning period.
We summarize the performance of the spacecraft, telescope, science instruments,
and ground system, with an emphasis on differences from pre-launch
expectations. Commissioning has made clear that JWST is fully capable of
achieving the discoveries for which it was built. Moreover, almost across the
board, the science performance of JWST is better than expected; in most cases,
JWST will go deeper faster than expected. The telescope and instrument suite
have demonstrated the sensitivity, stability, image quality, and spectral range
that are necessary to transform our understanding of the cosmos through
observations spanning from near-earth asteroids to the most distant galaxies.Comment: 5th version as accepted to PASP; 31 pages, 18 figures;
https://iopscience.iop.org/article/10.1088/1538-3873/acb29
The James Webb Space Telescope Mission
Twenty-six years ago a small committee report, building on earlier studies,
expounded a compelling and poetic vision for the future of astronomy, calling
for an infrared-optimized space telescope with an aperture of at least .
With the support of their governments in the US, Europe, and Canada, 20,000
people realized that vision as the James Webb Space Telescope. A
generation of astronomers will celebrate their accomplishments for the life of
the mission, potentially as long as 20 years, and beyond. This report and the
scientific discoveries that follow are extended thank-you notes to the 20,000
team members. The telescope is working perfectly, with much better image
quality than expected. In this and accompanying papers, we give a brief
history, describe the observatory, outline its objectives and current observing
program, and discuss the inventions and people who made it possible. We cite
detailed reports on the design and the measured performance on orbit.Comment: Accepted by PASP for the special issue on The James Webb Space
Telescope Overview, 29 pages, 4 figure
On the randomness requirements for privacy
Most cryptographic primitives require randomness (for example, to generate secret keys). Usually, one assumes that perfect randomness is available, but, conceivably, such primitives might be built under weaker, more realistic assumptions. This is known to be achievable for many authentication applications, when entropy alone is typically sufficient. In contrast, all known techniques for achieving privacy seem to fundamentally require (nearly) perfect randomness. We ask the question whether this is just a coincidence, or, perhaps, privacy inherently requires true randomness? We completely resolve this question for information-theoretic private-key encryption, where parties wish to encrypt a b-bit value using a shared secret key sampled from some imperfect source of randomness [special characters omitted]. Our main result shows that if such n-bit source [special characters omitted] allows for a secure encryption of b bits, where b > log n, then one can deterministically extract nearly b almost perfect random bits from [special characters omitted]. Further, the restriction that b > log n is nearly tight: there exist sources [special characters omitted] allowing one to perfectly encrypt (log n – loglog n) bits, but not to deterministically extract even a single slightly unbiased bit. Hence, to a large extent, true randomness is inherent for encryption : either the key length must be exponential in the message length b, or one can deterministically extract nearly b almost unbiased random bits from the key. In particular, the one-time pad scheme is essentially "universal". Our technique also extends to related primitives which are sufficiently binding and hiding, including computationally secure commitments and public-key encryption
Does privacy require true randomness
Abstract. Most cryptographic primitives require randomness (for example, to generate their secret keys). Usually, one assumes that perfect randomness is available, but, conceivably, such primitives might be built under weaker, more realistic assumptions. This is known to be true for many authentication applications, when entropy alone is typically sufficient. In contrast, all known techniques for achieving privacy seem to fundamentally require (nearly) perfect randomness. We ask the question whether this is just a coincidence, or, perhaps, privacy inherently requires true randomness? We completely resolve this question for the case of (informationtheoretic) private-key encryption, where parties wish to encrypt a b-bit value using a shared secret key sampled from some imperfect source of randomness S. Our main result shows that if such n-bit source S allows for a secure encryption of b bits, where b> log n, then one can deterministically extract nearly b almost perfect random bits from S. Further, the restriction that b> log n is nearly tight: there exist sources S allowing one to perfectly encrypt (log n − loglog n) bits, but not to deterministically extract even a single slightly unbiased bit. Hence, to a large extent, true randomness is inherent for encryption: either the key length must be exponential in the message length b, or one can deterministically extract nearly b almost unbiased random bits from the key. In particular, the one-time pad scheme is essentially “universal”. Our technique also extends to related computational primitives which are perfectly-binding, such as perfectly-binding commitment and computationally secure private- or public-key encryption, showing the necessity to efficiently extract almost b pseudorandom bits.
Corn Yield Loss Estimates Due to Diseases in the United States and Ontario, Canada from 2012 to 2015
Annual decreases in corn yield caused by diseases were estimated by surveying members of the Corn Disease Working Group in 22 corn-producing states in the United States and in Ontario, Canada, from 2012 through 2015. Estimated loss from each disease varied greatly by state and year. In general, foliar diseases such as northern corn leaf blight, gray leaf spot, and Goss’s wilt commonly caused the largest estimated yield loss in the northern United States and Ontario during nondrought years. Fusarium stalk rot and plant-parasitic nematodes caused the most estimated loss in the southern-most United States. The estimated mean economic loss due to yield loss by corn diseases in the United States and Ontario from 2012 to 2015 was $76.51 USD per acre. The cost of disease-mitigating strategies is another potential source of profit loss. Results from this survey will provide scientists, breeders, government, and educators with data to help inform and prioritize research, policy, and educational efforts in corn pathology and disease management