127 research outputs found

    Evaluation of LANDSAT-4 TM and MSS ground geometry performance without ground control

    Get PDF
    Techniques and software developed to characterize the Washington, D.C. scene were improved and are being systematically applied to an Imperial Valley, CA scene. Digital elevation files are being acquired. One hundred seventy-two tiepoints were located in the Imperial Valley scene. They were digitized from USGS maps to determine their lat-long coordinates. A least squares fit is currently being performed between line-sample image data and the lat-long positions of the tiepoints. Thematic mapper scanner sweeps were determined for the Imperial Valley P-data. VICAR jobs are currently under way to analyze sample-direction offsets between sweeps in the data, as well as band to band registration offsets. Tiepoint location is about to begin in the Harrisburg, PA scene

    Fast Locality-Sensitive Hashing Frameworks for Approximate Near Neighbor Search

    Full text link
    The Indyk-Motwani Locality-Sensitive Hashing (LSH) framework (STOC 1998) is a general technique for constructing a data structure to answer approximate near neighbor queries by using a distribution H\mathcal{H} over locality-sensitive hash functions that partition space. For a collection of nn points, after preprocessing, the query time is dominated by O(nρlogn)O(n^{\rho} \log n) evaluations of hash functions from H\mathcal{H} and O(nρ)O(n^{\rho}) hash table lookups and distance computations where ρ(0,1)\rho \in (0,1) is determined by the locality-sensitivity properties of H\mathcal{H}. It follows from a recent result by Dahlgaard et al. (FOCS 2017) that the number of locality-sensitive hash functions can be reduced to O(log2n)O(\log^2 n), leaving the query time to be dominated by O(nρ)O(n^{\rho}) distance computations and O(nρlogn)O(n^{\rho} \log n) additional word-RAM operations. We state this result as a general framework and provide a simpler analysis showing that the number of lookups and distance computations closely match the Indyk-Motwani framework, making it a viable replacement in practice. Using ideas from another locality-sensitive hashing framework by Andoni and Indyk (SODA 2006) we are able to reduce the number of additional word-RAM operations to O(nρ)O(n^\rho).Comment: 15 pages, 3 figure

    SCExAO/MEC and CHARIS Discovery of a Low Mass, 6 AU-Separation Companion to HIP 109427 using Stochastic Speckle Discrimination and High-Contrast Spectroscopy

    Get PDF
    We report the direct imaging discovery of a low-mass companion to the nearby accelerating A star, HIP 109427, with the Subaru Coronagraphic Extreme Adaptive Optics (SCExAO) instrument coupled with the MKID Exoplanet Camera (MEC) and CHARIS integral field spectrograph. CHARIS data reduced with reference star PSF subtraction yield 1.1-2.4 μ\mum spectra. MEC reveals the companion in YY and JJ band at a comparable signal-to-noise ratio using stochastic speckle discrimination, with no PSF subtraction techniques. Combined with complementary follow-up LpL_{\rm p} photometry from Keck/NIRC2, the SCExAO data favors a spectral type, effective temperature, and luminosity of M4-M5.5, 3000-3200 KK, and log10(L/L)=2.280.04+0.04\log_{10}(L/L_{\rm \odot}) = -2.28^{+0.04}_{-0.04}, respectively. Relative astrometry of HIP 109427 B from SCExAO/CHARIS and Keck/NIRC2, and complementary Gaia-Hipparcos absolute astrometry of the primary favor a semimajor axis of 6.550.48+3.06.55^{+3.0}_{-0.48} au, an eccentricity of 0.540.15+0.280.54^{+0.28}_{-0.15}, an inclination of 66.714+8.566.7^{+8.5}_{-14} degrees, and a dynamical mass of 0.2800.059+0.180.280^{+0.18}_{-0.059} MM_{\odot}. This work shows the potential for extreme AO systems to utilize speckle statistics in addition to widely-used post-processing methods to directly image faint companions to nearby stars near the telescope diffraction limit.Comment: 13 pages, 7 figures, 3 table

    Market analysis for cultured proteins in low- and lower-middle income countries.

    Get PDF
    The global burden of malnutrition is unacceptably high.10 Worldwide, an estimated 22% of children under the age of five were stunted and 8% were wasted in 2018.11 Low-quality diets lacking in essential vitamins, minerals, proteins, and other nutrients are a key contributor to this burden.12 Animal-source foods—such as meat, poultry, fish, eggs, and dairy—are important components of a diverse diet and provide high-quality proteins and other essential nutrients that promote optimal growth and development.13,14,15,16,17As populations and incomes grow, the global demand for animal-source foods is projected to increase substantially, particularly in many low- and lower-middle income countries (LMICs).18,19 However, cost is currently a significant barrier to animal-source food consumption. In addition, meeting this growing demand for animal-source foods will require rapid increases in livestock production, which has significant environmental impacts, requiring considerable land, water, chemical, and energy inputs.10,17,18 Global food production is responsible for roughly one-quarter of all greenhouse gas emissions, most of which (up to 80%) are related to livestock.20,21 Livestock production is also a contributor to water pollution, deforestation, land degradation, overfishing, and antimicrobial resistance.20,22,23 Given these challenges, this report aims to assess the market for potentially more sustainable alternative proteins and their potential for use in LMIC settings. The report focuses on proteins derived from fermentation-based cellular agriculture, called cultured proteins, given their potential near-term time to market and their potential impact in LMIC populations. Most cultured protein manufacturers are developing proteins that are present in animal-source milk and eggs

    HIVToolbox, an Integrated Web Application for Investigating HIV

    Get PDF
    Many bioinformatic databases and applications focus on a limited domain of knowledge federating links to information in other databases. This segregated data structure likely limits our ability to investigate and understand complex biological systems. To facilitate research, therefore, we have built HIVToolbox, which integrates much of the knowledge about HIV proteins and allows virologists and structural biologists to access sequence, structure, and functional relationships in an intuitive web application. HIV-1 integrase protein was used as a case study to show the utility of this application. We show how data integration facilitates identification of new questions and hypotheses much more rapid and convenient than current approaches using isolated repositories. Several new hypotheses for integrase were created as an example, and we experimentally confirmed a predicted CK2 phosphorylation site. Weblink: [http://hivtoolbox.bio-toolkit.com

    Phase transition in Random Circuit Sampling

    Full text link
    Quantum computers hold the promise of executing tasks beyond the capability of classical computers. Noise competes with coherent evolution and destroys long-range correlations, making it an outstanding challenge to fully leverage the computation power of near-term quantum processors. We report Random Circuit Sampling (RCS) experiments where we identify distinct phases driven by the interplay between quantum dynamics and noise. Using cross-entropy benchmarking, we observe phase boundaries which can define the computational complexity of noisy quantum evolution. We conclude by presenting an RCS experiment with 70 qubits at 24 cycles. We estimate the computational cost against improved classical methods and demonstrate that our experiment is beyond the capabilities of existing classical supercomputers

    Overcoming leakage in scalable quantum error correction

    Full text link
    Leakage of quantum information out of computational states into higher energy states represents a major challenge in the pursuit of quantum error correction (QEC). In a QEC circuit, leakage builds over time and spreads through multi-qubit interactions. This leads to correlated errors that degrade the exponential suppression of logical error with scale, challenging the feasibility of QEC as a path towards fault-tolerant quantum computation. Here, we demonstrate the execution of a distance-3 surface code and distance-21 bit-flip code on a Sycamore quantum processor where leakage is removed from all qubits in each cycle. This shortens the lifetime of leakage and curtails its ability to spread and induce correlated errors. We report a ten-fold reduction in steady-state leakage population on the data qubits encoding the logical state and an average leakage population of less than 1×1031 \times 10^{-3} throughout the entire device. The leakage removal process itself efficiently returns leakage population back to the computational basis, and adding it to a code circuit prevents leakage from inducing correlated error across cycles, restoring a fundamental assumption of QEC. With this demonstration that leakage can be contained, we resolve a key challenge for practical QEC at scale.Comment: Main text: 7 pages, 5 figure
    corecore