47 research outputs found

    2022 LDC U.S. Latino GDP Report: Quantifying the new mainstream economy

    Get PDF
    The 2022 LDC U.S. Latino GDP Report seeks to provide a factual view of the large and rapidly growing economic contribution of Latinos living in the United States. We estimate the U.S. Latino GDP based on a detailed, bottom-up construction which leverages publicly available data from major U.S. agencies. The most recent year for which the core data is available is 2020. Thus, this year's report provides a snapshot of the total economic contribution of U.S. Latinos in that year

    High-resolution ab initio three-dimensional X-ray diffraction microscopy

    Full text link
    Coherent X-ray diffraction microscopy is a method of imaging non-periodic isolated objects at resolutions only limited, in principle, by the largest scattering angles recorded. We demonstrate X-ray diffraction imaging with high resolution in all three dimensions, as determined by a quantitative analysis of the reconstructed volume images. These images are retrieved from the 3D diffraction data using no a priori knowledge about the shape or composition of the object, which has never before been demonstrated on a non-periodic object. We also construct 2D images of thick objects with infinite depth of focus (without loss of transverse spatial resolution). These methods can be used to image biological and materials science samples at high resolution using X-ray undulator radiation, and establishes the techniques to be used in atomic-resolution ultrafast imaging at X-ray free-electron laser sources.Comment: 22 pages, 11 figures, submitte

    Nonnegative Matrix Factorization for Efficient Hyperspectral Image Projection

    Get PDF
    Hyperspectral imaging for remote sensing has prompted development of hyperspectral image projectors that can be used to characterize hyperspectral imaging cameras and techniques in the lab. One such emerging astronomical hyperspectral imaging technique is wide-field double-Fourier interferometry. NASA's current, state-of-the-art, Wide-field Imaging Interferometry Testbed (WIIT) uses a Calibrated Hyperspectral Image Projector (CHIP) to generate test scenes and provide a more complete understanding of wide-field double-Fourier interferometry. Given enough time, the CHIP is capable of projecting scenes with astronomically realistic spatial and spectral complexity. However, this would require a very lengthy data collection process. For accurate but time-efficient projection of complicated hyperspectral images with the CHIP, the field must be decomposed both spectrally and spatially in a way that provides a favorable trade-off between accurately projecting the hyperspectral image and the time required for data collection. We apply nonnegative matrix factorization (NMF) to decompose hyperspectral astronomical datacubes into eigenspectra and eigenimages that allow time-efficient projection with the CHIP. Included is a brief analysis of NMF parameters that affect accuracy, including the number of eigenspectra and eigenimages used to approximate the hyperspectral image to be projected. For the chosen field, the normalized mean squared synthesis error is under 0.01 with just 8 eigenspectra. NMF of hyperspectral astronomical fields better utilizes the CHIP's capabilities, providing time-efficient and accurate representations of astronomical scenes to be imaged with the WIIT

    Wavefront-Error Performance Characterization for the James Webb Space Telescope (JWST) Integrated Science Instrument Module (ISIM) Science Instruments

    Get PDF
    The science instruments (SIs) comprising the James Webb Space Telescope (JWST) Integrated Science Instrument Module (ISIM) were tested in three cryogenic-vacuum test campaigns in the NASA Goddard Space Flight Center (GSFC)'s Space Environment Simulator (SES) test chamber. In this paper, we describe the results of optical wavefront-error performance characterization of the SIs. The wavefront error is determined using image-based wavefront sensing, and the primary data used by this process are focus sweeps, a series of images recorded by the instrument under test in its as-used configuration, in which the focal plane is systematically changed from one image to the next. High-precision determination of the wavefront error also requires several sources of secondary data, including 1) spectrum, apodization, and wavefront-error characterization of the optical ground-support equipment (OGSE) illumination module, called the OTE Simulator (OSIM), 2) F-number and pupil-distortion measurements made using a pseudo-nonredundant mask (PNRM), and 3) pupil geometry predictions as a function of SI and field point, which are complicated because of a tricontagon-shaped outer perimeter and small holes that appear in the exit pupil due to the way that different light sources are injected into the optical path by the OGSE. One set of wavefront-error tests, for the coronagraphic channel of the Near-Infrared Camera (NIRCam) Longwave instruments, was performed using data from transverse translation diversity sweeps instead of focus sweeps, in which a sub-aperture is translated and/or rotated across the exit pupil of the system. Several optical-performance requirements that were verified during this ISIM-level testing are levied on the uncertainties of various wavefront-error-related quantities rather than on the wavefront errors themselves. This paper also describes the methodology, based on Monte Carlo simulations of the wavefront-sensing analysis of focus-sweep data, used to establish the uncertainties of the wavefront-error maps

    High angular resolution imaging with stellar intensity interferometry using air Cherenkov telescope arrays

    Full text link
    Optical stellar intensity interferometry with air Cherenkov telescope arrays, composed of nearly 100 telescopes, will provide means to measure fundamental stellar parameters and also open the possibility of model-independent imaging. In addition to sensitivity issues, a main limitation of image recovery in intensity interferometry is the loss of phase of the complex degree of coherence during the measurement process. Nevertheless, several model-independent phase reconstruction techniques have been developed. Here we implement a Cauchy-Riemann based algorithm to recover images from simulated data. For bright stars (m_v~6) and exposure times of a few hours, we find that scale features such as diameters, oblateness and overall shapes are reconstructed with uncertainties of a few percent. More complex images are also well reconstructed with high degrees of correlation with the pristine image. Results are further improved by using a forward algorithm.Comment: Accepted for publication in MNRAS. 13 pages, 22 figure

    Wavefront-Error Performance Characterization for the James Webb Space Telescope (JWST) Integrated Science Instrument Module (ISIM) Science Instruments

    Get PDF
    The science instruments (SIs) comprising the James Webb Space Telescope (JWST) Integrated Science Instrument Module (ISIM) were tested in three cryogenic-vacuum test campaigns in the NASA Goddard Space Flight Center (GSFC)'s Space Environment Simulator (SES). In this paper, we describe the results of optical wavefront-error performance characterization of the SIs. The wavefront error is determined using image-based wavefront sensing (also known as phase retrieval), and the primary data used by this process are focus sweeps, a series of images recorded by the instrument under test in its as-used configuration, in which the focal plane is systematically changed from one image to the next. High-precision determination of the wavefront error also requires several sources of secondary data, including 1) spectrum, apodization, and wavefront-error characterization of the optical ground-support equipment (OGSE) illumination module, called the OTE Simulator (OSIM), 2) plate scale measurements made using a Pseudo-Nonredundant Mask (PNRM), and 3) pupil geometry predictions as a function of SI and field point, which are complicated because of a tricontagon-shaped outer perimeter and small holes that appear in the exit pupil due to the way that different light sources are injected into the optical path by the OGSE. One set of wavefront-error tests, for the coronagraphic channel of the Near-Infrared Camera (NIRCam) Longwave instruments, was performed using data from transverse translation diversity sweeps instead of focus sweeps, in which a sub-aperture is translated andor rotated across the exit pupil of the system.Several optical-performance requirements that were verified during this ISIM-level testing are levied on the uncertainties of various wavefront-error-related quantities rather than on the wavefront errors themselves. This paper also describes the methodology, based on Monte Carlo simulations of the wavefront-sensing analysis of focus-sweep data, used to establish the uncertainties of the wavefront error maps

    GROCS Collection for Noteworks 2007-2008

    Full text link
    Collection of artifacts from the Noteworks GROCS project in 2008.We propose to design and implement a computer application that enables users to create sound experiences and musical compositions in a completely new way. In particular, our software will enable users to design dynamic temporal networks in which the nodes correspond to sound clips, and directed edges represent time and other relationships between nodes. Furthermore, we will embed functionality in the application so as to enable different instances of our software to interact with other musicians’ networks so as to create a truly interactive, collaborative music experience. We will also release our software to any interested parties so they can extend it as they see fit (and set up their own musical networks at home).GROCS: GRant Opportunities [collaborative spaces], a Digital Media Commons program to fund student research on the use of rich media in collaborative learning.http://deepblue.lib.umich.edu/bitstream/2027.42/62445/11/grocs_proposal_noteworks.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/62445/10/setup.AVIhttp://deepblue.lib.umich.edu/bitstream/2027.42/62445/9/NW Design Review 2.1.08.mp3http://deepblue.lib.umich.edu/bitstream/2027.42/62445/8/noteworks_screenshot.pnghttp://deepblue.lib.umich.edu/bitstream/2027.42/62445/7/noteworks_screencast.avihttp://deepblue.lib.umich.edu/bitstream/2027.42/62445/6/noteworks_melancholy.mp4http://deepblue.lib.umich.edu/bitstream/2027.42/62445/5/noteworks_logo.pnghttp://deepblue.lib.umich.edu/bitstream/2027.42/62445/4/noteworks.ziphttp://deepblue.lib.umich.edu/bitstream/2027.42/62445/3/Noteworks Demo April 5.movhttp://deepblue.lib.umich.edu/bitstream/2027.42/62445/2/michigan.AVIhttp://deepblue.lib.umich.edu/bitstream/2027.42/62445/1/bedtime.av

    GROCS Collection for the GroupLoops Team 2009

    Full text link
    Collection of artifacts from the GroupLoops team project in the GROCS 2008-2009 cycle.Learning how to play an instrument and compose music takes a long time and a lot of effort. Electronic devices lower the barrier to music creation by removing procedural technique from this learning curve. However, music composition software shares a different barrier with instruments: It’s hard to learn how to make a musical work that sounds good. Our research will explore how the iPhone and iPod Touch could teach music theory while participants collaborate on a group track. The experience might be similar to a jam session, but with digital instruments that fit in your pocket and don’t require years of practice to play well. We’ll use interaction design methods like contextual inquiry, prototyping, and user testing to research and develop an interface that’s easy to use, instructional via constraints and recommendations, and fun. Our goal is to develop a proof-of-concept for co-located, synchronous, and collaborative music composition software for the iPhone and iPod Touch that would educate and inspire creativity in music theory novices.GROCS: GRant Opportunities [collaborative spaces], a Digital Media Commons program to fund student research on the use of rich media in collaborative learning.http://deepblue.lib.umich.edu/bitstream/2027.42/62184/8/grouploops.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/62184/7/Photos.ziphttp://deepblue.lib.umich.edu/bitstream/2027.42/62184/6/Grouploops.movhttp://deepblue.lib.umich.edu/bitstream/2027.42/62184/5/Grouploops-Sites.ziphttp://deepblue.lib.umich.edu/bitstream/2027.42/62184/4/grouploops wireframe 1.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/62184/3/GroupLoops Demo.m4vhttp://deepblue.lib.umich.edu/bitstream/2027.42/62184/2/GroupLoops - Drum Stick - Prototype - Wireframes.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/62184/1/GroupLoops - Design Review 4:2.pd
    corecore