67 research outputs found

    Precision Weak Gravitational Lensing Using Velocity Fields: Fisher Matrix Analysis

    Full text link
    Weak gravitational lensing measurements based on photometry are limited by shape noise, the variance in the unknown unlensed orientations of the source galaxies. If the source is a disk galaxy with a well-ordered velocity field, however, velocity field data can support simultaneous inference of the shear, inclination, and position angle, virtually eliminating shape noise. We use the Fisher Information Matrix formalism to forecast the precision of this method in the idealized case of a perfectly ordered velocity field defined on an infinitesimally thin disk. For nearly face-on targets one shear component, γ×\gamma_\times, can be constrained to 0.00390I025npix0.003\frac{90}{I_0}\frac{25}{n_{\rm pix}} where I0I_0 is the S/N of the central intensity pixel and npixn_{\rm pix} is the number of pixels across a diameter enclosing 80\% of the light. This precision degrades with inclination angle, by a factor of three by i=50i{=}50^\circ. Uncertainty on the other shear component, γ+\gamma_+, is about 1.5 (7) times larger than the γ×\gamma_\times uncertainty for targets at i=10i=10^\circ (5050^\circ). For arbitrary galaxy position angle on the sky, these forecasts apply not to γ+\gamma_+ and γ×\gamma_\times as defined on the sky, but to two eigenvectors in (γ+,γ×,μ)(\gamma_+, \gamma_\times,\mu) space where μ\mu is the magnification. We also forecast the potential of less expensive partial observations of the velocity field such as slit spectroscopy. We conclude by outlining some ways in which real galaxies depart from our idealized model and thus create random or systematic uncertainties not captured here. In particular, our forecast γ×\gamma_\times precision is currently limited only by the data quality rather than scatter in galaxy properties because the relevant type of scatter has yet to be measured.Comment: Accepted to ApJ, 17 pages, 14 figures. Diff from v1: added Sec 3.1 on degeneracies and Appendix with simulations confirming Fisher result

    Development of a Chemistry-Based, Predictive Method for Determining the Amount of Non-Pertechnetate Technetium in the Hanford Tanks: FY 2012 Progress Report

    Get PDF
    This report describes investigations directed toward understanding the extent of the presence of highly alkaline soluble, non-pertechnetate technetium (n-Tc) in the Hanford Tank supernatants. The goals of this report are to: a) present a review of the available literature relevant to the speciation of technetium in the Hanford tank supernatants, b) attempt to establish a chemically logical correlation between available Hanford tank measurements and the presence of supernatant soluble n-Tc, c) use existing measurement data to estimate the amount of n-Tc in the Hanford tank supernatants, and d) report on any likely, process-friendly methods to eventually sequester soluble n-Tc from Hanford tank supernatants

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be 24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with δ<+34.5\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Making Democratic-Governance Work: The Consequences for Prosperity

    Full text link

    Using GIS to Identify Possible Improvements for Eastern Kentucky Roadways

    No full text
    GIS Analysis of Road Connectivity of Eastern Kentucky Cities Matthew Wittman, Robin Zhang Department of Geosciences, Murray State University Keywords: GIS, Eastern Kentucky, Roadway Efficiency Eastern Kentucky is one of the poorest regions in the United States. This stems, in part, from the topography of the region; sinuous Appalachians trending Northeast-Southwest serve as a barrier between east and west, and result in overly-complicated road systems that are unattractive as trade routes. This analysis looks at this issue via the application of GIS techniques to spatial and temporal characteristics of the roadways of the region. For each county, the most efficient possible route from the largest settlement in that county to Lexington, KY will be calculated. Road efficiency (defined as the Euclidean distance between the endpoints of a roadway divided by the driving distance of that roadway) will then be calculated for each route to determine which cities in the region are most isolated from Lexington

    Mapping Of Impervious Surface Distribution Over Time In Southern Illinois Using Satellite Imagery

    No full text
    Mapping Of Impervious Surface Distribution Over Time In Southern Illinois Using Satellite Imagery Matthew Wittman Mentor: Dr. Haluk Cetin Department of Geosciences, Murray State University Keywords: Impervious surfaces, supervised land classification, land-cover change, maximum likelihood, Landsat TM Impervious surfaces are defined as surfaces that prevent water from infiltrating underlying soils. These surfaces exert an array of effects on the surrounding area, such as retarding the speed of geochemical cycles occurring in the soil, increasing overland water flow and the temperature of the immediate area. This study examines the change in impervious surface area over time in a portion of southern Illinois. Landsat 4-5 Thematic Mapper scenes (Row 33, Path 23) were obtained for the years 1986 and 2007, in the month of July. These scenes were analyzed using a supervised land-cover classification technique. First, both images were cropped to only include areas not covered by clouds in either image. Next, pixels were assigned to one of four classes (water, vegetation, barren land/soils, and impervious surfaces) using a maximum likelihood approach. Lastly, these classified maps were analyzed using a post-classification comparison technique to map changes in impervious surface distribution between the two time periods. Patterns and trends observed were used to make predictions about the effects impervious surfaces have had on the surrounding environment
    corecore