895 research outputs found

    Real Time 3-D Graphics Processing Hardware Design using Field-Programmable Gate Arrays.

    Get PDF
    Three dimensional graphics processing requires many complex algebraic and matrix based operations to be performed in real-time. In early stages of graphics processing, such tasks were delegated to a Central Processing Unit (CPU). Over time as more complex graphics rendering was demanded, CPU solutions became inadequate. To meet this demand, custom hardware solutions that take advantage of pipelining and massive parallelism become more preferable to CPU software based solutions. This fact has lead to the many custom hardware solutions that are available today. Since real time graphics processing requires extreme high performance, hardware solutions using Application Specific Integrated Circuits (ASICs) are the standard within the industry. While ASICs are a more than adequate solution for implementing high performance custom hardware, the design, implementation and testing of ASIC based designs are becoming cost prohibitive due to the massive up front verification effort needed as well as the cost of fixing design defects.Field Programmable Gate Arrays (FPGAs) provide an alternative to the ASIC design flow. More importantly, in recent years FPGA technology have begun to improve in performance to the point where ASIC and FPGA performance has become comparable. In addition, FPGAs address many of the issues of the ASIC design flow. The ability to reconfigure FPGAs reduces the upfront verification effort and allows design defects to be fixed easily. This thesis demonstrates that a 3-D graphics processor implementation on and FPGA is feasible by implementing both a two dimensional and three dimensional graphics processor prototype. By using a Xilinx Virtex 5 ML506 FPGA development kit a fully functional wireframe graphics rendering engine is implemented using VHDL and Xilinx's development tools. A VHDL testbench was designed to verify that the graphics engine works functionally. This is followed by synthesizing the design and real hardware and developing test applications to verify functionality and performance of the design. This thesis provides the ground work for push forward the use of FPGA technology in graphics processing applications

    HZETRN Radiation Transport Validation Using Balloon-Based Experimental Data

    Get PDF
    The deterministic radiation transport code HZETRN (High charge (Z) and Energy TRaNsport) was developed by NASA to study the effects of cosmic radiation on astronauts and instrumentation shielded by various materials. This work presents an analysis of computed differential flux from HZETRN compared with measurement data from three balloon-based experiments over a range of atmospheric depths, particle types, and energies. Model uncertainties were quantified using an interval-based validation metric that takes into account measurement uncertainty both in the flux and the energy at which it was measured. Average uncertainty metrics were computed for the entire dataset as well as subsets of the measurements (by experiment, particle type, energy, etc.) to reveal any specific trends of systematic over- or under-prediction by HZETRN. The distribution of individual model uncertainties was also investigated to study the range and dispersion of errors beyond just single scalar and interval metrics. The differential fluxes from HZETRN were generally well-correlated with balloon-based measurements; the median relative model difference across the entire dataset was determined to be 30%. The distribution of model uncertainties, however, revealed that the range of errors was relatively broad, with approximately 30% of the uncertainties exceeding 40%. The distribution also indicated that HZETRN systematically under-predicts the measurement dataset as a whole, with approximately 80% of the relative uncertainties having negative values. Instances of systematic bias for subsets of the data were also observed, including a significant underestimation of alpha particles and protons for energies below 2.5 GeV/u. Muons were found to be systematically over-predicted at atmospheric depths deeper than 50 g/cm(sup 2) but under-predicted for shallower depths. Furthermore, a systematic under-prediction of alpha particles and protons was observed below the geomagnetic cutoff, suggesting that improvements to the light ion production cross sections in HZETRN should be investigated

    Low Temperature Effects on the Mechanical, Fracture, and Dynamic Behavior of Carbon and E-glass Epoxy Laminates

    Get PDF
    An experimental investigation through which the effects of low temperatures on the mechanical, fracture, impact, and dynamic properties of carbon- and E-glass-epoxy composite materials has been conducted. The objective of the study is to quantify the influence of temperatures from 20 °C down to −2 °C on the in-plane (tensile/compressive) and shear material properties, static and dynamic Mode-I fracture characteristics, impact/residual strength, and the storage and loss moduli for the materials considered. The low end of the temperature range considered in the study is associated with Arctic seawater as well as conditions found at extreme ocean depths (2 °C–4 °C). In the investigation, both carbon/epoxy and E-glass/epoxy laminates are evaluated as these materials are of keen interest to the marine and undersea vehicle community. The mechanical characterization of the laminates consists of controlled tension, compression, and short beam shear testing. The Mode-I fracture performance is quantified under both quasi-static and highly dynamic loading rates with additional flexure after impact strength characterization conducted through the use of a drop tower facility. Finally, dynamic mechanical analysis (DMA) testing has been completed on each material to measure the storage and loss moduli of the carbon fiber- and E-glass fiber reinforced composites. The findings of the study show that nearly all characteristics of the mechanical performance of the laminates are both material and temperature dependent

    The Strayed Reveller, No. 7

    Get PDF
    The seventh issue of The Strayed Reveller.https://scholarworks.sfasu.edu/reveller/1006/thumbnail.jp

    The Coral Bleaching Automated Stress System (CBASS): A low‐cost, portable system for standardized empirical assessments of coral thermal limits

    Get PDF
    Ocean warming is increasingly affecting marine ecosystems across the globe. Reef-building corals are particularly affected by warming, with mass bleaching events increasing in frequency and leading to widespread coral mortality. Yet, some corals can resist or recover from bleaching better than others. Such variability in thermal resilience could be critical to reef persistence; however, the scientific community lacks standardized diagnostic approaches to rapidly and comparatively assess coral thermal vulnerability prior to bleaching events. We present the Coral Bleaching Automated Stress System (CBASS) as a low-cost, open-source, field-portable experimental system for rapid empirical assessment of coral thermal thresholds using standardized temperature stress profiles and diagnostics. The CBASS consists of four or eight flow-through experimental aquaria with independent water masses, lighting, and individual automated temperature controls capable of delivering custom modulating thermal profiles. The CBASS is used to conduct daily thermal stress exposures that typically include 3-h temperature ramps to multiple target temperatures, a 3-h hold period at the target temperatures, and a 1-h ramp back down to ambient temperature, followed by an overnight recovery period. This mimics shallow water temperature profiles observed in coral reefs and prompts a rapid acute heat stress response that can serve as a diagnostic tool to identify putative thermotolerant corals for in-depth assessments of adaptation mechanisms, targeted conservation, and possible use in restoration efforts. The CBASS is deployable within hours and can assay up to 40 coral fragments/aquaria/day, enabling high-throughput, rapid determination of thermal thresholds for individual genotypes, populations, species, and sites using a standardized experimental framework

    The Coral Bleaching Automated Stress System (CBASS): A Low-Cost, Portable System for Standardized Empirical Assessments of Coral Thermal Limits

    Get PDF
    Ocean warming is increasingly affecting marine ecosystems across the globe. Reef-building corals are particularly affected by warming, with mass bleaching events increasing in frequency and leading to widespread coral mortality. Yet, some corals can resist or recover from bleaching better than others. Such variability in thermal resilience could be critical to reef persistence; however, the scientific community lacks standardized diagnostic approaches to rapidly and comparatively assess coral thermal vulnerability prior to bleaching events. We present the Coral Bleaching Automated Stress System (CBASS) as a low-cost, open-source, field-portable experimental system for rapid empirical assessment of coral thermal thresholds using standardized temperature stress profiles and diagnostics. The CBASS consists of four or eight flow-through experimental aquaria with independent water masses, lighting, and individual automated temperature controls capable of delivering custom modulating thermal profiles. The CBASS is used to conduct daily thermal stress exposures that typically include 3-h temperature ramps to multiple target temperatures, a 3-h hold period at the target temperatures, and a 1-h ramp back down to ambient temperature, followed by an overnight recovery period. This mimics shallow water temperature profiles observed in coral reefs and prompts a rapid acute heat stress response that can serve as a diagnostic tool to identify putative thermotolerant corals for in-depth assessments of adaptation mechanisms, targeted conservation, and possible use in restoration efforts. The CBASS is deployable within hours and can assay up to 40 coral fragments/aquaria/day, enabling high-throughput, rapid determination of thermal thresholds for individual genotypes, populations, species, and sites using a standardized experimental framework

    Early Results from the Advanced Radiation Protection Thick GCR Shielding Project

    Get PDF
    The Advanced Radiation Protection Thick Galactic Cosmic Ray (GCR) Shielding Project leverages experimental and modeling approaches to validate a predicted minimum in the radiation exposure versus shielding depth curve. Preliminary results of space radiation models indicate that a minimum in the dose equivalent versus aluminum shielding thickness may exist in the 20-30 g/cm2 region. For greater shield thickness, dose equivalent increases due to secondary neutron and light particle production. This result goes against the long held belief in the space radiation shielding community that increasing shielding thickness will decrease risk to crew health. A comprehensive modeling effort was undertaken to verify the preliminary modeling results using multiple Monte Carlo and deterministic space radiation transport codes. These results verified the preliminary findings of a minimum and helped drive the design of the experimental component of the project. In first-of-their-kind experiments performed at the NASA Space Radiation Laboratory, neutrons and light ions were measured between large thicknesses of aluminum shielding. Both an upstream and a downstream shield were incorporated into the experiment to represent the radiation environment inside a spacecraft. These measurements are used to validate the Monte Carlo codes and derive uncertainty distributions for exposure estimates behind thick shielding similar to that provided by spacecraft on a Mars mission. Preliminary results for all aspects of the project will be presented

    Brilliance of a fire: innocence, experience and the theory of childhood

    Get PDF
    This essay offers an extensive rehabilitation and reappraisal of the concept of childhood innocence as a means of testing the boundaries of some prevailing constructions of childhood. It excavates in detail some of the lost histories of innocence in order to show that these are more diverse and more complex than established and pejorative assessments of them conventionally suggest. Recovering, in particular, the forgotten pedigree of the Romantic account of the innocence of childhood underlines its depth and furnishes an enriched understanding of its critical role in the coming of mass education - both as a catalyst of social change and as an alternative measure of the child-centeredness of the institutions of public education. Now largely and residually confined to the inheritance of nursery education, the concept of childhood innocence, and the wider Romantic project of which it is an element, can help question the assumptions underpinning modern, competence-centred philosophies of childhood

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at http://www.lsst.org/lsst/sciboo
    corecore