5,264 research outputs found

    AOIPS 3 user's guide. Volume 2: Program descriptions

    Get PDF
    The Atmospheric and Oceanographic Information Processing System (AOIPS) 3 is the version of the AOIPS software as of April 1989. The AOIPS software was developed jointly by the Goddard Space Flight Center and General Sciences Corporation. A detailed description of very AOIPS program is presented. It is intended to serve as a reference for such items as program functionality, program operational instructions, and input/output variable descriptions. Program descriptions are derived from the on-line help information. Each program description is divided into two sections. The functional description section describes the purpose of the program and contains any pertinent operational information. The program description sections lists the program variables as they appear on-line, and describes them in detail

    The engineering design integration (EDIN) system

    Get PDF
    A digital computer program complex for the evaluation of aerospace vehicle preliminary designs is described. The system consists of a Univac 1100 series computer and peripherals using the Exec 8 operating system, a set of demand access terminals of the alphanumeric and graphics types, and a library of independent computer programs. Modification of the partial run streams, data base maintenance and construction, and control of program sequencing are provided by a data manipulation program called the DLG processor. The executive control of library program execution is performed by the Univac Exec 8 operating system through a user established run stream. A combination of demand and batch operations is employed in the evaluation of preliminary designs. Applications accomplished with the EDIN system are described

    Aerodynamic preliminary analysis system 2. Part 2: User's manual

    Get PDF
    An aerodynamic analysis system based on potential theory at subsonic and/or supersonic speeds and impact type finite element solutions at hypersonic conditions is described. Three dimensional configurations have multiple nonplanar surfaces of arbitrary planforms and bodies of noncircular contour may be analyzed. Static, rotary, and control longitudinal and lateral-directional characteristics may be generated. The analysis was implemented on a time sharing system in conjunction with an input tablet digitizer and an interactive graphics input/output display and editing terminal to maximize its responsiveness to the preliminary analysis. Computation times on an IBM 3081 are typically less than one minute of CPU/Mach number at subsonic, supersonic, or hypersonic speeds. This is a user manual for the computer programming

    High Resolution Mesoscale Weather Data Improvement to Spatial Effects for Doserate Contour Plot Predictions

    Get PDF
    Reanalysis weather data is obtained for dates surrounding historical nuclear tests and processed through Regional Atmospheric Modeling System (RAMS) software to produce a high-resolution weather forecast. Output from RAMS is visualized to check for validity and input into Hazard Prediction and Assessment Capability (HPAC) software and modeled predictions are compared to historical observation data. Simulations are conducted using constant high resolution weather and varying terrain resolution. The HPAC prediction is numerically compared to historical observation data. The result of this research culminated in the knowledge that early-time, low-altitude wind data was neglected by HPAC\u27s incorporation of the Defense Land Fallout Interpretive Code (DELFIC) Cloud Rise Module, resulting in HPAC predictions being inaccurate for early fallout deposition

    The Helioseismic and Magnetic Imager (HMI) Vector Magnetic Field Pipeline: Overview and Performance

    Full text link
    The Helioseismic and Magnetic Imager (HMI) began near-continuous full-disk solar measurements on 1 May 2010 from the Solar Dynamics Observatory (SDO). An automated processing pipeline keeps pace with observations to produce observable quantities, including the photospheric vector magnetic field, from sequences of filtergrams. The primary 720s observables were released in mid 2010, including Stokes polarization parameters measured at six wavelengths as well as intensity, Doppler velocity, and the line-of-sight magnetic field. More advanced products, including the full vector magnetic field, are now available. Automatically identified HMI Active Region Patches (HARPs) track the location and shape of magnetic regions throughout their lifetime. The vector field is computed using the Very Fast Inversion of the Stokes Vector (VFISV) code optimized for the HMI pipeline; the remaining 180 degree azimuth ambiguity is resolved with the Minimum Energy (ME0) code. The Milne-Eddington inversion is performed on all full-disk HMI observations. The disambiguation, until recently run only on HARP regions, is now implemented for the full disk. Vector and scalar quantities in the patches are used to derive active region indices potentially useful for forecasting; the data maps and indices are collected in the SHARP data series, hmi.sharp_720s. Patches are provided in both CCD and heliographic coordinates. HMI provides continuous coverage of the vector field, but has modest spatial, spectral, and temporal resolution. Coupled with limitations of the analysis and interpretation techniques, effects of the orbital velocity, and instrument performance, the resulting measurements have a certain dynamic range and sensitivity and are subject to systematic errors and uncertainties that are characterized in this report.Comment: 42 pages, 19 figures, accepted to Solar Physic

    SEAPAK user's guide, version 2.0. Volume 2: Descriptions of programs

    Get PDF
    The SEAPAK is a user-interactive satellite data analysis package that was developed for the processing and interpretation of Nimbus-7/Coastal Zone Color Scanner (CZCS) and the NOAA Advanced Very High Resolution Radiometer (AVHRR) data. Significant revisions were made since version 1.0, and the ancillary environmental data analysis module was greatly expanded. The package continues to be user friendly and user interactive. Also, because the scientific goals of the ocean color research being conducted have shifted to large space and time scales, batch processing capabilities for both satellite and ancillary environmental data analyses were enhanced, thus allowing for large quantities of data to be ingested and analyzed

    Sensory processing and world modeling for an active ranging device

    Get PDF
    In this project, we studied world modeling and sensory processing for laser range data. World Model data representation and operation were defined. Sensory processing algorithms for point processing and linear feature detection were designed and implemented. The interface between world modeling and sensory processing in the Servo and Primitive levels was investigated and implemented. In the primitive level, linear features detectors for edges were also implemented, analyzed and compared. The existing world model representations is surveyed. Also presented is the design and implementation of the Y-frame model, a hierarchical world model. The interfaces between the world model module and the sensory processing module are discussed as well as the linear feature detectors that were designed and implemented

    Terrain and Spatial Effects on Hazard Prediction and Assessment Capability (HPAC) Software Dose-Rate Contour Plot Predictions as Compared to a Sample of Local Fallout Data from Test Detonations in the Continental United States, 1945-1962

    Get PDF
    Hazard Prediction and Capability (HPAC) Software is validated by comparing modeled predictions to historical test data. Reanalysis weather data is acquired and reformatted for use in HPAC. Simulations are made using various amounts of weather data by use of a spatial domain. Simulations are also varied by levels of terrain resolution. The predicted output of the software is numerically compared to historical test data. The result of this research culminated in the knowledge that HPAC prediction accuracy is improved by using terrain resolutions beyond the flat earth assumption. Furthermore, this research establishes that domain size variation produces no significant advantage as to the accuracy of the prediction

    Distributed Finite Element Analysis Using a Transputer Network

    Get PDF
    The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the 80,000transputernetworkdemonstratedacost−performanceratioabout60timesbetterthanthe80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the 15,000,000 Cray X-MP24 system

    The contour tree image encoding technique and file format

    Get PDF
    The process of contourization is presented which converts a raster image into a discrete set of plateaux or contours. These contours can be grouped into a hierarchical structure, defining total spatial inclusion, called a contour tree. A contour coder has been developed which fully describes these contours in a compact and efficient manner and is the basis for an image compression method. Simplification of the contour tree has been undertaken by merging contour tree nodes thus lowering the contour tree's entropy. This can be exploited by the contour coder to increase the image compression ratio. By applying general and simple rules derived from physiological experiments on the human vision system, lossy image compression can be achieved which minimises noticeable artifacts in the simplified image. The contour merging technique offers a complementary lossy compression system to the QDCT (Quantised Discrete Cosine Transform). The artifacts introduced by the two methods are very different; QDCT produces a general blurring and adds extra highlights in the form of overshoots, whereas contour merging sharpens edges, reduces highlights and introduces a degree of false contouring. A format based on the contourization technique which caters for most image types is defined, called the contour tree image format. Image operations directly on this compressed format have been studied which for certain manipulations can offer significant operational speed increases over using a standard raster image format. A couple of examples of operations specific to the contour tree format are presented showing some of the features of the new format.Science and Engineering Research Counci
    • …
    corecore