4,896 research outputs found

    A MARKOV CHAIN ANALYSIS OF STRUCTURAL CHANGES IN THE TEXAS HIGH PLAINS COTTON GINNING INDUSTRY

    Get PDF
    Markov chain analysis of changes in the number and size of cotton gin firms in West Texas was conducted assuming stationary and non-stationary transition probabilities. Projections of industry structure were made to 1999 with stationary probability assumptions and six sets of assumed conditions for labor and energy costs and technological change in the non-stationary transition model. Results indicate a continued decline in number of firms, but labor, energy, and technology conditions alter the configuration of the structural changes.Crop Production/Industries,

    Integrating planning and reactive control

    Get PDF
    Our research is developing persistent agents that can achieve complex tasks in dynamic and uncertain environments. We refer to such agents as taskable, reactive agents. An agent of this type requires a number of capabilities. The ability to execute complex tasks necessitates the use of strategic plans for accomplishing tasks; hence, the agent must be able to synthesize new plans at run time. The dynamic nature of the environment requires that the agent be able to deal with unpredictable changes in its world. As such, agents must be able to react to unanticipated events by taking appropriate actions in a timely manner, while continuing activities that support current goals. The unpredictability of the world could lead to failure of plans generated for individual tasks. Agents must have the ability to recover from failures by adapting their activities to the new situation, or replanning if the world changes sufficiently. Finally, the agent should be able to perform in the face of uncertainty. The Cypress system, described here, provides a framework for creating taskable, reactive agents. Several features distinguish our approach: (1) the generation and execution of complex plans with parallel actions; (2) the integration of goal-driven and event driven activities during execution; (3) the use of evidential reasoning for dealing with uncertainty; and (4) the use of replanning to handle run-time execution problems. Our model for a taskable, reactive agent has two main intelligent components, an executor and a planner. The two components share a library of possible actions that the system can take. The library encompasses a full range of action representations, including plans, planning operators, and executable procedures such as predefined standard operating procedures (SOP's). These three classes of actions span multiple levels of abstraction

    Buckling Testing and Analysis of Honeycomb Sandwich Panel Arc Segments of a Full-Scale Fairing Barrel Part 4: Six-ply Out-of-Autoclave Facesheets

    Get PDF
    Four honeycomb sandwich panel types, representing 1/16th arc segments of a 10-m diameter barrel section of the Heavy Lift Launch Vehicle (HLLV), were manufactured and tested under the NASA Composites for Exploration program and the NASA Constellation Ares V program. Two configurations were chosen for the panels: 6-ply facesheets with 1.125 in. honeycomb core and 8-ply facesheets with 1.000 in. honeycomb core. Additionally, two separate carbon fiber/epoxy material systems were chosen for the facesheets: in-autoclave IM7/977-3 and out-of-autoclave T40-800b/5320-1. Smaller 3 ft. by 5 ft. panels were cut from the 1/16th barrel sections. These panels were tested under compressive loading at the NASA Langley Research Center (LaRC). Furthermore, linear eigenvalue and geometrically nonlinear finite element analyses were performed to predict the compressive response of each 3 ft. by 5 ft. panel. This manuscript summarizes the experimental and analytical modeling efforts pertaining to the panels composed of 6-ply, T40-800b/5320-1 facesheets (referred to as Panels D). To improve the robustness of the geometrically nonlinear finite element model, measured surface imperfections were included in the geometry of the model. Both the linear and nonlinear models yield good qualitative and quantitative predictions. Additionally, it was correctly predicted that the panel would fail in buckling prior to failing in strength. Furthermore, three-dimensional (3D) effects on the compressive response of the panel were studied

    Scaling issues in ensemble implementations of the Deutsch-Jozsa algorithm

    Full text link
    We discuss the ensemble version of the Deutsch-Jozsa (DJ) algorithm which attempts to provide a "scalable" implementation on an expectation-value NMR quantum computer. We show that this ensemble implementation of the DJ algorithm is at best as efficient as the classical random algorithm. As soon as any attempt is made to classify all possible functions with certainty, the implementation requires an exponentially large number of molecules. The discrepancies arise out of the interpretation of mixed state density matrices.Comment: Minor changes, reference added, replaced with publised versio

    Advanced aeroservoelastic stabilization techniques for hypersonic flight vehicles

    Get PDF
    Advanced high performance vehicles, including Single-Stage-To-Orbit (SSTO) hypersonic flight vehicles, that are statically unstable, require higher bandwidth flight control systems to compensate for the instability resulting in interactions between the flight control system, the engine/propulsion dynamics, and the low frequency structural modes. Military specifications, such as MIL-F-9490D and MIL-F-87242, tend to limit treatment of structural modes to conventional gain stabilization techniques. The conventional gain stabilization techniques, however, introduce low frequency effective time delays which can be troublesome from a flying qualities standpoint. These time delays can be alleviated by appropriate blending of gain and phase stabilization techniques (referred to as Hybrid Phase Stabilization or HPS) for the low frequency structural modes. The potential of using HPS for compensating structural mode interaction was previously explored. It was shown that effective time delay was significantly reduced with the use of HPS; however, the HPS design was seen to have greater residual response than a conventional gain stablized design. Additional work performed to advance and refine the HPS design procedure, to further develop residual response metrics as a basis for alternative structural stability specifications, and to develop strategies for validating HPS design and specification concepts in manned simulation is presented. Stabilization design sensitivity to structural uncertainties and aircraft-centered requirements are also assessed

    Lidar-Based Navigation Algorithm for Safe Lunar Landing

    Get PDF
    The purpose of Hazard Relative Navigation (HRN) is to provide measurements to the Navigation Filter so that it can limit errors on the position estimate after hazards have been detected. The hazards are detected by processing a hazard digital elevation map (HDEM). The HRN process takes lidar images as the spacecraft descends to the surface and matches these to the HDEM to compute relative position measurements. Since the HDEM has the hazards embedded in it, the position measurements are relative to the hazards, hence the name Hazard Relative Navigation

    A general few-projection method for tomographic reconstruction of samples consisting of several distinct materials

    No full text
    We present a method for tomographic reconstruction of objects containing several distinct materials, which is capable of accurately reconstructing a sample from vastly fewer angular projections than required by conventional algorithms. The algorithm is more general than many previous discrete tomography methods, as: (i) a priori knowledge of the exact number of materials is not required; (ii) the linear attenuation coefficient of each constituent material may assume a small range of a priori unknown values. We present reconstructions from an experimental x-ray computed tomography scan of cortical bone acquired at the SPring-8 synchrotron

    The potential impact of Ocean Thermal Energy Conversion (OTEC) on fisheries

    Get PDF
    The commercial development of ocean thermal energy conversion (OTEC) operations will involve some environmental perturbations for which there is no precedent experience. The pumping of very large volumes of warm surface water and cold deep water and its subsequent discharge will result in the impingement, entrainment, and redistribution of biota. Additional stresses to biota will be caused by biocide usage and temperature depressions. However, the artificial upwelling of nutrients associated with the pumping of cold deep water, and the artificial reef created by an OTEC plant may have positive effects on the local environment. Although more detailed information is needed to assess the net effect of an OTEC operation on fisheries, certain assumptions and calculations are made supporting the conclusion that the potential risk to fisheries is not significant enough to deter the early development of IDEe. It will be necessary to monitor a commercial-scale plant in order to remove many of the remaining uncertainties. (PDF file contains 39 pages.

    Robust Machine Learning Applied to Astronomical Datasets I: Star-Galaxy Classification of the SDSS DR3 Using Decision Trees

    Get PDF
    We provide classifications for all 143 million non-repeat photometric objects in the Third Data Release of the Sloan Digital Sky Survey (SDSS) using decision trees trained on 477,068 objects with SDSS spectroscopic data. We demonstrate that these star/galaxy classifications are expected to be reliable for approximately 22 million objects with r < ~20. The general machine learning environment Data-to-Knowledge and supercomputing resources enabled extensive investigation of the decision tree parameter space. This work presents the first public release of objects classified in this way for an entire SDSS data release. The objects are classified as either galaxy, star or nsng (neither star nor galaxy), with an associated probability for each class. To demonstrate how to effectively make use of these classifications, we perform several important tests. First, we detail selection criteria within the probability space defined by the three classes to extract samples of stars and galaxies to a given completeness and efficiency. Second, we investigate the efficacy of the classifications and the effect of extrapolating from the spectroscopic regime by performing blind tests on objects in the SDSS, 2dF Galaxy Redshift and 2dF QSO Redshift (2QZ) surveys. Given the photometric limits of our spectroscopic training data, we effectively begin to extrapolate past our star-galaxy training set at r ~ 18. By comparing the number counts of our training sample with the classified sources, however, we find that our efficiencies appear to remain robust to r ~ 20. As a result, we expect our classifications to be accurate for 900,000 galaxies and 6.7 million stars, and remain robust via extrapolation for a total of 8.0 million galaxies and 13.9 million stars. [Abridged]Comment: 27 pages, 12 figures, to be published in ApJ, uses emulateapj.cl
    corecore