13,600 research outputs found

    Enhancing aeropropulsion research with high-speed interactive computing

    Get PDF
    NASA-Lewis has committed to a long range goal of creating a numerical test cell for aeropropulsion research and development. Efforts are underway to develop a first generation Numerical Propulsion System Simulation (NPSS). The NPSS will provide a unique capability to numerically simulate advanced propulsion systems from nose to tail. Two essential ingredients to the NPSS are: (1) experimentally validated Computational Fluid Dynamics (CFD) codes; and (2) high performing computing systems (hardware and software) that will permit those codes to be used efficiently. To this end, NASA-Lewis is using high speed, interactive computing as a means for achieving Integrated CFD and Experiments (ICE). The development is described of a prototype ICE system for multistage compressor flow physics research

    Statistical Reliability Estimation of Microprocessor-Based Systems

    Get PDF
    What is the probability that the execution state of a given microprocessor running a given application is correct, in a certain working environment with a given soft-error rate? Trying to answer this question using fault injection can be very expensive and time consuming. This paper proposes the baseline for a new methodology, based on microprocessor error probability profiling, that aims at estimating fault injection results without the need of a typical fault injection setup. The proposed methodology is based on two main ideas: a one-time fault-injection analysis of the microprocessor architecture to characterize the probability of successful execution of each of its instructions in presence of a soft-error, and a static and very fast analysis of the control and data flow of the target software application to compute its probability of success. The presented work goes beyond the dependability evaluation problem; it also has the potential to become the backbone for new tools able to help engineers to choose the best hardware and software architecture to structurally maximize the probability of a correct execution of the target softwar

    Multistep Measurement of Plantar Pressure Alterations Using Metatarsal Pads

    Get PDF
    Metatarsal pads are frequently prescribed for nonoperative management of metatarsalgia due to various etiologies. When appropriately placed, they are effective in reducing pressures under the metatarsal heads on the plantar surface of the foot. Despite the positive clinical reports that have been cited, there are no quantitative studies documenting the load redistribution effects of these pads during multiple step usage within the shoe environment. The objective of this study was to assess changes in plantar pressure metrics resulting from pad use. Ten normal adult male subjects were tested during a series of 400-step trials. Pressures were recorded from eight discrete plantar locations at the hindfoot, midfoot, and forefoot regions of the insole. Significant increases in peak pressures, contact durations, and pressure-time integrals were noted at the metatarsal shaft region with pad use (P ≀ .05). Statistically significant changes in metric values were not seen at the other plantar locations, although metatarsal pad use resulted in mild decreases in mean peak pressures at the first and second metatarsal heads and slight increases laterally. Contact durations decreased at all metatarsal head locations, while pressure-time integrals decreased at the first, second, third, and fourth metatarsal heads. A slight increase in pressure-time integrals was seen at the fifth metatarsal head. The redistribution of plantar pressures tended to relate not only to the dimensions of the metatarsal pads, but also to foot size, anatomic foot configuration, and pad location. Knowledge of these parameters, along with careful control of pad dimensions and placement, allows use of the metatarsal pad as an effective orthotic device for redistributing forefoot plantar pressures

    Chaos in computer performance

    Get PDF
    Modern computer microprocessors are composed of hundreds of millions of transistors that interact through intricate protocols. Their performance during program execution may be highly variable and present aperiodic oscillations. In this paper, we apply current nonlinear time series analysis techniques to the performances of modern microprocessors during the execution of prototypical programs. Our results present pieces of evidence strongly supporting that the high variability of the performance dynamics during the execution of several programs display low-dimensional deterministic chaos, with sensitivity to initial conditions comparable to textbook models. Taken together, these results show that the instantaneous performances of modern microprocessors constitute a complex (or at least complicated) system and would benefit from analysis with modern tools of nonlinear and complexity science

    An overview of decision table literature 1982-1995.

    Get PDF
    This report gives an overview of the literature on decision tables over the past 15 years. As much as possible, for each reference, an author supplied abstract, a number of keywords and a classification are provided. In some cases own comments are added. The purpose of these comments is to show where, how and why decision tables are used. The literature is classified according to application area, theoretical versus practical character, year of publication, country or origin (not necessarily country of publication) and the language of the document. After a description of the scope of the interview, classification results and the classification by topic are presented. The main body of the paper is the ordered list of publications with abstract, classification and comments.

    Model-based vision for space applications

    Get PDF
    This paper describes a method for tracking moving image features by combining spatial and temporal edge information with model based feature information. The algorithm updates the two-dimensional position of object features by correlating predicted model features with current image data. The results of the correlation process are used to compute an updated model. The algorithm makes use of a high temporal sampling rate with respect to spatial changes of the image features and operates in a real-time multiprocessing environment. Preliminary results demonstrate successful tracking for image feature velocities between 1.1 and 4.5 pixels every image frame. This work has applications for docking, assembly, retrieval of floating objects and a host of other space-related tasks

    Multi-Agent Cooperation for Particle Accelerator Control

    Get PDF
    We present practical investigations in a real industrial controls environment for justifying theoretical DAI (Distributed Artificial Intelligence) results, and we discuss theoretical aspects of practical investigations for accelerator control and operation. A generalized hypothesis is introduced, based on a unified view of control, monitoring, diagnosis, maintenance and repair tasks leading to a general method of cooperation for expert systems by exchanging hypotheses. This has been tested for task and result sharing cooperation scenarios. Generalized hypotheses also allow us to treat the repetitive diagnosis-recovery cycle as task sharing cooperation. Problems with such a loop or even recursive calls between the different agents are discussed
    • 

    corecore