896 research outputs found

    A generic applications subroutine library for the MPP

    Get PDF
    A new methodology to increase the utility of the Massively Parallel Processor (MPP) was developed, and is presented as an addition to the current methods of using the MPP. This methodology provides for the development of an MPP side abstraction layer that is callable from any host side high level language. Routines in the abstraction layer have the option of using a powerful software tool for accessing the stager as virtual memory. An additional abstraction layer that allows for remote access to the MPP via DECnet is discussed. This integrated approach to programming the MPP is a valuable tool for the implementation of interactive user driver systems that require the computational capabilities of the MPP as well as a controlled user view. It is expected that this methodology will be used to integrate the MPP into many such systems, and thus promote greater use of the MPP by scientific researchers who are accustomed to user friendly environments

    Improving reconfigurable systems reliability by combining periodical test and redundancy techniques: a case study

    Get PDF
    This paper revises and introduces to the field of reconfigurable computer systems, some traditional techniques used in the fields of fault-tolerance and testing of digital circuits. The target area is that of on-board spacecraft electronics, as this class of application is a good candidate for the use of reconfigurable computing technology. Fault tolerant strategies are used in order for the system to adapt itself to the severe conditions found in space. In addition, the paper describes some problems and possible solutions for the use of reconfigurable components, based on programmable logic, in space applications

    Explaining Excess Stock Return Through Options Market Sentiment

    Get PDF
    Option markets are a fascinating area of study and in recent years research has indicated that information obtained from the options market can be used to explain price returns in the underlying stock market. Building on existing asset pricing models such as the Fama-French Three Factor, Carhart Four Factor, and Fama-French Five Factor Models, this research tests if the put to call ratio can be used as an additional factor in explaining excess returns. Ordinary least squares models are run on all Dow Jones 30 stocks using more than ten years of data and the model results are compared. The results conclude that in a majority of cases, asset pricing models which include the ratio of put options to call options better explain excess stock returns than models which do not include information from the options market. These results provide supporting evidence that information from the options market contains valuable information into underlying stock price performance

    Animated computer graphics models of space and earth sciences data generated via the massively parallel processor

    Get PDF
    The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined

    A Human Factors analysis of Firefighter injury sustained during emergency response operations:Implications for error management and injury reduction in English Fire and Rescue Services.

    Get PDF
    This research is concerned with the human factors that may contribute to firefighter injury and whether the Fire and Rescue Service (FRS) adequately acknowledges their influence when investigating, recording, analysing, or reporting accident causation. In particular, the extent to which, as critical decision makers, firefighters experience the deficit outcome of their own risk-v-benefit decisions when operating without the immediate oversight of a supervisor or commander. Studies of judgement and decision making specifically focused on the role of firefighter as opposed to their incident commanders are exceptional.For the first time in the analysis of firefighter injury, a number of variables that represent the preconditions of accident causation such as the demographic, temporal, environmental and contextual characteristics were analysed. An ‘error typing’ taxonomy that differentiates between decision errors, skill-based errors, perception errors and violations was used to examine the extent to which human factors are being considered by FRSs in the analysis of firefighter injury. Opportunity was also taken to examine the applicability of the Human Factors Analysis and Classification System (HFACS) (Weigmann and Shappell 2003), to the emergency response domain of the FRS. This revealed the value of developing a valid and reliable sector specific variant of HFACS (UKFire-HFACS). Finally, using the critical decision method, recollection of the contextual characteristics that influenced the judgements, decisions, and actions at the ‘moment-of-choice’ of injured firefighters was also explored. Three studies that when combined establish components of a Human Factors Analysis Framework (HFAF) for the FRS.It was established that when implementing the requirements of an incident commander’s tactical plan, firefighters are required to make critical decisions and at times experience injury when operating without the immediate oversight of a supervisor or commander. Analysis demonstrated how the majority of injuries involve either a decision based or skill-based error which substantiates the existence and influence of skill fade at the ‘moment-of-choice’. It also brings FRS arrangements for the maintenance of competence into focus and worthy of closer scientific scrutiny. It is also evident that the approach of this research using three studies can be developed into a human factors analysis framework for the FRS. In turn this can establish the means by which the deficit outcome of firefighter critical decision making can be better understood, enable targeted intervention, and over time, reduce reported operational injury

    A hybrid memory kernel approach for condensed phase non-adiabatic dynamics

    Full text link
    The spin-boson model is a simplified Hamiltonian often used to study non-adiabatic dynamics in large condensed phase systems, even though it has not been solved in a fully analytic fashion. Herein, we present an exact analytic expression for the dynamics of the spin-boson model in the infinitely slow bath limit and generalize it to approximate dynamics for faster baths. We achieve the latter by developing a hybrid approach that combines the exact slow-bath result with the popular NIBA method to generate a memory kernel that is formally exact to second order in the diabatic coupling but also contains higher-order contributions approximated from the second order term alone. This kernel has the same computational complexity as NIBA, but is found to yield dramatically superior dynamics in regimes where NIBA breaks down---such as systems with large diabatic coupling or energy bias. This indicates that this hybrid approach could be used to cheaply incorporate higher order effects into second order methods, and could potentially be generalized to develop alternate kernel resummation schemes

    GALA: an international multicentre randomised trial comparing general anaesthesia versus local anaesthesia for carotid surgery

    Get PDF
    Background: Patients who have severe narrowing at or near the origin of the internal carotid artery as a result of atherosclerosis have a high risk of ischaemic stroke ipsilateral to the arterial lesion. Previous trials have shown that carotid endarterectomy improves long-term outcomes, particularly when performed soon after a prior transient ischaemic attack or mild ischaemic stroke. However, complications may occur during or soon after surgery, the most serious of which is stroke, which can be fatal. It has been suggested that performing the operation under local anaesthesia, rather than general anaesthesia, may be safer. Therefore, a prospective, randomised trial of local versus general anaesthesia for carotid endarterectomy was proposed to determine whether type of anaesthesia influences peri-operative morbidity and mortality, quality of life and longer term outcome in terms of stroke-free survival. Methods/design: A two-arm, parallel group, multicentre randomised controlled trial with a recruitment target of 5000 patients. For entry into the study, in the opinion of the responsible clinician, the patient requiring an endarterectomy must be suitable for either local or general anaesthesia, and have no clear indication for either type. All patients with symptomatic or asymptomatic internal carotid stenosis for whom open surgery is advised are eligible. There is no upper age limit. Exclusion criteria are: no informed consent; definite preference for local or general anaesthetic by the clinician or patient; patient unlikely to be able to co-operate with awake testing during local anaesthesia; patient requiring simultaneous bilateral carotid endarterectomy; carotid endarterectomy combined with another operation such as coronary bypass surgery; and, the patient has been randomised into the trial previously. Patients are randomised to local or general anaesthesia by the central trial office. The primary outcome is the proportion of patients alive, stroke free ( including retinal infarction) and without myocardial infarction 30 days post-surgery. Secondary outcomes include the proportion of patients alive and stroke free at one year; health related quality of life at 30 days; surgical adverse events, re-operation and re-admission rates; the relative cost of the two methods of anaesthesia; length of stay and intensive and high dependency bed occupancy

    Holographic dark information energy: Predicted dark energy measurement

    Get PDF
    Several models have been proposed to explain the dark energy that is causing universe expansion to accelerate. Here the acceleration predicted by the Holographic Dark Information Energy (HDIE) model is compared to the acceleration that would be produced by a cosmological constant. While identical to a cosmological constant at low redshifts, z<1, the HDIE model results in smaller Hubble parameter values at higher redshifts, z>1, reaching a maximum difference of 2.6 +-0.5% around z~1.7. The next generation of dark energy measurements, both those scheduled to be made in space (ESA's Euclid and NASA's WFIRST missions) and those to be made on the ground (BigBOSS, LSST and Dark Energy Survey), should be capable of determining whether such a difference signature exists or not. The HDIE model is therefore falsifiable.Comment: 13 pages, 1 figur
    • 

    corecore