1,143 research outputs found

    Distributed Finite Element Analysis Using a Transputer Network

    Get PDF
    The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the 80,000transputernetworkdemonstratedacost−performanceratioabout60timesbetterthanthe80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the 15,000,000 Cray X-MP24 system

    On line tracking of moving objects from moving platforms

    Full text link
    It is desired to position the end point of a conveyor belt, which carries material removed by a moving pavement trimmer, above the bed of a moving dump truck. The present report describes the analytical design and practical control of a tracking system for positioning the conveyor. Initial tests were conducted on a Unimation PUMA robot. The original pavement profiler has been modified to allow automatic computer control of both the soil removal and distribution systems. The distribution is performed by a two degrees of freedom moveable boom with a conveyor system. Two methods for non-contact target position detection were evaluated: machine vision and ultrasound. An ultrasound based target system was selected and implemented on a PUMA robot. Control software for on-line target acquisition and tracking was developed and tested. A set of ultrasound sensors and a boom rotation sensor were installed on the pavement profiler. All sensors are currently operational

    Advanced Engineering Laboratory project summaries 1992

    Get PDF
    The Advanced Engineering Laboratory of the Woods Hole Oceanographic Institution is a development laboratory within the Applied Ocean Physics and Engineering Deparment. Its function is the development of oceanographic instrumentation to test developing theories in oceanography, and to enhance current research projects in other disciplines within the community. This report summarzes recent and ongoing projects perfomied by members of this laboratory

    RRS Discovery Cruise DY017, 20 Oct - 05 Nov 2014. Outer Hebrides process cruise

    Get PDF
    The continental shelf region immediately west of the UK and North of Ireland is thought to be a key region for the exchange of nutrients, carbon and water between the NW European continental shelf and the open North Atlantic Ocean yet it remains comparatively under sampled. Within the context of the NERC/DEFRA co-funded Shelf Sea Biogeochemistry programme, which aims to improve our understanding of the role of shelf seas in the global carbon cycle, this cruise undertook a regional scale survey to determine the distribution and concentrations of dissolved inorganic carbon, inorganic nutrients, trace metals, and other ancillary data on the Malin and Hebridean Shelves. Of the seven planned transects, six were completed with the seventh abandoned due to poor weather but a rich dataset of key biogeochemical parameters has been collected which will enable work on the stoichiometry of dissolved nutrients and exchange with the open ocean to be undertaken

    Stratus Ocean Reference Station (20˚S, 85˚W) : mooring recovery and deployment cruise, R/V Ronald H. Brown Cruise 06-07, October 9–October 27, 2006

    Get PDF
    The Ocean Reference Station at 20°S, 85°W under the stratus clouds west of northern Chile is being maintained to provide ongoing, climate-quality records of surface meteorology, of air-sea fluxes of heat, freshwater, and momentum, and of upper ocean temperature, salinity, and velocity variability. The Stratus Ocean Reference Station (ORS Stratus) is supported by the National Oceanic and Atmospheric Administrations (NOAA) Climate Observation Program. It is recovered and redeployed annually, with cruises that have come between October and December. During the October 2006 cruise of NOAA's R/V Ronald H. Brown to the ORS Stratus site, the primary activities where recovery of the Stratus 6 WHOI surface mooring that had been deployed in October 2005, deployment of a new (Stratus 7) WHOI surface mooring at that site, in-situ calibration of the buoy meteorological sensors by comparison with instrumentation pub on board by staff of the NOAA Earth System Research Laboratory (ESRL, formerly ETL), and observations of the stratus clouds and lower atmosphere by NOAA ESRL. A buoy for the Pacific tsunami warning system was also serviced in collaboration with the Hydrographic and Oceanographic Service of the Chilean Navy (SHOA). The old DART (Deep-Ocean Assessment and Reporting of Tsunami) buoy was recovered and a new one deployed which carried IMET sensors and subsurface oceanographic instruments. Argo floats and drifters were also launched and CTD casts carried out during the cruise. The ORS Stratus buoys are equipped with two Improved Meteorological (IMET) systems, which provide surface wind speed and direction, air temperature, relative humidity, barometric pressure, incoming shortwave radiation, incoming longwave radiation, precipitation rate, and sea surface temperature. The IMET data are made available in near real time using satellite telemetry. The mooring line carries instruments to measure ocean salinity, temperature, and currents. The ESRL instrumentation used during the 2006 cruise included cloud radar, radiosonde balloons, and sensors for mean and turbulent surface meteorology. Stratus 7 also received a new addition to its set of sensors: a partial CO2 detector from the Pacific Marine Environmental Laboratory (PMEL). Aerosol measurements were also carried out onboard RHB by personnel of the University of Hawaii. Finally, the cruise hosted a teacher participating in NOAA's Teacher at Sea Program.Funding was provided by the National Oceanic and Atmospheric Administration under Grant No. NA17RJ1223

    Data security in European healthcare information systems

    Get PDF
    This thesis considers the current requirements for data security in European healthcare systems and establishments. Information technology is being increasingly used in all areas of healthcare operation, from administration to direct care delivery, with a resulting dependence upon it by healthcare staff. Systems routinely store and communicate a wide variety of potentially sensitive data, much of which may also be critical to patient safety. There is consequently a significant requirement for protection in many cases. The thesis presents an assessment of healthcare security requirements at the European level, with a critical examination of how the issue has been addressed to date in operational systems. It is recognised that many systems were originally implemented without security needs being properly addressed, with a consequence that protection is often weak and inconsistent between establishments. The overall aim of the research has been to determine appropriate means by which security may be added or enhanced in these cases. The realisation of this objective has included the development of a common baseline standard for security in healthcare systems and environments. The underlying guidelines in this approach cover all of the principal protection issues, from physical and environmental measures to logical system access controls. Further to this, the work has encompassed the development of a new protection methodology by which establishments may determine their additional security requirements (by classifying aspects of their systems, environments and data). Both the guidelines and the methodology represent work submitted to the Commission of European Communities SEISMED (Secure Environment for Information Systems in MEDicine) project, with which the research programme was closely linked. The thesis also establishes that healthcare systems can present significant targets for both internal and external abuse, highlighting a requirement for improved logical controls. However, it is also shown that the issues of easy integration and convenience are of paramount importance if security is to be accepted and viable in practice. Unfortunately, many traditional methods do not offer these advantages, necessitating the need for a different approach. To this end, the conceptual design for a new intrusion monitoring system was developed, combining the key aspects of authentication and auditing into an advanced framework for real-time user supervision. A principal feature of the approach is the use of behaviour profiles, against which user activities may be continuously compared to determine potential system intrusions and anomalous events. The effectiveness of real-time monitoring was evaluated in an experimental study of keystroke analysis -a behavioural biometric technique that allows an assessment of user identity from their typing style. This technique was found to have significant potential for discriminating between impostors and legitimate users and was subsequently incorporated into a fully functional security system, which demonstrated further aspects of the conceptual design and showed how transparent supervision could be realised in practice. The thesis also examines how the intrusion monitoring concept may be integrated into a wider security architecture, allowing more comprehensive protection within both the local healthcare establishment and between remote domains.Commission of European Communities SEISMED proje

    The NASA SBIR product catalog

    Get PDF
    The purpose of this catalog is to assist small business firms in making the community aware of products emerging from their efforts in the Small Business Innovation Research (SBIR) program. It contains descriptions of some products that have advanced into Phase 3 and others that are identified as prospective products. Both lists of products in this catalog are based on information supplied by NASA SBIR contractors in responding to an invitation to be represented in this document. Generally, all products suggested by the small firms were included in order to meet the goals of information exchange for SBIR results. Of the 444 SBIR contractors NASA queried, 137 provided information on 219 products. The catalog presents the product information in the technology areas listed in the table of contents. Within each area, the products are listed in alphabetical order by product name and are given identifying numbers. Also included is an alphabetical listing of the companies that have products described. This listing cross-references the product list and provides information on the business activity of each firm. In addition, there are three indexes: one a list of firms by states, one that lists the products according to NASA Centers that managed the SBIR projects, and one that lists the products by the relevant Technical Topics utilized in NASA's annual program solicitation under which each SBIR project was selected

    Parallelisation of algorithms

    Get PDF
    Most numerical software involves performing an extremely large volume of algebraic computations. This is both costly and time consuming in respect of computer resources and, for large problems, often super-computer power is required in order for results to be obtained in a reasonable amount of time. One method whereby both the cost and time can be reduced is to use the principle "Many hands make light work", or rather, allow several computers to operate simultaneously on the code, working towards a common goal, and hopefully obtaining the required results in a fraction of the time and cost normally used. This can be achieved through the modification of the costly, time consuming code, breaking it up into separate individual code segments which may be executed concurrently on different processors. This is termed parallelisation of code. This document describes communication between sequential processes, protocols, message routing and parallelisation of algorithms. In particular, it deals with these aspects with reference to the Transputer as developed by INMOS and includes two parallelisation examples, namely parallelisation of code to study airflow and of code to determine far field patterns of antennas. This document also reports on the practical experiences with programming in parallel
    • …
    corecore