233 research outputs found

    Converging Indicators for Assessing Individual Differences in Adaptation to Extreme Environments: Preliminary Report

    Get PDF
    This paper describes the development and validation of a new methodology for assessing the deleterious effects of spaceflight on crew health and performance. It is well known that microgravity results in various physiological alterations, e.g., headward fluid shifts which can impede physiological adaptation. Other factors that may affect crew operational efficiency include disruption of sleep-wake cycles, high workload, isolation, confinement, stress and fatigue. From an operational perspective, it is difficult to predict which individuals will be most or least affected in this unique environment given that most astronauts are first-time flyers. During future lunar and Mars missions space crews will include both men and women of multi-national origins, different professional backgrounds, and various states of physical condition. Therefore, new methods or technologies are needed to monitor and predict astronaut performance and health, and to evaluate the effects of various countermeasures on crew during long duration missions. This paper reviews several studies conducted in both laboratory and operational environments with men and women ranging in age between 18 to 50 years. The studies included the following: soldiers performing command and control functions during mobile operations in enclosed armored vehicles; subjects participating in laboratory tests of an anti-motion sickness medication; subjects exposed to chronic hypergravity aboard a centrifuge, and subject responses to 36-hours of sleep deprivation. Physiological measurements, performance metrics, and subjective self-reports were collected in each study. The results demonstrate that multivariate converging indicators provide a significantly more reliable method for assessing environmental effects on performance and health than any single indicator

    Absolute Calibration and Characterization of the Multiband Imaging Photometer for Spitzer. II. 70 micron Imaging

    Get PDF
    The absolute calibration and characterization of the Multiband Imaging Photometer for Spitzer (MIPS) 70 micron coarse- and fine-scale imaging modes are presented based on over 2.5 years of observations. Accurate photometry (especially for faint sources) requires two simple processing steps beyond the standard data reduction to remove long-term detector transients. Point spread function (PSF) fitting photometry is found to give more accurate flux densities than aperture photometry. Based on the PSF fitting photometry, the calibration factor shows no strong trend with flux density, background, spectral type, exposure time, or time since anneals. The coarse-scale calibration sample includes observations of stars with flux densities from 22 mJy to 17 Jy, on backgrounds from 4 to 26 MJy sr^-1, and with spectral types from B to M. The coarse-scale calibration is 702 +/- 35 MJy sr^-1 MIPS70^-1 (5% uncertainty) and is based on measurements of 66 stars. The instrumental units of the MIPS 70 micron coarse- and fine-scale imaging modes are called MIPS70 and MIPS70F, respectively. The photometric repeatability is calculated to be 4.5% from two stars measured during every MIPS campaign and includes variations on all time scales probed. The preliminary fine-scale calibration factor is 2894 +/- 294 MJy sr^-1 MIPS70F^-1 (10% uncertainty) based on 10 stars. The uncertainty in the coarse- and fine-scale calibration factors are dominated by the 4.5% photometric repeatability and the small sample size, respectively. The 5-sigma, 500 s sensitivity of the coarse-scale observations is 6-8 mJy. This work shows that the MIPS 70 micron array produces accurate, well calibrated photometry and validates the MIPS 70 micron operating strategy, especially the use of frequent stimulator flashes to track the changing responsivities of the Ge:Ga detectors.Comment: 19 pages, PASP, in pres

    Using NEURON for Reaction-Diffusion Modeling of Extracellular Dynamics

    Get PDF
    Development of credible clinically-relevant brain simulations has been slowed due to a focus on electrophysiology in computational neuroscience, neglecting the multiscale whole-tissue modeling approach used for simulation in most other organ systems. We have now begun to extend the NEURON simulation platform in this direction by adding extracellular modeling. The extracellular medium of neural tissue is an active medium of neuromodulators, ions, inflammatory cells, oxygen, NO and other gases, with additional physiological, pharmacological and pathological agents. These extracellular agents influence, and are influenced by, cellular electrophysiology, and cellular chemophysiology—the complex internal cellular milieu of second-messenger signaling and cascades. NEURON's extracellular reaction-diffusion is supported by an intuitive Python-based where/who/what command sequence, derived from that used for intracellular reaction diffusion, to support coarse-grained macroscopic extracellular models. This simulation specification separates the expression of the conceptual model and parameters from the underlying numerical methods. In the volume-averaging approach used, the macroscopic model of tissue is characterized by free volume fraction—the proportion of space in which species are able to diffuse, and tortuosity—the average increase in path length due to obstacles. These tissue characteristics can be defined within particular spatial regions, enabling the modeler to account for regional differences, due either to intrinsic organization, particularly gray vs. white matter, or to pathology such as edema. We illustrate simulation development using spreading depression, a pathological phenomenon thought to play roles in migraine, epilepsy and stroke. Simulation results were verified against analytic results and against the extracellular portion of the simulation run under FiPy. The creation of this NEURON interface provides a pathway for interoperability that can be used to automatically export this class of models into complex intracellular/extracellular simulations and future cross-simulator standardization

    Imaging Flash Lidar for Safe Landing on Solar System Bodies and Spacecraft Rendezvous and Docking

    Get PDF
    NASA has been pursuing flash lidar technology for autonomous, safe landing on solar system bodies and for automated rendezvous and docking. During the final stages of the landing from about 1 kilometer to 500 meters above the ground, the flash lidar can generate 3-Dimensional images of the terrain to identify hazardous features such as craters, rocks, and steep slopes. The onboard flight computer can then use the 3-D map of terrain to guide the vehicle to a safe location. As an automated rendezvous and docking sensor, the flash lidar can provide relative range, velocity, and bearing from an approaching spacecraft to another spacecraft or a space station. NASA Langley Research Center has developed and demonstrated a flash lidar sensor system capable of generating 16,000 pixels range images with 7 centimeters precision, at 20 Hertz frame rate, from a maximum slant range of 1800 m from the target area. This paper describes the lidar instrument and presents the results of recent flight tests onboard a rocket-propelled free-flyer vehicle (Morpheus) built by NASA Johnson Space Center. The flights were conducted at a simulated lunar terrain site, consisting of realistic hazard features and designated landing areas, built at NASA Kennedy Space Center specifically for this demonstration test. This paper also provides an overview of the plan for continued advancement of the flash lidar technology aimed at enhancing its performance to meet both landing and automated rendezvous and docking applications

    The sweet spot in sustainability: a framework for corporate assessment in sugar manufacturing

    Get PDF
    The assessment of corporate sustainability has become an increasingly important topic, both within academia and in industry. For manufacturing companies to conform to their commitments to sustainable development, a standard and reliable measurement framework is required. There is, however, a lack of sector-specific and empirical research in many areas, including the sugar industry. This paper presents an empirically developed framework for the assessment of corporate sustainability within the Thai sugar industry. Multiple case studies were conducted, and a survey using questionnaires was also employed to enhance the power of generalisation. The developed framework is an accurate and reliable measurement instrument of corporate sustainability, and guidelines to assess qualitative criteria are put forward. The proposed framework can be used for a company’s self-assessment and for guiding practitioners in performance improvement and policy decision-maki

    Global reorganization of deep-sea circulation and carbon storage after the last ice age

    Get PDF
    Funding information: This work was supported by grants from the National Science Foundation (OCE-2015647 and OCE-2032340 to PAR; OCE- 2032343 to MPH); NERC grant NE/N011716/1 to JWBR and NERC grant NE/M004619/1 to AB.Using new and published marine fossil radiocarbon (14C/C) measurements, a tracer uniquely sensitive to circulation and air-sea gas exchange, we establish several benchmarks for Atlantic, Southern, and Pacific deep-sea circulation and ventilation since the last ice age. We find the most 14C-depleted water in glacial Pacific bottom depths, rather than the mid-depths as they are today, which is best explained by a slowdown in glacial deep-sea overturning in addition to a “flipped” glacial Pacific overturning configuration. These observations cannot be produced by changes in air-sea gas exchange alone, and they underscore the major role for changes in the overturning circulation for glacial deep-sea carbon storage in the vast Pacific abyss and the concomitant drawdown of atmospheric CO2.Publisher PDFPeer reviewe

    NetPyNE, a tool for data-driven multiscale modeling of brain circuits

    Get PDF
    Biophysical modeling of neuronal networks helps to integrate and interpret rapidly growing and disparate experimental datasets at multiple scales. The NetPyNE tool (www.netpyne.org) provides both programmatic and graphical interfaces to develop data-driven multiscale network models in NEURON. NetPyNE clearly separates model parameters from implementation code. Users provide specifications at a high level via a standardized declarative language, for example connectivity rules, to create millions of cell-to-cell connections. NetPyNE then enables users to generate the NEURON network, run efficiently parallelized simulations, optimize and explore network parameters through automated batch runs, and use built-in functions for visualization and analysis – connectivity matrices, voltage traces, spike raster plots, local field potentials, and information theoretic measures. NetPyNE also facilitates model sharing by exporting and importing standardized formats (NeuroML and SONATA). NetPyNE is already being used to teach computational neuroscience students and by modelers to investigate brain regions and phenomena
    corecore