482 research outputs found

    PyNN: A Common Interface for Neuronal Network Simulators

    Get PDF
    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN

    NASA Sea Ice Validation Program for the Defense Meteorological Satellite Program Special Sensor Microwave Imager

    Get PDF
    The history of the program is described along with the SSM/I sensor, including its calibration and geolocation correction procedures used by NASA, SSM/I data flow, and the NASA program to distribute polar gridded SSM/I radiances and sea ice concentrations (SIC) on CD-ROMs. Following a discussion of the NASA algorithm used to convert SSM/I radiances to SICs, results of 95 SSM/I-MSS Landsat IC comparisons for regions in both the Arctic and the Antarctic are presented. The Landsat comparisons show that the overall algorithm accuracy under winter conditions is 7 pct. on average with 4 pct. negative bias. Next, high resolution active and passive microwave image mosaics from coordinated NASA and Navy aircraft underflights over regions of the Beaufort and Chukchi seas in March 1988 were used to show that the algorithm multiyear IC accuracy is 11 pct. on average with a positive bias of 12 pct. Ice edge crossings of the Bering Sea by the NASA DC-8 aircraft were used to show that the SSM/I 15 pct. ice concentration contour corresponds best to the location of the initial bands at the ice edge. Finally, a summary of results and recommendations for improving the SIC retrievals from spaceborne radiometers are provided

    Assumptions behind grammatical approaches to code-switching: when the blueprint is a red herring

    Get PDF
    Many of the so-called ‘grammars’ of code-switching are based on various underlying assumptions, e.g. that informal speech can be adequately or appropriately described in terms of ‘‘grammar’’; that deep, rather than surface, structures are involved in code-switching; that one ‘language’ is the ‘base’ or ‘matrix’; and that constraints derived from existing data are universal and predictive. We question these assumptions on several grounds. First, ‘grammar’ is arguably distinct from the processes driving speech production. Second, the role of grammar is mediated by the variable, poly-idiolectal repertoires of bilingual speakers. Third, in many instances of CS the notion of a ‘base’ system is either irrelevant, or fails to explain the facts. Fourth, sociolinguistic factors frequently override ‘grammatical’ factors, as evidence from the same language pairs in different settings has shown. No principles proposed to date account for all the facts, and it seems unlikely that ‘grammar’, as conventionally conceived, can provide definitive answers. We conclude that rather than seeking universal, predictive grammatical rules, research on CS should focus on the variability of bilingual grammars

    Mapping Planetary Volcanic Deposits: Identifying Vents and Distinguishing between Effects of Eruption Conditions and Local Storage and Release on Flow Field Morphology

    Get PDF
    Terrestrial geologic mapping techniques are regularly used for "photogeologic" mapping of other planets, but these approaches are complicated by the diverse type, areal coverage, and spatial resolution of available data sets. When available, spatially-limited in-situ human and/or robotic surface observations can sometimes introduce a level of detail that is difficult to integrate with regional or global interpretations. To assess best practices for utilizing observations acquired from orbit and on the surface, our team conducted a comparative study of geologic mapping and interpretation techniques. We compared maps generated for the same area in the San Francisco Volcanic Field (SFVF) in northern Arizona using 1) data collected for reconnaissance before and during the 2010 Desert Research And Technology Studies campaign, and 2) during a traditional, terrestrial field geology study. The operations, related results, and direct mapping comparisons are discussed in companion LPSC abstracts. Here we present new geologic interpretations for a volcanic cone and related lava flows as derived from all approaches involved in this study. Mapping results indicate a need for caution when interpreting past eruption conditions on other planetary surfaces from orbital data alone

    Comparing and Reconciling Traditional Field and Photogeologic Mapping Techniques: Lessons from the San Francisco Volcanic Field, Arizona

    Get PDF
    Cartographic products and - specifically - geologic maps provide critical assistance for establishing physical and temporal frameworks of planetary surfaces. The technical methods that result in the creation of geologic maps vary depending on how observations are made as well as the overall intent of the final products [1-3]. These methods tend to follow a common linear work flow, including the identification and delineation of spatially and temporally discrete materials (units), the documentation of their primary (emplacement) and secondary (erosional) characteristics, analysis of the relative and absolute age relationships between these materials, and the collation of observations and interpretations into an objective map product. The "objectivity" of a map is critical cross comparison with overlapping maps and topical studies as well as its relevance to scientific posterity. However, the "accuracy" and "correctness" of a geologic map is very subject to debate. This can be evidenced by comparison of existing geologic maps at various scales, particularly those compiled through field- and remote-based mapped efforts. Our study focuses on comparing the fidelity of (1) "Apollo-style" geologic investigations, where typically non-geologist crew members follow static traverse routes established through pre-mission planning, and (2) "traditional" field-based investigations, where geologists are given free rein to observe without preplanned routes. This abstract summarizes the regional geology wherein our study was conducted, presents the geologic map created from traditional field mapping techniques, and offers basic insights into how geologic maps created from different tactics can be reconciled in support of exploratory missions. Additional abstracts [4-6] from this study discuss various exploration and science results of these efforts

    Mapping Planetary Volcanic Deposits: Identifying Vents and Distingushing between Effects of Eruption Conditions and Local Lava Storage and Release on Flow Field Morphology

    Get PDF
    Terrestrial geologic mapping techniques are regularly used for "photogeologic" mapping of other planets, but these approaches are complicated by the diverse type, areal coverage, and spatial resolution of available data sets. When available, spatially-limited in-situ human and/or robotic surface observations can sometimes introduce a level of detail that is difficult to integrate with regional or global interpretations. To assess best practices for utilizing observations acquired from orbit and on the surface, our team conducted a comparative study of geologic mapping and interpretation techniques. We compared maps generated for the same area in the San Francisco Volcanic Field (SFVF) in northern Arizona using 1) data collected for reconnaissance before and during the 2010 Desert Research And Technology Studies campaign, and 2) during a traditional, terrestrial field geology study. The operations, related results, and direct mapping comparisons are discussed in companion LPSC abstracts [1-3]. Here we present new geologic interpretations for a volcanic cone and related lava flows as derived from all approaches involved in this study. Mapping results indicate a need for caution when interpreting past eruption conditions on other planetary surfaces from orbital data alone

    FIB and MIP: understanding nanoscale porosity in molecularly imprinted polymers via 3D FIB/SEM tomography

    Get PDF
    We present combined focused ion beam/scanning electron beam (FIB/SEM) tomography as innovative method for differentiating and visualizing the distribution and connectivity of pores within molecularly imprinted polymers (MIPs) and non-imprinted control polymers (NIPs). FIB/SEM tomography is used in cell biology for elucidating three-dimensional structures such as organelles, but has not yet been extensively applied for visualizing the heterogeneity of nanoscopic pore networks, interconnectivity, and tortuosity in polymers. To our best knowledge, the present study is the first application of this strategy for analyzing the nanoscale porosity of MIPs. MIPs imprinted for propranolol – and the corresponding NIPs – were investigated establishing FIB/SEM tomography as a viable future strategy complementing conventional isotherm studies. For visualizing and understanding the properties of pore networks in detail, polymer particles were stained with osmium tetroxide (OsO4) vapor, and embedded in epoxy resin. Staining with OsO4 provides excellent contrast during high-resolution SEM imaging. After optimizing the threshold to discriminate between the stained polymer matrix, and pores filled with epoxy resin, a 3D model of the sampled volume may be established for deriving not only the pore volume and pore surface area, but also to visualize the interconnectivity and tortuosity of the pores within the sampled polymer volume. Detailed studies using different types of cross-linkers and the effect of hydrolysis on the resulting polymer properties have been investigated. In comparison of MIP and NIP, it could be unambiguously shown that the interconnectivity of the visualized pores in MIPs is significantly higher vs. the non-imprinted polymer, and that the pore volume and pore area is 34% and approx. 35% higher within the MIP matrix. This confirms that the templating process not only induces selective binding sites, but indeed also affects the physical properties of such polymers down to the nanoscale, and that additional chemical modification, e.g., via hydrolysis clearly affects that nature of the polymer

    SpineCreator: a Graphical User Interface for the Creation of Layered Neural Models.

    Get PDF
    There is a growing requirement in computational neuroscience for tools that permit collaborative model building, model sharing, combining existing models into a larger system (multi-scale model integration), and are able to simulate models using a variety of simulation engines and hardware platforms. Layered XML model specification formats solve many of these problems, however they are difficult to write and visualise without tools. Here we describe a new graphical software tool, SpineCreator, which facilitates the creation and visualisation of layered models of point spiking neurons or rate coded neurons without requiring the need for programming. We demonstrate the tool through the reproduction and visualisation of published models and show simulation results using code generation interfaced directly into SpineCreator. As a unique application for the graphical creation of neural networks, SpineCreator represents an important step forward for neuronal modelling

    A Comprehensive Workflow for General-Purpose Neural Modeling with Highly Configurable Neuromorphic Hardware Systems

    Full text link
    In this paper we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware-experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results

    The 50 Constellation Priority Sites

    Get PDF
    The Constellation program (CxP) has developed a list of 50 sites of interest on the Moon which will be targeted by the LRO narrow angle camera. The list has also been provided to the M~ team to supplement their targeting list. This list does not represent a "site selection" process; rather the goal was to find "representative" sites and terrains to understand the range of possible surface conditions for human lunar exploration to aid engineering design and operational planning. The list compilers leveraged heavily on past site selection work (e.g. Geoscience and a Lunar Base Workshop - 1988, Site Selection Strategy for a Lunar Outpost - 1990, Exploration Systems Architecture Study (ESAS) - 2005). Considerations included scientific, resource utilization, and operational merits, and a desire to span lunar terrain types. The targets have been organized into two "tiers" of 25 sites each to provide a relative priority ranking in the event of mutual interference. A LEAG SAT (special action team) was established to validate and recommend modifications to the list. This SAT was chaired by Dr. Paul Lucey. They provided their final results to CxP in May. Dr. Wendell Mendell will organize an on-going analysis of the data as they come down to ensure data quality and determine if and when a site has sufficient data to be retired from the list. The list was compiled using the best available data, however, it is understood that with the flood of new lunar data, minor modifications or adjustments may be required
    • 

    corecore