122,827 research outputs found

    Overview of NASA/Marshall Space Flight Center's program on knowledge of atmospheric processes

    Get PDF
    The Marshall Space Flight Center (MSFC) is charged with the responsibility to enhance aviation safety through improving understanding of various atmospheric phenomena. A brief discussion is presented concerning the tasks and work being accomplished by MSFC. The tasks are defined as follows: (1) to determine and define the turbulence and steady wind environments induced by buildings, towers, hills, trees, etc., (2) to identify, develop, and apply natural environment technology for the reconstruction and/or simulation of the natural environment for aircraft accident investigation and hazard identification, (3) to develop basic information about free atmosphere perturbations, (4) to develop and apply fog modification mathematical models to assess candidate fog modification schemes and to develop appropriate instrumentation to aquire basic data about fog. To accomplish these tasks MSFC has developed a program involving field data acquisition, wind tunnel studies, theoretical studies, data analysis, and flight simulation studies

    Accelerating Science: A Computing Research Agenda

    Full text link
    The emergence of "big data" offers unprecedented opportunities for not only accelerating scientific advances but also enabling new modes of discovery. Scientific progress in many disciplines is increasingly enabled by our ability to examine natural phenomena through the computational lens, i.e., using algorithmic or information processing abstractions of the underlying processes; and our ability to acquire, share, integrate and analyze disparate types of data. However, there is a huge gap between our ability to acquire, store, and process data and our ability to make effective use of the data to advance discovery. Despite successful automation of routine aspects of data management and analytics, most elements of the scientific process currently require considerable human expertise and effort. Accelerating science to keep pace with the rate of data acquisition and data processing calls for the development of algorithmic or information processing abstractions, coupled with formal methods and tools for modeling and simulation of natural processes as well as major innovations in cognitive tools for scientists, i.e., computational tools that leverage and extend the reach of human intellect, and partner with humans on a broad range of tasks in scientific discovery (e.g., identifying, prioritizing formulating questions, designing, prioritizing and executing experiments designed to answer a chosen question, drawing inferences and evaluating the results, and formulating new questions, in a closed-loop fashion). This calls for concerted research agenda aimed at: Development, analysis, integration, sharing, and simulation of algorithmic or information processing abstractions of natural processes, coupled with formal methods and tools for their analyses and simulation; Innovations in cognitive tools that augment and extend human intellect and partner with humans in all aspects of science.Comment: Computing Community Consortium (CCC) white paper, 17 page

    Space odyssey: efficient exploration of scientific data.

    Get PDF
    Advances in data acquisition---through more powerful supercomputers for simulation or sensors with better resolution---help scientists tremendously to understand natural phenomena. At the same time, however, it leaves them with a plethora of data and the challenge of analysing it. Ingesting all the data in a database or indexing it for an efficient analysis is unlikely to pay off because scientists rarely need to analyse all data. Not knowing a priori what parts of the datasets need to be analysed makes the problem challenging. Tools and methods to analyse only subsets of this data are rather rare. In this paper we therefore present Space Odyssey, a novel approach enabling scientists to efficiently explore multiple spatial datasets of massive size. Without any prior information, Space Odyssey incrementally indexes the datasets and optimizes the access to datasets frequently queried together. As our experiments show, through incrementally indexing and changing the data layout on disk, Space Odyssey accelerates exploratory analysis of spatial data by substantially reducing query-to-insight time compared to the state of the art

    A Simulation Model Articulation of the REA Ontology

    Get PDF
    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise

    The Mechanics of Embodiment: A Dialogue on Embodiment and Computational Modeling

    Get PDF
    Embodied theories are increasingly challenging traditional views of cognition by arguing that conceptual representations that constitute our knowledge are grounded in sensory and motor experiences, and processed at this sensorimotor level, rather than being represented and processed abstractly in an amodal conceptual system. Given the established empirical foundation, and the relatively underspecified theories to date, many researchers are extremely interested in embodied cognition but are clamouring for more mechanistic implementations. What is needed at this stage is a push toward explicit computational models that implement sensory-motor grounding as intrinsic to cognitive processes. In this article, six authors from varying backgrounds and approaches address issues concerning the construction of embodied computational models, and illustrate what they view as the critical current and next steps toward mechanistic theories of embodiment. The first part has the form of a dialogue between two fictional characters: Ernest, the �experimenter�, and Mary, the �computational modeller�. The dialogue consists of an interactive sequence of questions, requests for clarification, challenges, and (tentative) answers, and touches the most important aspects of grounded theories that should inform computational modeling and, conversely, the impact that computational modeling could have on embodied theories. The second part of the article discusses the most important open challenges for embodied computational modelling

    Post-test simulation of a PLOFA transient test in the CIRCE-HERO facility

    Get PDF
    CIRCE is a lead–bismuth eutectic alloy (LBE) pool facility aimed to simulate the primary system of a heavy liquid metal (HLM) cooled pool-type fast reactor. The experimental facility was implemented with a new test section, called HERO (Heavy liquid mEtal pRessurized water cOoled tubes), which consists of a steam generator composed of seven double-wall bayonet tubes (DWBT) with an active length of six meters. The experimental campaign aims to investigate HERO behavior, which is representative of the tubes that will compose ALFRED SG. In the framework of the Horizon 2020 SESAME project, a transient test was selected for the realization of a validation benchmark. The test consists of a protected loss of flow accident (PLOFA) simulating the shutdown of primary pumps, the reactor scram and the activation of the DHR system. A RELAP5-3D© nodalization scheme was developed in the pre-test phase at DIAEE of “Sapienza” University of Rome, providing useful information to the experimentalists. The model consisted to a mono-dimensional scheme of the primary flow path and the SG secondary side, and a multi-dimensional component simulating the large LBE pool. The analysis of experimental data, provided by ENEA, has suggested to improve the thermal–hydraulic model with a more detailed nodalization scheme of the secondary loop, looking to reproduce the asymmetries observed on the DWBTs operation. The paper summarizes the post-test activity performed in the frame of the H2020 SESAME project as a contribution of the benchmark activity, highlighting a global agreement between simulations and experiment for all the primary circuit physical quantities monitored. Then, the attention is focused on the secondary system operation, where uncertainties related to the boundary conditions affect the computational results

    Computational and Robotic Models of Early Language Development: A Review

    Get PDF
    We review computational and robotics models of early language learning and development. We first explain why and how these models are used to understand better how children learn language. We argue that they provide concrete theories of language learning as a complex dynamic system, complementing traditional methods in psychology and linguistics. We review different modeling formalisms, grounded in techniques from machine learning and artificial intelligence such as Bayesian and neural network approaches. We then discuss their role in understanding several key mechanisms of language development: cross-situational statistical learning, embodiment, situated social interaction, intrinsically motivated learning, and cultural evolution. We conclude by discussing future challenges for research, including modeling of large-scale empirical data about language acquisition in real-world environments. Keywords: Early language learning, Computational and robotic models, machine learning, development, embodiment, social interaction, intrinsic motivation, self-organization, dynamical systems, complexity.Comment: to appear in International Handbook on Language Development, ed. J. Horst and J. von Koss Torkildsen, Routledg
    • …
    corecore