15,876 research outputs found

    21st Century Simulation: Exploiting High Performance Computing and Data Analysis

    Get PDF
    This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in computing power. This has been characterized as a ten-year lead over the use of single-processor computers. Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power. JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants, and to understand non-linear, asymmetric warfare. These requirements stretch both current computational techniques and data analysis methodologies. In this paper, documented examples and potential solutions will be advanced. The authors discuss the paths to successful implementation based on their experience. Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch, database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses. The modeling and simulation community has significant potential to provide more opportunities for training and analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights, for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses. The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success

    eXamine: a Cytoscape app for exploring annotated modules in networks

    Get PDF
    Background. Biological networks have growing importance for the interpretation of high-throughput "omics" data. Statistical and combinatorial methods allow to obtain mechanistic insights through the extraction of smaller subnetwork modules. Further enrichment analyses provide set-based annotations of these modules. Results. We present eXamine, a set-oriented visual analysis approach for annotated modules that displays set membership as contours on top of a node-link layout. Our approach extends upon Self Organizing Maps to simultaneously lay out nodes, links, and set contours. Conclusions. We implemented eXamine as a freely available Cytoscape app. Using eXamine we study a module that is activated by the virally-encoded G-protein coupled receptor US28 and formulate a novel hypothesis about its functioning

    Evaluation of the game development process of a location-based mobile game

    Get PDF
    There is a growing interest of government bodies and NGOs in using (serious) video games in awareness campaigns. Until now, however, little was known on how to set up such a campaign so as to effectively cater to the needs of different stakeholders including the target audience. Hence designing, developing and translating a game for educational purposes whilst balancing between fun and learning is a complex process, this paper aims to evaluate this by presenting a methodological framework for involving stakeholders in the design and development of a game-based awareness campaign based on a user-centered software design methodology and assesses its effectiveness in a concrete use case: the development of the location-based mobile game City Jam. The goal was to develop a game-based road safety campaign to confront adolescents with road traffic situations with the aim to positively influence road safety attitude and behavior. Mobile technologies offer new opportunities to embed digital game based learning by in different contexts. Given the nature of the road safety campaign, a location-based game format was chosen, aiming to facilitate learning by means of an extended three-way interaction (human interaction, game and context). Different user-centered design methods were deployed throughout several phases of the game development process: In phase one (the opportunity identification) a literature review was performed to investigate relevant fields for the game’s goal. In phase two (the game concept development) expert interviews and a focus groups were conducted with relevant stakeholders and in phase three (the game concept design) co-design sessions and a focus group resulted in a game design document. In phase four (game development and testing) the beta version of City jam was developed and tested in an iterative field testing design and resulted in the final game. Results obtained throughout the game development process provided us the opportunity to evaluate several major aspects. Firstly the impact of stakeholder involvement on the different phases of the design process and the final product resulted in a game that was tailored to the preferences and needs of the target group. Secondly translating the game concept into practice, such as game elements, proposed educational game elements, were evaluated based on the usability, playability principles and social and technological aspects. Benefits and challenges of user-centered design methods are discussed and how budget constraints and differing desired outcomes of different stakeholders challenge but also enrich the process

    Thinking through the implications of neural reuse for the additive factors method

    Get PDF
    One method for uncovering the subprocesses of mental processes is the “Additive Factors Method” (AFM). The AFM uses reaction time data from factorial experiments to infer the presence of separate processing stages. This paper investigates the conceptual status of the AFM. It argues that one of the AFM’s underlying assumptions is problematic in light of recent developments in cognitive neuroscience. Discussion begins by laying out the basic logic of the AFM, followed by an analysis of the challenge presented by neural reuse. Following this, implications are analysed and avenues of response considered

    Towards Exascale Scientific Metadata Management

    Full text link
    Advances in technology and computing hardware are enabling scientists from all areas of science to produce massive amounts of data using large-scale simulations or observational facilities. In this era of data deluge, effective coordination between the data production and the analysis phases hinges on the availability of metadata that describe the scientific datasets. Existing workflow engines have been capturing a limited form of metadata to provide provenance information about the identity and lineage of the data. However, much of the data produced by simulations, experiments, and analyses still need to be annotated manually in an ad hoc manner by domain scientists. Systematic and transparent acquisition of rich metadata becomes a crucial prerequisite to sustain and accelerate the pace of scientific innovation. Yet, ubiquitous and domain-agnostic metadata management infrastructure that can meet the demands of extreme-scale science is notable by its absence. To address this gap in scientific data management research and practice, we present our vision for an integrated approach that (1) automatically captures and manipulates information-rich metadata while the data is being produced or analyzed and (2) stores metadata within each dataset to permeate metadata-oblivious processes and to query metadata through established and standardized data access interfaces. We motivate the need for the proposed integrated approach using applications from plasma physics, climate modeling and neuroscience, and then discuss research challenges and possible solutions
    • …
    corecore