48,372 research outputs found

    A Rewriting-Logic-Based Technique for Modeling Thermal Systems

    Full text link
    This paper presents a rewriting-logic-based modeling and analysis technique for physical systems, with focus on thermal systems. The contributions of this paper can be summarized as follows: (i) providing a framework for modeling and executing physical systems, where both the physical components and their physical interactions are treated as first-class citizens; (ii) showing how heat transfer problems in thermal systems can be modeled in Real-Time Maude; (iii) giving the implementation in Real-Time Maude of a basic numerical technique for executing continuous behaviors in object-oriented hybrid systems; and (iv) illustrating these techniques with a set of incremental case studies using realistic physical parameters, with examples of simulation and model checking analyses.Comment: In Proceedings RTRTS 2010, arXiv:1009.398

    FLIAT, an object-relational GIS tool for flood impact assessment in Flanders, Belgium

    Get PDF
    Floods can cause damage to transportation and energy infrastructure, disrupt the delivery of services, and take a toll on public health, sometimes even causing significant loss of life. Although scientists widely stress the compelling need for resilience against extreme events under a changing climate, tools for dealing with expected hazards lag behind. Not only does the socio-economic, ecologic and cultural impact of floods need to be considered, but the potential disruption of a society with regard to priority adaptation guidelines, measures, and policy recommendations need to be considered as well. The main downfall of current impact assessment tools is the raster approach that cannot effectively handle multiple metadata of vital infrastructures, crucial buildings, and vulnerable land use (among other challenges). We have developed a powerful cross-platform flood impact assessment tool (FLIAT) that uses a vector approach linked to a relational database using open source program languages, which can perform parallel computation. As a result, FLIAT can manage multiple detailed datasets, whereby there is no loss of geometrical information. This paper describes the development of FLIAT and the performance of this tool

    Focusing Attention on the Health Aspects of Foods Changes Value Signals in vmPFC and Improves Dietary Choice

    Get PDF
    Attention is thought to play a key role in the computation of stimulus values at the time of choice, which suggests that attention manipulations could be used to improve decision-making in domains where self-control lapses are pervasive. We used an fMRI food choice task with non-dieting human subjects to investigate whether exogenous cues that direct attention to the healthiness of foods could improve dietary choices. Behaviorally, we found that subjects made healthier choices in the presence of health cues. In parallel, stimulus value signals in ventromedial prefrontal cortex were more responsive to the healthiness of foods in the presence of health cues, and this effect was modulated by activity in regions of dorsolateral prefrontal cortex. These findings suggest that the neural mechanisms used in successful self-control can be activated by exogenous attention cues, and provide insights into the processes through which behavioral therapies and public policies could facilitate self-control

    BioWorkbench: A High-Performance Framework for Managing and Analyzing Bioinformatics Experiments

    Get PDF
    Advances in sequencing techniques have led to exponential growth in biological data, demanding the development of large-scale bioinformatics experiments. Because these experiments are computation- and data-intensive, they require high-performance computing (HPC) techniques and can benefit from specialized technologies such as Scientific Workflow Management Systems (SWfMS) and databases. In this work, we present BioWorkbench, a framework for managing and analyzing bioinformatics experiments. This framework automatically collects provenance data, including both performance data from workflow execution and data from the scientific domain of the workflow application. Provenance data can be analyzed through a web application that abstracts a set of queries to the provenance database, simplifying access to provenance information. We evaluate BioWorkbench using three case studies: SwiftPhylo, a phylogenetic tree assembly workflow; SwiftGECKO, a comparative genomics workflow; and RASflow, a RASopathy analysis workflow. We analyze each workflow from both computational and scientific domain perspectives, by using queries to a provenance and annotation database. Some of these queries are available as a pre-built feature of the BioWorkbench web application. Through the provenance data, we show that the framework is scalable and achieves high-performance, reducing up to 98% of the case studies execution time. We also show how the application of machine learning techniques can enrich the analysis process

    Information Theory’s failure in neuroscience: on the limitations of cybernetics

    Get PDF
    In Cybernetics (1961 Edition), Professor Norbert Wiener noted that “The role of information and the technique of measuring and transmitting information constitute a whole discipline for the engineer, for the neuroscientist, for the psychologist, and for the sociologist”. Sociology aside, the neuroscientists and the psychologists inferred “information transmitted” using the discrete summations from Shannon Information Theory. The present author has since scrutinized the psychologists’ approach in depth, and found it wrong. The neuroscientists’ approach is highly related, but remains unexamined. Neuroscientists quantified “the ability of [physiological sensory] receptors (or other signal-processing elements) to transmit information about stimulus parameters”. Such parameters could vary along a single continuum (e.g., intensity), or along multiple dimensions that altogether provide a Gestalt – such as a face. Here, unprecedented scrutiny is given to how 23 neuroscience papers computed “information transmitted” in terms of stimulus parameters and the evoked neuronal spikes. The computations relied upon Shannon’s “confusion matrix”, which quantifies the fidelity of a “general communication system”. Shannon’s matrix is square, with the same labels for columns and for rows. Nonetheless, neuroscientists labelled the columns by “stimulus category” and the rows by “spike-count category”. The resulting “information transmitted” is spurious, unless the evoked spike-counts are worked backwards to infer the hypothetical evoking stimuli. The latter task is probabilistic and, regardless, requires that the confusion matrix be square. Was it? For these 23 significant papers, the answer is No

    Recognition and reconstruction of coherent energy with application to deep seismic reflection data

    Get PDF
    Reflections in deep seismic reflection data tend to be visible on only a limited number of traces in a common midpoint gather. To prevent stack degeneration, any noncoherent reflection energy has to be removed. In this paper, a standard classification technique in remote sensing is presented to enhance data quality. It consists of a recognition technique to detect and extract coherent energy in both common shot gathers and fi- nal stacks. This technique uses the statistics of a picked seismic phase to obtain the likelihood distribution of its presence. Multiplication of this likelihood distribution with the original data results in a “cleaned up” section. Application of the technique to data from a deep seismic reflection experiment enhanced the visibility of all reflectors considerably. Because the recognition technique cannot produce an estimate of “missing” data, it is extended with a reconstruction method. Two methods are proposed: application of semblance weighted local slant stacks after recognition, and direct recognition in the linear tau-p domain. In both cases, the power of the stacking process to increase the signal-to-noise ratio is combined with the direct selection of only specific seismic phases. The joint application of recognition and reconstruction resulted in data images which showed reflectors more clearly than application of a single technique
    • 

    corecore