92 research outputs found

    Beyond XSPEC: Towards Highly Configurable Analysis

    Full text link
    We present a quantitative comparison between software features of the defacto standard X-ray spectral analysis tool, XSPEC, and ISIS, the Interactive Spectral Interpretation System. Our emphasis is on customized analysis, with ISIS offered as a strong example of configurable software. While noting that XSPEC has been of immense value to astronomers, and that its scientific core is moderately extensible--most commonly via the inclusion of user contributed "local models"--we identify a series of limitations with its use beyond conventional spectral modeling. We argue that from the viewpoint of the astronomical user, the XSPEC internal structure presents a Black Box Problem, with many of its important features hidden from the top-level interface, thus discouraging user customization. Drawing from examples in custom modeling, numerical analysis, parallel computation, visualization, data management, and automated code generation, we show how a numerically scriptable, modular, and extensible analysis platform such as ISIS facilitates many forms of advanced astrophysical inquiry.Comment: Accepted by PASP, for July 2008 (15 pages

    Stimfit: Quantifying electrophysiological data with Python

    Get PDF
    Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals

    Towards Molecular Simulations that are Transparent, Reproducible, Usable By Others, and Extensible (TRUE)

    Full text link
    Systems composed of soft matter (e.g., liquids, polymers, foams, gels, colloids, and most biological materials) are ubiquitous in science and engineering, but molecular simulations of such systems pose particular computational challenges, requiring time and/or ensemble-averaged data to be collected over long simulation trajectories for property evaluation. Performing a molecular simulation of a soft matter system involves multiple steps, which have traditionally been performed by researchers in a "bespoke" fashion, resulting in many published soft matter simulations not being reproducible based on the information provided in the publications. To address the issue of reproducibility and to provide tools for computational screening, we have been developing the open-source Molecular Simulation and Design Framework (MoSDeF) software suite. In this paper, we propose a set of principles to create Transparent, Reproducible, Usable by others, and Extensible (TRUE) molecular simulations. MoSDeF facilitates the publication and dissemination of TRUE simulations by automating many of the critical steps in molecular simulation, thus enhancing their reproducibility. We provide several examples of TRUE molecular simulations: All of the steps involved in creating, running and extracting properties from the simulations are distributed on open-source platforms (within MoSDeF and on GitHub), thus meeting the definition of TRUE simulations

    scenery: Flexible Virtual Reality Visualization on the Java VM

    Full text link
    Life science today involves computational analysis of a large amount and variety of data, such as volumetric data acquired by state-of-the-art microscopes, or mesh data from analysis of such data or simulations. Visualization is often the first step in making sense of data, and a crucial part of building and debugging analysis pipelines. It is therefore important that visualizations can be quickly prototyped, as well as developed or embedded into full applications. In order to better judge spatiotemporal relationships, immersive hardware, such as Virtual or Augmented Reality (VR/AR) headsets and associated controllers are becoming invaluable tools. In this work we introduce scenery, a flexible VR/AR visualization framework for the Java VM that can handle mesh and large volumetric data, containing multiple views, timepoints, and color channels. scenery is free and open-source software, works on all major platforms, and uses the Vulkan or OpenGL rendering APIs. We introduce scenery's main features and example applications, such as its use in VR for microscopy, in the biomedical image analysis software Fiji, or for visualizing agent-based simulations.Comment: Added IEEE DOI, version published at VIS 201

    A wrapper generation tool for the creation of scriptable scientific applications

    Get PDF
    Journal ArticleIn recent years, there has been considerable interest in the use of scripting languages as a mechanism for controlling and developing scientific software. Scripting languages allow scientific applications to be encapsulated in an interpreted environment similar to that found in commercial scientific packages such as MATLAB, Mathematica, and IDL. This improves the usability of scientific software by providing a powerful meachanism for specifyling and controlling cimplex problems as well as giving users an interactive and exploratory problem solving environment. Scripting languages also provide a framework for building and integrating software components that allows tools be used in a more efficient manner. This streamlines the problem solving process and enable scientists to be more productive

    MATLAB-Like Scripting of Java Scientific Libraries in ScalaLab

    Get PDF

    Creating Complex Network Services with eBPF: Experience and Lessons Learned

    Get PDF
    The extended Berkeley Packet Filter (eBPF) is a recent technology available in the Linux kernel that enables flexible data processing. However, so far the eBPF was mainly used for monitoring tasks such as memory, CPU, page faults, traffic, and more, with a few examples of traditional network services, e.g., that modify the data in transit. In fact, the creation of complex network functions that go beyond simple proof-of-concept data plane applications has proven to be challenging due to the several limitations of this technology, but at the same time very promising due to some characteristics (e.g., dynamic recompilation of the source code) that are not available elsewhere. Based on our experience, this paper presents the most promising characteristics of this technology and the main encountered limitations, and we envision some solutions that can mitigate the latter. We also summarize the most important lessons learned while exploiting eBPF to create complex network functions and, finally, we provide a quantitative characterization of the most significant aspects of this technology

    Doctor of Philosophy

    Get PDF
    dissertationDomain-specific languages (DSLs) are increasingly popular, and there are a variety of ways to create a DSL. A DSL designer might write an interpreter from scratch, compile the DSL to another language, express DSL concepts using only the existing forms of an existing language, or implement DSL constructs using a language's extension capabilities, including macros. While extensible languages can offer the easiest opportunity for creating a DSL that takes advantage of the language's existing infrastructure, existing tools for debugging fail to adequately adapt the debugging experience to a given domain. This dissertation addresses the problem of debugging DSLs defined with macros and describes an event-oriented approach that works well with a macro-expansion view of language implementation. It pairs the mapping of DSL terms to host terms with an event mapping to convert primitive events back to domain-specific concepts. Domain-specific events can be further inspected or manipulated to construct domain-specific debuggers. This dissertation presents a core model of evaluation and events and also presents a language design-analogous to pattern-based notations for macros, but in the other direction-for describing how events in a DSL's expansion are mapped to events at the DSL's level. The domain-specific events can enable useful, domain-specific debuggers, and the dissertation introduces a design for a debugging framework to help with debugger construction. To validate the design of the debugging framework, a debugging framework, Ripple, is implemented, and this dissertation demonstrates that with a modest amount of work, Ripple can support building domain-specific debuggers

    Pharo's Vision: Goals, Processes, and Development Effort

    Get PDF
    This document presents the goals, processes, architectural vision and current and future development efforts in Pharo core. It will serve as a working document to structure the effort for the next versions of Pharo core. By Pharo core, we mean the essential parts of the system, such as the compiler, basic libraries, and key infrastructure such as canvas, events, etc. We hope that this document will also bring good energy to achieve some of the goals described in the roadmap. It document should be read as a proposal and explanation of some of our effort. Now by no means what we describe here is carved into stone. We are really interested in constructive criticism, comments and suggestions. In addition, people who want to participate to the definition of this document are welcome. Finally, we are convinced that the best core is nothing if it has no cool libraries; so we encourage people to build libraries and we will support them. We hope that this document will put a good light on what we want to achieve and share with the community. We also hope that it will make clear what the system we want and that we can build it together
    corecore