10,023 research outputs found

    A method for data base management and analysis for wind tunnel data

    Get PDF
    To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown

    GATE : a simulation toolkit for PET and SPECT

    Get PDF
    Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols, and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at the address http://www-lphe.epfl.ch/GATE/

    Pixel detector R&D for the Compact Linear Collider

    Full text link
    The physics aims at the proposed future CLIC high-energy linear e+ee^+ e^- collider pose challenging demands on the performance of the detector system. In particular the vertex and tracking detectors have to combine precision measurements with robustness against the expected high rates of beam-induced backgrounds. A spatial resolution of a few microns and a material budget down to 0.2\% of a radiation length per vertex-detector layer have to be achieved together with a few nanoseconds time stamping accuracy. These requirements are addressed with innovative technologies in an ambitious detector R\&D programme, comprising hardware developments as well as detailed device and Monte Carlo simulations based on TCAD, Geant4 and Allpix-Squared. Various fine pitch hybrid silicon pixel detector technologies are under investigation for the CLIC vertex detector. The CLICpix and CLICpix2 readout ASICs with \SI{25}{\micro\meter} pixel pitch have been produced in a \SI{65}{\nano\meter} commercial CMOS process and bump-bonded to planar active edge sensors as well as capacitively coupled to High-Voltage (HV) CMOS sensors. Monolithic silicon tracking detectors are foreseen for the large surface (\approx \SI{140}{\meter\squared}) CLIC tracker. Fully monolithic prototypes are currently under development in High-Resistivity (HR) CMOS, HV-CMOS and Silicon on Insulator (SOI) technologies. The laboratory and beam tests of all recent prototypes profit from the development of the CaRIBou universal readout system. This talk presents an overview of the CLIC pixel-detector R\&D programme, focusing on recent test-beam and simulation results.Comment: On behalf of CLICdp collaboration, Conference proceedings for PIXEL201

    Potential of X-ray computed tomography for 3D anatomical analysis and microdensitometrical assessment in wood research with focus on wood modification

    Get PDF
    Studying structure and chemistry of wood and wood-based materials is the backbone of all wood research and many techniques are at hand to do so. A very valuable modality is X-ray computed tomography (CT), able to non-destructively probe the three-dimensional (3D) structure and composition. In this paper, we elaborate on the use of Nanowood, a flexible multi-resolution X-ray CT set-up developed at UGCT, the Ghent University Centre for X-ray Tomography. The technique has been used successfully in many different fields of wood science. It is illustrated how 3D structural and microdensitometrical data can be obtained using different scan set-ups and protocols. Its potential for the analysis of modified wood is exemplified, e.g. for the assessment of wood treated with hydrophobing agents, localisation of modification agents, pathway analysis related to functional tissues, dimensional changes due to thermal treatment, etc. Furthermore, monitoring of transient processes is a promising field of activity too

    Alignment-free Genomic Analysis via a Big Data Spark Platform

    Get PDF
    Motivation: Alignment-free distance and similarity functions (AF functions, for short) are a well established alternative to two and multiple sequence alignments for many genomic, metagenomic and epigenomic tasks. Due to data-intensive applications, the computation of AF functions is a Big Data problem, with the recent Literature indicating that the development of fast and scalable algorithms computing AF functions is a high-priority task. Somewhat surprisingly, despite the increasing popularity of Big Data technologies in Computational Biology, the development of a Big Data platform for those tasks has not been pursued, possibly due to its complexity. Results: We fill this important gap by introducing FADE, the first extensible, efficient and scalable Spark platform for Alignment-free genomic analysis. It supports natively eighteen of the best performing AF functions coming out of a recent hallmark benchmarking study. FADE development and potential impact comprises novel aspects of interest. Namely, (a) a considerable effort of distributed algorithms, the most tangible result being a much faster execution time of reference methods like MASH and FSWM; (b) a software design that makes FADE user-friendly and easily extendable by Spark non-specialists; (c) its ability to support data- and compute-intensive tasks. About this, we provide a novel and much needed analysis of how informative and robust AF functions are, in terms of the statistical significance of their output. Our findings naturally extend the ones of the highly regarded benchmarking study, since the functions that can really be used are reduced to a handful of the eighteen included in FADE

    A feasibility study: Forest Fire Advanced System Technology (FFAST)

    Get PDF
    The National Aeronautics and Space Administration/Jet Propulsion Laboratory and the United States Department of Agriculture Forest Service completed a feasibility study that examined the potential uses of advanced technology in forest fires mapping and detection. The current and future (1990's) information needs in forest fire management were determined through interviews. Analysis shows that integrated information gathering and processing is needed. The emerging technologies that were surveyed and identified as possible candidates for use in an end to end system include ""push broom'' sensor arrays, automatic georeferencing, satellite communication links, near real or real time image processing, and data integration. Matching the user requirements and the technologies yielded a ""strawman'' system configuration. The feasibility study recommends and outlines the implementation of the next phase for this project, a two year, conceptual design phase to define a system that warrants continued development

    The JPL telerobot operator control station. Part 1: Hardware

    Get PDF
    The Operator Control Station of the Jet Propulsion Laboratory (JPL)/NASA Telerobot Demonstrator System provides the man-machine interface between the operator and the system. It provides all the hardware and software for accepting human input for the direct and indirect (supervised) manipulation of the robot arms and tools for task execution. Hardware and software are also provided for the display and feedback of information and control data for the operator's consumption and interaction with the task being executed. The hardware design, system architecture, and its integration and interface with the rest of the Telerobot Demonstrator System are discussed

    CAncer bioMarker Prediction Pipeline (CAMPP) - A standardized framework for the analysis of quantitative biological data

    Get PDF
    With the improvement of -omics and next-generation sequencing (NGS) methodologies, along with the lowered cost of generating these types of data, the analysis of high-throughput biological data has become standard both for forming and testing biomedical hypotheses. Our knowledge of how to normalize datasets to remove latent undesirable variances has grown extensively, making for standardized data that are easily compared between studies. Here we present the CAncer bioMarker Prediction Pipeline (CAMPP), an open-source R-based wrapper (https://github.com/ELELAB/CAncer-bioMarker-Prediction-Pipeline -CAMPP) intended to aid bioinformatic software-users with data analyses. CAMPP is called from a terminal command line and is supported by a user-friendly manual. The pipeline may be run on a local computer and requires little or no knowledge of programming. To avoid issues relating to R-package updates, a renv .lock file is provided to ensure R-package stability. Data-management includes missing value imputation, data normalization, and distributional checks. CAMPP performs (I) k-means clustering, (II) differential expression/abundance analysis, (III) elastic-net regression, (IV) correlation and co-expression network analyses, (V) survival analysis, and (VI) protein-protein/miRNA-gene interaction networks. The pipeline returns tabular files and graphical representations of the results. We hope that CAMPP will assist in streamlining bioinformatic analysis of quantitative biological data, whilst ensuring an appropriate bio-statistical framework
    corecore