1,904 research outputs found

    Spectral Mapping Reconstruction of Extended Sources

    Get PDF
    Three dimensional spectroscopy of extended sources is typically performed with dedicated integral field spectrographs. We describe a method of reconstructing full spectral cubes, with two spatial and one spectral dimension, from rastered spectral mapping observations employing a single slit in a traditional slit spectrograph. When the background and image characteristics are stable, as is often achieved in space, the use of traditional long slits for integral field spectroscopy can substantially reduce instrument complexity over dedicated integral field designs, without loss of mapping efficiency -- particularly compelling when a long slit mode for single unresolved source followup is separately required. We detail a custom flux-conserving cube reconstruction algorithm, discuss issues of extended source flux calibration, and describe CUBISM, a tool which implements these methods for spectral maps obtained with ther Spitzer Space Telescope's Infrared Spectrograph.Comment: 11 pages, 8 figures, accepted by PAS

    Target and (Astro-)WISE technologies - Data federations and its applications

    Full text link
    After its first implementation in 2003 the Astro-WISE technology has been rolled out in several European countries and is used for the production of the KiDS survey data. In the multi-disciplinary Target initiative this technology, nicknamed WISE technology, has been further applied to a large number of projects. Here, we highlight the data handling of other astronomical applications, such as VLT-MUSE and LOFAR, together with some non-astronomical applications such as the medical projects Lifelines and GLIMPS, the MONK handwritten text recognition system, and business applications, by amongst others, the Target Holding. We describe some of the most important lessons learned and describe the application of the data-centric WISE type of approach to the Science Ground Segment of the Euclid satellite.Comment: 9 pages, 5 figures, Proceedngs IAU Symposium No 325 Astroinformatics 201

    Creating and controlling visual environments using BonVision.

    Get PDF
    Real-time rendering of closed-loop visual environments is important for next-generation understanding of brain function and behaviour, but is often prohibitively difficult for non-experts to implement and is limited to few laboratories worldwide. We developed BonVision as an easy-to-use open-source software for the display of virtual or augmented reality, as well as standard visual stimuli. BonVision has been tested on humans and mice, and is capable of supporting new experimental designs in other animal models of vision. As the architecture is based on the open-source Bonsai graphical programming language, BonVision benefits from native integration with experimental hardware. BonVision therefore enables easy implementation of closed-loop experiments, including real-time interaction with deep neural networks, and communication with behavioural and physiological measurement and manipulation devices

    A Framework for Real-time Analysis in OLAP Systems

    Get PDF
    OLAP systems are designed to quickly answer multi-dimensional queries against large data warehouse systems. Constructing data cubes and their associated indexes is time consuming and computationally expensive, and for this reason, data cubes are only refreshed periodically. Increasingly, organizations are demanding for both historical and predictive analysis based on the most current data. This trend has also placed the requirement on OLAP systems to merge updates at a much faster rate than before. In this thesis, we proposes a framework for OLAP systems that enables updates to be merged with data cubes in soft real-time. We apply a strategy of local partitioning of the data cube, and maintain a ``hot'' partition for each materialized view to merge update data. We augment this strategy by applying multi-core processing using the OpenMP library to accelerate data cube construction and query resolution. Experiments using a data cube with 10,000,000 tuples and an update set of 100,000 tuples show that our framework achieves a 99% performance improvement updating the data cube, a 76% performance increase when constructing a new data cube, and a 72% performance increase when resolving a range query against a data cube with 1,000,000 tuples
    • …
    corecore