19 research outputs found
Web-Based Visualization of Very Large Scientific Astronomy Imagery
Visualizing and navigating through large astronomy images from a remote
location with current astronomy display tools can be a frustrating experience
in terms of speed and ergonomics, especially on mobile devices. In this paper,
we present a high performance, versatile and robust client-server system for
remote visualization and analysis of extremely large scientific images.
Applications of this work include survey image quality control, interactive
data query and exploration, citizen science, as well as public outreach. The
proposed software is entirely open source and is designed to be generic and
applicable to a variety of datasets. It provides access to floating point data
at terabyte scales, with the ability to precisely adjust image settings in
real-time. The proposed clients are light-weight, platform-independent web
applications built on standard HTML5 web technologies and compatible with both
touch and mouse-based devices. We put the system to the test and assess the
performance of the system and show that a single server can comfortably handle
more than a hundred simultaneous users accessing full precision 32 bit
astronomy data.Comment: Published in Astronomy & Computing. IIPImage server available from
http://iipimage.sourceforge.net . Visiomatic code and demos available from
http://www.visiomatic.org
Banking Resolution: Expansion of the Resolution Toolkit and the Changing Role of Deposit Insurers
In this Policy Brief, we provide quantitative evidence demonstrating that the resolution toolkit has expanded
considerably since the 2008 Global Financial Crisis (GFC). Purchase and assumption transactions, bridge bank
facilitation and bail-in mechanisms have all become more available for bank resolution purposes. The use of such
resolution tools is increasingly subject to least cost rules and to systemic failure considerations. These resolution tools may be available to different authorities, such as deposit insurers or resolution authorities, depending on the
jurisdiction in question. Two of the three statistical models applied point to a significant increase in resolution powers for deposit insurers
Evaluation of the Data Quality from a Round-Robin Test of Hyperspectral Imaging Systems
In this study, the results from a round-robin test of hyperspectral imaging systems are presented and analyzed. Fourteen different pushbroom hyperspectral systems from eight different institutions were used to acquire spectral cubes from the visible, near infra-red and short-wave infra-red regions. Each system was used to acquire a common set of targets under their normal operating conditions with the data calibrated and processed using the standard processing pipeline for each system. The test targets consisted of a spectral wavelength standard and of a custom-made pigment panel featuring Renaissance-era pigments frequently found in paintings from that period. The quality and accuracy of the resulting data was assessed with quantitative analyses of the spectral, spatial and colorimetric accuracy of the data. The results provide a valuable insight into the accuracy, reproducibility and precision of hyperspectral imaging equipment when used under routine operating conditions. The distribution and type of error found within the data can provide useful information on the fundamental and practical limits of such equipment when used for applications such as spectral classification, change detection, colorimetry and others
Multispectral and Hyperspectral Imaging of Art: Quality, Calibration and Visualization
Multispectral and hyperspectral imaging can be powerful tools for analyzing and documenting works of art due to their ability to simultaneously capture both accurate spectral and spatial information. The data can be used for a wide range of diagnostic and analytical purposes, including materials identification, pigment mapping, the detection of hidden features or areas of lost material, for colorimetric analysis or for precise quantitative documentation.
However, a number of technical challenges exist which have prevented multispectral and hyperspectral imaging from realizing their full potential and which have prevented the technologies from becoming more widely used and routine analytical tools. Both multispectral and hyperspectral imaging systems require careful and precise acquisition workflows in order to produce useful data. In addition, processing and calibration of the acquired data can be a challenge for many cultural heritage users. Hyperspectral imaging, in particular, can produce vast quantities of raw data that require complex processing and the ability to manage the large resulting volumes of data. Moreover, the final high resolution and multidimensional data that is produced can be difficult to use or to visualize.
This thesis, therefore, seeks to address some of these issues and seeks to analyze and quantify potential problems and then propose tools, workflows and methodologies to resolve and mitigate them. The research presented here focuses on two main areas. The first research area concerns the quality of spectral data and how to measure, quantify and improve it. To do so, it is necessary to first establish exactly what spectral quality is and what methods can be used to quantify it. These methods are then applied to ascertain the levels of quality seen in data acquired under routine operating conditions with an evaluation of data from an extensive round-robin test of hyperspectral imaging systems.
In order to improve the quality of spectral data, the various elements that contribute to and affect spectral quality within a system are then analyzed. Ac quisition and calibration pipelines are then defined for both multispectral and hyperspectral equipment with practical guidelines and workflows provided that aim to help users produce the best quality data possible.
The second research area concerns the visualization of such data and examines ways to facilitate and make large and complex image data accessible online. For this, an architecture, visualization techniques and a full software platform are presented for the efficient distribution and visualization of high resolution multi-modal and multispectral or hyperspectral image data. This work is then extended in order to push the technology to the limits and to apply the techniques to the field of astronomy where image sizes are at their most extreme
EROS: an internal open multilingual platform for image content retrieval dedicated to conservation-restoration exchange between cultural institutions
An Open Source high performance and flexible database system dedicated to art conservation-restoration and research has been developed by Celartem Technology in association with the C2RMF to handle information on works of art. This includes detailed information on the works themselves as well as information relating to digital images of the works, restoration reports and image recognition meta-data. Other kinds of information can also be added as required. The system includes advanced multilingual searching and indexing capabilities as well as a high resolution image viewer and image content recognition modules. The data can be presented in XML or other formats. Separate databases can also be linked together to allow information to be exchanged and analysed not only within museum laboratories, but between different institutions also
Hyperspectral imaging of art: Acquisition and calibration workflows
Hyperspectral imaging has become an increasingly used tool in the analysis of works of art. However, the quality of the acquired data and the processing of that data to produce accurate and reproducible spectral image cubes can be a challenge to many cultural heritage users. The calibration of data that is both spectrally and spatially accurate is an essential step in order to obtain useful and relevant results from hyperspectral imaging. Data that is too noisy or inaccurate will produce sub-optimal results when used for pigment mapping, the detection of hidden features, change detection or for quantitative spectral documentation. To help address this, therefore, we will examine the specific acquisition and calibration workflows necessary for works of art. These workflows include the key parameters that must be addressed during acquisition and the essential steps and issues at each of the stages required during post-processing in order to fully calibrate hyperspectral data. In addition, we will look in detail at the key issues that affect data quality and propose practical solutions that can make significant differences to overall hyperspectral image quality
Calibration and Spectral Reconstruction for CRISATEL: an Art Painting Multispectral Acquisition System
The CRISATEL multispectral acquisition system is dedicated to the digital archiving of fine art paintings. It is composed of a dynamic lighting system and of a high-resolution camera equipped with a CCD linear array, 13 interference filters and several built-in electronically controlled mechanisms. A custom calibration procedure has been designed and implemented. It allows us to select the parameters to be used for the raw image acquisition and to collect experimental data, which will be used in the post processing stage to correct the obtained multispectral images. Various techniques have been tested and compared in order to reconstruct the spectral reflectance curve of the painting surface imaged in each pixel. Realistic colour rendering under any illuminants can then be obtained from this spectral reconstruction. The results obtained with the CRISATEL acquisition system and the associated multispectral image processing are shown on two art painting examples
Quality evaluation in spectral imaging – Quality factors and metrics
This is an Open Access article. This is the publisher’s PDF originally published in Journal of the International Colour Association: http://aic-colour-journal.org/index.php/JAIC/article/view/147Spectral imaging has many advantages over conventional three channel colour imaging, and has numerous applications in many domains. Despite many benefits, applications, and different techniques being proposed, little attention has been given to the evaluation of the quality of spectral images and of spectral imaging systems. There has been some research in the area of spectral image quality, mostly targeted at specific application domains. This paper seeks to provide a comprehensive review on existing research in the area of spectral image quality metrics. We classify existing spectral image quality metrics into categories based on how they were developed, their main features, and their intended applications. Spectral quality metrics, in general, aim to measure the quality of spectral images without considering specifically the imaging systems used to acquire the images. Having many different types of spectral imaging systems that could be used to acquire spectral images in an application, it is important to evaluate the performance/quality of these spectral imaging systems too. However, to our knowledge, not much attention has been given in this direction previously. As a first step towards this, we aim to identify different factors that influence the quality of the spectral imaging systems. In almost every stage of a spectral imaging workflow, there may be one or more factors that influence the quality of the final spectral image, and hence the imaging system used for acquiring the image. Identification of these factors, we believe, will be essential in developing a framework, for evaluating the quality of spectral imaging systems