Data Compression on Zero Suppressed High Energy Physics Data

Abstract

Future High Energy Physics experiments will produce unprecedented data volumes (up to 1 GB/s [1]). In most cases it will be impossible to analyse these data in real time and they will have to be stored on durable mostly magnetic linear media (e.g. tapes) for later analysis. This threatens to become a major cost factor for the running of these experiments. Here we present some ideas developed together with the Institute of Computer Graphics, Department for Algorithms and Programming on how this volume and the related cost can be reduced significantly. The algorithms presented are not general ones but aimed in particular to physics experiments data. Taking advantage of the knowledge of the data they are highly superior to general ones (Huffman, LZW, arithmetic coding) both in compression rate but more importantly in speed as to keep up with the output rate to modern tape drives. Above standard algorithms are, however, used after the data have been transferred in a more 'compressible' data space. These algorithms are now available in hardware notably from IBM (IBM ALDC1-xxS [4]) with (de)compressions speeds of up to 40 MB/s which can perform this compression in real time. Data may either be compressed just before being written to tape or but it proves advantageous to compress the data already before the recording machine and before sending them on various network media in the distributed Data Acquisition System. This also reduces the number and/or performance requirements on the LDCs, the links, the GDCs and finally on the number of tape stations

Similar works

Full text

thumbnail-image

CERN Document Server

redirect
Last time updated on 30/10/2014

This paper was published in CERN Document Server.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.