5 research outputs found

    Data compression using correlations and stochastic processes in the ALICE Time Projection chamber

    Full text link
    In this paper lossless and a quasi lossless algorithms for the online compression of the data generated by the Time Projection Chamber (TPC) detector of the ALICE experiment at CERN are described. The first algorithm is based on a lossless source code modelling technique, i.e. the original TPC signal information can be reconstructed without errors at the decompression stage. The source model exploits the temporal correlation that is present in the TPC data to reduce the entropy of the source. The second algorithm is based on a lossy source code modelling technique. In order to evaluate the consequences of the error introduced by the lossy compression, the results of the trajectory tracking algorithms that process data offline are analyzed, in particular, with respect to the noise introduced by the compression. The offline analysis has two steps: cluster finder and track finder. The results on how these algorithms are affected by the lossy compression are reported. In both compression technique entropy coding is applied to the set of events defined by the source model to reduce the bit rate to the corresponding source entropy. Using TPC simulated data, the lossless and the lossy compression achieve a data reduction to 49.2% of the original data rate and respectively in the range of 35% down to 30% depending on the desired precision.In this study we have focused on methods which are easy to implement in the frontend TPC electronics.Comment: 8 pages, 3 figures, Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, PSN THLT00

    Lossy compression of TPC data and trajectory tracking efficiency for the ALICE experiment

    Get PDF
    In this paper a quasi-lossless algorithm for the on-line compression of the data generated by the Time Projection Chamber (TPC) detector of the ALICE experiment at CERN is described. The algorithm is based on a lossy source code modeling technique, i.e. it is based on a source model which is lossy if samples of the TPC signal are considered one by one; conversely, the source model is lossless or quasi-lossless if some physical quantities that are of main interest for the experiment are considered. These quantities are the area and the location of the center of mass of each TPC signal pulse, representing the pulse charge and the time localization of the pulse. So as to evaluate the consequences of the error introduced by the lossy compression process, the results of the trajectory tracking algorithms that process data off-line after the experiment are analyzed, in particular, versus their sensibility to the noise introduced by the compression. Two different versions of these off- line algorithms are described, performing cluster finding and particle tracking. The results on how these algorithms are affected by the lossy compression are reported. Entropy coding can be applied to the set of events defined by the source model to reduce the bit rate to the corresponding source entropy. Using TPC simulated data according to the expected ALICE TPC performance, the compression algorithm achieves a data reduction in the range of 34.2% down to 23.7% of the original data rate depending on the desired precision on the pulse center of mass. The number of operations per input symbol required to implement the algorithm is relatively low, so that a real-time implementation of the compression process embedded in the TPC data acquisition chain using low-cost integrated electronics is a realistic option to effectively reduce the data storing cost of ALICE experiment

    Compression of TPC data in the ALICE experiment

    Get PDF
    In this paper two algorithms for the compression of the data generated by the Time Projection Chamber (TPC) detector of the ALICE experiment at CERN are described. The first algorithm is based on a lossless source code modelling technique, i.e. the original TPC signal information can be reconstructed without errors at the decompression stage. The source model exploits the temporal correlation that is present in the TPC data to reduce the entropy of the source. The second algorithm is based on a source model which is lossy if samples of the TPC signal are considered one by one. Conversely, the source model is lossless or quasi-lossless if some physical quantities that are of main interest for the experiment are considered. These quantities are the area and the location of the center of mass of each TPC signal pulse. Obviously entropy coding is applied to the set of events defined by the two source models to reduce the bit rate to the corresponding source entropy. Using TPC simulated data according to the expected ALICE TPC performance, the lossless and the lossy compression algorithms achieve a data reduction respectively to 49.2% and in the range of 34.2% down to 23.7% of the original data rate. The number of operations per input symbol required to implement the compression stage for both algorithms is relatively low, so that a real-time implementation embedded in the TPC data acquisition chain using low-cost integrated electronics is a realistic option to effectively reduce the data storing cost of ALICE experiment

    Using webgis and cloud tools to promote cultural heritage dissemination: The historic up project

    No full text
    On the occasion of the First World War centennial, the GeoSNav Lab (Geodesy and Satellite Navigation Laboratory), Department of Engineering and Architecture, University of Trieste, Italy, in collaboration with Radici&Futuro Association, Trieste, Italy, carried out an educational Project named \u201cHistoric Up\u201d involving a group of students from the \u201cF. Petrarca\u201d High School of Trieste, Italy. The main target of the project is to make available to students of Middle and High Schools a set of historical and cultural contents in a simple and immediate way, through the production of a virtual and interactive tour following the event that started the First World War: the assassination of Franz Ferdinand and his wife Sofia in Sarajevo, in 28th June 1914. A set of Google Apps was used, including Google Earth, Maps, Tour Builder, Street View, Gmail, Drive, and Docs. The Authors instructed the students about software and team-working and supported them along the research. After the check, all the data have been uploaded on Tour Builder to create a sequence of historical checkpoints. Checkpoints has texts, pictures and videos that connect the tour-users to 1914. Moreover, GeoSNaV Lab produced a KML (Keyhole Markup Language) file representing the itinerary of the funeral procession and compared historical maps to actual status. This freely available online tour starts with the arrival of the royals, in 28th June 1914, and follows the couple along the events, from the assassination to the burial in Arstetten (Austria), including passages through Trieste (Italy), Lubjana (Slovenia), Graz and Wien (Austria)
    corecore