2,812 research outputs found

    Audiovisual preservation strategies, data models and value-chains

    No full text
    This is a report on preservation strategies, models and value-chains for digital file-based audiovisual content. The report includes: (a)current and emerging value-chains and business-models for audiovisual preservation;(b) a comparison of preservation strategies for audiovisual content including their strengths and weaknesses, and(c) a review of current preservation metadata models, and requirements for extension to support audiovisual files

    Data compression using correlations and stochastic processes in the ALICE Time Projection chamber

    Full text link
    In this paper lossless and a quasi lossless algorithms for the online compression of the data generated by the Time Projection Chamber (TPC) detector of the ALICE experiment at CERN are described. The first algorithm is based on a lossless source code modelling technique, i.e. the original TPC signal information can be reconstructed without errors at the decompression stage. The source model exploits the temporal correlation that is present in the TPC data to reduce the entropy of the source. The second algorithm is based on a lossy source code modelling technique. In order to evaluate the consequences of the error introduced by the lossy compression, the results of the trajectory tracking algorithms that process data offline are analyzed, in particular, with respect to the noise introduced by the compression. The offline analysis has two steps: cluster finder and track finder. The results on how these algorithms are affected by the lossy compression are reported. In both compression technique entropy coding is applied to the set of events defined by the source model to reduce the bit rate to the corresponding source entropy. Using TPC simulated data, the lossless and the lossy compression achieve a data reduction to 49.2% of the original data rate and respectively in the range of 35% down to 30% depending on the desired precision.In this study we have focused on methods which are easy to implement in the frontend TPC electronics.Comment: 8 pages, 3 figures, Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, PSN THLT00

    Building self-optimized communication systems based on applicative cross-layer information

    Get PDF
    This article proposes the Implicit Packet Meta Header(IPMH) as a standard method to compute and represent common QoS properties of the Application Data Units (ADU) of multimedia streams using legacy and proprietary streams’ headers (e.g. Real-time Transport Protocol headers). The use of IPMH by mechanisms located at different layers of the communication architecture will allow implementing fine per-packet selfoptimization of communication services regarding the actual application requirements. A case study showing how IPMH is used by error control mechanisms in the context of wireless networks is presented in order to demonstrate the feasibility and advantages of this approach

    Survey of the Use of Steganography over the Internet

    Get PDF
    This paper addressesthe use of Steganography over the Internet by terrorists. There were ru-mors in the newspapers that Steganography is being used to covert communication between terrorists, without presenting any scientific proof. Niels Provos and Peter Honeyman conducted an extensive Internet search where they analyzed over 2 million images and didn’t find a single hidden image. After this study the scientific community was divided: some believed that Niels Provos and Peter Honeyman was conclusive enough other did not. This paper describes what Steganography is and what can be used for, various Steganography techniques and also presents the studies made regarding the use of Steganography on the Internet.Steganography, Secret Communication, Information Hiding, Cryptography

    Map online system using internet-based image catalogue

    Get PDF
    Digital maps carry along its geodata information such as coordinate that is important in one particular topographic and thematic map. These geodatas are meaningful especially in military field. Since the maps carry along this information, its makes the size of the images is too big. The bigger size, the bigger storage is required to allocate the image file. It also can cause longer loading time. These conditions make it did not suitable to be applied in image catalogue approach via internet environment. With compression techniques, the image size can be reduced and the quality of the image is still guaranteed without much changes. This report is paying attention to one of the image compression technique using wavelet technology. Wavelet technology is much batter than any other image compression technique nowadays. As a result, the compressed images applied to a system called Map Online that used Internet-based Image Catalogue approach. This system allowed user to buy map online. User also can download the maps that had been bought besides using the searching the map. Map searching is based on several meaningful keywords. As a result, this system is expected to be used by Jabatan Ukur dan Pemetaan Malaysia (JUPEM) in order to make the organization vision is implemented
    • 

    corecore