5 research outputs found

    On randomness of compressed data using non-parametric randomness tests

    Get PDF
    Four randomness tests were used to test the outputs (compressed files) of four lossless compressions algorithms: JPEG-LS and JPEG-2000 algorithms are image-dedicated algorithms, while 7z and Bzip2 algorithms are generalpurpose algorithms. The relationship between the result of randomness tests and the compression ratio was investigated. This paper reports the important relationship between the statistical information behind these tests and the compression ratio. It shows that, this statistical information almost the same at least, for the four lossless algorithms under test. This information shows that 50 % of the compressed data are grouping of runs, 50% of it has positive signs when comparing adjacent values, 66% of the files containing turning points, and using Cox-Stuart test, 25% of the file give positive signs, which reflects the similarity aspects of compressed data. When it comes to the relationship between the compression ratio and these statistical information, the paper shows also, that, the greater values of these statistical numbers, the greater compression ratio we get

    Using pre-determined patterns to analyze the common behavior of compressed data and their compressibility apeal

    Get PDF
    This paper studies the behavior of compressed/uncompressed data on predetermined binary patterns. These patterns were generated according to specific criteria to ensure that they represent binary files. Each pattern is structurally unique. This study shows that all compressed data behave almost similarly when analyzing predetermined patterns. They all follow a curve similar to that of a skewed normal distribution. The uncompressed data, on the other hand, behave differently. Each file of uncompressed data plots its own curve without a specific shape. The paper confirms the side effect of these patterns, and the fact that they can be used to measure the compr essibility appeal of compressed data

    A new dynamic speech encryption algorithm based on lorenz chaotic map over internet protocol

    Get PDF
    This paper introduces a dynamic speech encryption algorithm based on Lorenz chaotic map over internet protocol to enhance the services of the real-time applications such as increases the security level and reduces latency. The proposed algorithm was divided into two processes: dynamic key generation process using 128-bit hash value to dynamically alter the initial secret keys, and encryption and decryption process using Lorenz system. In the proposed algorithm, the performance evaluation is carried out through efficient simulations and implementations and statistical analysis. In addition, the average time delay in the proposed algorithm and some of the existing algorithms such as AES is compared. The obtained results concluded that, the proposed dynamic speech encryption algorithm is effectually secured against various cryptanalysis attacks and has useful cryptographic properties such as confusion and diffusion for better voice communication in the voice applications field in the Internet

    Forensic analysis of large capacity digital storage devices

    Get PDF
    Digital forensic laboratories are failing to cope with the volume of digital evidence required to be analysed. The ever increasing capacity of digital storage devices only serves to compound the problem. In many law enforcement agencies a form of administrative triage takes place by simply dropping perceived low priority cases without reference to the data itself. Security agencies may also need days or weeks to analyse devices in order to detect and quantify encrypted data on the device.The current methodology often involves agencies creating a hash database of files where each known contraband file is hashed using a forensic hashing algorithm. Each file on a suspect device is similarly hashed and the hash compared against the contraband hash database. Accessing files via the file system in this way is a slow process. In addition deleted files or files on deleted or hidden partitions would not be found since their existence is not recorded in the file system.This thesis investigates the introduction of a system of triage whereby digital storage devices of arbitrary capacity can be quickly scanned to identify contraband and encrypted content with a high probability of detection with a known and controllable margin of error in a reasonable time. Such a system could classify devices as being worthy of further investigation or not and thus limit the number of devices being presented to digital forensic laboratories for examination.A system of triage is designed which bypasses the file system and uses the fundamental storage unit of digital storage devices, normally a 4 KiB block, rather than complete files. This allows fast sampling of the storage device. Samples can be chosen to give a controllable margin of error. In addition the sample is drawn from the whole address space of the device and so deleted files and partitions are also sampled. Since only a sample is being examined this is much faster than the traditional digital forensic analysis process.In order to achieve this, methods are devised that allow firstly the identification of 4 KiB blocks as belonging to a contraband file and secondly the classification of the block as encrypted or not. These methods minimise both memory and CPU loads so that the system may run on legacy equipment that may be in a suspectโ€™s possession. A potential problem with the existence of blocks that are common to many files is quantified and a mitigation strategy developed.The system is tested using publically available corpora by seeding devices with contraband and measuring the detection rate during triage. Results from testing are positive, achieving a 99% probability of detecting 4 MiB of contraband on a 1 TB device within the time normally assigned for the interview of the device owner. Initial testing on live devices in a law enforcement environment has shown that sufficient evidence can be collected in under four minutes from a 1TB device to allow the equipment to be seized and the suspect to be charged.This research will lead to a significant reduction in the backlog of cases in digital forensic laboratories since it can be used for triage within the laboratory as well as at the scene of crime
    corecore