1,177 research outputs found

    A New Copyright Protection for Vector Map using FFT-based Watermarking

    Get PDF
    This study proposed a new approach of copyright protection for vector map using robust watermarking on FFT algorithm. A copyright marker inserted in vector map as the watermark. In addition to data origin authentication capabilities watermark, RSA cryptographic algorithm is used when generating the watermark. Quality measurement of the results was based on the three characteristics of digital watermarking: (1) invisibility using RMSE calculations, (2) fidelity with the farthest distance and (3) NC calculation and gemotrical level of robustness against attacks. Result of experiments showed that the approach used in this study succeeded in inserting copyright as watermark on vector maps. Invisibility test showed good results, demonstrated by RMSE close to zero. Fidelity of the watermarked map was also maintained. Level of watermark robustness against geometric attacks on vector map results has been maintained within the limits that these attacks do not affect the watermark bit value directly

    Work design improvement at Miroad Rubber Industries Sdn. Bhd.

    Get PDF
    Erul Food Industries known as Salaiport Industry is a family-owned company and was established on July 2017. Salaiport Industry apparently moved to a new place at Pedas, Negeri Sembilan. Previously, Salaiport Industry operated in-house located at Pagoh, Johor. This small company major business is producing frozen smoked beef, smoked quail, smoke catfish and smoked duck. The main frozen product is smoked beef. The frozen smoked meat produced by Salaiport Industry is depending on customer demands. Usually the company produce 40 kg to 60 kg a day and operated between for four days until five days. Therefore, the company produce approximately around 80 kg to 120 kg per week. The company usually take 2 days for 1 complete cycle for the production as the first day the company will only receive the meat from the supplier and freeze the meat for use of tomorrow

    Compression Of 2-Tone Manuscript For Multimedia Application [QA76.9.D33 B171 2008 f rb].

    Get PDF
    Malaysia seperti negara lain kaya dengan dokumen lama berlandaskan unsur sejarah dan kebudayaan yang jarang ditemui. Malaysia like any other country has old and rare documents that depict its history and culture

    Optimum Implementation of Compound Compression of a Computer Screen for Real-Time Transmission in Low Network Bandwidth Environments

    Get PDF
    Remote working is becoming increasingly more prevalent in recent times. A large part of remote working involves sharing computer screens between servers and clients. The image content that is presented when sharing computer screens consists of both natural camera captured image data as well as computer generated graphics and text. The attributes of natural camera captured image data differ greatly to the attributes of computer generated image data. An image containing a mixture of both natural camera captured image and computer generated image data is known as a compound image. The research presented in this thesis focuses on the challenge of constructing a compound compression strategy to apply the ‘best fit’ compression algorithm for the mixed content found in a compound image. The research also involves analysis and classification of the types of data a given compound image may contain. While researching optimal types of compression, consideration is given to the computational overhead of a given algorithm because the research is being developed for real time systems such as cloud computing services, where latency has a detrimental impact on end user experience. The previous and current state of the art videos codec’s have been researched along many of the most current publishing’s from academia, to design and implement a novel approach to a low complexity compound compression algorithm that will be suitable for real time transmission. The compound compression algorithm will utilise a mixture of lossless and lossy compression algorithms with parameters that can be used to control the performance of the algorithm. An objective image quality assessment is needed to determine whether the proposed algorithm can produce an acceptable quality image after processing. Both traditional metrics such as Peak Signal to Noise Ratio will be used along with a new more modern approach specifically designed for compound images which is known as Structural Similarity Index will be used to define the quality of the decompressed Image. In finishing, the compression strategy will be tested on a set of generated compound images. Using open source software, the same images will be compressed with the previous and current state of the art video codec’s to compare the three main metrics, compression ratio, computational complexity and objective image quality

    Hybrid Compressed Hash Based Homomorphic AB Encryption Algorithm for Security of data in the Cloud Environment

    Get PDF
    Cloud computing is an emerging technology in the world of computing. It provides a convenient virtual environment for on-demand access to different type of services and computing resources such as applications, networks and storage space in an efficient way. The virtual environment is a massive compound structure in terms of accessibility that made easy in a compact way and familiar of functional components. The complexity in virtual environment generates several issues related to data storage, data security, authorization and authentication in cloud computing. With the size of the data, it becomes difficult to the cloud user to store large amounts of information in the remote cloud servers due to high computational cost, insecurity and costs high per hour proportional to the volume of information. In this paper, we propose compressed hash based encrypted model for the virtual environment. The aim of this paper is to store huge amount of data in the cloud environment in the form of compressed and encrypted data in a secure way

    Evaluation of GPU/CPU Co-Processing Models for JPEG 2000 Packetization

    Get PDF
    With the bottom-line goal of increasing the throughput of a GPU-accelerated JPEG 2000 encoder, this paper evaluates whether the post-compression rate control and packetization routines should be carried out on the CPU or on the GPU. Three co-processing models that differ in how the workload is split among the CPU and GPU are introduced. Both routines are discussed and algorithms for executing them in parallel are presented. Experimental results for compressing a detail-rich UHD sequence to 4 bits/sample indicate speed-ups of 200x for the rate control and 100x for the packetization compared to the single-threaded implementation in the commercial Kakadu library. These two routines executed on the CPU take 4x as long as all remaining coding steps on the GPU and therefore present a bottleneck. Even if the CPU bottleneck could be avoided with multi-threading, it is still beneficial to execute all coding steps on the GPU as this minimizes the required device-to-host transfer and thereby speeds up the critical path from 17.2 fps to 19.5 fps for 4 bits/sample and to 22.4 fps for 0.16 bits/sample

    Investigation of the effects on embedded watermarks under image manipulations

    Get PDF
    Abstract: In this paper, different types of image watermarking techniques, the embedding of data or copyright information into the data file, are investigated. Three image watermarking techniques are discussed, namely: least significant bit, least significant bit and discrete cosine transform combined, and discrete cosine transform and discrete wavelet transform combined. These embedded watermarking techniques are evaluated on how robust each technique is under image manipulations. Simulations are done using the three image watermarking techniques to determine the effects on how well the embedded watermarking technique resists manipulations..
    corecore