940 research outputs found

    Work design improvement at Miroad Rubber Industries Sdn. Bhd.

    Get PDF
    Erul Food Industries known as Salaiport Industry is a family-owned company and was established on July 2017. Salaiport Industry apparently moved to a new place at Pedas, Negeri Sembilan. Previously, Salaiport Industry operated in-house located at Pagoh, Johor. This small company major business is producing frozen smoked beef, smoked quail, smoke catfish and smoked duck. The main frozen product is smoked beef. The frozen smoked meat produced by Salaiport Industry is depending on customer demands. Usually the company produce 40 kg to 60 kg a day and operated between for four days until five days. Therefore, the company produce approximately around 80 kg to 120 kg per week. The company usually take 2 days for 1 complete cycle for the production as the first day the company will only receive the meat from the supplier and freeze the meat for use of tomorrow

    Investigation of the effects of image compression on the geometric quality of digital protogrammetric imagery

    Get PDF
    We are living in a decade, where the use of digital images is becoming increasingly important. Photographs are now converted into digital form, and direct acquisition of digital images is becoming increasing important as sensors and associated electronics. Unlike images in analogue form, digital representation of images allows visual information to· be easily manipulated in useful ways. One practical problem of the digital image representation is that, it requires a very large number of bits and hence one encounters a fairly large volume of data in a digital production environment if they are stored uncompressed on the disk. With the rapid advances in sensor technology and digital electronics, the number of bits grow larger in softcopy photogrammetry, remote sensing and multimedia GIS. As a result, it is desirable to find efficient representation for digital images in order to reduce the memory required for storage, improve the data access rate from storage devices, and reduce the time required for transfer across communication channels. The component of digital image processing that deals with this problem is called image compression. Image compression is a necessity for the utilisation of large digital images in softcopy photogrammetry, remote sensing, and multimedia GIS. Numerous image Compression standards exist today with the common goal of reducing the number of bits needed to store images, and to facilitate the interchange of compressed image data between various devices and applications. JPEG image compression standard is one alternative for carrying out the image compression task. This standard was formed under the auspices ISO and CCITT for the purpose of developing an international standard for the compression and decompression of continuous-tone, still-frame, monochrome and colour images. The JPEG standard algorithm &Us into three general categories: the baseline sequential process that provides a simple and efficient algorithm for most image coding applications, the extended DCT-based process that allows the baseline system to satisfy a broader range of applications, and an independent lossless process for application demanding that type of compression. This thesis experimentally investigates the geometric degradations resulting from lossy JPEG compression on photogrammetric imagery at various levels of quality factors. The effects and the suitability of JPEG lossy image compression on industrial photogrammetric imagery are investigated. Examples are drawn from the extraction of targets in close-range photogrammetric imagery. In the experiments, the JPEG was used to compress and decompress a set of test images. The algorithm has been tested on digital images containing various levels of entropy (a measure of information content of an image) with different image capture capabilities. Residual data was obtained by taking the pixel-by-pixel difference between the original data and the reconstructed data. The image quality measure, root mean square (rms) error of the residual was used as a quality measure to judge the quality of images produced by JPEG(DCT-based) image compression technique. Two techniques, TIFF (IZW) compression and JPEG(DCT-based) compression are compared with respect to compression ratios achieved. JPEG(DCT-based) yields better compression ratios, and it seems to be a good choice for image compression. Further in the investigation, it is found out that, for grey-scale images, the best compression ratios were obtained when the quality factors between 60 and 90 were used (i.e., at a compression ratio of 1:10 to 1:20). At these quality factors the reconstructed data has virtually no degradation in the visual and geometric quality for the application at hand. Recently, many fast and efficient image file formats have also been developed to store, organise and display images in an efficient way. Almost every image file format incorporates some kind of compression method to manage data within common place networks and storage devices. The current major file formats used in softcopy photogrammetry, remote sensing and · multimedia GIS. were also investigated. It was also found out that the choice of a particular image file format for a given application generally involves several interdependent considerations including quality; flexibility; computation; storage, or transmission. The suitability of a file format for a given purpose is · best determined by knowing its original purpose. Some of these are widely used (e.g., TIFF, JPEG) and serve as exchange formats. Others are adapted to the needs of particular applications or particular operating systems

    An overview of JPEG 2000

    Get PDF
    JPEG-2000 is an emerging standard for still image compression. This paper provides a brief history of the JPEG-2000 standardization process, an overview of the standard, and some description of the capabilities provided by the standard. Part I of the JPEG-2000 standard specifies the minimum compliant decoder, while Part II describes optional, value-added extensions. Although the standard specifies only the decoder and bitstream syntax, in this paper we describe JPEG-2000 from the point of view of encoding. We take this approach, as we believe it is more amenable to a compact description more easily understood by most readers.

    Map online system using internet-based image catalogue

    Get PDF
    Digital maps carry along its geodata information such as coordinate that is important in one particular topographic and thematic map. These geodatas are meaningful especially in military field. Since the maps carry along this information, its makes the size of the images is too big. The bigger size, the bigger storage is required to allocate the image file. It also can cause longer loading time. These conditions make it did not suitable to be applied in image catalogue approach via internet environment. With compression techniques, the image size can be reduced and the quality of the image is still guaranteed without much changes. This report is paying attention to one of the image compression technique using wavelet technology. Wavelet technology is much batter than any other image compression technique nowadays. As a result, the compressed images applied to a system called Map Online that used Internet-based Image Catalogue approach. This system allowed user to buy map online. User also can download the maps that had been bought besides using the searching the map. Map searching is based on several meaningful keywords. As a result, this system is expected to be used by Jabatan Ukur dan Pemetaan Malaysia (JUPEM) in order to make the organization vision is implemented

    APPLICATION OF GENETIC ALGORITHM FOR IMAGE TRANSFER

    Get PDF
    For images transfer, different embedding system exist which works by creating a mosaic image from the source image and recovery from the target image using some sort of algorithm. In current study, a method is proposed using the genetic algorithm for recovery of image from the source image. The algorithm utilized is genetic algorithm which is a search method along with another additional technique for obtaining higher robustness and security. The proposed methodology works by dividing the source image into smaller parts which are fitted into target image using the lossless compression. The mosaic image is recovered at retrieving side by the permutation array which is recovered and mapped using the pre-select key

    Application of Stochastic Diffusion for Hiding High Fidelity Encrypted Images

    Get PDF
    Cryptography coupled with information hiding has received increased attention in recent years and has become a major research theme because of the importance of protecting encrypted information in any Electronic Data Interchange system in a way that is both discrete and covert. One of the essential limitations in any cryptography system is that the encrypted data provides an indication on its importance which arouses suspicion and makes it vulnerable to attack. Information hiding of Steganography provides a potential solution to this issue by making the data imperceptible, the security of the hidden information being a threat only if its existence is detected through Steganalysis. This paper focuses on a study methods for hiding encrypted information, specifically, methods that encrypt data before embedding in host data where the ‘data’ is in the form of a full colour digital image. Such methods provide a greater level of data security especially when the information is to be submitted over the Internet, for example, since a potential attacker needs to first detect, then extract and then decrypt the embedded data in order to recover the original information. After providing an extensive survey of the current methods available, we present a new method of encrypting and then hiding full colour images in three full colour host images with out loss of fidelity following data extraction and decryption. The application of this technique, which is based on a technique called ‘Stochastic Diffusion’ are wide ranging and include covert image information interchange, digital image authentication, video authentication, copyright protection and digital rights management of image data in general

    A Comparative Study on Improvement of Image Compression Method using Hybrid DCT - DWT Techniques with Huffman Encoding for Wireless Sensor Network Application

    Get PDF
    Nowadays, the demands on the usage of wireless network are increasing rapidly from year to year. Wireless network is a large scale of area where many nodes are connecting to each other to communicate using a device. Primarily, wireless network also tend to be as a link to transmit and receive any multimedia such as image, sound, video, document and etc. In order to receive the transmitted media correctly, most type of media must be compressed before being transmitted and decompressed after being received by the device or else the device used must have the ability to read the media in a compressed way. In this paper, a hybrid compression of Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) with Huffman encoding technique are proposed for Wireless Sensor Network (WSN) application. Data compression is very useful to remove the redundant data and reduce the size of image. After conducting a comprehensive observation, it is found that hybrid compression is suitable due to the process consist of the combination of multiple compression techniques which suits for Wireless Sensor Network’s application focusing on ZigBee platform

    Evaluation of international standards for ECG-recording and storage for use in tele-medical services

    Get PDF
    This report is written to clarify which of the international standards for ECG recordings that can be used in tele-medical services, where the recordings should be transmitted by wireless telecommunication facilities and finally stored as information integrated into the patients Electronic Health Record (EHR). Some principals for recording, transmission and storage of digital vital signs parameters are highlighted and important aspects of wireless communication of recorded signals from biomedical sensors are described, in order to understand the significance and differences in the storing formats to be used. Even if most of the relevant standards are not yet ratified (the last meeting in ISO TC 251 WH6 was held in October 2005), the actual international standards SCP-ECG, MFER, FDAXML and DICIOM are defined and already widely adopted. In this report, these standards are briefly described and evaluated with respect to possible use in tele-medical services, and recommendations are given in order to obtain a reliable and secure communication solution. Requirements for integration of the ECG file formats into the EHR are briefly described, and it is given some recommendations for actual standards to be used in future solutions
    corecore