3 research outputs found

    Hybrid Approaches to Block Cipher

    Get PDF
    This chapter introduces two new approaches to block cipher—one is DNA hybridization encryption scheme (DHES) and the other is hybrid graphical encryption algorithm (HGEA). DNA cryptography deals with the techniques of hiding messages in the form of a DNA sequence. The key size of data encryption standard (DES) can be increased by using DHES. In DHES, DNA cryptography algorithm is used for encryption and decryption, and one-time pad (OTP) scheme is used for key generation. The output of DES algorithm is passed as an input to DNA hybridization scheme to provide an added security. The second approach, HGEA, is based on graphical pattern recognition. By performing multiple transformations, shifting and logical operations, a block cipher is obtained. This algorithm is influenced by hybrid cubes encryption algorithm (HiSea). Features like graphical interpretation and computation of selected quadrant value are the unique features of HGEA. Moreover, multiple key generation scheme combined with graphical interpretation method provides an increased level of security

    Integration of Expectation Maximization using Gaussian Mixture Models and Naïve Bayes for Intrusion Detection

    Get PDF
    Intrusion detection is the investigation process of information about the system activities or its data to detect any malicious behavior or unauthorized activity. Most of the IDS implement K-means clustering technique due to its linear complexity and fast computing ability. Nonetheless, it is Naïve use of the mean data value for the cluster core that presents a major drawback. The chances of two circular clusters having different radius and centering at the same mean will occur. This condition cannot be addressed by the K-means algorithm because the mean value of the various clusters is very similar together. However, if the clusters are not spherical, it fails. To overcome this issue, a new integrated hybrid model by integrating expectation maximizing (EM) clustering using a Gaussian mixture model (GMM) and naïve Bays classifier have been proposed. In this model, GMM give more flexibility than K-Means in terms of cluster covariance. Also, they use probabilities function and soft clustering, that’s why they can have multiple cluster for a single data. In GMM, we can define the cluster form in GMM by two parameters: the mean and the standard deviation. This means that by using these two parameters, the cluster can take any kind of elliptical shape. EM-GMM will be used to cluster data based on data activity into the corresponding category
    corecore