LFSR Next Bit Prediction through Deep Learning

Abstract

Pseudorandom bit sequences are generated using deterministic algorithms to simulate truly random sequences. Many cryptographic algorithms use pseudorandom sequences, and the randomness of these sequences greatly impacts the robustness of these algo-rithms. Important crypto primitive Linear Feedback Shift Register (LFSR) and its combina-tions have long been used in stream ciphers for the generation of pseudorandom bit sequences. The sequences generated by LFSR can be predicted using the traditional Ber-lekamp Massey Algorithm, which solves LFSR in 2×n number of bits, where n is the de-gree of LFSR. Many different techniques based on ML classifiers have been successful at predicting the next bit of the sequences generated by LFSR. However, the main limitation in the existing approaches is that they require a large number (as compared to the de-gree of LFSR) of bits to solve the LFSR. In this paper, we have proposed a novel Pattern Duplication technique that exponentially reduces the input bits requirement for training the ML Model. This Pattern Duplication technique generates new samples from the available data using two properties of the XOR function used in LFSRs. We have used the Deep Neural Networks (DNN) as the next bit predictor of the sequences generated by LFSR along with the Pattern Duplication technique. Due to the Pattern Duplication tech-nique, we need a very small number of input patterns for DNN. Moreover, in some cases, the DNN model managed to predict LFSRs in less than 2n bits as compared to the Ber-lekamp Massey Algorithm. However, this technique was not successful in cases where LFSRs have primitive polynomials with a higher number of tap points

    Similar works