Using Long Short-Term Memory Networks to Make and Train Neural Network Based Pseudo Random Number Generator
Thesis - Open Access
Master of Science (MS)
Department / School
Electrical Engineering and Computer Science
Neural Networks have been used in many decision-making models and been employed in computer vision, and natural language processing. Several works have also used Neural Networks for developing Pseudo-Random Number Generators [2, 4, 5, 7, 8]. However, despite great performance in the National Institute of Standards and Technology (NIST) statistical test suite for randomness, they fail to discuss how the complexity of a neural network affects such statistical results. This work introduces: 1) a series of new Long Short- Term Memory Network (LSTM) based and Fully Connected Neural Network (FCNN – baseline  + variations) Pseudo Random Number Generators (PRNG) and 2) an LSTMbased predictor. The thesis also performs adversarial training to determine two things: 1) How the use of sequence models such as LSTMs after adversarial training affects the performance on NIST tests. 2) To study how the complexity of the fully connected network-based generator in  and the LSTM-based generator affects NIST results. Experiments were done on four different sets of generators and predictors, i) Fully Connected Neural Network Generator (FC NN Gen) – Convolutional Neural Network Predictor (CNN Pred), ii) FC NN Gen - LSTM Pred, iii) LSTM-based Gen – CNN. Pred, iv) LSTM-based Gen – LSTM Pred, where FC NN Gen and CNN Pred were taken as the baseline from  while LSTM-based Gen and LSTM Pred were proposed. Based on the experiments, LSTM Predictor overall gave much consistent and even better results on the NIST test suite than the CNN Predictor from . It was observed that using LSTM generator showed a higher pass rate for NIST test on average when paired with LSTM Predictor but a very low fluctuating trend. On the other hand, an increasing trend was observed for the average NIST test passing rate when the same generator was trained with CNN Predictor in an adversarial environment. The baseline  and its variations however only displayed a fluctuating trend, but with better results with the adversarial training with the LSTM-based Predictor than the CNN Predictor.
Library of Congress Subject Headings
Neural networks (Computer science)
Random number generators.
Short-term memory -- Mathematical models.
Number of Pages
South Dakota State University
Harshvardhan, Aditya, "Using Long Short-Term Memory Networks to Make and Train Neural Network Based Pseudo Random Number Generator" (2022). Electronic Theses and Dissertations. 347.
Computer Engineering Commons, Computer Sciences Commons, Electrical and Computer Engineering Commons