Character-level RNN (Recurrent Neural Net) LSTM (Long Short-Term Memory) text predictor intended for password generation research.
Check out corresponding Medium article:
Password Cracker - Generating Passwords with Recurrent Neural Networks (LSTMs)🔑🔓
This project was developed for purely educational use. Don't use it for any evil purposes.
- Given a large dataset of leaked passwords.
- Train an RNN LSTM model on it.
- Generate new passwords.
Top 85 million WPA (Wi-Fi) Passwords
Randomized and split into:
- Training set (10%) ~100 MB (9 M passwords)
- Testing set (90%) ~850 MB (76 M passwords)
Text Predictor RNN LSTM, for more details check this article.
Batch size: 32
Sequence length: 25
Learning rate: 0.002
Decay rate: 0.97
Hidden layer size: 1024
Cells size: 3
hit_ratio = sampled_passwords_in_test_set / all_sampled_passwords
After 100 thousands of learning iterations, RNN LSTM model generated 896 passwords and 119 of them were in the validation set.
It means that 13% of the generated password were the real ones.
Examples of AI generated passwords that were actually used by people:
richardmars
sierrasoftball
8aug1863
FalconGroovy
verstockt
hakensen
mccaitlin
playboyslayer
republicmaster
eddie123
Denversharon
marchand
humaniseront5
7december1789
15071600
Spatted2
jaredhomebrew
choco2007
doctorPacker
bac7er!o1o9!s7s
elliot1993
d3r!v@7!on
trickset
jonathancruise
mcjordan23
Family82
susanAwesome
Greg (Grzegorz) Surma