Skip to content
forked from karpathy/minGPT

A minimal PyTorch implementation of BERT (Bidirectional Encoder Representations from Transformers)

License

Notifications You must be signed in to change notification settings

barneyhill/minBERT

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

98 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

minBERT

A minimal implementation (model.py is 331 lines) of Bidirectional Encoder Representations from Transformers (BERT) forked from Andrej Karpathy's minGPT.

Results

Trained on the Shakespeare Corpus on a M2 Macbook for a couple of hours:

O God, O God! O God! God! and you do you! boy! hast's servant thee! hoo! and you sink, and see 'twixt out father sin! God, sir. But your sheps; but you did bot see it and your holy poison can you show the town, which? your minds is itself and be out one this: my lord, but this am that I am no brother, such an outs on me! I do allow him. But trouble me. And makEs the tyranny: more are thou the stoody! When is a world, bold you, give it in thyself, but not he's time. Witch, thou tell'st'st a sleepes on tediabl

About

A minimal PyTorch implementation of BERT (Bidirectional Encoder Representations from Transformers)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 51.8%
  • Jupyter Notebook 48.2%