Skip to content

atzuur/luma

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

luma

Warning

This project is still a work in progress.

GPT-2 [1] implementation with manually computed gradients, inspired by karpathy/llm.c and karpathy/nanoGPT. Also features a BPE tokenizer [2]. The plan is to eventually rewrite this in C++ with hand-optimized CPU kernels for small, hardware-restricted LM purposes.

Usage

Requires PyTorch >= 2.6.0. Run python src/gpt.py to train using the Shakespeare dataset and display a sample of inference.

References

About

(WIP) GPT-2 implementation with manually computed gradients

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors