REPOSITORIES
AI agents running research on single-GPU nanochat training automatically
The best ChatGPT that $100 can buy.
LLM101n: Let's build a Storyteller
LLM training in simple, raw C/CUDA
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Neural Networks: Zero to Hero
Inference Llama 2 in one file of pure C
LLM Council works together to answer your hardest questions
A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API
Multi-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch
Minimal, clean code for the Byte Pair Encoding (BPE) algorithm commonly used in LLM tokenization.
Deep Learning in Javascript. Train Convolutional Neural Networks (or ordinary ones) in your browser.
The simplest, fastest repository for training/finetuning medium-sized GPTs.