Show HN: Z80-μLM, a 'Conversational AI' That Fits in 40KB
Link:
Discussion:
GitHub
GitHub - HarryR/z80ai: Z80-μLM is a 2-bit quantized language model small enough to run on an 8-bit Z80 processor. Train conversational models in Python, export them as CP/M .COM binaries, and chat with your vintage computer.
Z80-μLM is a 2-bit quantized language model small enough to run on an 8-bit Z80 processor. Train conversational models in Python, export them as C...
Show HN: Z80-μLM, a 'Conversational AI' That Fits in 40KB | Hacker News