gpt-oss
Open weights model family (20b and 120b variants).
The gpt-oss-20b model delivers similar results to OpenAI o3‑mini on common benchmarks and can run on edge devices with just 16 GB of memory, making it ideal for on-device use cases, local inference, or rapid iteration without costly infrastructure.
Can work with codex-cli, in theory providing an
experience somewhat similar to Claude Code. In practice, I think the
experience is pretty painful.
Tags: llm
Created: 2026-01-05T16:13:10Z · Updated: 2026-01-05T16:13:10Z