01.AI · Released May 13, 2024

Yi-1.5 34B Chat

The Yi series is 01.AI's open-weight family. Yi-1.5 is a continuation pre-train of the original Yi base on an additional 500 billion tokens, followed by fine-tuning on 3 million instruction examples. The 34B Chat checkpoint remains one of the most popular self-hosted models in its size class.

Why it's worth knowing

34B is an unusually practical size: large enough to be genuinely capable, small enough to fit on a single 48 GB workstation card at FP16 or on a 24 GB consumer card at Q4. Among models in that bracket, Yi-1.5 has unusually strong bilingual (Chinese/English) performance and a permissive Apache 2.0 license.

What it's good at

Long-context retrieval (it was one of the early models to handle long documents reliably), Chinese-language tasks, and as a fine-tuning base. The Yi-Coder variant derived from the same family is also respected in its niche.

What to watch for

It's been superseded by Qwen2.5 on most public benchmarks. Choose it when you specifically want the 34B size, a fully Apache-licensed model, or its long-context behavior.

License

Apache 2.0 — fully permissive.