A Chinese AI startup, Z.ai (formerly Zhipu), has introduced its new open-source large language model, GLM‑4.5, which it says costs less to run than DeepSeek’s DeepSeek‑R1 model.
At the World Artificial Intelligence Conference in Shanghai, Z.ai CEO Zhang Peng announced that GLM‑4.5 is priced at $0.11 per million input tokens and $0.28 per million output tokens, notably lower than DeepSeek’s reported per‑token fees.
This pricing follows a broader trend in China’s AI sector, where aggressive cost competition is reshaping expectations after DeepSeek shocked the industry with its $5–6 million claimed training cost, open‑source launch, and comparatively low operational expenses.
Why the pricing matters
DeepSeek‑R1 garnered attention by achieving performance comparable to GPT‑4‑class models at an estimated fraction of the cost, with some reports citing DeepSeek’s training cost at just $6 million, significantly lower than OpenAI’s GPT‑4 estimates of $50–100 million.
GLM‑4.5 undercuts even these low operating costs, putting pressure on DeepSeek’s lead in budget AI.
China now accounts for over 1,500 large language models globally, nearly half of the 3,755 models released worldwide by mid-2025.
This boom reflects fierce local competition and state-backed subsidies pushing innovation and driving down prices across the sector.
What sets GLM‑4.5 apart
Agentic AI design: GLM‑4.5 divides complex tasks into smaller subtasks to improve efficiency and precision.Lightweight architecture: Uses just eight Nvidia H20 chips half the hardware footprint reported for DeepSeek models.Open‑source access: GLM‑4.5 is publicly available, enabling developers worldwide to use and customize it without licensing fees.
Zhang Peng emphasized that Z.ai currently has sufficient in‑house computing capacity, removing the need for additional expensive hardware.