Zhipu AI Releases Massive 754B Parameter GLM-5 Model
- •Zhipu AI launches GLM-5, a 754-billion parameter model released under the permissive MIT license.
- •The model size reaches 1.51TB on Hugging Face, doubling the scale of its predecessor.
- •Industry leaders shift terminology from 'Vibe Coding' to 'Agentic Engineering' for AI-assisted software development.
Zhipu AI has unveiled GLM-5, a staggering 754-billion parameter large language model that marks a significant leap in open-source AI capability. Distributed under the permissive MIT license, this release effectively doubles the scale of the previous GLM-4.7 version, weighing in at a massive 1.51TB on Hugging Face. Such immense scale suggests a commitment to pushing the boundaries of what open-weight models can achieve in a global field often dominated by proprietary, closed systems.
Beyond the raw hardware requirements, the release highlights a shifting paradigm in how professional developers interact with artificial intelligence. Tech strategist Simon Willison observes that the industry is rapidly moving away from 'Vibe Coding'—a term for informal, intuition-led development—toward 'Agentic Engineering.' This more structured approach treats AI not just as a text generator but as an active, autonomous participant in the software engineering lifecycle.
Initial tests of GLM-5 reveal impressive creative capabilities, such as generating complex SVG code for detailed visual prompts, even if some mechanical details like bicycle frames remain a challenge. By providing such a high-capacity foundation model to the public, Zhipu AI is positioning itself as a central player in the race for sophisticated, accessible tools. The move underscores the growing trend of using massive scale to achieve more reliable, agent-like behavior in autonomous coding tasks.