GLM-5 Debuts 754B-Parameter Open-Source Model On Hugging Face
&w=1920&q=75)
GLM-5, an MIT-licensed 754 billion-parameter model, appears on Hugging Face with a 1.51TB checkpoint, roughly double GLM-4.7's 368B parameters and 717GB size. Observers note Z.ai's push to label professional developers using LLMs as 'Agentic Engineering,' echoed by Andrej Karpathy and Addy Osmani. Early testing via OpenRouter produced strong SVG generation but imperfect structural outputs.
Scoring Rationale
High novelty and industry-wide relevance justify a strong score, but brief, single-source coverage limits verification and depth.
Practice with real Logistics & Shipping data
90 SQL & Python problems · 15 industry datasets
250 free problems · No credit card
See all Logistics & Shipping problemsStep-by-step roadmaps from zero to job-ready — curated courses, salary data, and the exact learning order that gets you hired.
Sources
- Read OriginalGLM-5: From Vibe Coding to Agentic Engineeringsimonwillison.net


