China's GLM-5 Just Claimed the Open-Source Crown. Nobody Noticed.

While you were doom-scrolling about OpenAI drama, Zhipu quietly released the world's strongest open-source model.

7 min read

7 min read

The Silent Revolution

Here's a fun fact: while Western tech Twitter was having another meltdown about whatever Sam Altman said this week, China just dropped the mic on the entire AI scene. Meet GLM-5, Zhipu AI's latest creation—a 745-billion-parameter beast that's now officially the strongest open-source language model on the planet.

And here's the kicker: barely anyone noticed.

Released on February 11th under an MIT license (yes, proper open-source, not OpenAI's "we'll-call-it-open-because-we-like-the-name" bullshit), GLM-5 sailed straight to the top of Artificial Analysis's benchmarks, leaving Moonshot AI's previous champion in the dust.

The Numbers Don't Lie

Zhipu's stock (trading as Knowledge Atlas Technology in Hong Kong) absolutely exploded—up 30% to HK$405 in a single day. UCloud Tech, which provides computing infrastructure for Zhipu, hit its daily trading limit with a 20% surge. Chinese AI stocks just had their biggest rally in months.

But here's what's really mental: GLM-5 was trained entirely on Chinese Huawei chips. Not a single NVIDIA H100 in sight. While everyone else is scrambling for GPU allocations like it's 2020 toilet paper, Zhipu just proved you can build world-class AI with whatever hardware you've got.

The model boasts record-low hallucination rates using something called "RL slime" technique (don't ask me to explain that one—the paper's in Mandarin anyway). It's available through Z.ai's platform and costs about $0.80-$1.00 per million input tokens. That's cheaper than your morning coffee habit and more useful than most of your Slack conversations.

DeepSeek Joins the Party

As if that wasn't enough, DeepSeek decided this was the perfect time to remind everyone they exist too. They just upgraded their flagship model with a context window that jumped from 128,000 tokens to over 1 million—nearly a tenfold increase. Their knowledge cutoff now goes to May 2025, which means it knows things that happened after ChatGPT stopped paying attention.

DeepSeek V4 is dropping February 17th (perfect Lunar New Year timing), and word is it's going to be absolutely savage for coding tasks. They've integrated something called "Engram conditional memory technology" that can efficiently handle contexts exceeding one million tokens. That's enough to swallow your entire codebase and still ask for more.

The Independence Play

Here's what the West is missing: this isn't just about better models. This is about technological independence. While Silicon Valley companies are burning through venture capital and fighting over NVIDIA chips like kids over the last biscuit, Chinese companies are quietly building their own ecosystem.

Huawei chips. Chinese training infrastructure. MIT-licensed models that anyone can use, modify, and deploy without asking permission from some Valley oligarch.

The Chinese AI race isn't just accelerating—it's accelerating independently. They're not waiting for export licenses or hoping Biden doesn't ban their favourite chips next week. They're just building.

Wake Up Call

GLM-5's quiet dominance should be a bloody wake-up call. While Western AI labs are playing theatrical politics and posting cryptic tweets about "the coming storm," China is actually shipping products that work.

No drama. No manifestos about artificial general intelligence. No reality distortion fields or messianic founder worship. Just: here's a model that's better than yours, it's open source, and oh, by the way, we built it with chips you tried to ban us from using.

The revolution is happening. It's just not happening where you're looking.

And that 30% stock surge? That's just the market finally waking up to what should have been obvious months ago: the future of AI isn't going to be decided in San Francisco conference rooms. It's being built in Shenzhen labs, one benchmark at a time.

Sweet dreams, Silicon Valley.

Explore Topics

Icon

0%

Explore Topics

Icon

0%