Okay, so there’s this whole buzz about Huawei’s new AI thingy, the CloudMatrix 384, or whatever it’s called. People are talking about it like it’s some superhero that’s going to give NVIDIA a run for its money. I mean, who’d have thought, right? Chinese tech companies are all over it, and they’re apparently quite impressed. I guess Huawei’s really diving into the game. Anyway — wait, where was I? Oh, right, the CloudMatrix thing.
So, Huawei is like, “Hey, let’s not rely on anyone else. We got our own stuff now.” And they’re serious. They’ve put together this AI server with in-house tech that’s supposed to rival NVIDIA’s best. Word on the street — or is it from the Financial Times? — is that about ten big-shot companies have already started using it. They’re keeping the customers hush-hush, though, but it’s a big deal.
Quick detour here. You know when you’re supposed to talk specs, but your mind just wanders? Yeah, same. Anyway, they say this CloudMatrix 384 is powered by these Ascend 910C chips, hundreds of them in some complicated setup I can’t even pretend to understand. It’s like stuffing more chips in there than what NVIDIA does with their GB200 NVL72 (try saying that five times fast).
But, oh, here’s a thing — it’s a power hog. I mean, it chugs electricity like there’s no tomorrow. But they’re playing it cool, focusing more on what they can build themselves rather than trying to be the cheapest.
And, get this, one of these servers costs like $8 million! Yep, you read that right. Not exactly pocket change. But hey, it’s not about being cheap; it’s more like a “look what we did, all by ourselves” sort of statement. Which sounds pretty bold to me.
So, yeah. Interesting times.