Skip to main content Skip to footer


March 28, 2024

Nvidia’s Blackwell: big news at a big cost

The new architecture’s step-change in power could usher in sweeping gen AI advancements—but mainly at the highest echelons of the technology field.


In the news

Last week, Nvidia rolled out a new computing platform. Named Blackwell, the platform is highlighted by the B200, a new 208-billion-transistor chip said to blow away Nvidia’s already best-in-class offerings.

Some might dismiss the product announcement as hype, given that the information on the B200’s specs and potential comes from the vendor itself, whose rapid growth in both reputation and valuation ($2 trillion and counting) have tracked alongside the surge in interest in artificial intelligence.

However, Blackwell is one product rollout that’s worth exploring. Because if its performance comes anywhere near Nvidia’s claims, the platform could usher in the next wave of generative AI.

Those claims are impressive. Nvidia says that when training gen AI models, Blackwell beats the performance of Hopper, its present-generation processor, by 250%. Its efficiency improvement is even more impressive: The B200 can reduce power consumption by a factor of 25—an important consideration in the massive data centers that will use it.

The anticipated impact of Blackwell was underscored by Nvidia’s confidence that it will “help unlock breakthroughs” in the relentless progression of generative AI. The architecture, it is said, will enable trillion-parameter models; this puts to shame today’s models, which are limited to billions of parameters.

But can a product announcement, no matter how impressive, possibly be as important as many are now saying? Or is Nvidia sitting in first class on the generative AI hype train?

The Cognizant take

While Blackwell may indeed push the state of the AI art, it’s up for debate whether anybody but the top technology companies will benefit. “These chips are going to cost a bloody fortune,” notes Duncan Roberts, a Senior Manager at Cognizant Research. (Nvidia says B200s will run $30,000 to $40,000 a pop—we’d say that qualifies as a bloody fortune.)

At that price, Roberts says, buyers will primarily be “the same old folks with the same old power.” It’s an issue, he adds, because “true innovation more often than not demands a wide ecosystem.”

A promising area for innovation lies in the challenge of doing more with less. “Smaller LLMs on cheaper hardware— now, that’s interesting,” Roberts says. “Efficiency and innovation will come as people get less-expensive hardware running the sorts of models we see on big hardware now.” Google has gone this route recently, introducing a pair of relatively small, open-source AI models intended for such modest tasks as chatbots and summaries.

To be sure, Blackwell can and will find plenty of takers and plenty of use cases, including important ones. “For big-horsepower needs like drug discovery, it’s cool,” Roberts says. “But you’d also like to see somebody use these chips to do other things that are new and exciting.”



Tech to Watch Blog
Cognizant’s weekly blog
Headshot of Digitally Cognizant author Tech to Watch

Understand the transformative impact of emerging technologies on the world around us as they address our most significant global challenges.

editorialboard@cognizant.com



Latest posts

Related posts

Subscribe for more and stay relevant

The Modern Business newsletter delivers monthly insights to help your business adapt, evolve, and respond—as if on intuition