NVIDIA B100 "Blackwell" AI GPU Technical Details Leak Out (2024)

');$('.tpu-fancybox-wrap').css('maxWidth', maxWidth);*/instance.$refs.stage.on('transitionend', function() {updateButtonPos(instance);});},onUpdate: updateButtonPos,afterShow: function(instance, slide) {updateButtonPos(instance);instance.$refs.inner.find('.fancybox-tpu-nav').show();},beforeClose: function(instance, slide) {instance.$refs.inner.find('.fancybox-tpu-nav').hide();},afterClose: function(instance, slide) {$('.tpu-fancybox-wrap').contents().unwrap();$('body').removeClass('tpu-fancybox-body-wrap')},baseTpl: '

',});});}loadjs.ready(['jquery', 'fancybox', 'swiper'], function() {attachLightbox('a[data-fancybox]');if ($(window).width()<600) {$('.imgcontainer').each(function() {var $this=$(this);if (($this.find('a').length==1) || ($this.find('a').length>7))return;$this.addClass('swiper-container');$this.find('a').addClass('swiper-slide').css('width', 'auto').wrapAll('

');new Swiper ($this.eq(0), { slidesPerView: 'auto', slidesPerGroup: 1, spaceBetween: 15, pagination: { el: '.swiper-pagination', clickable: true } });});}$('.newspost').on('click', '.spoiler > .button, .spoiler > a', function(e) {e.preventDefault();$(this).next('div').toggle();});$('.newspost').on('click', '.ispoiler', function(e) {e.preventDefault();$(this).find('div').css('filter', '');$(this).removeClass('ispoiler');});$('.contnt').on('click', '.newspoll_btn', function() {popup.Show('TechPowerUp Quick Poll','Loading...');$.get('/news-poll/options?id='+$(this).data('id'), function(data) {$('#popup_content').html(data);});});});

Sunday, March 17th 2024

NVIDIA B100 "Blackwell" AI GPU Technical Details Leak Out (1)

by

T0@st
Discuss (41 Comments)

Jensen Huang's opening GTC 2024 keynote is scheduled to happen tomorrow afternoon (13:00 Pacific time)—many industry experts believe that the NVIDIA boss will take the stage and formally introduce his company's B100 "Blackwell" GPU architecture. An enlightened few have been treated to preview (AI and HPC) units—including Dell's CEO, Jeff Clarke—but pre-introduction leaks have not flowed out. Team Green is likely enforcing strict conditions upon a fortunate selection of trusted evaluators, within a pool of ecosystem partners and customers.

Today, a brave soul has broken that silence—tech tipster, AGF/XpeaGPU, fears repercussions from the leather-jacketed one. They revealed a handful of technical details, a day prior to Team Green's highly anticipated unveiling: "I don't want to spoil NVIDIA B100 launch tomorrow, but this thing is a monster. 2 dies on (TSMC) CoWoS-L, 8x8-Hi HBM3E stacks for 192 GB of memory." They also crystal balled an inevitable follow-up card: "one year later, B200 goes with 12-Hi stacks and will offer a beefy 288 GB. And the performance! It's... oh no Jensen is there... me run away!" Reuters has also joined in on the fun, with some predictions and insider information: "NVIDIA is unlikely to give specific pricing, but the B100 is likely to cost more than its predecessor, which sells for upwards of $20,000." Enterprise products are expected to arrive first—possibly later this year—followed by gaming variants, maybe months later.

Sources:AGF Tweet, VideoCardz, Reuters, Wccftech

Related News

  • Tags:
  • 192 GB
  • 2024
  • B100
  • B200
  • Blackwell
  • CoWoS
  • HBM3E
  • HPC
  • insider
  • Jensen Huang
  • Leak
  • NVIDIA
  • Rumor
  • Rumors
  • Team Green
  • TSMC
  • Aug 21st 2023 NVIDIA BIOS Signature Lock Broken, vBIOS Modding and Crossflash Enabled by Groundbreaking New Tools (210)
  • Dec 24th 2023 NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024 (126)
  • May 24th 2024 NVIDIA RTX 5090 "Blackwell" Founders Edition to Implement the "RTX 4090 Ti" Cinderblock Design (118)
  • May 5th 2024 NVIDIA to Only Launch the Flagship GeForce RTX 5090 in 2024, Rest of the Series in 2025 (154)
  • Feb 19th 2024 NVIDIA RTX 50-series "Blackwell" to Debut 16-pin PCIe Gen 6 Power Connector Standard (106)
  • Aug 22nd 2023 NVIDIA Announces DLSS 3.5 Ray Reconstruction Technology, Works on GeForce 20 and Newer (89)
  • Jun 11th 2024 Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked (138)
  • May 9th 2024 NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W (84)
  • Apr 5th 2024 NVIDIA Releases DLSS 3.7.0 With Quality E Preset for Image Quality Improvements (54)
  • Oct 17th 2023 NVIDIA Readies GeForce RTX 4070 SUPER, RTX 4070 Ti SUPER, and RTX 4080 SUPER (76)
Add your own comment
#1
Antique4106

I wonder what AMD's response to this will be.. Well, time to get ready for whatever the MI350X brings to the table, because this is definitely going to require something new for AMD to compete with it. The current MI300X is a bit better than the H200, if I remember correctly.

#2
Solid State Soul ( SSS )

If i sip a drink every time i read the word Ai, i would die by the end of the day

#3
Space Lynx

Astronaut

Solid State Soul ( SSS )If i sip a drink every time i read the word Ai, i would die by the end of the day

I still think Ai is indeed overrated, and a huge market bubble that is going to collapse. CoPilot with gpt-4 was just giving me nonsense yesterday, then it showed me some ads that I will never in a million years click on.

Dumb as f*ck hype train. I think it has potential, but it's definitely not there yet. I think Nvidia stock is a huge huge bubble and its going to pop within 2 years when the hype train realizes there is no money being made from it. It will take time for the pop to happen though, as humans are subject to hype train nonsense.

#4
Onasi

@Space Lynx
I’ve got this feeling of deja vu. Like we’ve been in this situation before… hmm… oh wait, it was in the two previous crypto bubbles! After the first one NV even had to explain to their shareholders why the growth just suddenly stopped. And the only reason they have come out of the second one just fine is that we’ve essentially transitioned from crypto to AI now.
To be fair, compute is compute and there is always demand for more. So not like NV will just tank. But I fully expect the current feeding frenzy to subside. Especially when, inevitably, specialized hardware will replace GPGPUs for the task, as happened with mining.

#5
beedoo
Space LynxI still think Ai is indeed overrated, and a huge market bubble that is going to collapse. CoPilot with gpt-4 was just giving me nonsense yesterday, then it showed me some ads that I will never in a million years click on.

Dumb as f*ck hype train. I think it has potential, but it's definitely not there yet. I think Nvidia stock is a huge huge bubble and its going to pop within 2 years when the hype train realizes there is no money being made from it. It will take time for the pop to happen though, as humans are subject to hype train nonsense.

To be fair, AI should be great for specific use-cases. I'd be hopeful that AI can be used to improve CPU/GPU designs to squeeze more performance and reduce power requirements - assuming the current model is something that will continue for a while.

As a software engineer, I'm not convinced by AI yet - whilst many colleagues of mine want in with absolutely no idea what they need it for. The world has a number of problems to solve, so why not get AI working on them? Chances are it will come up with something, although it's likely 'it' still has much to learn.

#6
Space Lynx

Astronaut

I just haven't seen any AI that has impressed me yet, that's all. Like AI images where a finger is in a cup of coffee, but the rest of the image is mostly ok, it just can't do precision as much as it might try. Until it impresses me I am still saying its a bubble, /shrug

#8
Space Lynx

Astronaut

those who bought a f*ck ton of H200's and still haven't seen any profit generation:

NVIDIA B100 "Blackwell" AI GPU Technical Details Leak Out (5)

#9
Denver
Antique4106I wonder what AMD's response to this will be.. Well, time to get ready for whatever the MI350X brings to the table, because this is definitely going to require something new for AMD to compete with it. The current MI300X is a bit better than the H200, if I remember correctly.

There's an upcoming update expected to include enhancements in memory, bandwidth, or similar improvements. However, it's debatable whether it's necessary considering the H100's price tag of US$30k. The B100, on the other hand, is anticipated to cost around US$40-50k, which could be equivalent to purchasing three mi300X units priced at $15k each. Demand far exceeds production capacity regardless.

#10
ir_cow
Space LynxI still think Ai is indeed overrated, and a huge market bubble that is going to collapse. CoPilot with gpt-4 was just giving me nonsense yesterday, then it showed me some ads that I will never in a million years click on.

We are in the phase now like the early Internet. Everyone is trying to raise money for their company now this time under the AI banner. But AI when applied correctly is far from overrated. I'm not a programer, but ChatGPT can and has written code for me. Just amazing and free.

#11
wolf

Performance Enthusiast

fears repercussions from the leather-jacketed one

For a news post from a staff member, I find this content in the article itself to be in very poor taste.

I expect it from a considerable few in the user base, but not the staff, you guys can do better than that.

#12
bonehead123
beedooThe world has a number of problems to solve

Yep, and 99.999% of them are of the bipedal, upright variety....

But oh well, Skynet will solve that part of the equation rather quickly :D

#13
Space Lynx

Astronaut

ir_cowWe are in the phase now like the early Internet. Everyone is trying to raise money for their company now this time under the AI banner. But AI when applied correctly is far from overrated. I'm not a programer, but ChatGPT can and has written code for me. Just amazing and free.

I think the problem is consistency though? I have heard from several people it gets coding wrong half the time if not more. So, you still have to check the code in a detailed manner, defeating the purpose entirely.

bonehead123Yep, and 99.999% of them are of the bipedal, upright variety....

But oh well, Skynet will solve that part of the equation rather quickly :D

this doesn't scare me, what scares me is all the nuclear talk lately, when we went a solid 30-40 years thinking such a scenario was impossible. Strange times indeed.

#14
Denver
ir_cowWe are in the phase now like the early Internet. Everyone is trying to raise money for their company now this time under the AI banner. But AI when applied correctly is far from overrated. I'm not a programer, but ChatGPT can and has written code for me. Just amazing and free.

That's what usually happens, laypeople in a specific area being impressed by "AI" doing something below mediocre that they don't have the knowledge to notice, try throwing in a couple of questions from an area you've mastered and your eyes will open at the ton of grotesque errors. :')

#15
Space Lynx

Astronaut

DenverThat's what usually happens, laypeople in a specific area being impressed by "AI" doing something below mediocre that they don't have the knowledge to notice, try throwing in a couple of questions from an area you've mastered and your eyes will open at the ton of grotesque errors. :')

ya I catch CoPilot with gpt-4 getting sh*t wrong all the time, and I call it out, and then its like oh sorry you are correct actually. lol

#16

beedooTo be fair, AI should be great for specific use-cases. I'd be hopeful that AI can be used to improve CPU/GPU designs to squeeze more performance and reduce power requirements - assuming the current model is something that will continue for a while.

As a software engineer, I'm not convinced by AI yet - whilst many colleagues of mine want in with absolutely no idea what they need it for. The world has a number of problems to solve, so why not get AI working on them? Chances are it will come up with something, although it's likely 'it' still has much to learn.

I've heard rumors of AI doing basic math problems at the 2-3th grade level.

Is AI general intelligence close, if you know?

#17
FoulOnWhite

We need AI because humans have none

Yikes seems like a monster Nvid product again though.

#18
ir_cow
Space LynxI think the problem is consistency though? I have heard from several people it gets coding wrong half the time if not more. So, you still have to check the code in a detailed manner, defeating the purpose entirely.

Sure. It's all based on models that need to be trained. Having these big models that cover everything will inevitably produce errors. But for code it either works or doesn't. It might not be optimized either, but it works and best part it is free. Id happily paid for a trained model for coding in X language.

The money is going to be in specialties, not this general AI going on right now. But you need to learn to walk before running :)

#19
Keullo-e

S.T.A.R.S.

I can't even imagine how detailed waifu pics this can generate. :)

#20
ZoneDymo
wolfFor a news post from a staff member, I find this content in the article itself to be in very poor taste.

I expect it from a considerable few in the user base, but not the staff, you guys can do better than that.

You need to work on reading comprehension it seems, it's a joke upon the joke last line from the Twitter post....come on man.

#21
Onasi
A&amp;P211Is AI general intelligence close, if you know?

About as close as nuclear fusion energy. So many decades away if ever. What we have now is “AI” really in marketing terms only.

#22
the54thvoid

Intoxicated Moderator

wolfFor a news post from a staff member, I find this content in the article itself to be in very poor taste.

I expect it from a considerable few in the user base, but not the staff, you guys can do better than that.

I, for one, am very reassured that the leather-jacketed one works for Nvidia. It makes them cooler.

NVIDIA B100 "Blackwell" AI GPU Technical Details Leak Out (6)

Ayyyyyyy!

Let us not forget American history and the church of the Fonz - this is the ONLY leather-jacketed one. Amen.

#23
Jism
ir_cowWe are in the phase now like the early Internet. Everyone is trying to raise money for their company now this time under the AI banner. But AI when applied correctly is far from overrated. I'm not a programer, but ChatGPT can and has written code for me. Just amazing and free.

Have you even read the small letters to always, check / test / verify the output code?

I would not want to give something to anyone using ChatGPT without any knowledge of the actual code.

#24
ir_cow
JismHave you even read the small letters to always, check / test / verify the output code?

I would not want to give something to anyone using ChatGPT without any knowledge of the actual code.

Who says it isn't for personal use :). I'm not launching a rocket.

I would pay for an AI program that will cook with me in real time and giving advice or explaining how to cook meals I have never thought of or done before.

General models is all we have right now, but soon specialty AI will be the final stop. It's to assist you, not replace you.

The trick right now is to tell chatGPT in small chunks what you want to code. It does this very well. You can't say "write me a game engine". But you can ask for some logic to be written C++ code.

#25
Jism

Well i can think of a few models, AI can dominate or be extremely talented.

It's just at the very early years - wait until you can have AI develop your next CPU/GPU without any human intervention even. A zillion of possibilities really.

Add your own comment
NVIDIA B100 "Blackwell" AI GPU Technical Details Leak Out (2024)
Top Articles
Latest Posts
Article information

Author: Greg Kuvalis

Last Updated:

Views: 6370

Rating: 4.4 / 5 (75 voted)

Reviews: 82% of readers found this page helpful

Author information

Name: Greg Kuvalis

Birthday: 1996-12-20

Address: 53157 Trantow Inlet, Townemouth, FL 92564-0267

Phone: +68218650356656

Job: IT Representative

Hobby: Knitting, Amateur radio, Skiing, Running, Mountain biking, Slacklining, Electronics

Introduction: My name is Greg Kuvalis, I am a witty, spotless, beautiful, charming, delightful, thankful, beautiful person who loves writing and wants to share my knowledge and understanding with you.