Nvidia Ceo- Jensen on competition from Amd.

1 : Anonymous2021/11/18 09:16 ID: qwlxla
Nvidia Ceo- Jensen on competition from Amd.
2 : Anonymous2021/11/18 12:54 ID: hl45ee7

Wtf is he actually saying?!

ID: hl4c34x

Basically His main points are- * Every Year some people say there is a Nvidia Killer. Nvidia takes it as intense serious competition. But * Hardware is not enough if you are behind in Software & Ecosystem.

ID: hl5nu09

Hardware is not enough? AMD won the Exascale Supercomputer contracts, and the national labs are handling the software with AMD.

ID: hl621rj

I mean the 2nd one is partially true

ID: hl6nkml

His right

ID: hl55r69

As little as possible in as many words as possible.

Edit: goodness gracious!

-Square-3716 PM'd me quite a roast regarding my lack of formal education in this space. I'm "a dumbfuck reddit user that holds no education or important position," according to them. I was merely making light of the fact that companies generally try to give up as little information as possible on these earnings calls, since they know their competition eagerly listen in for any clues.

ID: hl6fake

Aw, that's cute.

-Square-3716 is so afraid of downvotes and people seeing how horrible he is that he privately messaged you.

ID: hl6oto8

Hello - sorry to hear that they have privately messaged you. I would recommend reporting them to Reddit and muting them. They have been permanently banned here, though the account is only a few days old so it is likely an alt account.

ID: hl46oof

Exactly the point.

ID: hl4efn8

He is saying that they are still market leaders, since people is out to kill them not vice versa

ID: hl4xtpn

When you're #1 in front of the line, all anyone else can see is your back and the target on it.

ID: hl5dfql

I think he said, nvida doesn't have an answer to the mi200, but it doesn't matter.

On the hardware front, i would think if the a100 successor was a win vs mi200, he would have used different wording. Even if he wanted to keep their response secret.

On the software front, hes right, it doesnt matter. AMD has a lot of work to crack this nut. Hardware wins are an important step, but that is just showing up to play.

ID: hl88sp5

He's right though.

He's just repeating the meme at this point which is that everytime AMD announces a new GPU the world calls it an "Nvidia killer" but Nvidia just set new records with over $7billion quarterly revenue and continues to grow

ID: hl77aiz

The interviewer (11/16/2021 from OP's link below.) is asking Jensen about the competition. You have to know what the competition is bringing to the table.

Here is what the competition is bringing to the table.

Anandtech Intel AMD NVIDIA Product Ponte Vecchio MI250X A100 80GB Transistors 100 B 58.2 B 54.2 B FP64 Vector unknown 47.9 TFLOPS 9.5 TFLOPS FP64 Matrix unknown 95.7 TFLOPs 19.5 TFLOPS VRAM Cap 128 GB 128 GB 80 GB MFR. Intel 7, TSMC N7, N5 TSMC N6 TSMC N7

From Anandtech's article on Sapphire Rapids and Ponte Vecchio article 11/15/2021.

Jensen talks about Datacenter GPU accelerators only need two things. FP64 flops performance and memory capacity. And cost of course.

By comparing the competition in the table above, we can see that it looks as if Nvidia maybe behind in a couple of those metrics. So competition is fierce. Nvidia also has to share TSMC capacity between AMD, Apple, Qualcomm, Intel, and others on TSMC's N7, N7+, N7P, and N5. It is not all available for Nvidia or they would need to pay a higher premium.

While we don't know what FP64 flops performance Ponte Vecchio will bring to the table. We can already see that it will be manufactured on advanced nodes and is expected to have near double the amount of Transistors. So we can expect high FPS64 flops from these when they release in 2022.

So that answers the first part of Jensen's response. The second part of what Jensen talked about is Moore's Law. Moore's law basically predicts that we will roughly double the amount of transistors available in a dense integrated circuit about roughly every two years.

But with this constant march of progress, we will eventually hit an actual physical limit to the amount of transistors we can pack in a dense integrated circuit. This is because an atom in a transistor is about the size of 0.1 nm. And I don't think we can create transistors smaller than the size of an atom. Let alone transistors the size of an atom.

Intel has announced 20A by 2025. With Intel 20A replacing previously Intel's 2nm naming scheme. What is after 2nm? And beyond? I think this is what Jensen is hinting at about a physical limit on transistor density and so then the only thing that can differentiate between hardware will be software. And currently Nvidia CUDA is the dominant software being used in the datacenter GPU.

edit: a lot of spelling and rephrasing.

3 : Anonymous2021/11/18 13:36 ID: hl4a5mg

Jenson is of course right. Kind of...

They have the software stack and the ecosystem.

But the interviewer is right too. They got to where they are because their hardware was without competition. That has changed now.

Believing the software and ecosystem side can not change is foolish. Mind you AMD or Intel will need to work hard to crack that nut. But not impossible.

ID: hl4emzt

For HPC the nut is already cracked.. and the systems sold. MI200 already takes the cake... it isn't something they need to "work on" in that context they already did it... and saying otherwise is insulting to AMD's efforts in that sector for the last few years.

Yes ROCm sucks if you want to run it on your workstation... but they arent' selling workstations they are selling servers and super computers... smaller unit sales of those chips will take an uptick as that ecosystem improves (and they notably have integrated ROCm into the standard driver stack now so progress is being made even in that direction).

ID: hl5nn74

AMD software is lacking but has gotten better over the years, still if someone told me I could have a 6800XT for £599 or the RTX 3080 for £649 I'd take the 3080 for it's features even if it is lacking in VRAM compared to the 6800XT.

ID: hl5oqed

Where I am getting a 3080Ti was almost the same price as getting a 6800XT.... plus the last 6800XT I got had terrible coil whine and couldn't overclock/undervolt at all.

Downgraded to a 3060TI, sold the 6800XT at a £200 loss over what I paid for it, then sold the 3060Ti to a miner for £450 profit on top of purchase cost, made back the £200 loss doing mining on the side before I sold it, ended up spending £890 for the 3080Ti minus the profits I made.

Really depends on the market.

4 : Anonymous2021/11/18 14:10 ID: hl4e9yh

"If you just give someone an accelerator, what are they going to accelerate?"

His point is that making the hardware itself is relatively simple, but making the software stack that enables the hardware to do things is what's actually hard. It's sort of a backhanded way of saying AMD did the easy part, but still can't compete in the software space compared to Nvidia.

ID: hl4u2fx

Which is somewhat true. Huang realised very early in the game (in the mid 90s) that software is the key. That's why he centralised drivers and bet big on wide adoption of DirectX.

NV isn't just a hardware company, they are a software and services company too.

ID: hl5oj7p

I mean . . . Nvidia was almost killed off as a hardware company in their early days because their hardware didn't support the software being used (something about color support for 8bit vs 16bit or something to that effect). So they went around to companies and begged/convinced them to modify their software in order to support their hardware by eliminating certain colors from their software. Even from the very beginning I'd imagine Huang had a heavy appreciation that software was paramount.

ID: hl540ng

I think i read somewhere that he said nvidia is a software company not hardware at some point.

ID: hl5znq7

Actually NV sees themselves more as a SW company now.

ID: hl57bo2

If that was the be all end all, then Nvidia would have won the "two exoscale class systems" and not AMD. But in reality, AMD won, even with "lacking software support" which makes you think, is AMD software really lacking?

ID: hl5hvjn

That's what Huang meant with:

Actually, I think this is the absolute easiest space

in the interview.

Software for supercomputers is always heavily modified, or written from scratch, to run on that specific hardware.These machines are one-off builds with different architectures and topologies; each organisation that operates one has its own team of low-level programmers to write optimised code for it.

Because of that, the drivers/libraries/frameworks provided by GPU makers are hardly relevant in supercomputing. Either they're not used at all - in favour of lower-level code - or the software stack will be written and optimised around their quirks.

By comparison, normal consumer and workstation apps are written to run generically on a wide range of hardware, across multiple generations, with as little special-case effort as possible. The focus is on minimising developer time and customer support/troubleshooting, not getting the absolute most out of the hardware.

That's where Nvidia have excelled with the CUDA ecosystem and things like PhysX and DLSS. By contrast AMD have been through a whole string of different compute solutions (OpenCL 1, 2, HSA, HIP, ...), none of which has ever been a reliable option for all recent AMD hardware or maintained for more than a few years.

ID: hl608uw

Generational Supercomputer contracts have been traditionally spread among vendors, they are a way to subsidence US tech companies and hedge technological bets using public funds.

Other exascale projects are going to use NVIDIA and Intel accelerators.

A few years ago, intel got some big wins for DOE/NSF contracts using Xeon Phis. And we see where that all went.

5 : Anonymous2021/11/18 13:33 ID: hl49tr8

I have no idea what the hell he's trying to say lol? Seems like a real runaround answer to me. I mean he could have led into him saying yeah it's a great product but it doesn't have software, instead. This is just really nonsensical.

ID: hl4ea6e

Except that would be wrong, and insulting to HPC customers.... because at this point it is Nvidia that is behind because they don't provide an open source solution which has been a huge selling point for AMD, .... HPC users have been clamoring for this for at least a decade and even went so far as to write thier own HPC drivers for Nvidia...

ID: hl4zuw2

You mean HPC customers such as google, that does not provide official support for AMD in their machine learning API. Or such as Meta (facebook) that also doesn't provide support non-Beta support for AMD GPUs in their machine learning API. Or maybe HPC customers such as Amazon, which doesn't have any instance with high-end AMD GPUs for heavy workloads.

Reading your comments you don't have any idea about the HPC market, open-source code is a nice feature to have, but first, you need software that works, has consistent performance and a big ecosystem around it. For example, if you want to do machine learning an AMD GPU is as useful as a brick, ROCm barely works for machine learning and when it works is terrible slow. Nobody is interested in a GPU that requires beta software that may or not work and be or may not be fast if it works, even if that software is open source, because people at google, amazon, meta... have better things to do than writing a CUDA clone that works for AMD, they can just buy Nvidia GPUs.

6 : Anonymous2021/11/18 21:40 ID: hl69edp

this is the first time I've seen Jensen acknowledge amd's competition in this way.

7 : Anonymous2021/11/18 09:16 ID: hl3nqc8
8 : Anonymous2021/11/18 13:49 ID: hl4bmiu

He basically thinks that we will eventually reach a limit as to how small transistors can get, how much performance we can get out of silicon and the rest will be only software. If that comes, the only way to get more performance is to just put more GPUs in something, not better or faster ones, and once that comes, the performance difference will be small, it's all gonna be software.

He is right in a way, every new GPU or CPU is basically an improvement on the last one, with a few new instructions maybe, smaller transistors, accompanied by other new tech like memmory, interfaces etc. The software side of things doesn't get enough time for polish because it's not needed, why optimise drivers and software for a piece of hardware for 5 years if next year something else comes already.

He is right, but also I think he is a bit naive if he thinks that we are close to that point in time.

ID: hl4egtd

Very good explanation. But sadly node shrinks are coming to an end. No new tech has been invented yet that will replace current tech.

So, software and ai seems to be the only way for better performance in coming times.

ID: hl4gstk

I am not so sure about that though. Even without node shrinks, there is still a lot of improvements possible with hardware design. Just recently AMD proved that they can improve their Zen cores on the same node, look at Zen2 vs Zen3, and that is in one or two years only, now imagine if all the focus is just on that.

Number of transistors and their speed has been pushing development to new heights, every 2 years we have been getting a huge performance boost, if no company was systematically not doing that (intel). However, the microcode, architecture and underlying systems have basically just been improved bit by bit, but not in a major way. We still use the x86 CPU instruction set for gods sake, and how old is it, more than 40 years? Now imagine a world where lets say 1nm is standard, everyone has it and yields are almost 100%, and we can't get lower. This is where actual hardware design come into play, and software and driver support. I would guess this is one reason why AMD is also interested in Xilinx and FPGAs, and they have been also dabling in quantum computers as well. I believe Jensen is a bit naive here, or maybe he is looking for excuses because for the first time in a long while he has real competition.

ID: hl5imhe

It looks like there will be pretty significant process advances over the next 5 years. Add on top of that a further push into multi chiplet processors, and there should be very large increases in hardware performance over the short and mid term. I don't have any problem seeing a potential ~10x improvement from where we are now.

I mean it did look like process tech was hitting a wall a few years ago. But not currently, there is a lot on the roadmap that looks quite achievable in the near and mid term.

Long term. Ya, something is going to have to replace silicon. I think its a good bet that something will probably be found. It would be extremely hard for me to believe that humanity is anywhere close to the limit of what is possible. Breakthroughs can happen tomorrow, or 100 years from now, they are not predictable.

ID: hl6i34s

Look at tsmcs roadmap... Amd are just hitting 5nm but it's been out for years.

4nm on track for next year. 3nm q1 2023. That's a full node ahead of where amd is heading

2nm (another full node) should be in 2025

Node shrinks are still here for atleast another 3 years.

9 : Anonymous2021/11/18 10:20 ID: hl3s6k7

He has a point. Rocm still is not mature and has little adoption.

ID: hl4f5ib

Its not mature for workstations.... it IS mature for HPC... where those cards sell.

10 : Anonymous2021/11/18 12:48 ID: hl44syv

The "but it's two chips" excuse is lame.

Look at number of transistors.

MI200 is within 10% of number of transistors in A100.

It's just smarter design (and different performance focus too, heavier on fp64)

ID: hl4aevx

The "but it's two chips" excuse is lame.

That was said by the interviewer, why would it be an excuse?

ID: hl55rde

The "but it's two chips" excuse is lame.

There was no 'excuse' being dealt out here. It was just descriptive.

As usual, y'all are in endless persecution mode.

ID: hl4u79t

Isn’t wattage numbers whats more important. 560w versus 400w. So not really double but still an increase. Given that is also a new gen 1.5-2yrs later, doesnt look too impressive. But yes king in fp64 as of now indeed. Fp16 is barely 20% increase at 40% power increase meh.

Still a very good job AMD, they are getting much closer at least in HW.

11 : Anonymous2021/11/18 15:26 ID: hl4onrf

Software is definitely Nvidia's strong point. CUDA nowadays has total dominance in many parts of the market. This is not an exaggeration, it's go CUDA or go home.

Nobody has managed to break this dominance so far, and several companies have tried, for many years.

ID: hl5pwhc

Blender ditches OpenCL in the latest version and the only things left are CUDA and CPU rendering.

AMD did offer a rocM solution but that thing only runs on rdna2.

In research no one willingly uses AMD GPUs because CUDA was and still is the easiest, most documented API there is.

12 : Anonymous2021/11/18 13:50 ID: hl4bspk

Ok yeah, so the competition is not knowing what your competitors are up to and how far they will jump with their next product. Nvidia doesn't want to get stuck in a rut like Intel has and they know that competition can catch up such as what AMD has proved with CPUs.

In terms of the AMD dig, kinda funny from the reporter. It is on the same die so it doesn't really matter if there are two chips on it. Really helps increase the transistor density you can fit in a single server.

13 : Anonymous2021/11/18 13:34 ID: hl49v8n

MINOR INTENSE

14 : Anonymous2021/11/18 14:49 ID: hl4jerh

Can confirm as the former owner of a 6900xt, the "Nvidia killers" never quite live up to the hype.

15 : Anonymous2021/11/18 11:09 ID: hl3vv0x

It sounds like the ramblings of a crazy person.

16 : Anonymous2021/11/18 19:35 ID: hl5qjvw

What did Intel and Nvidia do when they acheived a comfortable leadership in hardware tech? They switched focus to software.

People are saying that software is the hard part. That's laughable to me. The talent pool for chip hardware R&D/engineering is far smaller than the software dev pool.

What history has shown (with Intel, Nvidia, and AMD) is that software for your specific hardware won't matter if your hardware can't compete. That's why it's not surprising to see AMD starting to catch up in the software front now in both the CPU space and the GPU space. And it's pretty clear to me that ML is next on AMDs hit list by the amount they talk about it of their own volition in their conversations on the future.

Nvidia had the luxury of at least 3 GPU generations of hardware that allowed them to shift focus to software. But lately AMD seems to be doing excellent in all fronts and I suspect we will see them catch up in the software front.

Hardware is the bait and software is the hook that won't let go - that keeps you wanting to stay using their hardware.

18 : Anonymous2021/11/18 15:37 ID: hl4qcnq

First off Jensen speaks incredibly poorly... second, even if he was being crystal clear i doubt AMD diehard's would get the hint.

19 : Anonymous2021/11/18 11:13 ID: hl3w6n3

Old clip of Jensen talking about Moore's law.

20 : Anonymous2021/11/18 12:06 ID: hl40n0m

i like the part where he says GPU-s should be like Gucci bags.

21 : Anonymous2021/11/18 17:40 ID: hl58x0h

at least the price is already there

22 : Anonymous2021/11/18 10:39 ID: hl3tkfl

Lol, two GPUs. Denial is strong in that one.

23 : Anonymous2021/11/18 13:40 ID: hl4an72

Do you realize Jensen didn't speak about the 2 GPUs at all, but the guy who asked the question ?

24 : Anonymous2021/11/18 14:21 ID: hl4fniw

Why but why was it even brought up it isnt' a valid critism of CDNA2 or RDNA3... neither of which will have much negative impact from the multi chip solution.

25 : Anonymous2021/11/18 17:08 ID: hl543wh

Do you realize I didn't say a word about the deer leader?

26 : Anonymous2021/11/18 13:10 ID: hl473bw

Why would an interviewer be in denial?

27 : Anonymous2021/11/18 17:11 ID: hl54gxi

I don't know? Probably similar reasons pricketts at Intel were spewing bullshit about "glued" Zen. It was nice of INTC to give them a pink slip though.

28 : Anonymous2021/11/18 14:20 ID: hl4fhv9

Why else... interviewer is a shill.

Two GPUs is not a disadvantage for HPC its a density advantage though... they dont' care as much about performance per GPU as they do about performance per rack.

29 : Anonymous2021/11/18 21:03 ID: hl63zre

He's right

30 : Anonymous2021/11/18 22:46 ID: hl6j2h9

Here is a Petition to NVIDIA to continue providing driver support to GTX 600 and 700 series

31 : Anonymous2021/11/18 09:53 ID: hl3qbvf

still not buying nvidia cards, go figure....Jensen

32 : Anonymous2021/11/18 10:39 ID: hl3tlre

Ya see what i mean? This dude shows up here to post only nvidia stuff and would say "idk what you mean i'm here for competition because it helps us!" and lie about the multiple accounts he runs on reddit to troll

33 : Anonymous2021/11/18 10:41 ID: hl3tqa1

Both my pc's are amd.

Did you even read it?

Nvidia Ceo said Amd is providing serious competition.

Also, this is a business not for weak hearted people as anything can happen.

34 : Anonymous2021/11/18 13:48 ID: hl4bjcs

Why are you posting this like it's some sort of surprise?

AMD, and formerly ATI, has always been a huge competitor in the consumer gpu market and have topped Nvidia multiple times.

Shit, I remember when I got my HD 5870 for the eyefinity, and Nvidia's top card at the time was the 9800 GTX, I think. Anyway, the 5870 destroyed it.

I remember articles talking about how Nvidia rushed out the x2, basically 2 9800s in sli but it was a single card, to compete with 5870 and 5890.

35 : Anonymous2021/11/18 11:10 ID: hl3vyhm

And you denied having multiple reddit accounts but there were proofs to show you lied. What's up with that? Weak hearted people can't admit that they are a long time troll with many accounts?

Did you even read it?

Did you see the proofs? Why'd ya lie if it's just business?

36 : Anonymous2021/11/18 14:48 ID: hl4j8th

What

My brain is off its late I'll read it tomorrow cheers

37 : Anonymous2021/11/19 01:50 ID: hl77i2x

The industry always says "who will have the Nvidia killer". The real question someone should ask will be, "Which alternative GPU will give Nvidia a good competition?" In my opinion, the Navi21 XTXH made them rethink some of their strategies, and Intel's GPUs will really present some marketing difficulties for Nvidia to stay dominant.

There is no killer. But there are market dominance disruptors.

38 : Anonymous2021/11/19 17:54 ID: hla4xt6

I got real lucky finding an AMD gpu at a honest price, but it shocks me how prices for AMD poroducts are so high in some countries. Why is that? Is it because AMD CPUs are being snatched by the new CPU-based crypto farmers?

39 : Anonymous2021/11/19 20:08 ID: hlaqltk

*your operations

40 : Anonymous2021/11/18 14:39 ID: hl4i0qb

This reads a lot better if you read it in Trumps voice. Really gives you the subtext of the words.

41 : Anonymous2021/11/18 16:05 ID: hl4uk3i

Jensen is so far from Trump you couldn't be more wrong. He built the most valued semi-conductor company on the planet.

42 : Anonymous2021/11/19 11:03 ID: hl8p3ca

wasn't saying he was like trump, just the wording sounded like it. Very choppy and almost like it jumped from thought to thought.

43 : Anonymous2021/11/18 18:03 ID: hl5cj4b

What would make AMD an NVidia Killer is if their drivers matched or were better than the competing card drivers on launch and throughout the lifespan.

I love my 6900 XT, but, it's had it's fair share of driver issues.

44 : Anonymous2021/11/18 22:55 ID: hl6kdan

Sorry to all AMD fans the super computer market is Nvidia and won it many years ago. Why NVIDIA won it because it took it seriously.

Same happened with RTX and DLSS. Nvidia took it seriously. Proof of AMD no taking things real just see the fiasco of FSR. Nvidia make a driver update with Nvidia Scaling on all games while FSR requiere work from game publisher.

Nvidia is much better for the end user.

45 : Anonymous2021/11/18 22:51 ID: hl6jrem

my 6900XT joins the chat: ayo bro I kill your top tier GPU with 80 watts less.

46 : Anonymous2021/11/19 02:11 ID: hl7a90o

AMD is no slouch when it comes to software. Jensen should get a bit of reality check. When MS was unwilling to optimise DX for AMD's draw call heavy drivers AMD simply made their own frickin API. Then Microsoft was forced to update their directX. When it comes to user interface, nvidia's drivers are still stuck in 2003, while AMD pushed for impressive controll over hardware. That was before Zen server money started rolling in and AMD gpu software team was really small!

47 : Anonymous2021/11/19 08:51 ID: hl8fluy

Yeah, AMD software looks good, but it’s actually pretty unstable and bad TuT

引用元:https://www.reddit.com/r/Amd/comments/qwlxla/nvidia_ceo_jensen_on_competition_from_amd/

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x