Viewing a single comment thread. View all comments

zeyus t1_j338cuu wrote

Isn't their continued support one of the selling points for AM5? That they supported previous gen for ages and they plan to again

−2

geeky_username t1_j33cic6 wrote

Software.

Having AI compute hardware is rather pointless without the supporting software.

Nvidia has an entire CUDA ecosystem for developers to use

6

zeyus t1_j33nspg wrote

Absolutely agree, it's been a while since I've had AMD hardware, but I'd consider it again (especially CPU)...I just haven't been aware of specific issues with software either, I mean Intel, AMD and Nvidia all have had bugfixes and patching with drivers and firmware. Is there something I've missed about AMD and software?

BTW, I haven't had enough disposable income to upgrade so I've been stuck on 4590K for about 6 years and I hate my motherboard software (that's Asus bloatware) and had so much trouble getting the NVMe to work and RAID...but once I did it's been OK, and the 1070 I have is getting a bit to small for working with ML/AI, but what can you do...it still runs most newish games too.

2

geeky_username t1_j358k76 wrote

>Is there something I've missed about AMD and software?

They have this https://gpuopen.com/

Which seems great in theory, but some of that hasn't been touched in a long time.

Radeon Rays: May 2021

They'll release something, do a bunch of initial work on it, and then it fades

1

zeyus t1_j363mxu wrote

Well that is a genuine shame, nvidia really needs some competition in this space. I'm sure plenty of researchers and enthusiasts would happily use some different hardware (as long as porting was easy) I've written some CUDA C++ and it's not bad. Manufacturer-specific code always feels a bit gross, but the GPU agent based modeling framework I was using was strictly CUDA.

3

ZaZaMood t1_j39c7ot wrote

Nvidia needs some competition fr fr. I can't even consider buying AMD because the entire data science community has pinned to CUDA

2