> With over 90% of the PC market running on NVIDIA tech, they’re the clear winner of the GPU race. The losers are every single one of us.
I have been rocking AMD GPU ever since the drivers were upstreamed into the linux kernel. No regrets.
I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy. But consumer gotta consoooooom and then cry and outrage when they are exploited instead of just walking away and doing something else.
Same with magic the gathering, the game went to shit and so many people got outraged and in a big huff but they still spend thousands on the hobby. I just stopped playing mtg.
I want to love AMD, but they're just... mediocre. Worse for gaming, and much worse for ML. They're better-integrated into Linux, but given that the entire AI industry runs on:
1. Nvidia cards
2. Hooked up to Linux boxes
It turns out that Nvidia tends to work pretty well on Linux too, despite the binary blob drivers.
Other than gaming and ML, I'm not sure what the value of spending much on a GPU is... AMD is just in a tough spot.
> I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy.
I'd really love to try AMD as a daily driver. For me CUDA is the showstopper. There's really nothing comparable in the AMD camp.
ROCM is, to some degree and in some areas, a pretty decent alternative. Developing with it is often times a horrible experience, but once something works, it works fine.
> I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy.
I'm with you - in principle. Capital-G "Gamers" who see themselves as the real discriminated group have fully earned the ridicule.
But I think where the criticism is valid is how NVIDIA's behavior is part of the wider enshittification trend in tech. Lock-in and overpricing in entertainment software might be acceptance, but it gets problematic when we have the exact same trends in actually critical tech like phones and cars.
I am not a gamer and don't why AMD GPUs aren't good enough. It's weird since both Xbox and PlayStation are using AMD GPUs.
I guess there games that you can only play on PC with Nvidia graphics. That begs the question why someone create a game and ignore large console market.
AMD cards are fine from a raw performance perspective, but Nvidia has built themselves a moat of software/hardware features like ray-tracing, video encoding, CUDA, DLSS, etc where AMD's equivalents have simply not been as good.
With their current generation of cards AMD has caught up on all of those things except CUDA, and Intel is in a similar spot now that they've had time to improve their drivers, so it's pretty easy now to buy a non-Nvidia card without feeling like you're giving anything up.
I have no experience of using it so I might be wrong but AMD has ROCm which has something called HIP that should be comparable to CUDA. I think it also has a way to automatically translate CUDA calls into HIP as well so it should work without the need to modify your code.
What I experienced is that AI is a nightmare on AMD in Linux. There is a myriad of custom things that one needs to do, and even that just breaks after a while. Happened so much on my current setup (6600 XT) that I don't bother with local AI anymore, because the time investment is just not worth it.
It's not that I can't live like this, I still have the same card, but if I were looking to do anything AI locally with a new card, for sure it wouldn't be an AMD one.
I don't have much experience with ROCm for large trainings, but NVIDIA is still shit with driver+cuda version+other things. The only simplification is due to ubuntu and other distros that already do the heavy lift by installing all required components, without much configuration.
Nvidia is the high end, AMD is the mid segment and Intel is the low end. In reality I am playing 4K on HellDivers with 50-60FPS on a 6800XT.
Traditionally the NVIDIA drivers have been more stable on Windows than the AMD drivers. I choose an AMD card because I wanted a hassle free experience on Linux (well as much as you can).
Software. AMD has traditionally been really bad at their drivers. (They also missed the AI train and are trying to catch up).
I use Linux and have learned not to touch AMD GPUs (and to a lesser extent CPUs due to chipset quality/support) a long time ago. Even if they are better now, (I feel) Intel integrated (if no special GPU perf needed) or NVidia are less risky choices.
This is wrong. For 14 years the recommendation on Linux is:
* Purchase always AMD.
* Purchase never Nvidia.
* Intel is also okay.
Because the AMD drivers are good and open-source. And AMD cares about bug reports. The one from Nvidia can and will create issues because they’re closed-source and avoided for years to support Wayland. Now Nvidia published source-code and refuses to merge it into Linux and Mesa facepalm
While Nvidia comes up with proprietary stuff AMD brought us Vulkan, FreeSync, supported Wayland well already with Implicit-Sync (like Intel) and used the regular Video-Acceleration APIs for long time.
Their bad drivers still don’t handle simple actions like a VT-Switch or Suspend/Resume. If a developer doesn’t know about that extension the users suffer for years.
Okay. But that is probably only a short term solution?
It is Nvidias short term solution since 2016!
I've been using a 4090 on my linux workstation for a few years now. Its mostly fine - with the occasional bad driver version randomly messing things up. I'm using linux mint. Mint uses X11, which, while silly, means suspend / resume works fine.
NVIDIA's drivers also recently completely changed how they worked. Hopefully that'll result in a lot of these long term issues getting fixed. As I understand it, the change is this: The nvidia drivers contain a huge amount of proprietary, closed source code. This code used to be shipped as a closed source binary blob which needed to run on your CPU. And that caused all sorts of problems - because its linux and you can't recompile their binary blob. Earlier this year, they moved all the secret, proprietary parts into a firmware image instead which runs on a coprocessor within the GPU itself. This then allowed them to - at last - opensource (most? all?) of their remaining linux driver code. And that means we can patch and change and recompile that part of the driver. And that should mean the wayland & kernel teams can start fixing these issues.
In theory, users shouldn't notice any changes at all. But I suspect all the nvidia driver problems people have been running into lately have been fallout from this change.
> I use Linux and have learned not to touch AMD GPUs (and to a lesser extent CPUs due to chipset quality/support) a long time ago. Even if they are better now, (I feel) Intel integrated (if no special GPU perf needed) or NVidia are less risky choices.
Err, what? While you're right about Intel integrated GPUs being a safe choice, AMD has long since been the GPU of choice for Linux -- it just works. Whereas Nvidia on Linux has been flaky for as long as I can remember.
Had major problems with xinerama, suspend/resume, vsync, probably a bunch of other stuff.
That said, I've been avoiding AMD in general for so long the ecosystem might have really improved in the meantime, as there was no incentive for me to try and switch.
Recently I've been dabbling in AI where AMD GPUs (well, sw ecosystem, really) are lagging behind. Just wasn't worth the hassle.
NVidia hw, once I set it up (which may be a bit involved), has been pretty stable for me.
I run llama.cpp using Vulkan and AMD CPUs, no need to install any drivers (or management software for that matter, nor any need to taint the kernel meaning if I have an issue it's easy to get support). For example the other day when a Mesa update had an issue I had a fix in less than 36 hours (without any support contract or fees) and `apt-mark hold` did a perfect job until there was a fix. Performance for me is within a couple of % points, and with under-volting I get better joules per token.
>Err, what? While you're right about Intel integrated GPUs being a safe choice, AMD has long since been the GPU of choice for Linux -- it just works. Whereas Nvidia on Linux has been flaky for as long as I can remember.
Not OP, I had same experience in the past with AMD,I bought a new laptop and in 6 months the AMD decided that my card is obsolete and no longer provided drivers forcing me to be stuck with older kernel/X11 , so I switched to NVIDIA and after 2 PC changes I still use NVIDIA since the official drivers work great, I really hope AMD this time is putting the effort to keep older generations of cards working on latest kernels/X11 maybe next card will be AMD.
But this is an explanations why us some older Linux users have bad memories with AMD and we had good reason to switch over to NVIDIA and no good reason to switch back to AMD
They seem to be close?
The RX 9070 is the 2nd most efficient graphics card this generation according to TechPowerUp and they also do well when limited to 60Hz, implying their joules per frame isn't bad either.
I run 9070s (non XT) and in combination with under-volting it is very efficient in both joules per frame and joules per token. And in terms of purchase price it was a steal compared to similar class of NVidia cards.
You are certainly right that this group has little spending self-control. There is no limit just about to how abusive companies like Hasbro, Nvidia and Nintendo can be and still rake in record sales.
They will complain endlessly about the price of a RTX 5090 and still rush out to buy it. I know people that own these high end cards as a flex, but their lives are too busy to actually play games.
I'm not saying that these companies aren't charging "fair" prices (whatever that means) but for many hardcore gamers their spending per hour is tiny compared to other forms of entertainment. They may buyba $100 game and play to for over 100 hours. Maybe add another $1/hour for the console. Compared to someone who frequents the cinema goes to the pub or does many other common hobbies and it can be hard to say that games are getting screwed.
Now it is hard to draw a straight comparison. Gamers may spend a lot more time playing so $/h isn't a perfect metric. And some will frequently buy new games or worse things like microtransactions which quickly skyrocket the cost. But overall it doesn't seem like the most expensive hobby, especially if you are trying to spend less.
> I have also realized that there is a lot out there in the world besides video games
My favorite part about being a reformed gaming addict is the fact that my MacBook now covers ~100% of my computer use cases. The desktop is nice for Visual Studio but that's about it.
I'm still running a 5700XT in my desktop. I have absolutely zero desire to upgrade.
Same here - actually, my PC broke in early 2024 and I still haven't fixed it. I quickly found out that without gaming, I no longer have any use for my PC, so now I just do everything on my MacBook.
Same here. I got mine five years ago when I needed to upgrade my workstation to do work-from-home, and it's been entirely adequate since then. I switched the CPU from an AMD 3900 to a 5900, but that's the only upgrade. The differences from one generation to the next are pretty marginal.
> I have also realized that there is a lot out there in the world besides video games
...and even if you're all in on video games, there's a massive amount of really brilliant indie games on Steam that run just fine on a 1070 or 2070 (I still have my 2070 and haven't found a compelling reason to upgrade yet).
> I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy.
I think more and more people will realize games are a waste of time for them and go on to find other hobbies. As a game developer, it kinda worries me. As a gamer, I can't wait for gaming to be a niche thing again, haha.
The games industry is now bigger than the movies industry. I think you're very wrong about this, as games are engaging in a way other consumption based media simply cannot replicate.
I played video games since I was a teenager. Loved them, was obsessed with them. Then sometime around 40 I just gave up. Not because of life pressure or lack of time but because I just started to find them really boring and unfulfilling. Now I’d much rather watch movies or read. I don’t know if the games changed or I changed.
I get that, I go through periods of falling in and out of them too after having grown up with them. But there is a huge fraction of my age group (and a little older) that have consistently had games as their main "consumption" hobby throughout.
And then there's the age group younger than me, for whom games are not only a hobby but also a "social place to be", I doubt they'll be dropping gaming entirely easily.
Fortunately for your business model, there's a constant stream of new people to replace the ones who are aging out. But you have to make sure your product is appealing to them, not just to the same people who bought it last decade.
Also playing PC video games doesn't even require a Nvidia GPU. It does sorta require Windows. I don't want to use that, so guess I lost the ability to waste tons of time playing boring games, oh no.
Out of the 11 games I've bought through Steam this year, I've had to refund one (1) because it wouldn't run under Proton, two (2) had minor graphical glitches that didn't meaningfully affect my enjoyment of them, and two (2) had native Linux builds. Proton has gotten good enough that I've switched from spending time researching if I can play a game to just assuming that I can. Presumably ymmv depending on your taste in games of course, but I'm not interested in competitive multiplayer games with invasive anticheat which appears to be the biggest remaining pain point.
My experience with running non-game windows-only programs has been similar over the past ~5 years. It really is finally the Year of the Linux Desktop, only few people seem to have noticed.
It depends on the games you play and what you are doing. It is a mixed bag IME. If you are installing a game that is several years old it will work wonderfully. Most guides assume you have Arch Linux or are using one of the "gaming" distros like Bazzite. I use Debian (I am running Testing/Trixie RC on my main PC).
I play a lot of HellDivers 2. Despite what a lot of Linux YouTubers say. It doesn't work very well on Linux.
The recommendations I got from people was to change distro. I do other stuff on Linux. Game slows down when you need it to be running smoothly doesn't matter what resolution/settings you set.
Anything with anti-cheat probably won't work very well if at all.
I also wanted to play the old Command and Conquer games. Getting the fan made patchers (not the games itself) to run properly that fix a bunch of bugs that EA/Westwood never fixed and mod support is more difficult than I cared to bother with.
This will use gamemode to run it, give it priority, put the system in performance power mode, and will fix any pulse audio static you may be having. You can do this for any game you launch with steam, any shortcut, etc.
It's missing probably 15fps on this card between windows and Linux, and since it's above 100fps I really don't even notice.
It does seem to run a bit better under gnome with Variable Refresh Rate than KDE.
I will be honest, I just gave up. I couldn't get consistent performance on HellDivers 2. Many of the things you have mentioned I've tried and found they don't make much of a difference or made things worse.
I did get it running nice for about a day and then an update was pushed and it ran like rubbish again. The game runs smoothly when initially running the map and then massive dip in frames for several seconds. This is usually when one of the bugs is jumping at you.
This game may work better on Fedora/Bazzite or <some other distro> but I find Debian to be super reliable and don't want to switch distro. I also don't like Fedora generally as I've found it unreliable in the past. I had a look at Bazzite and I honestly just wasn't interested. This is due to it having a bunch of technologies that I have no interest in using.
There are other issues that are tangential but related issues.
e.g.
I normally play on Super HellDive with other players in a Discord VC. Discord / Pipewire seems to reset my sound for no particular reason and my Plantronics Headset Mic (good headset, not some gamer nonsense) will be not found. This requires a restart of pipewire/wireplumber and Discord (in that order). This happens often enough I have a shell script alias called "fix_discord".
I have weird audio problems on HDMI (AMD card) thanks to a regression in the kernel (Kernel 6.1 with Debian worked fine).
I could mess about with this for ages and maybe get it working or just reboot into Windows which takes me all of a minute.
It is just easier to use Windows for Gaming. Then use Linux for work stuff.
I don't want to use Fedora. Other than I've found it unreliable I switched to Debian because I was fed up of all the Window-isms/Corporate stuff in the distro that was enabled by default that I was trying to get away from.
It the same reason I don't want to use Bazzite. It misses the point of using a Linux/Unix system altogether.
I also learned a long time ago Distro Hopping doesn't actually fix your issues. You just end up either with the same issues or different ones. If I switched from Debian to Fedora, I suspect I would have many of the same issues.
e.g. If a issue is in the Linux kernel itself such as HDMI Audio on AMD cards having random noise, I fail to see how changing from one distro to another would help. Fedora might have a custom patch to fix this, however I could also take this patch and make my own kernel image (which I've done in the past btw).
The reality is that most people doing development for various project / packages that make the Linux desktop don't have the setup I have and some of the peculiarities I am running into. If I had a more standard setup, I wouldn't have an issue.
Moreover, I would be using FreeBSD/OpenBSD or some other more traditional Unix system and ditch Linux if I didn't require some Linux specific applications. I am considering moving to something like Artix / Devuan in the future if I did decide to switch.
My hesitation is around high end settings, can Proton run 240hz on 1440p and high settings? I'm switching anyway soon and might just have a separate machine for gaming but I'd rather it be Linux. SteamOS looks promising if they release for PC.
The only games in my library at all that don't work on linux are indie games from the early 2000s, and I'm comfortable blaming the games themselves in this case.
I also don't play any games that require a rootkit, so..
good move, thats why i treat my windows install as a dumb game box, they can steal whatever data they want from that i dont care. i do my real work on linux, as far away from windows as i can possibly get.
Same way I treat my windows machine, but also the reason I wont be swapping it to linux any time soon. I use different operating systems for different purposes for a reason. It's great for fompartmentalization.
When I am in front of windows, I know I can permit myself to relax, breath easy and let off some steam. When I am not, I know I am there to learn/earn a living/produce something etc. Most probably do not need this, but my brain does, or I would never switch off.
What works for me is having different Activities/Workspaces in KDE - they have different wallpapers, pinned programs in the taskbar, the programs themselves launch only in a specific Activity. I hear others also use completely different user accounts.
Proton/Steam/ Linux works damn nearly flawlessly for /most/ games. I've gone through a Nvidia 2060, a 4060, and now an AMD 6700 XT. No issues even for release titles at launch.
Yeah, but it's not worth. Apparently the "gold" list on ProtonDB is games that allegedly work with tweaks. So like, drop in this random DLL and it might fix the game. I'm not gonna spend time on that.
Last one I ever tried was https://www.protondb.com/app/813780 with comments like "works perfectly, except multiplayer is completely broken" and the workaround has changed 3 times so far, also it lags no matter what. Gave up after stealing 4 different DLLs from Windows. It doesn't even have anticheat, it's just cause of some obscure math library.
> Yeah, but it's not worth. Apparently the "gold" list on ProtonDB is games that allegedly work with tweaks. So like, drop in this random DLL and it might fix the game. I'm not gonna spend time on that.
I literally never had to do that. Most tweaking I needed to do was switching proton versions here and there (which is trivial to do).
I've been running opensuse+steam and I never had to tweak a dll to get a game running. Albeit that I don't exactly chase the latest AAA, the new releases that I have tried have worked well.
Age of empires 2 used to work well, without needing any babying, so I'm not sure why it didn't for you. I will see about spinning it up.
Seems a bit calculated and agreed across the industry. What can really make sense of Microsoft's acquisitions and ruining of billion dollar IPs? It's a manufactured collapse of the gaming industry. They want to centralize control of the market and make it a service based (rent seeking) sector.
I'm not saying they all got together and decided this together but their wonks are probably all saying the same thing. The market is shrinking and whether it's by design or incompetence, this creates a new opportunity to acquire it wholesale for pennies on the dollar and build a wall around it and charge for entry. It's a natural result of games requiring NVidia developers for driver tuning, bitcoin/ai and buying out capacity to prevent competitors.
The wildcard I can't fit into this puzzle is Valve. They have a huge opportunity here but they also might be convinced that they have already saturated the market and will read the writing on the wall.
I think the reason you see things like Blizzard killing off Overwatch 1 is because the Lindy effect applies in gaming as well. Some things are so sticky and preferred that you have to commit atrocities to remove them from use.
From a supply/demand perspective, if all of your customers are still getting high on the 5 (or 20) year old supply, launching a new title in the same space isn't going to work. There are not an infinite # of gamers and the global dopamine budget is limited.
Launching a game like TF2 or Starcraft 2 in 2025 would be viewed as a business catastrophe by the metrics most AAA studios are currently operating under. Monthly ARPU for gamers years after purchasing the Orange Box was approximately $0.00. Giving gamers access to that strong of a drug would ruin the demand for other products.
The video game industry has been through cycles like this before. One of them (the 1983 crash) was so bad it killed most American companies and caused the momentum to shift to Japan for a generation. Another one I can recall is the "death" of the RTS (real-time strategy) genre around 2010. They have all followed a fairly similar pattern and in none of them that I know of have things played out as the companies involved thought or hoped they would.
I worked in the video game industry from the 90s through to today. I think you are over generalizing or missing the original point. It's true that there have been boom and busts. But there are also structural changes. Do you remember CD-ROMs? Steam and the iPhone were structural changes.
What Microsoft is trying to do with Gamepass is a structural change. It may not work out the way that they plan but the truth is that sometimes these things do change the nature of the games you play.
But the thing is that Steam didn't cause the death of physical media. I absolutely do remember PC gaming before Steam, and between the era when it was awesome (StarCraft, Age of Empires, Unreal Tournament, Tribes, etc.) and the modern Steam-powered renaissance, there was an absolutely dismal era of disappointment and decline. Store shelves were getting filled with trash like "40 games on one CD!" and each new console generation gave retailers an excuse to shrink shelf space for PC games. Yet during this time, all of Valve's games were still available on discs!
I think Microsoft's strategy is going to come to the same result as Embracer Group. They've bought up lots of studios and they control a whole platform (by which I mean Xbox, not PC) but this doesn't give them that much power. Gaming does evolve and it often evolves to work around attempts like this, rather than in favor of them.
I am not saying that about Steam. In fact Steam pretty much saved triple A PC gaming. Your timeline is quite accurate!
>> Microsoft's strategy is going to come to the same result as Embracer Group.
I hope you are right.
If I were trying to make a larger point, I guess it would be that big tech companies (Apple, MSFT, Amazon) don't want content creators to be too important in the ecosystem and tend to support initiatives that emphasize the platform.
Not in the game industry but as a consumer this is very true. One example: ubiquitous access to transactions and payment systems gave a huge rise to loot boxes.
Also mobile games that got priced at $0.99 meant that only the unicorn level games could actually make decent money so In-App Purchases were born.
But also I suspect it is just a problem where as consumers we spend a certain amount of money on certain kinds of entertainment and if as a content producer you can catch enough people’s attention you can get a slice of that pie. We saw this with streaming services where an average household spent about $100/month on cable so Netflix, Hulu, et al all decided to price themselves such that they could be a portion of that pie (and would have loved to be the whole pie but ironically studios not willing to license everything to everyone is what prevented that).
If it’s manufactured it implies intent. Someone at Microsoft is doing it on purpose and, presumably, thinks it’ll benefit them. I’m not sure how this can be seen as a win for them. They invested a massive amount of money into buying all those game studios. They also admitted Xbox hardware is basically dead. So the only way they can any return on that investment is third party hardware: either PlayStation or PC. If I were to choose it would be pc for MS. They already have game pass and windows is the gaming OS. By giving business to Sony they would undermine those.
I don’t think nVidia wants gaming collapse either. They might not prioritize it now but they definitely know that it will persist in some form. They bet on AI (and crypto before it) because those are lucrative opportunities but there’s no guarantee they will last. So they squeeze as much as they can out of those while they can. They definitely want gaming as a backup. It might be not as profitable and more finicky as it’s a consumer market but it’s much more stable in the long run.
> It's a manufactured collapse of the gaming industry. They want to centralize control of the market and make it a service based (rent seeking) sector.
It also won’t work, and Microsoft has developed no way to compete on actual value. As much as I hate the acquisitions they’ve made, even if Microsoft as a whole were to croak tomorrow I think the game industry would be fine.
New stars would arise, others suggesting the games industry would collapse and go away is like saying the music industry collapsing would stop people from making music.
Yes games can be expensive to make, but they don't have to be, and millions will still want new games to play. It is actually a pretty low bar for entry to bring an indie game to market (relative to other ventures). A triple A studio collapse would probably be an amazing thing for gamers, lots of new and unique indie titles. Just not great for profit for big companies, a problem I am not concerned with.
As much as they've got large resources, I'm not sure what projects they could reasonably throw a mountain of money at and expect to change things, and presumably benefit from in the future instead of doing it to be a a force of chaos in the industry. Valve's efforts all seem to orbit around the store, that's their main business and everything else seems like a loss-leader to get you buying through it even if it comes across as a pet project of a group of employees.
The striking one for me is their linux efforts, at least as far as I'm aware they don't do a lot that isn't tied to the steam deck (or similar devices) or running games available on steam through linux. Even the deck APU is derived from the semi-custom work AMD did for the consoles, they're benefiting from a second later harvest that MS/Sony have invested (hundreds of millions?) in many years earlier. I suppose a lot of it comes down to what Valve needs to support their customers (developers/publishers), they don't see the point in pioneering and establishing some new branch of tech with developers.
I've always played a few games for many hours as opposed to many games for one playthrough. Subscription just does not make sense for me, and I suspect that's a big part of the market. Add to this the fact that you have no control over it and then top it off with potential ads and I will quit gaming before switching to subs only. Luckily there is still GoG and Steam doesn't seem like it will change but who knows.
This post is crazy nonsense: Bad games companies have always existed, and the solution is easy: dont buy their trash. I buy mostly smaller indie games these days just fine.
nvidia isn't purposely killing anything, they are just following the pivot into the AI nonsense. They have no choice, if they are in a unique position to make 10x by a pivot they will, even if it might be a dumpsterfire of a house of cards. Its immoral to just abandon the industry that created you, but companies have always been immoral.
Valve has an opportunity to what? Take over video card hardware market? No. AMD and Intel are already competitors in the market and cant get any foothold (until hopefully now consumers will have no choice but to shift to them)
> This in turn sparked rumors about NVIDIA purposefully keeping stock low to make it look like the cards are in high demand to drive prices. And sure enough, on secondary markets, the cards go way above MSRP
Nvidia doesn't earn more money when cards are sold above MSRP, but they get almost all the hate for it. Why would they set themselves up for that?
Scalpers are a retail wide problem. Acting like Nvidia has the insight or ability to prevent them is just silly. People may not believe this, but retailers hate it as well and spend millions of dollars trying to combat it. They would have sold the product either way, but scalping results in the retailer's customers being mad and becoming some other company's customers, which are both major negatives.
Scalpers are only a retail-wide problem if (a) factories could produce more, but they calculated demand wrong, or (b) factories can't produce more, they calculated demand wrong, and under-priced MSRP relative to what the market is actually willing to pay, thus letting scalpers capture more of the profits.
Either way, scalping is not a problem that persists for multiple years unless it's intentional corporate strategy. Either factories ramp up production capacity to ensure there is enough supply for launch, or MSRP rises much faster than inflation. Getting demand planning wrong year after year after year smells like incompetence leaving money on the table.
The argument that scalping is better for NVDA is coming from the fact that consumer GPUs no longer make a meaningful difference to the bottom line. Factory capacity is better reserved for even more profitable data center GPUs. The consumer GPU market exists not to increase NVDA profits directly, but as a marketing / "halo" effect that promotes decision makers sticking with NVDA data center chips. That results in a completely different strategy where out-of-stock is a feature, not a bug, and where product reputation is more important than actual product performance, hence the coercion on review media.
Scalping and MSRP-baiting have been around for far too many years for nVidia to claim innocence. The death of EVGA's GPU line also revealed that nVidia holds most of the cards in the relationship with its "partners". Sure, Micro Center and Amazon can only do so much, and nVidia isn't a retailer, but they know what's going on and their behavior shows that they actually like this situation.
Yeah wait, what happened with EVGA? (guess I can search it up, of course) I was browsing gaming PC hardware recently and noticed none of the GPUs were from EVGA .. I used to buy their cards because they had such a good warranty policy (in my experience)... :\
In 2022 claiming a lack of respect from Nvidia, low margins, and Nvidia's control over partners as just a few of the reasons, EVGA ended its partnership with Nvidia and ceased manufacturing Nvidia GPUs.
> I used to buy their cards because they had such a good warranty policy (in my experience)... :\
It's so wild to hear this as in my country, they were not considered anything special over any other third party retailer as we have strong consumer protection laws which means its all much of a muchness.
Think of it this way, the only reason 40 series and above are priced like they are is because they saw how willing people were to pay dueing 30 series scalper days.
This over representation by the rich is training other customers that nvidia gpus are worth that much so when they increase it again people won't feel offended.
> Scalpers are a retail wide problem. Acting like Nvidia has the insight or ability to prevent them is just silly.
Oh trust me, they can combat it. The easiest way, which is what Nintendo often does for the launch of its consoles, is produce an enormous amount of units before launch. The steady supply to retailers, absolutely destroys folks ability to scalp. Yes a few units will be scalped, but most scalpers will be underwater if there is a constant resupply. I know this because I used to scalp consoles during my teens and early twenties, and Nintendo's consoles were the least profitable and most problematic because they really try to supply the market. The same with iPhones, yeah you might have to wait a month after launch to find one if you don't pre-order but you can get one.
It's widely reported that most retailers had maybe tens of cards per store, or a few hundred nationally, for the 5090s launch. This immediately creates a giant spike in demand, and drove prices up along with the incentive for scalpers. The manufacturing partners immediately saw what (some) people were willing to pay (to the scalpers) and jacked up prices so they could get their cut. It is still so bad in the case of the 5090 that MSRP prices from AIBs skyrocketed 30%-50%. PNY had cards at the original $1999.99 MSRP and now those same cards can't be found for less than $2,999.99.
By contrast look at how AMD launched it's 9000 series of GPUS-- each MicroCenter reportedly had hundreds on hand (and it sure looked like by pictures floating around). Folks were just walking in until noon and still able to get a GPU on launch day. Multiple restocks happened across many retailers immediately after launch. Are there still some inflated prices in the 9000 series GPUs? Yes, but we're not talking a 50% increase. Having some high priced AIBs has always occurred but what Nvidia has done by intentionally under supplying the market is awful.
I personally have been trying to buy a 5090 FE since launch. I have been awake attempting to add to cart for every drop on BB but haven't been successful. I refuse to pay the inflated MSRP for cards that haven't been been that well reviewed. My 3090 is fine... At this point, I'm so frustrated by NVidia I'll likely just piss off for this generation and hope AMD comes out with something that has 32GB+ of VRAM at a somewhat reasonable price.
It' $4.2k on Newegg; I wouldn't necessarily call it reasonably priced, even compared to NVidia.
If we're looking at the ultra high end, you can pay double that and get an RTX 6000 Pro with double the VRAM (96GB vs 48GB), double the memory bandwidth (1792 GB/s vs 864 GB/s) and much much better software support. Or you could get an RTX 5000 Pro with the same VRAM, better memory bandwidth (1344 GB/s vs 864 GB/s) at similar ~$4.5k USD from what I can see (only a little more expensive than AMD).
Why the hell would I ever buy AMD in this situation? They don't really give you anything extra over NVidia, while having similar prices (usually only marginally cheaper) and much, much worse software support. Their strategy was always "slightly worse experience than NVidia, but $50 cheaper and with much worse software support"; it's no wonder they only have less than 10% GPU market share.
> Nvidia doesn't earn more money when cards are sold above MSRP, but they get almost all the hate for it. Why would they set themselves up for that?
If you believe their public statements, because they didn't want to build out additional capacity and then have a huge excess supply of cards when demand suddenly dried up.
In other words, the charge of "purposefully keeping stock low" is something NVidia admitted to; there was just no theory of how they'd benefit from it in the present.
I haven't read the whole article but a few things to remark:
* The prices for Nvidia GPUs are insane. For that money you can have an extremely good PC with a good non Nvidia GPU.
* The physical GPU sizes are massive, even letting the card rest on a horizontal motherboard looks like scary.
* Nvidia has still issues with melting cables? I've heard about those some years ago and thought it was a solved problem.
* Proprietary frameworks like CUDA and others are going to fall at some point, is just a matter of time.
Looks as if Nvidia at the moment is only looking at the AI market (which as a personal belief has to burst at some point) and simply does not care the non GPU AI market at all.
I remember many many years ago when I was a teenager and 3dfx was the dominant graphics card manufacturer that John Carmack profethically in a gaming computer magazine (the article was about Quake I) predicted that the future wasn't going to be 3dfx and Glide. Some years passed by and effectively 3dfx was gone.
Perhaps is just the beginning of the same story that happened with 3dfx. I think AMD and Intel have a huge opportunity to balance the market and bring Nvidia down, both in the AI and gaming space.
I have only heard excellent things about Intel's ARC GPUs in other HNs threads and if I need to build a new desktop PC from scratch there's no way to pay for the prices that Nvidia is pushing to the market, I'll definitely look at Intel or AMD.
High end GPU has over the last 5 years slowly turning from an enthusiast product into a luxury product.
5 or maybe 10 years ago, high-end GPU are needed to run games at reasonably eye candy setting. In 2025, $500 mid-range GPUs are more than enough. Folks all over can barely tell between High and Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling. There's just no point to compete at $500 price point any more, that Nvidia has largely given up and relegating to the AMD-built Consoles, and integrated graphics like AMD APU, that offer good value in low-end, medium-end, and high-end.
Maybe the rumored Nvidia PC, or the Switch 2, can bring some resurgence.
What strategy? They charge more because manufacturing costs are higher, cost per transistor haven't changed much since 28nm [0] but chips have more and more transistors. What do you think that does to the price?
10 years ago, $650 would buy you a top-of-the-line gaming GPU (GeForce GTX 980 Ti). Nowadays, $650 might get you a mid-range RX 9070 XT if you miraculously find one near MSRP.
That is $880 dollars in today's term. And 2015 Apple was already shipping a 16nm SoC. The GeForce GTX 980 Ti was still on 28nm. Two generation Node behind.
In their never ending quest to find ways to suck more money out of people, one natural extension is to just turn the thing into a luxury good and that alone seems to justify the markup
This is why new home construction is expensive - the layout of a home doesn’t change much but it’s trivial to throw on some fancy fixtures and slap the deluxe label on the listing.
Or take a Toyota, slap some leather seats on it, call it a Lexus and mark up the price 40% (I get that these days there are more meaningful differences but the point stands)
This and turning everything into subscriptions alone are responsible for 90% of the issues I have as a consumer
Graphics cards seem to be headed in this direction as well - breaking through that last ceiling for maximum fps is going to be like buying a bentley (if it isn’t already) where as before it was just opting for the v8
Nvidia's been doing this for a while now, since at least the Titan cards and technically the SLI/Crossfire craze too. If you sell it, egregiously-compensated tech nerds will show up with a smile and a wallet large enough to put a down-payment on two of them.
I suppose you could also blame the software side, for adopting compute-intensive ray tracing features or getting lazy with upscaling. But PC gaming has always been a luxury market, at least since "can it run Crysis/DOOM" was a refrain. The homogeneity of a console lineup hasn't ever really existed on PC.
I've used all of these (at 4K, 120hz, set to "balanced") since they came out, and I just don't understand how people say this.
FSR is a vaseline-like mess to me, it has its own distinct blurriness. Not as bad as naive upscaling, and I'll use it if no DLSS is available and the game doesn't run well, but it's distracting.
Lossless is borderline unusable. I don't remember the algorithm's name, but it has a blur similar to FSR. It cannot handle text or UI elements without artifacting (because it's not integrated in the engine, those don't get rendered at native resolution). The frame generation causes almost everything to have a ghost or afterimage - UI elements and the reticle included. It can also reduce your framerate because it's not as optimized. On top of that, the way the program works interferes with HDR pipelines. It is a last resort.
DLSS (3) is, by a large margin, the best offering. It just works and I can't notice any cons. Older versions did have ghosting, but it's been fixed. And I can retroactively fix older games by just swapping the DLL (there's a tool for this on GitHub, actually). I have not tried DLSS 4.
Most people either can’t tell the difference, don’t care about the difference, or both. Similar discourse can be found about FSR, frame drop, and frame stutter. I have conceded that most people do not care.
Not quite $500, but at $650, the 9070 is an absolute monster that outperforms Nvidia's equivalent cards in everything but ray tracing (which you can only turn on with full DLSS framegen and get a blobby mess anyways)
AMD is truly making excellent cards, and with a bit of luck UDNA is even better. But they're in the same situation as Nvidia: they could sell 200 GPUs, ship drivers, maintain them, deal with returns and make $100k... Or just sell a single MI300X to a trusted partner that won't make any waves and still make $100k.
Wafer availability unfortunately rules all, and as it stands, we're lucky neither of them have abandoned their gaming segments for massively profitable AI things.
Some models of 9070 use the well-proven old style PCI-E power connectors too, which is nice. As far as I'm aware none of the current AIB midrange or high end Nvidia cards do this.
I went from a 2080 Ti to a 5070 Ti. Yes it's faster, but for the games I play, not dramatically so. Certainly not what I'm used to doing such a generational leap. The 5070 Ti is noticeably faster at local LLMs, and has a bit more memory which is nice.
I went with the 5070 Ti since the 5080 didn't seem like a real step up, and the 5090 was just too expensive and wasn't in stock for ages.
If I had a bit more patience, I would have waited till the next node refresh, or for the 5090. I don't think any of the other current 50-series cards are worth besides the 5090 it if you're coming from a 2080. And by worth it I mean will give you a big boost in performance.
I went from a 3070 to 5070 Ti and it's fantastic. Just finished Cyberpunk Max'd out at 4k with DLSS balanced, 2x frame gen, and reflex 2. Amazing experience.
TSMC can only make about as many Nvidia chips as OpenAI and the other AI guys wants to buy. Nvidia releases gpus made from basically the shaving leftovers from the OpenAI products, which makes them limited in supply and expensive.
So gamers have to pay much more and wait much longer than before, which they resent.
Some youtubers make content that profit from the resentment so they play fast and loose with the fundamental reasons in order to make gamers even more resentful. Nvidia has "crazy prices" they say.
But they're clearly not crazy. 2000 dollar gpus appear in quantities of 50+ from time to time at stores here but they sell out in minutes. Lowering the prices would be crazy.
This is one reason, and another is that both Dennard scaling has stopped and GPUs hit a memory wall for DRAM. The only reason AI hardware gets the significant improvements is that they are using big matmuls and a lot of research has been in getting lower precision (now 4bit) training working (numerical precision stability was always a huge problem with backprop).
NVIDIA is, and will be for at least the next year or two, supply constrained. They only have so much capacity at TSMC for all the chips, and the lion's share of that is going to be going enterprise chips, which sell for an order of magnitude more than the consumer chips.
It's hard to get too offended by them shirking the consumer marker right now when they're printing money with their enterprise business.
Not personally offended, but when a company makes a big stink around several gross exaggerations (performance, price, availability) it's not hard to understand why folks are kicking up their own stink.
Nvidia could have said "we're prioritizing enterprise" but instead they put on a big horse and pony show about their consumer GPUs.
I really like the Gamer's Nexus paper launch shirt. ;)
If they believed they were going to continue selling AI chips at those margins they would:
- outbid Apple on new nodes
- sign commitments with TSMC to get the capacity in the pipeline
- absolutely own the process nodes they made cards on that are still selling way above retail
NVIDIA has been posting net earnings in the 60-90 range over the last few years. If you think that's going to continue? You book the fab capacity hell or high water. Apple doesn't make those margins (which is what on paper would determine who is in front for the next node).
And what if Nvidia booked but the order didn't come. What if Nvidia's customer isn't going to commit? How expensive and how much prepayment is needed for TSMC to break a new Fab?
These are the same question Apple Fans asking Apple to buy TSMC. The fact is isn't so simple. And even if Nvidia were willing to pay for it TSMC wouldn't do it just for Nvidia alone.
The real issue here is actually harebrained youtubers stirring up drama for views. That's 80% of the problem. And their viewers (and readers, for that which makes it into print) eat it up.
Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly, posting to social media, and youtubers jumping on the trend for likes.
These are 99% user error issues drummed up by non-professionals (and, in some cases, people paid by 3rd party vendors to protect those vendors' reputation).
And the complaints about transient performances issues with drivers, drummed up into apocalyptics scenarios, again, by youtubers, who are putting this stuff under a microscope for views, are universal across every single hardware and software product. Everything.
Claiming "DLSS is snakeoil", and similar things are just an expression of the complete lack of understanding of the people involved in these pot-stirring contests. Like... the technique obviously couldn't magically multiply the ability of hardware to generate frames using the primary method. It is exactly as advertised. It uses machine learning to approximate it. And it's some fantastic technology, that is now ubiquitous across the industry. Support and quality will increase over time, just like every _quality_ hardware product does during its early lifespan.
It's all so stupid and rooted in greed by those seeking ad-money, and those lacking in basic sense or experience in what they're talking about and doing. Embarrassing for the author to so publicly admit to eating up social media whinging.
If you've ever watched a GN or LTT video, they never claimed that DLSS is snakeoil. They specifically call out the pros of the technology, but also point out that Nvidia lies, very literally, about its performance claims in marketing material. Both statements are true and not mutually exclusive. I think people like in this post get worked up about the false marketing and develop (understandably) a negative view of the technology as a whole.
> Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly
This is not true. Even GN reproduced the melting of the first-party cable.
Also, why shouldn't you be able to use third-party cables? Fuck DRM too.
I'm referring to the section header in this article. Youtubers are not a truly hegemonic group, but there's a set of ideas and narratives that pervade the group as a whole that different subsets buy into, and push, and that's one that exists in the overall sphere of people who discuss the use of hardware for gaming.
Well, I can't speak for all youtubers, but I do watch most GN and LTT videos and the complaints are legitimate, nor are they random jabronis yolo'ing hardware installations.
As far as I know, neither of them have had a card unintentionally light on fire.
The whole thing started with Derbauer going to bat for a cable from some 3rd party vendor that he'd admitted he'd already plugged in and out of various cards something like 50 times.
The actual instances that youtubers report on are all reddit posters and other random social media users who would clearly be better off getting a professional installation. The huge popularity for enthusiast consumer hardware, due to the social media hype cycle, has brought a huge number of naive enthusiasts into the arena. And they're getting burned by doing hardware projects on their own. It's entirely unsurprising, given what happens in all other realms of amateur hardware projects.
Most of those who are whinging about their issues are false positive user errors. The actual failure rates (and there are device failures) are far lower, and that's what warrantys are for.
I'm sure the failure rates are blown out of proportion, I agree with that.
But the fact of the matter is that Nvidia has shifted from a consumer business to b2b, and they don't even give a shit about pretending they care anymore. People take beef with that, understandably, and when you couple that with the false marketing, the lack of inventory, the occasional hardware failure, missing ROPs, insane prices that nobody can afford and all the other shit that's wrong with these GPUs, then this is the end result.
GN were the OG "fake framers" going back to their constant casting shade on DLSS, ignoring it on their reviews, and also crapping on RT.
AI upscaling, AI denoising, and RT were clearly the future even 6 years ago. CDPR and the rest of the industry knew it, but outlets like GN pushed a narrative(borderline conspiracy) the developers were somehow out of touch and didn't know what they were talking about?
There is a contingent of gamers who play competitive FPS. Most of which are, like in all casual competitive hobbies, not very good. But they ate up the 240hz rasterization be-all meat GN was feeding them. Then they think they are the majority and speak for all gamers(as every loud minority on the internet does).
Fast forward 6 years and NVidia is crushing the Steam top 10 GPU list, AI rendering techniques are becoming ubiquitous, and RT is slowly edging out rasterization.
Now that the data is clear the narrative is most consumers are "suckers" for purchasing NVidia, Nintendo, and etc. And the content creator economy will be there to tell them they are right.
Edit: I believe too some of these outlets had chips on their shoulder regarding NVidia going way back. So AMDs poor RT performance and lack of any competitive answer the the DLSS suite for YEARS had them lying to themselves about where the industry was headed. Essentially they were running interference for AMD. Now that FSR4 is finally here it's like AI upscaling is finally ok.
Remember when nvidia got caught dropping 2 bits of color information to beat ati in benchmarks? I still can't believe anyone has trusted them since! That is an insane thing to do considering the purpose of the product.
For as long as they have competition, I will support those companies instead. If they all fail, I guess I will start one. My spite for them knows no limits
People need to start asking more questions about why the RTX 50 series (Blackwell) has almost no performance uplift over the RTX 40 series (Ada/Hopper), and also conveniently its impossible to find B200s.
I am a volunteer firefighter and hold a degree in electrical engineering. The shenanigans with their shunt resistors, and ensuing melting cables, is in my view criminal. Any engineer worth their salt would recognize pushing 600W through a bunch of small cables with no contingency if some of them have failed is just asking for trouble. These assholes are going to set someone's house on fire.
I hope they get hit with a class action lawsuit and are forced to recall and properly fix these products before anyone dies as a result of their shoddy engineering.
And a video referenced in the original post, describing how the design changed from one that proactively managed current balancing, to simply bundling all the connections together and hoping for the best: https://youtu.be/kb5YzMoVQyw
Also, like, I kind of want to play with these things, but also I'm not sure I want a computer that uses 500W+ in my house, let alone just a GPU.
I might actually be happy to buy one of these things, at the inflated price, and run it at half voltage or something... but I can't tell if that is going to fix these concerns or they're just bad cards.
It's not the voltage, it's the current you'd want to halve. The wire gauge required to carry power is dependent on the current load. It's why when i first saw these new connectors and the loads they were being tasked with it was a wtf moment for me. Better to just avoid them in the first place though.
It's crazy, you don't even need to know about electricity after you see a thermal camera on them operating at full load. I'm surprised they can be sold to the general public, the reports of cables melting plus the high temps should be enough to force a recall.
Has anyone made 12VHPWR cables that replace the 12 little wires with 2 large gauge wires yet? That would prevent the wires from becoming unbalanced, which should preempt the melting connector problem.
As a bonus, if the gauge is large enough, the cable would actually cool the connectors, although that should not be necessary since the failure appears to be caused by overloaded wires dumping heat into the connector as they overheat.
Might help a little bit, by heatsinking the contacts better, but the problem is the contact resistance, not the wire resistance. The connector itself dangerously heats up.
Or at least I think so? Was that a different 12VHPWR scandal?
Another problem is when the connector is angled, several of the pins may not make contact, shoving all the power through as few as one wire. A common bus would help this but the contact resistance in this case is still bad.
It would help, but my intuition is that the thin steel of the contact would not move the heat fast enough to make a significant difference. Only way to really know is to test it.
I thought that the contact resistance caused the unbalanced wires, which then overheat alongside the connector, giving the connector’s heat nowhere to go.
They don't just specify 12 smaller cables for nothing if 2 larger ones will do. There are concerns here with mechanical compatibility (12 wires have smaller allowable bend radius than 2 larger ones with the same ampacity).
One option is to use two very wide, thin insulated copper sheets as cable. Still has a good bend radius in one dimension, but is able to sink a lot of power.
This article goes much deeper than I expected, and is a nice recap of the last few years of "green" gpu drama.
Liars or not, the performance has not been there for me in any of my usecases, from personal to professional.
A system from 2017/2018 with an 8700K and an 8GB 2080 performs so closely to the top end, expensive systems today that it makes almost no sense to upgrade at MSRP+markup unless your system is older than this.
Unless you need specific features only on more recent cards, there are very few use cases I can think of needing more than a 30 series card right now.
> A system from 2017/2018 with an 8700K and an 8GB 2080 performs so closely to the top end, expensive systems today
This is in no way true and is quite an absurd claim. Unless you meant for some specific isolated purposed restricted purely to yourself and your performance needs.
> there are very few use cases I can think of needing more than a 30 series card right now.
How about I like high refresh and high resolutions? I'll throw in VR to boot. Which are my real use cases. I use a high refresh 4K display and VR, both have benefited hugely from my 2080Ti > 4090 Shift.
I mean, most people probably won't directly upgrade. Their old card will die, or eventually nvidia will stop making drivers for it. Unless you're looking around for used cards, the price difference between something low end like a 3060 isn't that much less in price for the length of support you're going to get.
Unless nvidia's money printing machine breaks soon, expect the same to continue for the next 3+ years. Crappy expensive cards with a premium on memory with almost no actual video rendering performance increase.
> Unless you're looking around for used cards, the price difference between something low end like a 3060 isn't that much less in price for the length of support you're going to get.
This does not somehow give purchasers more budget room now, but they can buy 30-series cards in spades and not have to worry about the same heating and power deliveries as a little bonus.
I don’t want to jump on nvidia but I found it super weird when they clearly remote controlled a Disney bot onto the stage and claimed it was all using real time AI which was clearly impossible due to no latency and weirdly the bot verifying correct stage position in relation to the presenter. It was obviously the Disney bot just being controlled by someone off stage.
I found it super alarming because why would they fake something on stage to the extent of just lying.i know Steve jobs had backup phones but jsut claiming a robot is autonomous when it isn’t I just feel it was scammy.
It reminded me of when Tesla had remote controlled Optimus bots. I mean I think that’s awesome like super cool but clearly the users thought the robots were autonomous during that dinner party.
I have no idea why I seem to be the only person bothered by “stage lies” to this level. Tbh even the Tesla bots weren’t claimed to be autonomous so actually I should never have mentioned them but it explains the “not real” vibe.
Not meaning to disparage just explaining my perception as a European maybe it’s just me though!
EDIT > Im kinda suprised by the weak arguments in the replies, I love both companies, I am just offering POSITIVE feedback, that its important ( in my eyes ) to be careful not to pretend in certain specific ways or it makes the viewer question the foundation ( which we all know is SOLID and good ).
EDIT 2 >There actually is a good rebuttal in the replies, although apparently I have "reading comprehension skill deficiencies" its just my pov that they were insinuating the robot was aware of its surroundings, which is fair enough.
See this snipet : "Operator Commands Are Merged:
The control system blends expressive animation commands (e.g., wave, look left) with balance-maintaining RL motions"
I will print a full retraction if someone can confirm my gut feeling is correct
Having worked on control systems a long time ago, that's a 'nothing' statement: the whole job of the control system is to keep the robot stable/ambulating, regardless of whatever disturbances occur. It's meant to reject the forces induced due to waving exactly as much as bumping into something unexpected.
It's easier to stabilise from an operator initiated wave, really; it knows it's happening before it does the wave, and would have a model of the forces it'll induce.
I tried to understand the point of your reply but Im not sure what your point was - I only seemed to glean "its easier to balance if the operator is moving it".
Please elaborate unless Im being thick.
EDIT > I upvoted your comment in any case as Im sure its helping
'control system' in this case is not implying remote control, it's referring to the feedback system that adjust the actuators in response to the sensed information. If the motion is controlled automatically, then the control loop can in principle anticipate the motion in a way that it could not if it was remote controlled: i.e. the opposite, it's easier to control the motions (in terms of maintaining balance and avoiding overstressing the actuators) if the operator is not live puppeteering it.
It's that there's nothing special about blending "operator initiated animation commands" with the RL balancing system. The balance system has to balance anyway; if there was no connection between an operator's wave command and balance, it would have exactly the same job to do.
At best the advantage of connecting those systems is that the operator command can inform the balance system, but there's nothing novel about that.
"RL is not AI" "Disney bots were remote controlled" are major AI hypebro delulu moment lol
Your understanding of AI and robotics are more cucumber than pear shaped. You're making very little technical sense here. Challenges and progress in robotics aren't where you think they are. It's all propagandish contents you're basing your understandings on.
If you're getting information from TikTok or YouTube Shorts style content, especially around Tesla bros - get the hell out of it at Ludicrous Speed. Or consume way more of it so thoroughly that you cannot be deceived anymore despite blatant lies everywhere. Then come back. They're all plain wrong and it's not good for you.
> I don’t want to jump on nvidia but I found it super weird when they clearly remote controlled a Disney bot onto the stage and claimed it was all using real time AI which was clearly impossible due to no latency and weirdly the bot verifying correct stage position in relation to the presenter. It was obviously the Disney bot just being controlled by someone off stage.
I don't know what you're referring to, but I'd just say that I don't believe what you are describing could have possibly happened.
Nvidia is a huge corporation, with more than a few lawyers on staff and on retainer, and what you are describing is criminal fraud that any plaintiff's lawyer would have a field day with. So, given that, and since I don't think people who work at Nvidia are complete idiots, I think whatever you are describing didn't happen the way you are describing it. Now, it's certainly possible there was some small print disclaimer, or there was some "weasel wording" that described something with ambiguity, but when you accuse someone of criminal fraud you want to have more than "hey this is just my opinion" to back it up.
Tefal literally sells a rice cooker that boasts "AI Smart Cooking Technology" while not even containing a microcontroller and just being controlled by the time-honored technology of "a magnet that gets hot". They also have lawyers.
AI doesn't mean anything. You can claim anything uses "AI" and just define what that means yourself. They could have some basic anti-collision technology and claim it's "AI".
They're soaked eyebrows deep in Tiktok style hype juice, believing that latest breakthrough in robotics is that AGIs just casually started walking and talking on their own and therefore anything code controlled by now is considered proof of ineptitude and fake.
It's complete cult crazy talk. Not even cargocult, it's proper cultism.
Disney are open about their droids being operator controlled. Unless nvidia took a Disney droid and built it to be autonomous (which seems unlikely) it would follow that it is also operator controlled. The presentation was demonstrating what Disney had achieved using nvidia’s technology. You can see an explainer of how these droids use machine learning here: https://youtube.com/shorts/uWObkOV71ZI
If you think the droid was autonomous then I guess that is evidence that nvidia were misrepresenting (if not lying).
Having seen these droids outside of the nvidia presentation and watching the nvidia presentation, I think it’s obvious it was human operated and that nvidia were misleading people.
I assume any green accounts that are just asking questions with no research are usually lying. Actual new users will just comment and say their thoughts to join the community.
It seems to me like both cases raised by OP - the Disney droids and Optimus - are cases of people making assumptions and then getting upset that their assumptions were wrong and making accusations.
Neither company was very forthcoming about the robots being piloted, but neither seems to be denying it either. And both seem to use RL / ML techniques to maintain balance, locomotion, etc. Not unlike Boston Dynamics' bots, which are also very carefully orchestrated by humans in multiple ways.
Yet he lists all the RL stuff that we know is used in the robot, he isnt being silent and saying " this robot is aided by AI" , or better yet, not commenting on the specifics, ( which would have been totally ok ), instead he is saying " This is real life simulation", which it isnt.
EDIT > apparently I am wrong - thank you for the correction everyone!
I have written motion control firmwares for 20+ years, and "this is real time simulation" has very domain-specific meaning to me. "Real time" means the code is responding to events as they happen, like with interrupts, and not via preemptible processing which could get out of sync with events. "simulation" is used by most control systems from simple PID loops to advanced balancing and motion planning.
It is clearly - to me at least - doing both of those things.
I think you're reading things into what he said that aren't there.
Yea, this seems like the initial poster has reading comprehension skill deficiencies and is blaming NVIDIA for lying about a point they never made. NVIDIA is even releasing some of the code they used to power the robot, which further proves that they in no way said the robot was not being operator controlled, just that it was using AI to make it’s movement look more fluid.
I seem to remember multiple posts on large tech websites having the exact same opinion/conclusion/insinuation as the one you originally had, so not necessarily comprehension problem on your part. My opinion: Nvidia's CEO has a problem communicating in good faith. He absolutely knew what he was doing during that little stage show, and it was absolutely designed to mislead people toward the most "AI HYPE, PLEASE BUY GPUs, MY ROBOT NEEDS GPUS TO LIVE" conclusion
I wonder if the 12VHPWR connector is intentionally defective to prevent large-scale use of those consumer cards in server/datacenter contexts?
The failure rate is just barely acceptable in a consumer use-case with a single card, but with multiple cards the probability of failure (which takes down the whole machine, as there's no way to hot-swap the card) makes it unusable.
I can't otherwise see why they'd persevere on that stupid connector when better alternatives exist.
It boggles my mind that an army of the most talented electrical engineers on earth somehow fumble a power connector and then don’t catch it before shipping.
Sunk cost fallacy and a burning (literal) desire to have small artistic things. That's probably also the reason the connector was densified so much, and clearly, released with so VERY little tolerance for error human and otherwise.
IANAL, but knowingly leaving a serious defect in your product at scale for that purpose would be very bad behavior and juries tend not like that sort of thing.
However, as we’ve learned from the Epic vs Apple case, corporations don’t really care about bad behavior — as long as their ulterior motives don’t get caught.
Anyone else getting a bit disillusioned with the whole tech hardware improvements thing? Seems like every year we get less improvement for higher cost and the use cases become less useful. Like the whole industry is becoming a rent seeking exercise with diminishing returns. I used to follow hardware improvements and now largely don't because I realised I (and probably most of us) don't need it.
It's staggering that we are throwing so many resources at marginal improvements for things like gaming, and I say that as someone whose main hobby used to be gaming. Ray tracing, path tracing, DLSS, etc at a price point of $3000 just for the GPU - who cares when a 2010 cell shaded game running on an upmarket toaster gave me the utmost joy?
And the AI use cases don't impress me either - seems like all we do each generation is burn more power to shove more data through and pray for an improvement (collecting sweet $$$ in the meantime).
Another commenter here said it well, there's just so much more you can do with your life than follow along with this drama.
What stands out to me is that it's not just the hardware side, software production to make use of it to realize the benefits offered doesn't seem to be running smoothly either, at least for gaming. I'm not sure nvidia really cares too much though as there's no market pressure on them where it's a weakness for them, if consumer GPUs disappeared tomorrow they'd be fine.
A few months ago Jensen Huang said he sees quantum computing as the next big thing he wants nvidia to be a part of over the next 10-15 years (which seems like a similar timeline as GPU compute), so I don't think consumer GPUs are a priority for anyone. Gaming used to be the main objective with byproducts for professional usage, for the past few years that's reversed where gaming piggybacks on common aspects to compute.
Your disillusionment is warranted, but I'll say that on the Mac side the grass has never been greener. The M chips are screamers year after year, the GPUs are getting ok, the ML cores are incredible and actually useful.
Good point, we should commend genuinely novel efforts towards making baseline computation more efficient, like Apple has done as you say. Particularly in light of recent x86 development which seems to be "shove as many cores as possible on a die and heat your apartment while your power supply combusts" (meanwhile the software gets less efficient by the day, but that's another thing altogether...). ANY DAY of the week I will take a compute platform that's no-bs no-bells-and-whistles simply more efficient without the manufacturer trying to blow smoke up our asses.
> The RTX 50 series are the second generation of NVIDIA cards to use the 12VHPWR connector.
This is wrong. The 50 series uses 12V-2x6, not 12VHPWR. The 30 series was the first to use 12VHPR. The 40 series was the second to use 12VHPWR and first to use 12V-2x6. The 50 series was the second to use 12V-2x6. The female connectors are what changed in 12V-2x6. The male connectors are identical between 12V-2x6 and 12VHPWR.
The guy accuses Nvidia of not doing anything about that problem, but ignored that they did with the 12V-2x6 connector, which as far as I can tell, has had far fewer issues.
It is a connector. None of the connectors inside a PC have those. They could add them to the circuitry on the PCB side of the connector, but that is entirely separate from the connector.
That said, the industry seems to be moving to adding detection into the PSU, given seasonic’s announcement:
Finally, I think there is a simpler solution, which is to change the cable to use two large gauge wires instead of 12 individual ones to carry current. That would eliminate the need for balancing the wires in the first place.
Previous well-designed video cards used the technologies I described. Eliminating the sense circuits and fusing is a recent development.
I do like the idea of just using big wires. It’d be so much cleaner and simpler. Also using 24 or 48V would be nice, but that’d be an even bigger departure from current designs.
It seems incredibly wrong to assume that there was only 1 issue with 12WHPWR. 12V-2x6 was an improvement that eliminated some potential issues, not all of them. If you want to eliminate all of them, replace the 12 current carrying wires with 2 large gauge wires. Then the wires cannot become unbalanced. Of course, the connector would need to split the two into 12 very short wires to be compatible, but those would be recombined on the GPU’s PCB into a single wire.
(context: 12VHPWR and 12V-2x6 are the exact same thing. The latter is supposed to be improved and totally fixed, complete with the underspecced load-bearing "supposed to be" clause.)
> The competing open standard is FreeSync, spearheaded by AMD. Since 2019, NVIDIA also supports FreeSync, but under their “G-Sync Compatible” branding. Personally, I wouldn’t bother with G-Sync when a competing, open standard exists and differences are negligible[4].
Open is good, but the open standard itself is not enough. You need some kind of testing/certification, which is built in to the G-Sync process. AMD does have a FreeSync certification program now which is good.
If you rely on just the standard, some manufacturers get really lazy. One of my screens technically supports FreeSync but I turned it off day one because it has a narrow range and flickers very badly.
The article complains about issues with consumer GPUs but those are nowadays relegated to being merely a side hobby project of Nvidia, whose core business is enterprise AI chips. Anyway Nvidia still has no significant competition from AMD on either front so they are still getting away with this.
Deceptive marketing aside, it's true that it's sad that we can't get 4K 60 Hz with ray tracing with current hardware without some kind of AI denoising and upscaling, but ray tracing is really just _profoundly_ hard so I can't really blame anyone for not having figured out how to put it in a consumer pc yet. There's a reason why pixar movies need huge render farms that take lots of time per frame. We would probably sooner get gaussian splatting and real time diffusion models in games than nice full resolution ray tracing tbh.
Yes but why should I care provided the product they have already sold me continues to work? How does this materially change my life because Nvidia doesnt want to go steady with me anymore?
With the rise of LLM training, Nvidia’s main revenue stream switched to datacenter gpus (>10x gaming revenue). I wonder whether this have affected the quality of these consumer cards, including both their design and product processes:
It’s reasonable to argue that NVIDIA has a de facto monopoly in the field of GPU-accelerated compute, especially due to CUDA (Compute Unified Device Architecture). While not a legal monopoly in the strict antitrust sense (yet), in practice, NVIDIA's control over the GPU compute ecosystem — particularly in AI, HPC, and increasingly in professional content creation — is extraordinarily dominant.
Strict antitrust sense don't look at actual monopoly to trigger, but just if you use your standing in the market to gain unjust advantages. Which does not require a monopoly situation but just a strong standing used wrong (like abusing vertical integration). So Standard Oil, to take a famous example, never had more than a 30% market share.
Breaking a monopoly can be a solution to that, however. But having a large part of a market by itself doesn't trigger anti trust legislation.
Agreed. An excellent summary of a lot of missteps that have been building for a while. I had watched that article on the power connector/ shunt resistors and was dumbfounded at the seemingly rank-amateurish design. And although I don't have a 5000 series GPU I have been astonished at how awful the drivers have been for the better part of a year.
As someone who filed the AMD/ATi ecosystems due to their quirky unreliability, Nvidia and Intel have really shit the bed these days (I also had the misfortune of "upgrading" to a 13th gen Intel processor just before we learned that they cook themselves)
I do think DLSS supersampling is incredible but Lord almighty is it annoying that the frame generation is under the same umbrella because that is nowhere near the same, and the water is awful muddy since "DLSS" is often used without distinction
Aks. "Every beef anyone has ever had with Nvidia in one outrage friendly article."
If you want to hate on Nvidia, there'll be something for you in there.
An entire section on 12vhpwr connectors, with no mention of 12V-2x6.
A lot of "OMG Monopoly" and "why won't people buy AMD" without considering that maybe ... AMD cards are not considered by the general public to be as good _where it counts_. (Like benefit per Watt, aka heat.)
Maybe it's all perception, but then AMD should work on that perception. If you want the cooler CPU/GPU, perception is that that's Intel/Nvidia. That's reason enough for me, and many others.
Availability isn't great, I'll admit that, if you don't want to settle for a 5060.
> The RTX 4090 was massive, a real heccin chonker. It was so huge in fact, that it kicked off the trend of needing support brackets to keep the GPU from sagging and straining the PCIe slot.
This isn't true. People were buying brackets with 10 series cards.
That bit does seem a bit whiney. AMD's latest offerings are quite good, certainly better value for money. Why not buy that? The only shame is that they don't sell anything as massive as Nvidia's high end.
Choosing the vendor locked in, standards hating brand does tend to mean that you inevitably get screwed when they decide do massively inflate their prices and there's nothing you can do about it does tend to make you worse off, yes.
Not that AMD was anywhere near being in a good state 10 years ago. Nvidia still fucked you over.
I sometimes wonder if people getting this salty over "fake" frames actually realise every frame is fake even in native mode. Neither is more "real" than the other, it's just different.
a friend of mine is a SW developer in Nvidia, working on their drivers. He was complaining lately that he is required to fix a few bugs in the drivers code for the new card (RTX?), while not provided with the actual hardware. His pleas to send him this HW were ignored, but the demand to fix by a deadline kept being pushed.
He actually ended up buying older but somewhat similar used hardware with his personal money, to be able to do his work.
Not even sure if he was eventually able to expense it, but wouldn't be surprised if not, knowing how big companies bureaucracy works...
Oh man, you haven't gotten into their AI benchmark bullshittery. There's factors of 4x on their numbers that are basically invented whole cloth by switching units.
I disagree with some of the article’s points - primarily, that nVidia’s drivers were ever “good” - but the gist I agree with.
I have a 4070 Ti right now. I use it for inference and VR gaming on a Pimax Crystal (2880x2880x2). In War Thunder I get ~60 FPS. I’d love to be able to upgrade to a card with at least 16GB of VRAM and better graphics performance… but as far as I can tell, such a card does not exist at any price.
Nvidia is full of shit, but this article is full of shit, too.
A lot of human slop, some examples:
- 12VHPWR is not at fault / the issue. As the article itself points out, the missing power balancing circuit is to blame. The 3090 Ti had bot 12VHPWR and the balancing power circuit and ran flawless.
- Nvidia G-Sync: Total non-issue. G-Sync native is dead. Since 2023, ~1000 Freesync Monitors have been released, and 3(!!) G-Sync native Monitors.
- The RTX 4000 series is not still expensive, it is again expensive. It was much cheaper a year before RTX 5000 release
- Anti-Sag Brackets were a thing way before RTX 4000
with Intel also shitting the bed, it seems like AMD is poised to pick up “traditional computing” while everybody else runs off to chase the new gold rush. Presumably there’s still some money in desktops and gaming rigs?
A lot of those Xeon e-waste machines were downright awful, especially for the "cheap gaming PC" niche they were popular in. Low single-core clock speeds, low memory bandwidth for desktop-style configurations and super expensive motherboards that ran at a higher wattage than the consumer alternatives.
> THIS will be the excitement everyone is looking for.
Or TSMC could become geopolitically jeopardized somehow, drastically increasing the secondhand value of modern GPUs even beyond what they're priced at now. It's all a system of scarcity, things could go either way.
Sure, eventually. Then in 2032, you can enjoy the raster performance that slightly-affluent people in 2025 had for years.
By your logic people should be snatching up the 900 and 1000-series cards by the truckload if the demand was so huge. But a GTX 980 is like $60 these days, and honestly not very competitive in many departments. Neither it nor the 1000-series have driver support nowadays, so most users will reach for a more recent card.
More a sequence of potential events than a timeframe.
High-end GPUs are already useless for gaming (a low-end GPU is enough), their traditional source of demand. They're floating on artificial demand for a while now.
There are two markets that currently could use them: LLMs and Augmented Reality. Both of these are currently useless, and getting more useless by the day.
CPUs are just piggybacking on all of this.
So, lots of things hanging on unrealized promises. It will pop when there is no next use for super high-end GPUs.
War is a potential user of such devices, and I predict it could be the next thing after LLMs and AR. But then if war breaks out in such a scale to drive silicon prices up, lots of things are going to pop, and food and fuel will boom to such a magnitude that will make silicon look silly.
I think it will pop before it comes to the point of war driving it, and it will happen within our lifetimes (so, not a Nostradamus-style prediction that will only be realized long-after I'm dead).
Local LLMs are becoming more popular and easier to run, and Chinese corporations are releasing extremely good models of all sizes under MIT or similar terms in many cases. There amount of VRAM is the main limiter, and it would help with gaming too.
From a market perspective, LLMs sell GPUs. Doesn't even matter if they work or not.
From the geopolitical tensions perspective, they're the perfect excuse to create infrastructure for a global analogue of the Great Firewall (something that the Chinese are pioneers of, and catching up to the plan).
From the software engineering perspective, LLMs are a nuissance, a distraction. They harm everyone.
Really? What about textures? Any ML that the new wave of games might use? For instance, while current LLMs powering NPC interactions would be pretty horrible, what about in 2 years time? You could have arbitrary dialogue trees AND dynamically voiced NPCs or PCs. This is categorically impossible without more VRAM.
> the perfect excuse to create infrastructure for a global analogue of the Great Firewall
Yes, let's have more censorship and kill the dream of the Internet even deader than it already is.
> From the software engineering perspective, LLMs are a nuissance, a distraction. They harm everyone.
You should be aware that reasonable minds can differ in this issue. I won't defend companies forcing the use of LLMs (it would be like forcing use of vim or any other tech you dislike), but I disagree about being a nuisance, distraction, or a universal harm. It's all down to choices and fit for use case.
I don't see how GPU factories could be running in the event of war "in such a scale to drive silicon prices up". Unless you mean that supply will be low and people scavanging TI calculators for processors to make boxes playing Tetris and Space Invaders.
This is the exact model in which WWII operated. Car and plane supply chains were practically nationalized to support the military industry.
If drones, surveillance, satellites become the main war tech, they'll all use silicon, and things will be fully nationalized.
We already have all sorts of hints of this. Doesn't need a genius to predict that it could be what happens to these industries.
The balance with food and fuel is more delicate though. A war with drones, satellites and surveillance is not like WWII, there's a commercial aspect to it. If you put it on paper, food and fuel project more power and thus, can move more money. Any public crisis can make people forget about GPUs and jeopardize the process of nationalization that is currently being implemented, which still depends on relatively peaceful international trade.
CPU and GPU compute will be needed for military use processing the vast data from all sorts of sensors. Think about data centres crunching satellite imagery for trenches, fortifications and vehicles.
> satellite imagery for trenches, fortifications and vehicles
Dude, you're describing the 80s. We're in 2025.
GPUs will be used for automated surveillance, espionage, brainwashing and market manipulation. At least that's what the current batch of technologies implies.
The only thing stopping this from becoming a full dystopia is that delicate balance with food and fuel I mentioned earlier.
It has become pretty obvious that entire wealthy nations can starve if they make the wrong move. Turns out GPUs cannot produce calories, and there's a limit to how much of a market you can manipulate to produce calories for you.
Uhh, these 12VHPWR connectors seem like a serious fire risk. How are they not being recalled? I just got a 5060ti , now I'm wishing I went AMD instead.. what the hell :(
Whoa, the stuff covered in the rest of the post is just as egregious. Wow! Maybe time to figure out which AMD models compares performance-wise and sell this thing, jeez.
The thing is, company culture is a real thing. And some cultures are invasive/contagious like kudzu both internally to the company and into adjacent companies that they get comped against. The people get to thinking a certain way, they move around between adjacent companies at far higher rates than to more distant parts of their field, the executives start sitting on one another's boards, before you know it a whole segment is enshittified, and customers feel like captives in an exploitation machine instead of parties to a mutually beneficial transaction in which trade increases the wealth of all.
And you can build mythologies around falsehoods to further reinforce it: "I have a legal obligation to maximize shareholder value." No buddy, you have some very specific restrictions on your ability to sell the company to your cousin (ha!) for a handful of glass beads. You have a legal obligation to bin your wafers the way it says on your own box, but that doesn't seem to bother you.
These days I get a machine like the excellent ASUS Proart P16 (grab one of those before they're all gone if you can) with a little 4060 or 4070 in it that can boot up Pytorch and make sure the model will run forwards and backwards at a contrived size, and then go rent a GB200 or whatever from Latitude or someone (seriously check out Latitude, they're great), or maybe one of those wildly competitive L40 series fly machines (fly whips the llama's ass like nothing since Winamp, check them out too). The GMTek EVO-X1 is a pretty capable little ROCm inference machine for under 1000, its big brother is nipping at the heels of a DGX Spark under 2k. There is good stuff out there but its all from non-incumbent angles.
I don't game anymore but if I did I would be paying a lot of attention to ARC, I've heard great things.
Fuck the cloud and their ancient Xeon SKUs for more than Latitude charges for 5Ghz EPYC. Fuck NVIDIA gaming retail rat race, its an electrical as well as moral hazard in 2025.
It's a shame we all have to be tricky to get what used to be a halfway fair deal 5-10 years ago (and 20 years ago they passed a HUGE part of the scaling bonanza down to the consumer), but its possible to compute well in 2025.
Dude, no one talks about this and it drives me up the wall. The only way to guarantee modern CPUs from any cloud provider is to explicitly provision really new instance types. If you use any higher-level abstracted services (Fargate, Cloud Run, Lambda, whatever) you get salvation army second-hand CPUs from 15 years ago, you're billed by the second so the slower, older CPUs screw you over there, and you pay a 30%+ premium over the lower-level instances because its a "managed service". Its insane and extremely sad that so many customers put up with it.
Bare metal is priced like it always was but is mad convenient now. latitude.sh is my favorite, but there are a bunch of providers that are maybe a little less polished.
It's also way faster to deploy and easier to operate now. And mad global, I've needed to do it all over the world (a lot of places the shit works flawlessly and you can get Ryzen SKUs for nothing).
Protip: burn a partition of Ubuntu 24.04 LTS which is the default on everything and use that as "premium IPMI", even if you run Ubuntu. you can always boot into a known perfect thing with all the tools to tweak whatever. If I have to even restart on I just image it, faster than launching a VM on EC2.
Nobody’s going to read this, but this article and sentiment is utter anti-corporate bullshit, and the vastly congruent responses show that none of you have watched the historical development of GPGPU, or do any serious work on GPUs, or keep up with the open work of nvidia researchers.
The spoiled gamer mentality is getting old for those of us that actually work daily in GPGPU across industries, develop with RTX kit, do AI research, etc.
Yes they’ve had some marketing and technical flubs as any giant publically traded company will have, but their balance of research-driven development alongside corporate profit necessities is unmatched.
And no I don’t work for nvidia. I’ve just been in the industry long enough to watch the immense contribution nvidia has made to every. single. field. The work of their researchers is astounding, it’s clear to anyone that’s honestly worked in this field long enough. It’s insane to hate on them.
Maybe the average consumer doesn't agree they are being treated like shit? Steam top 10 GPU list is almost all NVidia. Happy customers or duped suckers? I've seen the later sentiment a lot over the years and discounting consumer's preferences never seems to lead to correct prediction of outcomes..
> Pretty much all upscalers force TAA for anti-aliasing and it makes the entire image on the screen look blurry as fuck the lower the resolution is.
I feel like this is a misunderstanding, though I admit I'm splitting hairs here. DLSS is a form of TAA, and so is FSR and most other modern upscalers. You generally don't need an extra antialiasing pipeline if you're getting an artificially supersampled image.
We've seen this technique variably developed across the lifespan of realtime raster graphics; first with checkerboard rendering, then TAA, then now DLSS/frame generation. It has upsides and downsides, and some TAA implementations were actually really good for the time.
Every kind of TAA that I've seen creates artifacts around fast-moving objects. This may sound like a niche problem only found in fast-twitch games but it's cropped up in turn-based RPGs and factory/city builders. I personally turn it off as soon as I notice it. Unfortunately, some games have removed traditional MSAA as an option, and some are even making it difficult to turn off AA when TAA and FXAA are the only options (though you can usually override these restrictions with driver settings).
The sad truth is that with rasterization every renderer needs to be designed around a specific set of antialiasing solutions. Antialiasing is like a big wall in your rendering pipeline, there's the stuff you can do before resolving and the stuff you can do afterwards. The problem with MSAA is that it is pretty much tightly coupled with all your architectural rendering decisions. To that end, TAA is simply the easiest to implement and it kills a lot of proverbial birds with one stone. And it can all be implemented as essentially a post processing effect, it has much less of the tight coupling.
MSAA only helps with geometric edges, shader aliasing can be combatted with prefiltering but even then it's difficult to get rid of it completely. MSAA also needs beefy multisample intermediate buffers, this makes it pretty much a non-starter on heavily deferred rendering pipelines, which throw away coverage information to fit their framebuffer budget. On top of that the industry moved to stochastic effects for rendering all kinds of things that were too expensive before, the latest being actual realtime path tracing. I know people moan about TAA and DLSS but to do realtime path tracing at 4k is sort of nuts really. I still consider it a bit of a miracle we can do it at all.
Personally, I wish there was more research by big players into things like texture space lighting, which makes shading aliasing mostly go away, plays nice with alpha blending and would make MSAA viable again. The issue there is with shading only the stuff you see and not wasting texels.
There's another path, which is to raise the pixel densities so high we don't need AA (as much) anymore, but I'm going to guess it's a) even more expensive and b) not going to fix all the problems anyway.
It's not that it's difficult to turn off TAA: it's that so many modern techniques do not work without temporal accumulation and anti-aliasing.
Ray tracing? Temporal accumulation and denoising. Irradiance cache? Temporal accumulation and denoising. most modern light rendering techniques cannot be done in time in a single frame. Add to that the fact that deferred or hybrid rendering makes implementing MSAA be anywhere between "miserable" and "impossible", and you have the situation we're in today.
A lot of this is going to come down to taste so de gustibus and all that, but this feels like building on a foundation of sand. If the artifacts can be removed (or at least mitigated), then by all means let's keep going with cool new stuff as long as it doesn't detract from other aspects of a game. But if they can't be fixed, then either these techniques ought to be relegated to special uses (like cutscenes or the background, kinda like the pre-rendered backdrops of FF7) or abandoned/rethought as pretty but impractical.
The 4090 was released coming up on 3 years and is currently going for about 25% over launch msrp USED. Buying gpu's is literally an appreciating asset. It is complete insanity and an infuriating situation for an average consumer.
I honestly don't know why nvidia didn't just suspend their consumer line entirely. It's clearly no longer a significant revenue source and they have thoroughly destroyed consumer goodwill over the past 5 years.
>I honestly don't know why nvidia didn't just suspend their consumer line entirely.
It's ~$12 billion a year with a high gross margin by the standards of every other hardware company. They want to make sure neither AMD nor Intel get that revenue they can invest into funding their own AI/ML efforts.
>How is it that one can supply customers with enough stock on launch consistently for decades, and the other can’t?
I guess the author is too young and didn't go through iPhone 2G to iPhone 6 era. Also worth remembering it wasn't too long ago Nvidia was sitting on nearly ONE full year of GPU stock unsold. That has completely changed the course of how Nvidia does supply chain management and forecast. Which unfortunately have a negative impact all the way to Series 50. I believe they have since changed and next Gen should be better prepared. But you can only do so much when AI demand is seemingly unlimited.
>The PC, as gaming platform, has long been held in high regards for its backwards compatibility. With the RTX 50 series, NVIDIA broke that going forward. PhysX.....
Glide? What about all the Audio Drivers API before. As much as I wish everything is backward compatible. That is just not how the world works. Just like any old games you need some fiddling to get it work. And they even make the code available so people could actually do something rather then emulation or reverse engineering.
>That, to me, was a warning sign that maybe, just maybe, ray tracing was introduced prematurely and half-baked.
Unfortunately that is not how it works. Do we want to go back to Pre-3DFx to today to see how many what we thought was great idea for 3D accelerator only to be replaced by better ideas or implementation? These idea were good on paper but didn't work well. We than learn from it and reiterate.
>Now they’re doing an even more computationally expensive version of ray tracing: path tracing. So all the generational improvements we could’ve had are nullified again......
How about Path Tracing is simply a better technology? Game developers also dont have to use any of these tech. The article act as if Nvidia forces all game to use it. Gamers want better graphics quality, Artist and Graphics asset is already by far the most expensive item in gaming and it is still increasing. What hardware improvement is allowing those to be achieved at lower cost. ( To Game Developers )
>Never mind that frame generation introduces input lag that NVIDIA needs to counter-balance with their “Reflex” technology,
No. That is not why "Reflex" tech was invented. Nvidia spend R&D on 1000 fps monitor as well and potentially sub 1ms frame monitor. They have always been latency sensitive.
------------------------------
I have no idea how modern Gamers become what they are today. And this isn't the first time I have read it even on HN. You dont have to buy Nvidia. You have AMD and now Intel ( again ). Basically I can summarise one thing about it, Gamers want Nvidia 's best GPU for the lowest price possible. Or a price they think is acceptable without understanding the market dynamics and anything supply chain or manufacturing. They also want higher "generational" performance. Like 2x every 2 year. And if they dont get it, it is Nvidia's fault. Not TSMC, not Cadence, not Tokyo Electron, not Issac Newton or Law of Physic. But Nvidia.
Nvidia's PR tactic isn't exactly new in the industry. Every single brand do something similar. Do I like it? No. But unfortunately that is how the game is played. And Apple is by far the worst offender.
I do sympathise with the Cable issue though. And not the first time Nvidia has with thermal issues. But then again they are also the one who are constantly pushing the boundary forward. And AFAIK the issues isn't as bad as the series 40 but some YouTube seems to be making a bigger issue than most. Supply issues will be better but TSMC 3nm is fully booked . The only possible solution would be to have consumer GPU less capable of AI workload. Or to have AI GPU working with leading edge node and consumer always be a node lower to split the capacity problem. I would imagine that is part of the reason why TSMC is accelerating 3nm capacity increase on US soil. Nvidia is now also large enough and has enough cash to take on more risk.
Yeah, computer graphics has always been "software trickery" all the way down. There are valid points to be made about DLSS being marketed in misleading ways, but I don't think it being "software trickery" is a problem at all.
Exactly. Running games at a lower resolution isn't new. I remember changing the size of the viewport in the original DOOM 1993 to get it to run faster. Making a lower resolution look better without having to run at a higher resolution is the exact same problem anti-aliasing has been tackling forever. DLSS is just another form of AA that is now so advanced, you can go from an even lower resolution and still look good.
So even when I'm running a game at native resolution, I still want anti-aliasing, and DLSS is a great choice then.
It's one thing to rely on a technique like AA to improve visual quality with negligible drawbacks. DLSS is entirely different though, since upscaling introduces all kinds of graphical issues, and frame generation[1] even more so, while adding considerable input latency. NVIDIA will claim that this is offset by its Reflex feature, but that has its own set of issues.
So, sure, we can say that all of this is ultimately software trickery, but when the trickery is dialed up to 11 and the marketing revolves entirely on it, while the raw performance is only slightly improved over previous generations, it's a clear sign that consumers are being duped.
[1]: I'm also opposed to frame generation from a philosophical standpoint. I want my experience to be as close as possible to what the game creator intended. That is, I want every frame to be generated by the game engine; every object to look as it should within the world, and so on. I don't want my graphics card to create an experience that approximates what the creator intended.
This is akin to reading a book on an e-reader that replaces every other word with one chosen by an algorithm. I want none of that.
I don't think we are? Article talks about DLSS on RTX 20 series cards, which do not support DLSS frame-gen:
> What always rubbed me the wrong way about how DLSS was marketed is that it wasn’t only for the less powerful GPUs in NVIDIA’s line-up. No, it was marketed for the top of the line $1,000+ RTX 20 series flagship models to achieve the graphical fidelity with all the bells and whistles.
The article doesn't make the best argument to support the claim but it's true that NVIDIA is now making claims like '4090 level performance' on the basis that if you turn on DLSS multi-frame generation you suddenly have Huge Framerates when most of the pixels are synthesized instead of real.
Personally I'm happy with DLSS on balanced or quality, but the artifacts from framegen are really distracting. So I feel like it's fair to call their modern marketing snake oil since it's so reliant on frame gen to create the illusion of real progress.
Here's something I don't understand: Why is it that when I go look at DigitalOcean's GPU Droplet options, they don't offer any Blackwell chips? [1] I thought Blackwell was supposed to be the game changing hyperchip that carried AI into the next generation, but the best many providers still offer are Hopper H100s? Where are all the Blackwell chips? Its been oodles of months.
Apparently AWS has them available in the P6 instance type, but the only configuration they offer has 2TB of memory and costs... $113/hr [2]? Like, what is going on at Nvidia?
Where the heck is Project Digits? Like, I'm developing this shadow opinion that Nvidia actually hasn't built anything new in three years, but they fill the void by talking about hypothetical newtech that no one can actually buy + things their customers have built with the actually good stuff they built three years ago. Like, consumers can never buy Blackwell because "oh Enterprises have bought them all up" then when Microsoft tries to buy any they say "Amazon bought them all up" and vice-versa. Something really fishy is going on over there. Time to short.
> So 7 years into ray traced real-time computer graphics and we’re still nowhere near 4K gaming at 60 FPS, even at $1,999.
The guy is complaining that a product can’t live up to his standard, while dismissing barely noticeable proposed trade off that can make it possible because it’s «fake».
I'm so happy to see someone calling NVIDIA out for their bullshit. The current state of GPU programming sucks, and that's just an example of the problems with the GPU market today.
The lack of open source anything for GPU programming makes me want to throw my hands up and just do Apple. It feels much more open than pretending that there's anything open about CUDA on Linux.
Another perspective: Nvidia customer support on their mellanox purchase ...is total crap. It's the worst of corporate America ... paper pushing beurceatric guys who slow roll stuff ... getting to a smart person behind the customer reps requires one to be an ape in a bad mood 5x ... I think they're so used to that now that unless you go crazy mode their take is ... well I guess he wasn't serious about his ask and he dropped it.
Here's another nvdia/mellanox bs problem: many mlx nic cards are finalized or post assembled say by hp. So if you have a hp "mellanox" nic nvidia washes their hands of anything detailed. It's not ours; hp could have done anything to it what do we know? So one phones hp ... and they have no clue either because it's really not their IP or their drivers.
It's a total cluster bleep and more and more why corporate america sucks
Corporate America actually resembles the state of government a lot too. Deceptive marketing, inflated prices that leave the average Joe behind, and low quality products on top of all that.
In the 1980s maybe a course correction was needed to help capitalism. But it's over corrected by 30%. I'm not knocking corporate america or capitalism in absolute terms. I am saying customers have lost power... whether it's phone trees, right to fix, a lack of accountability (2008 housing crisis), the ability to play endless accounting games to pay lower taxes plus all the more mundane things ... it's gotten out of whack.
> With over 90% of the PC market running on NVIDIA tech, they’re the clear winner of the GPU race. The losers are every single one of us.
I have been rocking AMD GPU ever since the drivers were upstreamed into the linux kernel. No regrets.
I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy. But consumer gotta consoooooom and then cry and outrage when they are exploited instead of just walking away and doing something else.
Same with magic the gathering, the game went to shit and so many people got outraged and in a big huff but they still spend thousands on the hobby. I just stopped playing mtg.
I want to love AMD, but they're just... mediocre. Worse for gaming, and much worse for ML. They're better-integrated into Linux, but given that the entire AI industry runs on:
1. Nvidia cards
2. Hooked up to Linux boxes
It turns out that Nvidia tends to work pretty well on Linux too, despite the binary blob drivers.
Other than gaming and ML, I'm not sure what the value of spending much on a GPU is... AMD is just in a tough spot.
> I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy.
I'd really love to try AMD as a daily driver. For me CUDA is the showstopper. There's really nothing comparable in the AMD camp.
ROCM is, to some degree and in some areas, a pretty decent alternative. Developing with it is often times a horrible experience, but once something works, it works fine.
> I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy.
I'm with you - in principle. Capital-G "Gamers" who see themselves as the real discriminated group have fully earned the ridicule.
But I think where the criticism is valid is how NVIDIA's behavior is part of the wider enshittification trend in tech. Lock-in and overpricing in entertainment software might be acceptance, but it gets problematic when we have the exact same trends in actually critical tech like phones and cars.
I am not a gamer and don't why AMD GPUs aren't good enough. It's weird since both Xbox and PlayStation are using AMD GPUs.
I guess there games that you can only play on PC with Nvidia graphics. That begs the question why someone create a game and ignore large console market.
AMD cards are fine from a raw performance perspective, but Nvidia has built themselves a moat of software/hardware features like ray-tracing, video encoding, CUDA, DLSS, etc where AMD's equivalents have simply not been as good.
With their current generation of cards AMD has caught up on all of those things except CUDA, and Intel is in a similar spot now that they've had time to improve their drivers, so it's pretty easy now to buy a non-Nvidia card without feeling like you're giving anything up.
I have no experience of using it so I might be wrong but AMD has ROCm which has something called HIP that should be comparable to CUDA. I think it also has a way to automatically translate CUDA calls into HIP as well so it should work without the need to modify your code.
`I think it also has a way to automatically translate CUDA calls`
I suspect the thing you're referring to is ZLUDA[0], it allows you to run CUDA code on a range of non NVidia hardware (for some value of "run").
[0] https://github.com/vosen/ZLUDA
it's mostly about AI training at this point. the software for this only supports CUDA well.
AMD RT is still slower than Nvidia's.
What I experienced is that AI is a nightmare on AMD in Linux. There is a myriad of custom things that one needs to do, and even that just breaks after a while. Happened so much on my current setup (6600 XT) that I don't bother with local AI anymore, because the time investment is just not worth it.
It's not that I can't live like this, I still have the same card, but if I were looking to do anything AI locally with a new card, for sure it wouldn't be an AMD one.
I don't have much experience with ROCm for large trainings, but NVIDIA is still shit with driver+cuda version+other things. The only simplification is due to ubuntu and other distros that already do the heavy lift by installing all required components, without much configuration.
Oh I'm sure. The thing is that with AMD I have the same luxury, and the wretched thing still doesn't work, or has regressions.
Nvidia is the high end, AMD is the mid segment and Intel is the low end. In reality I am playing 4K on HellDivers with 50-60FPS on a 6800XT.
Traditionally the NVIDIA drivers have been more stable on Windows than the AMD drivers. I choose an AMD card because I wanted a hassle free experience on Linux (well as much as you can).
ive used an amd card for a couple years
its been great. flawless in fact.
> AMD GPUs aren't good enough.
Software. AMD has traditionally been really bad at their drivers. (They also missed the AI train and are trying to catch up).
I use Linux and have learned not to touch AMD GPUs (and to a lesser extent CPUs due to chipset quality/support) a long time ago. Even if they are better now, (I feel) Intel integrated (if no special GPU perf needed) or NVidia are less risky choices.
This is wrong. For 14 years the recommendation on Linux is:
Because the AMD drivers are good and open-source. And AMD cares about bug reports. The one from Nvidia can and will create issues because they’re closed-source and avoided for years to support Wayland. Now Nvidia published source-code and refuses to merge it into Linux and Mesa facepalmWhile Nvidia comes up with proprietary stuff AMD brought us Vulkan, FreeSync, supported Wayland well already with Implicit-Sync (like Intel) and used the regular Video-Acceleration APIs for long time.
Meanwhile Nvidia:
https://registry.khronos.org/OpenGL/extensions/NV/NV_robustn...
Their bad drivers still don’t handle simple actions like a VT-Switch or Suspend/Resume. If a developer doesn’t know about that extension the users suffer for years.Okay. But that is probably only a short term solution? It is Nvidias short term solution since 2016!
https://www.phoronix.com/news/NVIDIA-Ubuntu-2025-SnR
I've been using a 4090 on my linux workstation for a few years now. Its mostly fine - with the occasional bad driver version randomly messing things up. I'm using linux mint. Mint uses X11, which, while silly, means suspend / resume works fine.
NVIDIA's drivers also recently completely changed how they worked. Hopefully that'll result in a lot of these long term issues getting fixed. As I understand it, the change is this: The nvidia drivers contain a huge amount of proprietary, closed source code. This code used to be shipped as a closed source binary blob which needed to run on your CPU. And that caused all sorts of problems - because its linux and you can't recompile their binary blob. Earlier this year, they moved all the secret, proprietary parts into a firmware image instead which runs on a coprocessor within the GPU itself. This then allowed them to - at last - opensource (most? all?) of their remaining linux driver code. And that means we can patch and change and recompile that part of the driver. And that should mean the wayland & kernel teams can start fixing these issues.
In theory, users shouldn't notice any changes at all. But I suspect all the nvidia driver problems people have been running into lately have been fallout from this change.
> I use Linux and have learned not to touch AMD GPUs (and to a lesser extent CPUs due to chipset quality/support) a long time ago. Even if they are better now, (I feel) Intel integrated (if no special GPU perf needed) or NVidia are less risky choices.
Err, what? While you're right about Intel integrated GPUs being a safe choice, AMD has long since been the GPU of choice for Linux -- it just works. Whereas Nvidia on Linux has been flaky for as long as I can remember.
Had major problems with xinerama, suspend/resume, vsync, probably a bunch of other stuff.
That said, I've been avoiding AMD in general for so long the ecosystem might have really improved in the meantime, as there was no incentive for me to try and switch.
Recently I've been dabbling in AI where AMD GPUs (well, sw ecosystem, really) are lagging behind. Just wasn't worth the hassle.
NVidia hw, once I set it up (which may be a bit involved), has been pretty stable for me.
I run llama.cpp using Vulkan and AMD CPUs, no need to install any drivers (or management software for that matter, nor any need to taint the kernel meaning if I have an issue it's easy to get support). For example the other day when a Mesa update had an issue I had a fix in less than 36 hours (without any support contract or fees) and `apt-mark hold` did a perfect job until there was a fix. Performance for me is within a couple of % points, and with under-volting I get better joules per token.
>Err, what? While you're right about Intel integrated GPUs being a safe choice, AMD has long since been the GPU of choice for Linux -- it just works. Whereas Nvidia on Linux has been flaky for as long as I can remember.
Not OP, I had same experience in the past with AMD,I bought a new laptop and in 6 months the AMD decided that my card is obsolete and no longer provided drivers forcing me to be stuck with older kernel/X11 , so I switched to NVIDIA and after 2 PC changes I still use NVIDIA since the official drivers work great, I really hope AMD this time is putting the effort to keep older generations of cards working on latest kernels/X11 maybe next card will be AMD.
But this is an explanations why us some older Linux users have bad memories with AMD and we had good reason to switch over to NVIDIA and no good reason to switch back to AMD
They have never been flaky on the x11 desktop
> I have also realized that there is a lot out there in the world besides video games
My main hobby is videogames, but since I can consistently play most games on Linux (that has good AMD support), it doesn't really matter.
AMD isn't even bad at video games, it's just pytorch that doesn't work so well.
Frame per watt they aren't as good. But they are still decent.
They seem to be close? The RX 9070 is the 2nd most efficient graphics card this generation according to TechPowerUp and they also do well when limited to 60Hz, implying their joules per frame isn't bad either.
Efficiency: https://tpucdn.com/review/gigabyte-geforce-rtx-5050-gaming-o...
Vsync power draw: https://tpucdn.com/review/gigabyte-geforce-rtx-5050-gaming-o...
The variance within Nvidia's line-up is much larger than the variance between brands, anyway.
I run 9070s (non XT) and in combination with under-volting it is very efficient in both joules per frame and joules per token. And in terms of purchase price it was a steal compared to similar class of NVidia cards.
The RX 9070XT goes toe-to-toe with the RTX 4080 in many benchmarks, and costs around 2/3 MSRP. I'd say that's a pretty big win!
TCO per FPS is almost certainly cheaper.
You are certainly right that this group has little spending self-control. There is no limit just about to how abusive companies like Hasbro, Nvidia and Nintendo can be and still rake in record sales.
They will complain endlessly about the price of a RTX 5090 and still rush out to buy it. I know people that own these high end cards as a flex, but their lives are too busy to actually play games.
I'm not saying that these companies aren't charging "fair" prices (whatever that means) but for many hardcore gamers their spending per hour is tiny compared to other forms of entertainment. They may buyba $100 game and play to for over 100 hours. Maybe add another $1/hour for the console. Compared to someone who frequents the cinema goes to the pub or does many other common hobbies and it can be hard to say that games are getting screwed.
Now it is hard to draw a straight comparison. Gamers may spend a lot more time playing so $/h isn't a perfect metric. And some will frequently buy new games or worse things like microtransactions which quickly skyrocket the cost. But overall it doesn't seem like the most expensive hobby, especially if you are trying to spend less.
> I have also realized that there is a lot out there in the world besides video games
My favorite part about being a reformed gaming addict is the fact that my MacBook now covers ~100% of my computer use cases. The desktop is nice for Visual Studio but that's about it.
I'm still running a 5700XT in my desktop. I have absolutely zero desire to upgrade.
Put Linux on it, and you can even run software raytracing on it for games like Indiana Jones! It'll do something like ~70 fps medium 1080p IIRC.
No mesh shader supports though. I bet more games will start using that soon
Same here - actually, my PC broke in early 2024 and I still haven't fixed it. I quickly found out that without gaming, I no longer have any use for my PC, so now I just do everything on my MacBook.
Im a reformed gaming addict as well and mostly play games over 10 years old, and am happy to keep doing that.
Still have 2080 RTX on primary desktop, it's more than enough for GUI.
Just got PRO 6000 96GB for models tuning/training/etc. The cheapest 'good enough' for my needs option.
PCI reset bug makes it necessary to upgrade to 6xxx series at least.
> I'm still running a 5700XT in my desktop. I have absolutely zero desire to upgrade.
Same boat. I have 5700XT as well and since 2023, used mostly my Mac for gaming.
Same here. I got mine five years ago when I needed to upgrade my workstation to do work-from-home, and it's been entirely adequate since then. I switched the CPU from an AMD 3900 to a 5900, but that's the only upgrade. The differences from one generation to the next are pretty marginal.
> I have also realized that there is a lot out there in the world besides video games
...and even if you're all in on video games, there's a massive amount of really brilliant indie games on Steam that run just fine on a 1070 or 2070 (I still have my 2070 and haven't found a compelling reason to upgrade yet).
> I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy.
I think more and more people will realize games are a waste of time for them and go on to find other hobbies. As a game developer, it kinda worries me. As a gamer, I can't wait for gaming to be a niche thing again, haha.
The games industry is now bigger than the movies industry. I think you're very wrong about this, as games are engaging in a way other consumption based media simply cannot replicate.
I played video games since I was a teenager. Loved them, was obsessed with them. Then sometime around 40 I just gave up. Not because of life pressure or lack of time but because I just started to find them really boring and unfulfilling. Now I’d much rather watch movies or read. I don’t know if the games changed or I changed.
I get that, I go through periods of falling in and out of them too after having grown up with them. But there is a huge fraction of my age group (and a little older) that have consistently had games as their main "consumption" hobby throughout.
And then there's the age group younger than me, for whom games are not only a hobby but also a "social place to be", I doubt they'll be dropping gaming entirely easily.
"it's just a fad"
Nah. Games will always be around.
Fortunately for your business model, there's a constant stream of new people to replace the ones who are aging out. But you have to make sure your product is appealing to them, not just to the same people who bought it last decade.
Also playing PC video games doesn't even require a Nvidia GPU. It does sorta require Windows. I don't want to use that, so guess I lost the ability to waste tons of time playing boring games, oh no.
Out of the 11 games I've bought through Steam this year, I've had to refund one (1) because it wouldn't run under Proton, two (2) had minor graphical glitches that didn't meaningfully affect my enjoyment of them, and two (2) had native Linux builds. Proton has gotten good enough that I've switched from spending time researching if I can play a game to just assuming that I can. Presumably ymmv depending on your taste in games of course, but I'm not interested in competitive multiplayer games with invasive anticheat which appears to be the biggest remaining pain point.
My experience with running non-game windows-only programs has been similar over the past ~5 years. It really is finally the Year of the Linux Desktop, only few people seem to have noticed.
It depends on the games you play and what you are doing. It is a mixed bag IME. If you are installing a game that is several years old it will work wonderfully. Most guides assume you have Arch Linux or are using one of the "gaming" distros like Bazzite. I use Debian (I am running Testing/Trixie RC on my main PC).
I play a lot of HellDivers 2. Despite what a lot of Linux YouTubers say. It doesn't work very well on Linux. The recommendations I got from people was to change distro. I do other stuff on Linux. Game slows down when you need it to be running smoothly doesn't matter what resolution/settings you set.
Anything with anti-cheat probably won't work very well if at all.
I also wanted to play the old Command and Conquer games. Getting the fan made patchers (not the games itself) to run properly that fix a bunch of bugs that EA/Westwood never fixed and mod support is more difficult than I cared to bother with.
Fedora 42, Helldivers 2
Make sure to change your Steam launch options to:
PULSE_LATENCY_MSEC=84 gamemoderun %command%
This will use gamemode to run it, give it priority, put the system in performance power mode, and will fix any pulse audio static you may be having. You can do this for any game you launch with steam, any shortcut, etc.
It's missing probably 15fps on this card between windows and Linux, and since it's above 100fps I really don't even notice.
It does seem to run a bit better under gnome with Variable Refresh Rate than KDE.
I will be honest, I just gave up. I couldn't get consistent performance on HellDivers 2. Many of the things you have mentioned I've tried and found they don't make much of a difference or made things worse.
I did get it running nice for about a day and then an update was pushed and it ran like rubbish again. The game runs smoothly when initially running the map and then massive dip in frames for several seconds. This is usually when one of the bugs is jumping at you.
This game may work better on Fedora/Bazzite or <some other distro> but I find Debian to be super reliable and don't want to switch distro. I also don't like Fedora generally as I've found it unreliable in the past. I had a look at Bazzite and I honestly just wasn't interested. This is due to it having a bunch of technologies that I have no interest in using.
There are other issues that are tangential but related issues.
e.g.
I normally play on Super HellDive with other players in a Discord VC. Discord / Pipewire seems to reset my sound for no particular reason and my Plantronics Headset Mic (good headset, not some gamer nonsense) will be not found. This requires a restart of pipewire/wireplumber and Discord (in that order). This happens often enough I have a shell script alias called "fix_discord".
I have weird audio problems on HDMI (AMD card) thanks to a regression in the kernel (Kernel 6.1 with Debian worked fine).
I could mess about with this for ages and maybe get it working or just reboot into Windows which takes me all of a minute.
It is just easier to use Windows for Gaming. Then use Linux for work stuff.
I used Debian for about 15 years.
Honestly? Fedora is really the premier Linux distro these days. It's where the most the development is happening, by far.
All of my hardware, some old, some brand new (AMD card), worked flawlessly out of the box.
There was a point when you couldn't get me to use an rpm-based distro if my life depended on it. That time is long gone.
I don't want to use Fedora. Other than I've found it unreliable I switched to Debian because I was fed up of all the Window-isms/Corporate stuff in the distro that was enabled by default that I was trying to get away from.
It the same reason I don't want to use Bazzite. It misses the point of using a Linux/Unix system altogether.
I also learned a long time ago Distro Hopping doesn't actually fix your issues. You just end up either with the same issues or different ones. If I switched from Debian to Fedora, I suspect I would have many of the same issues.
e.g. If a issue is in the Linux kernel itself such as HDMI Audio on AMD cards having random noise, I fail to see how changing from one distro to another would help. Fedora might have a custom patch to fix this, however I could also take this patch and make my own kernel image (which I've done in the past btw).
The reality is that most people doing development for various project / packages that make the Linux desktop don't have the setup I have and some of the peculiarities I am running into. If I had a more standard setup, I wouldn't have an issue.
Moreover, I would be using FreeBSD/OpenBSD or some other more traditional Unix system and ditch Linux if I didn't require some Linux specific applications. I am considering moving to something like Artix / Devuan in the future if I did decide to switch.
My hesitation is around high end settings, can Proton run 240hz on 1440p and high settings? I'm switching anyway soon and might just have a separate machine for gaming but I'd rather it be Linux. SteamOS looks promising if they release for PC.
Proton has often better performance than gaming under Windows - partly because Linux is faster - so sure it can run those settings.
The only games in my library at all that don't work on linux are indie games from the early 2000s, and I'm comfortable blaming the games themselves in this case.
I also don't play any games that require a rootkit, so..
good move, thats why i treat my windows install as a dumb game box, they can steal whatever data they want from that i dont care. i do my real work on linux, as far away from windows as i can possibly get.
Same way I treat my windows machine, but also the reason I wont be swapping it to linux any time soon. I use different operating systems for different purposes for a reason. It's great for fompartmentalization.
When I am in front of windows, I know I can permit myself to relax, breath easy and let off some steam. When I am not, I know I am there to learn/earn a living/produce something etc. Most probably do not need this, but my brain does, or I would never switch off.
What works for me is having different Activities/Workspaces in KDE - they have different wallpapers, pinned programs in the taskbar, the programs themselves launch only in a specific Activity. I hear others also use completely different user accounts.
> It does sorta require Windows.
The vast majority of my gaming library runs fine on Linux. Older games might run better than on Windows, in fact.
True for single player, but if you're into multiplayer games anti-cheat is an issue.
If a game requires invasive anticheat, it is probably something I won't enjoy playing. Most likely the game will be full of cheaters anyway.
And yes, I rarely play anything online multiplayer.
Proton/Steam/ Linux works damn nearly flawlessly for /most/ games. I've gone through a Nvidia 2060, a 4060, and now an AMD 6700 XT. No issues even for release titles at launch.
Steam's Wine thing works quite well. And yes you need to fiddle and do work arounds including giving up getting some games to work.
Yeah Proton covers a lot of titles. It’s mainly games that use the most draconian forms of anticheat that don’t work.
It's Linux, what software doesn't need fiddling to work?
Other than maybe iOS what OSes in general don't need fiddling these days to be usable?
Yeah, but it's not worth. Apparently the "gold" list on ProtonDB is games that allegedly work with tweaks. So like, drop in this random DLL and it might fix the game. I'm not gonna spend time on that.
Last one I ever tried was https://www.protondb.com/app/813780 with comments like "works perfectly, except multiplayer is completely broken" and the workaround has changed 3 times so far, also it lags no matter what. Gave up after stealing 4 different DLLs from Windows. It doesn't even have anticheat, it's just cause of some obscure math library.
> Yeah, but it's not worth. Apparently the "gold" list on ProtonDB is games that allegedly work with tweaks. So like, drop in this random DLL and it might fix the game. I'm not gonna spend time on that.
I literally never had to do that. Most tweaking I needed to do was switching proton versions here and there (which is trivial to do).
I've been running opensuse+steam and I never had to tweak a dll to get a game running. Albeit that I don't exactly chase the latest AAA, the new releases that I have tried have worked well.
Age of empires 2 used to work well, without needing any babying, so I'm not sure why it didn't for you. I will see about spinning it up.
Seems a bit calculated and agreed across the industry. What can really make sense of Microsoft's acquisitions and ruining of billion dollar IPs? It's a manufactured collapse of the gaming industry. They want to centralize control of the market and make it a service based (rent seeking) sector.
I'm not saying they all got together and decided this together but their wonks are probably all saying the same thing. The market is shrinking and whether it's by design or incompetence, this creates a new opportunity to acquire it wholesale for pennies on the dollar and build a wall around it and charge for entry. It's a natural result of games requiring NVidia developers for driver tuning, bitcoin/ai and buying out capacity to prevent competitors.
The wildcard I can't fit into this puzzle is Valve. They have a huge opportunity here but they also might be convinced that they have already saturated the market and will read the writing on the wall.
I think the reason you see things like Blizzard killing off Overwatch 1 is because the Lindy effect applies in gaming as well. Some things are so sticky and preferred that you have to commit atrocities to remove them from use.
From a supply/demand perspective, if all of your customers are still getting high on the 5 (or 20) year old supply, launching a new title in the same space isn't going to work. There are not an infinite # of gamers and the global dopamine budget is limited.
Launching a game like TF2 or Starcraft 2 in 2025 would be viewed as a business catastrophe by the metrics most AAA studios are currently operating under. Monthly ARPU for gamers years after purchasing the Orange Box was approximately $0.00. Giving gamers access to that strong of a drug would ruin the demand for other products.
Petition related to companies like Blizzard killing games: https://eci.ec.europa.eu/045/public/#/screen/home
I purchased "approximately $0.00" in TF2 loot boxes. How much exactly? Left as an exercise to the reader.
When were microtransactions added to TF2? Probably years after the initial launch, and they worked so well the game became f2p.
People forget that TF2 was originally 20 dollars before hitting the F2P market.
I paid full price for the orange box
This is too clever for me, I think - 0?
Approximately. +/- 0
The video game industry has been through cycles like this before. One of them (the 1983 crash) was so bad it killed most American companies and caused the momentum to shift to Japan for a generation. Another one I can recall is the "death" of the RTS (real-time strategy) genre around 2010. They have all followed a fairly similar pattern and in none of them that I know of have things played out as the companies involved thought or hoped they would.
I worked in the video game industry from the 90s through to today. I think you are over generalizing or missing the original point. It's true that there have been boom and busts. But there are also structural changes. Do you remember CD-ROMs? Steam and the iPhone were structural changes.
What Microsoft is trying to do with Gamepass is a structural change. It may not work out the way that they plan but the truth is that sometimes these things do change the nature of the games you play.
But the thing is that Steam didn't cause the death of physical media. I absolutely do remember PC gaming before Steam, and between the era when it was awesome (StarCraft, Age of Empires, Unreal Tournament, Tribes, etc.) and the modern Steam-powered renaissance, there was an absolutely dismal era of disappointment and decline. Store shelves were getting filled with trash like "40 games on one CD!" and each new console generation gave retailers an excuse to shrink shelf space for PC games. Yet during this time, all of Valve's games were still available on discs!
I think Microsoft's strategy is going to come to the same result as Embracer Group. They've bought up lots of studios and they control a whole platform (by which I mean Xbox, not PC) but this doesn't give them that much power. Gaming does evolve and it often evolves to work around attempts like this, rather than in favor of them.
I am not saying that about Steam. In fact Steam pretty much saved triple A PC gaming. Your timeline is quite accurate!
>> Microsoft's strategy is going to come to the same result as Embracer Group.
I hope you are right.
If I were trying to make a larger point, I guess it would be that big tech companies (Apple, MSFT, Amazon) don't want content creators to be too important in the ecosystem and tend to support initiatives that emphasize the platform.
> big tech companies (Apple, MSFT, Amazon) don't want content creators to be too important in the ecosystem
100%. The platforms' ability to monetize in their factor is directly proportional to their relative power vs the most powerful creatives.
Thus, in order to keep more money, they make strategic moves that disempower creatives.
Not in the game industry but as a consumer this is very true. One example: ubiquitous access to transactions and payment systems gave a huge rise to loot boxes.
Also mobile games that got priced at $0.99 meant that only the unicorn level games could actually make decent money so In-App Purchases were born.
But also I suspect it is just a problem where as consumers we spend a certain amount of money on certain kinds of entertainment and if as a content producer you can catch enough people’s attention you can get a slice of that pie. We saw this with streaming services where an average household spent about $100/month on cable so Netflix, Hulu, et al all decided to price themselves such that they could be a portion of that pie (and would have loved to be the whole pie but ironically studios not willing to license everything to everyone is what prevented that).
Thankfully, RTS is healthy again! (To your point about cycles)
What RTS games are you playing now, please?
AoE2, baby. Still going strong, decades after launch.
And AoE4, one of the few high profile RTS games of the past years, is dead.
If it’s manufactured it implies intent. Someone at Microsoft is doing it on purpose and, presumably, thinks it’ll benefit them. I’m not sure how this can be seen as a win for them. They invested a massive amount of money into buying all those game studios. They also admitted Xbox hardware is basically dead. So the only way they can any return on that investment is third party hardware: either PlayStation or PC. If I were to choose it would be pc for MS. They already have game pass and windows is the gaming OS. By giving business to Sony they would undermine those.
I don’t think nVidia wants gaming collapse either. They might not prioritize it now but they definitely know that it will persist in some form. They bet on AI (and crypto before it) because those are lucrative opportunities but there’s no guarantee they will last. So they squeeze as much as they can out of those while they can. They definitely want gaming as a backup. It might be not as profitable and more finicky as it’s a consumer market but it’s much more stable in the long run.
Valve is a private company so doesn’t have the same growth at all costs incentives. To Microsoft, the share price is everything.
> It's a manufactured collapse of the gaming industry. They want to centralize control of the market and make it a service based (rent seeking) sector.
It also won’t work, and Microsoft has developed no way to compete on actual value. As much as I hate the acquisitions they’ve made, even if Microsoft as a whole were to croak tomorrow I think the game industry would be fine.
New stars would arise, others suggesting the games industry would collapse and go away is like saying the music industry collapsing would stop people from making music.
Yes games can be expensive to make, but they don't have to be, and millions will still want new games to play. It is actually a pretty low bar for entry to bring an indie game to market (relative to other ventures). A triple A studio collapse would probably be an amazing thing for gamers, lots of new and unique indie titles. Just not great for profit for big companies, a problem I am not concerned with.
As much as they've got large resources, I'm not sure what projects they could reasonably throw a mountain of money at and expect to change things, and presumably benefit from in the future instead of doing it to be a a force of chaos in the industry. Valve's efforts all seem to orbit around the store, that's their main business and everything else seems like a loss-leader to get you buying through it even if it comes across as a pet project of a group of employees.
The striking one for me is their linux efforts, at least as far as I'm aware they don't do a lot that isn't tied to the steam deck (or similar devices) or running games available on steam through linux. Even the deck APU is derived from the semi-custom work AMD did for the consoles, they're benefiting from a second later harvest that MS/Sony have invested (hundreds of millions?) in many years earlier. I suppose a lot of it comes down to what Valve needs to support their customers (developers/publishers), they don't see the point in pioneering and establishing some new branch of tech with developers.
I've always played a few games for many hours as opposed to many games for one playthrough. Subscription just does not make sense for me, and I suspect that's a big part of the market. Add to this the fact that you have no control over it and then top it off with potential ads and I will quit gaming before switching to subs only. Luckily there is still GoG and Steam doesn't seem like it will change but who knows.
This post is crazy nonsense: Bad games companies have always existed, and the solution is easy: dont buy their trash. I buy mostly smaller indie games these days just fine.
nvidia isn't purposely killing anything, they are just following the pivot into the AI nonsense. They have no choice, if they are in a unique position to make 10x by a pivot they will, even if it might be a dumpsterfire of a house of cards. Its immoral to just abandon the industry that created you, but companies have always been immoral.
Valve has an opportunity to what? Take over video card hardware market? No. AMD and Intel are already competitors in the market and cant get any foothold (until hopefully now consumers will have no choice but to shift to them)
This really makes no sense:
> This in turn sparked rumors about NVIDIA purposefully keeping stock low to make it look like the cards are in high demand to drive prices. And sure enough, on secondary markets, the cards go way above MSRP
Nvidia doesn't earn more money when cards are sold above MSRP, but they get almost all the hate for it. Why would they set themselves up for that?
Scalpers are a retail wide problem. Acting like Nvidia has the insight or ability to prevent them is just silly. People may not believe this, but retailers hate it as well and spend millions of dollars trying to combat it. They would have sold the product either way, but scalping results in the retailer's customers being mad and becoming some other company's customers, which are both major negatives.
Scalpers are only a retail-wide problem if (a) factories could produce more, but they calculated demand wrong, or (b) factories can't produce more, they calculated demand wrong, and under-priced MSRP relative to what the market is actually willing to pay, thus letting scalpers capture more of the profits.
Either way, scalping is not a problem that persists for multiple years unless it's intentional corporate strategy. Either factories ramp up production capacity to ensure there is enough supply for launch, or MSRP rises much faster than inflation. Getting demand planning wrong year after year after year smells like incompetence leaving money on the table.
The argument that scalping is better for NVDA is coming from the fact that consumer GPUs no longer make a meaningful difference to the bottom line. Factory capacity is better reserved for even more profitable data center GPUs. The consumer GPU market exists not to increase NVDA profits directly, but as a marketing / "halo" effect that promotes decision makers sticking with NVDA data center chips. That results in a completely different strategy where out-of-stock is a feature, not a bug, and where product reputation is more important than actual product performance, hence the coercion on review media.
Scalping and MSRP-baiting have been around for far too many years for nVidia to claim innocence. The death of EVGA's GPU line also revealed that nVidia holds most of the cards in the relationship with its "partners". Sure, Micro Center and Amazon can only do so much, and nVidia isn't a retailer, but they know what's going on and their behavior shows that they actually like this situation.
Yeah wait, what happened with EVGA? (guess I can search it up, of course) I was browsing gaming PC hardware recently and noticed none of the GPUs were from EVGA .. I used to buy their cards because they had such a good warranty policy (in my experience)... :\
EVGA was angry because nVidia wouldn't pay them for attempts at scalping which failed.
In 2022 claiming a lack of respect from Nvidia, low margins, and Nvidia's control over partners as just a few of the reasons, EVGA ended its partnership with Nvidia and ceased manufacturing Nvidia GPUs.
> I used to buy their cards because they had such a good warranty policy (in my experience)... :\
It's so wild to hear this as in my country, they were not considered anything special over any other third party retailer as we have strong consumer protection laws which means its all much of a muchness.
Think of it this way, the only reason 40 series and above are priced like they are is because they saw how willing people were to pay dueing 30 series scalper days. This over representation by the rich is training other customers that nvidia gpus are worth that much so when they increase it again people won't feel offended.
Did you just casually forget about the AI craze we are in the midst of? Nvidia still selling GPUs for gamers at all is a surprise to be honest.
Is AMD doing the same? From another post in this thread:
> Nowadays, $650 might get you a mid-range RX 9070 XT if you miraculously find one near MSRP.
If yes then it's industry wide phenomena.
> Nvidia doesn't earn more money when cards are sold above MSRP
How would we know if they were?
> Scalpers are a retail wide problem. Acting like Nvidia has the insight or ability to prevent them is just silly.
Oh trust me, they can combat it. The easiest way, which is what Nintendo often does for the launch of its consoles, is produce an enormous amount of units before launch. The steady supply to retailers, absolutely destroys folks ability to scalp. Yes a few units will be scalped, but most scalpers will be underwater if there is a constant resupply. I know this because I used to scalp consoles during my teens and early twenties, and Nintendo's consoles were the least profitable and most problematic because they really try to supply the market. The same with iPhones, yeah you might have to wait a month after launch to find one if you don't pre-order but you can get one.
It's widely reported that most retailers had maybe tens of cards per store, or a few hundred nationally, for the 5090s launch. This immediately creates a giant spike in demand, and drove prices up along with the incentive for scalpers. The manufacturing partners immediately saw what (some) people were willing to pay (to the scalpers) and jacked up prices so they could get their cut. It is still so bad in the case of the 5090 that MSRP prices from AIBs skyrocketed 30%-50%. PNY had cards at the original $1999.99 MSRP and now those same cards can't be found for less than $2,999.99.
By contrast look at how AMD launched it's 9000 series of GPUS-- each MicroCenter reportedly had hundreds on hand (and it sure looked like by pictures floating around). Folks were just walking in until noon and still able to get a GPU on launch day. Multiple restocks happened across many retailers immediately after launch. Are there still some inflated prices in the 9000 series GPUs? Yes, but we're not talking a 50% increase. Having some high priced AIBs has always occurred but what Nvidia has done by intentionally under supplying the market is awful.
I personally have been trying to buy a 5090 FE since launch. I have been awake attempting to add to cart for every drop on BB but haven't been successful. I refuse to pay the inflated MSRP for cards that haven't been been that well reviewed. My 3090 is fine... At this point, I'm so frustrated by NVidia I'll likely just piss off for this generation and hope AMD comes out with something that has 32GB+ of VRAM at a somewhat reasonable price.
Switch 2 inventory was amazing, but how did RX 9070 inventory remotely sufficient? News at the time were all about how limited its availability https://www.tweaktown.com/news/103716/amd-rx-9070-xt-stock-a...
Not to mention it's nowhere to be found on Steam Hardware Survey https://store.steampowered.com/hwsurvey/videocard/
The 9070 XT stock situation went about like this; I bought a 5070 Ti instead.
>Oh trust me, they can combat it.
As has been explained by others. They cant. Look at the tech which is used by Switch 2 and then look at the tech by Nvidia 50 series.
And Nintendo didn't destroy scalpers, they are still in many market not meeting demand despite "is produce an enormous amount of units before launch".
W7900 has 48 Gb and is reasonably priced.
It' $4.2k on Newegg; I wouldn't necessarily call it reasonably priced, even compared to NVidia.
If we're looking at the ultra high end, you can pay double that and get an RTX 6000 Pro with double the VRAM (96GB vs 48GB), double the memory bandwidth (1792 GB/s vs 864 GB/s) and much much better software support. Or you could get an RTX 5000 Pro with the same VRAM, better memory bandwidth (1344 GB/s vs 864 GB/s) at similar ~$4.5k USD from what I can see (only a little more expensive than AMD).
Why the hell would I ever buy AMD in this situation? They don't really give you anything extra over NVidia, while having similar prices (usually only marginally cheaper) and much, much worse software support. Their strategy was always "slightly worse experience than NVidia, but $50 cheaper and with much worse software support"; it's no wonder they only have less than 10% GPU market share.
> Nvidia doesn't earn more money when cards are sold above MSRP, but they get almost all the hate for it. Why would they set themselves up for that?
If you believe their public statements, because they didn't want to build out additional capacity and then have a huge excess supply of cards when demand suddenly dried up.
In other words, the charge of "purposefully keeping stock low" is something NVidia admitted to; there was just no theory of how they'd benefit from it in the present.
which card's demand suddenly dried up? Can we buy their excess stock already? please?
I didn't say that happened. I said that was why NVidia said they didn't want to ramp up production. They didn't want to end up overextended.
Nvidia shareholders make money when share price rises. Perceived extreme demand raises share prices
I haven't read the whole article but a few things to remark:
* The prices for Nvidia GPUs are insane. For that money you can have an extremely good PC with a good non Nvidia GPU.
* The physical GPU sizes are massive, even letting the card rest on a horizontal motherboard looks like scary.
* Nvidia has still issues with melting cables? I've heard about those some years ago and thought it was a solved problem.
* Proprietary frameworks like CUDA and others are going to fall at some point, is just a matter of time.
Looks as if Nvidia at the moment is only looking at the AI market (which as a personal belief has to burst at some point) and simply does not care the non GPU AI market at all.
I remember many many years ago when I was a teenager and 3dfx was the dominant graphics card manufacturer that John Carmack profethically in a gaming computer magazine (the article was about Quake I) predicted that the future wasn't going to be 3dfx and Glide. Some years passed by and effectively 3dfx was gone.
Perhaps is just the beginning of the same story that happened with 3dfx. I think AMD and Intel have a huge opportunity to balance the market and bring Nvidia down, both in the AI and gaming space.
I have only heard excellent things about Intel's ARC GPUs in other HNs threads and if I need to build a new desktop PC from scratch there's no way to pay for the prices that Nvidia is pushing to the market, I'll definitely look at Intel or AMD.
High end GPU has over the last 5 years slowly turning from an enthusiast product into a luxury product.
5 or maybe 10 years ago, high-end GPU are needed to run games at reasonably eye candy setting. In 2025, $500 mid-range GPUs are more than enough. Folks all over can barely tell between High and Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling. There's just no point to compete at $500 price point any more, that Nvidia has largely given up and relegating to the AMD-built Consoles, and integrated graphics like AMD APU, that offer good value in low-end, medium-end, and high-end.
Maybe the rumored Nvidia PC, or the Switch 2, can bring some resurgence.
The fact that we're calling $500 GPUs "midrange" is proof that Nvidia's strategy is working.
What strategy? They charge more because manufacturing costs are higher, cost per transistor haven't changed much since 28nm [0] but chips have more and more transistors. What do you think that does to the price?
[0]: https://www.semiconductor-digest.com/moores-law-indeed-stopp...
10 years ago, $650 would buy you a top-of-the-line gaming GPU (GeForce GTX 980 Ti). Nowadays, $650 might get you a mid-range RX 9070 XT if you miraculously find one near MSRP.
That is $880 dollars in today's term. And 2015 Apple was already shipping a 16nm SoC. The GeForce GTX 980 Ti was still on 28nm. Two generation Node behind.
Keeping with inflation (650 to 880) it’d get you a 5070TI.
$650 of 2015 USD is around $875 of 2025 USD fwiw
I bought a new machine with an RTX 3060 Ti back in 2020 and it's still going strong, no reason to replace it.
same, 2080 Super here, I even do AI with it
I think this is the even broader trend here
In their never ending quest to find ways to suck more money out of people, one natural extension is to just turn the thing into a luxury good and that alone seems to justify the markup
This is why new home construction is expensive - the layout of a home doesn’t change much but it’s trivial to throw on some fancy fixtures and slap the deluxe label on the listing.
Or take a Toyota, slap some leather seats on it, call it a Lexus and mark up the price 40% (I get that these days there are more meaningful differences but the point stands)
This and turning everything into subscriptions alone are responsible for 90% of the issues I have as a consumer
Graphics cards seem to be headed in this direction as well - breaking through that last ceiling for maximum fps is going to be like buying a bentley (if it isn’t already) where as before it was just opting for the v8
Nvidia's been doing this for a while now, since at least the Titan cards and technically the SLI/Crossfire craze too. If you sell it, egregiously-compensated tech nerds will show up with a smile and a wallet large enough to put a down-payment on two of them.
I suppose you could also blame the software side, for adopting compute-intensive ray tracing features or getting lazy with upscaling. But PC gaming has always been a luxury market, at least since "can it run Crysis/DOOM" was a refrain. The homogeneity of a console lineup hasn't ever really existed on PC.
Just going to focus on this one:
> DLSS vs FSR, or DLSS FG and Lossless Scaling.
I've used all of these (at 4K, 120hz, set to "balanced") since they came out, and I just don't understand how people say this.
FSR is a vaseline-like mess to me, it has its own distinct blurriness. Not as bad as naive upscaling, and I'll use it if no DLSS is available and the game doesn't run well, but it's distracting.
Lossless is borderline unusable. I don't remember the algorithm's name, but it has a blur similar to FSR. It cannot handle text or UI elements without artifacting (because it's not integrated in the engine, those don't get rendered at native resolution). The frame generation causes almost everything to have a ghost or afterimage - UI elements and the reticle included. It can also reduce your framerate because it's not as optimized. On top of that, the way the program works interferes with HDR pipelines. It is a last resort.
DLSS (3) is, by a large margin, the best offering. It just works and I can't notice any cons. Older versions did have ghosting, but it's been fixed. And I can retroactively fix older games by just swapping the DLL (there's a tool for this on GitHub, actually). I have not tried DLSS 4.
Maybe I over exaggerated, but I was dumbfounded myself reading people’s reaction to Lossless Scaling https://www.reddit.com/r/LinusTechTips/s/wlaoHl6GAS
Most people either can’t tell the difference, don’t care about the difference, or both. Similar discourse can be found about FSR, frame drop, and frame stutter. I have conceded that most people do not care.
I’ve used fsr 4 and dlss 4, I’d say fsr 4 is a bit ahead of dlss 3 but behind dlss 4. No more vaseline smear
Not quite $500, but at $650, the 9070 is an absolute monster that outperforms Nvidia's equivalent cards in everything but ray tracing (which you can only turn on with full DLSS framegen and get a blobby mess anyways)
AMD is truly making excellent cards, and with a bit of luck UDNA is even better. But they're in the same situation as Nvidia: they could sell 200 GPUs, ship drivers, maintain them, deal with returns and make $100k... Or just sell a single MI300X to a trusted partner that won't make any waves and still make $100k.
Wafer availability unfortunately rules all, and as it stands, we're lucky neither of them have abandoned their gaming segments for massively profitable AI things.
Some models of 9070 use the well-proven old style PCI-E power connectors too, which is nice. As far as I'm aware none of the current AIB midrange or high end Nvidia cards do this.
As I understand it, for the 50-series nvidia requires the 12VHPWR connector
I have a 2080 that I'm considering upgrading but not sure which 50 series would be the right choice.
I went from a 2080 Ti to a 5070 Ti. Yes it's faster, but for the games I play, not dramatically so. Certainly not what I'm used to doing such a generational leap. The 5070 Ti is noticeably faster at local LLMs, and has a bit more memory which is nice.
I went with the 5070 Ti since the 5080 didn't seem like a real step up, and the 5090 was just too expensive and wasn't in stock for ages.
If I had a bit more patience, I would have waited till the next node refresh, or for the 5090. I don't think any of the other current 50-series cards are worth besides the 5090 it if you're coming from a 2080. And by worth it I mean will give you a big boost in performance.
I went from a 3070 to 5070 Ti and it's fantastic. Just finished Cyberpunk Max'd out at 4k with DLSS balanced, 2x frame gen, and reflex 2. Amazing experience.
Grab a used/refurb 3090 then. Probably as legendary card as a 1080Ti.
Just pray that it's a 3090 under that lid when you buy it second hand
TSMC can only make about as many Nvidia chips as OpenAI and the other AI guys wants to buy. Nvidia releases gpus made from basically the shaving leftovers from the OpenAI products, which makes them limited in supply and expensive.
So gamers have to pay much more and wait much longer than before, which they resent.
Some youtubers make content that profit from the resentment so they play fast and loose with the fundamental reasons in order to make gamers even more resentful. Nvidia has "crazy prices" they say.
But they're clearly not crazy. 2000 dollar gpus appear in quantities of 50+ from time to time at stores here but they sell out in minutes. Lowering the prices would be crazy.
This is one reason, and another is that both Dennard scaling has stopped and GPUs hit a memory wall for DRAM. The only reason AI hardware gets the significant improvements is that they are using big matmuls and a lot of research has been in getting lower precision (now 4bit) training working (numerical precision stability was always a huge problem with backprop).
NVIDIA is, and will be for at least the next year or two, supply constrained. They only have so much capacity at TSMC for all the chips, and the lion's share of that is going to be going enterprise chips, which sell for an order of magnitude more than the consumer chips.
It's hard to get too offended by them shirking the consumer marker right now when they're printing money with their enterprise business.
Not personally offended, but when a company makes a big stink around several gross exaggerations (performance, price, availability) it's not hard to understand why folks are kicking up their own stink.
Nvidia could have said "we're prioritizing enterprise" but instead they put on a big horse and pony show about their consumer GPUs.
I really like the Gamer's Nexus paper launch shirt. ;)
They could rapidly build new own factories but they don’t.
Are you saying Nvidia could spin up their own chip fabs in short order?
If they believed they were going to continue selling AI chips at those margins they would:
- outbid Apple on new nodes
- sign commitments with TSMC to get the capacity in the pipeline
- absolutely own the process nodes they made cards on that are still selling way above retail
NVIDIA has been posting net earnings in the 60-90 range over the last few years. If you think that's going to continue? You book the fab capacity hell or high water. Apple doesn't make those margins (which is what on paper would determine who is in front for the next node).
And what if Nvidia booked but the order didn't come. What if Nvidia's customer isn't going to commit? How expensive and how much prepayment is needed for TSMC to break a new Fab?
These are the same question Apple Fans asking Apple to buy TSMC. The fact is isn't so simple. And even if Nvidia were willing to pay for it TSMC wouldn't do it just for Nvidia alone.
Yeah, I agree my "if" is doing a lot of lifting there. As in, "if Jensen were being candid and honest when he goes on stage and said things".
Big if, I I get that.
Somebody should let Intel know.
They could be more honest about it though.
I was under the impression that a ton of their sales growth last quarter was actually from consumers. DC sales growth was way lower than I expected.
"It's hard to get too offended by them shirking the consumer"
BS! Nvidia isn't entitled. I'm not obligated. Customer always has final say.
The problem is a lot of customers can't or don't stand their ground. And the other side knows that.
Maybe you're a well trained "customer" by Nvidia just like Basil Fawlty was well trained by his wife ...
Stop excusing bs.
The real issue here is actually harebrained youtubers stirring up drama for views. That's 80% of the problem. And their viewers (and readers, for that which makes it into print) eat it up.
Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly, posting to social media, and youtubers jumping on the trend for likes.
These are 99% user error issues drummed up by non-professionals (and, in some cases, people paid by 3rd party vendors to protect those vendors' reputation).
And the complaints about transient performances issues with drivers, drummed up into apocalyptics scenarios, again, by youtubers, who are putting this stuff under a microscope for views, are universal across every single hardware and software product. Everything.
Claiming "DLSS is snakeoil", and similar things are just an expression of the complete lack of understanding of the people involved in these pot-stirring contests. Like... the technique obviously couldn't magically multiply the ability of hardware to generate frames using the primary method. It is exactly as advertised. It uses machine learning to approximate it. And it's some fantastic technology, that is now ubiquitous across the industry. Support and quality will increase over time, just like every _quality_ hardware product does during its early lifespan.
It's all so stupid and rooted in greed by those seeking ad-money, and those lacking in basic sense or experience in what they're talking about and doing. Embarrassing for the author to so publicly admit to eating up social media whinging.
If you've ever watched a GN or LTT video, they never claimed that DLSS is snakeoil. They specifically call out the pros of the technology, but also point out that Nvidia lies, very literally, about its performance claims in marketing material. Both statements are true and not mutually exclusive. I think people like in this post get worked up about the false marketing and develop (understandably) a negative view of the technology as a whole.
> Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly
This is not true. Even GN reproduced the melting of the first-party cable.
Also, why shouldn't you be able to use third-party cables? Fuck DRM too.
I'm referring to the section header in this article. Youtubers are not a truly hegemonic group, but there's a set of ideas and narratives that pervade the group as a whole that different subsets buy into, and push, and that's one that exists in the overall sphere of people who discuss the use of hardware for gaming.
Well, I can't speak for all youtubers, but I do watch most GN and LTT videos and the complaints are legitimate, nor are they random jabronis yolo'ing hardware installations.
As far as I know, neither of them have had a card unintentionally light on fire.
The whole thing started with Derbauer going to bat for a cable from some 3rd party vendor that he'd admitted he'd already plugged in and out of various cards something like 50 times.
The actual instances that youtubers report on are all reddit posters and other random social media users who would clearly be better off getting a professional installation. The huge popularity for enthusiast consumer hardware, due to the social media hype cycle, has brought a huge number of naive enthusiasts into the arena. And they're getting burned by doing hardware projects on their own. It's entirely unsurprising, given what happens in all other realms of amateur hardware projects.
Most of those who are whinging about their issues are false positive user errors. The actual failure rates (and there are device failures) are far lower, and that's what warrantys are for.
I'm sure the failure rates are blown out of proportion, I agree with that.
But the fact of the matter is that Nvidia has shifted from a consumer business to b2b, and they don't even give a shit about pretending they care anymore. People take beef with that, understandably, and when you couple that with the false marketing, the lack of inventory, the occasional hardware failure, missing ROPs, insane prices that nobody can afford and all the other shit that's wrong with these GPUs, then this is the end result.
GN were the OG "fake framers" going back to their constant casting shade on DLSS, ignoring it on their reviews, and also crapping on RT.
AI upscaling, AI denoising, and RT were clearly the future even 6 years ago. CDPR and the rest of the industry knew it, but outlets like GN pushed a narrative(borderline conspiracy) the developers were somehow out of touch and didn't know what they were talking about?
There is a contingent of gamers who play competitive FPS. Most of which are, like in all casual competitive hobbies, not very good. But they ate up the 240hz rasterization be-all meat GN was feeding them. Then they think they are the majority and speak for all gamers(as every loud minority on the internet does).
Fast forward 6 years and NVidia is crushing the Steam top 10 GPU list, AI rendering techniques are becoming ubiquitous, and RT is slowly edging out rasterization.
Now that the data is clear the narrative is most consumers are "suckers" for purchasing NVidia, Nintendo, and etc. And the content creator economy will be there to tell them they are right.
Edit: I believe too some of these outlets had chips on their shoulder regarding NVidia going way back. So AMDs poor RT performance and lack of any competitive answer the the DLSS suite for YEARS had them lying to themselves about where the industry was headed. Essentially they were running interference for AMD. Now that FSR4 is finally here it's like AI upscaling is finally ok.
Remember when nvidia got caught dropping 2 bits of color information to beat ati in benchmarks? I still can't believe anyone has trusted them since! That is an insane thing to do considering the purpose of the product.
For as long as they have competition, I will support those companies instead. If they all fail, I guess I will start one. My spite for them knows no limits
People need to start asking more questions about why the RTX 50 series (Blackwell) has almost no performance uplift over the RTX 40 series (Ada/Hopper), and also conveniently its impossible to find B200s.
[flagged]
> https://linustechtips.com/topic/1497989-amd-caught-cheating-...
The forum post you linked was an april fools joke.
It's sort of funny, because the second comment is:
"Kinda rather not do april 1st jokes like this as it does get cached and passed around after the fact without it being clear."
Egg, meet face. Pretty funny that this was obviously "Google, find posts that prove my point" with nary a further shred of investigation.
I am a volunteer firefighter and hold a degree in electrical engineering. The shenanigans with their shunt resistors, and ensuing melting cables, is in my view criminal. Any engineer worth their salt would recognize pushing 600W through a bunch of small cables with no contingency if some of them have failed is just asking for trouble. These assholes are going to set someone's house on fire.
I hope they get hit with a class action lawsuit and are forced to recall and properly fix these products before anyone dies as a result of their shoddy engineering.
Apparently somebody did sue a couple years back. Anyone know what happened with the Lucas Genova vs. nVidia lawsuit?
EDIT: Plantiff dismissed it. Guessing they settled. Here are the court documents (alternately, shakna's links below include unredacted copies):
https://www.classaction.org/media/plaintiff-v-nvidia-corpora...
https://www.classaction.org/media/plaintiff-v-nvidia-corpora...
A GamersNexus article investigating the matter: https://gamersnexus.net/gpus/12vhpwr-dumpster-fire-investiga...
And a video referenced in the original post, describing how the design changed from one that proactively managed current balancing, to simply bundling all the connections together and hoping for the best: https://youtu.be/kb5YzMoVQyw
> NOTICE of Voluntary Dismissal With Prejudice by Lucas Genova (Deckant, Neal) (Filed on 3/10/2023) (Entered: 03/10/2023)
Sounds like it was settled out of court.
[0] https://www.docketalarm.com/cases/California_Northern_Distri...
Do those mention failing to follow Underwriters Laboratory requirements?
I’m curious whether the 5090 package was not following UL requirements.
Would that make them even more liable?
Part of me believes that the blame here is probably on the manufacturers and that this isn’t a problem with Nvidia corporate.
GamersNexus ftw as always
Also, like, I kind of want to play with these things, but also I'm not sure I want a computer that uses 500W+ in my house, let alone just a GPU.
I might actually be happy to buy one of these things, at the inflated price, and run it at half voltage or something... but I can't tell if that is going to fix these concerns or they're just bad cards.
It's not the voltage, it's the current you'd want to halve. The wire gauge required to carry power is dependent on the current load. It's why when i first saw these new connectors and the loads they were being tasked with it was a wtf moment for me. Better to just avoid them in the first place though.
It's crazy, you don't even need to know about electricity after you see a thermal camera on them operating at full load. I'm surprised they can be sold to the general public, the reports of cables melting plus the high temps should be enough to force a recall.
With 5080 using 300W, talking about 500W is a bit of an exaggeration, isn't it?
Has anyone made 12VHPWR cables that replace the 12 little wires with 2 large gauge wires yet? That would prevent the wires from becoming unbalanced, which should preempt the melting connector problem.
As a bonus, if the gauge is large enough, the cable would actually cool the connectors, although that should not be necessary since the failure appears to be caused by overloaded wires dumping heat into the connector as they overheat.
Might help a little bit, by heatsinking the contacts better, but the problem is the contact resistance, not the wire resistance. The connector itself dangerously heats up.
Or at least I think so? Was that a different 12VHPWR scandal?
Contact resistance is a problem.
Another problem is when the connector is angled, several of the pins may not make contact, shoving all the power through as few as one wire. A common bus would help this but the contact resistance in this case is still bad.
A common bus that is not also overheating would cool the overheating contact(s).
It would help, but my intuition is that the thin steel of the contact would not move the heat fast enough to make a significant difference. Only way to really know is to test it.
I think it's both contact and wire resistance.
It is technically possible to solder a new connector on. LTT did that in a video. https://www.youtube.com/watch?v=WzwrLLg1RR4
I thought that the contact resistance caused the unbalanced wires, which then overheat alongside the connector, giving the connector’s heat nowhere to go.
Or 12 strands in a single sheath so it's not overly rigid.
They don't just specify 12 smaller cables for nothing if 2 larger ones will do. There are concerns here with mechanical compatibility (12 wires have smaller allowable bend radius than 2 larger ones with the same ampacity).
One option is to use two very wide, thin insulated copper sheets as cable. Still has a good bend radius in one dimension, but is able to sink a lot of power.
This article goes much deeper than I expected, and is a nice recap of the last few years of "green" gpu drama.
Liars or not, the performance has not been there for me in any of my usecases, from personal to professional.
A system from 2017/2018 with an 8700K and an 8GB 2080 performs so closely to the top end, expensive systems today that it makes almost no sense to upgrade at MSRP+markup unless your system is older than this.
Unless you need specific features only on more recent cards, there are very few use cases I can think of needing more than a 30 series card right now.
> A system from 2017/2018 with an 8700K and an 8GB 2080 performs so closely to the top end, expensive systems today
This is in no way true and is quite an absurd claim. Unless you meant for some specific isolated purposed restricted purely to yourself and your performance needs.
> there are very few use cases I can think of needing more than a 30 series card right now.
How about I like high refresh and high resolutions? I'll throw in VR to boot. Which are my real use cases. I use a high refresh 4K display and VR, both have benefited hugely from my 2080Ti > 4090 Shift.
I mean, most people probably won't directly upgrade. Their old card will die, or eventually nvidia will stop making drivers for it. Unless you're looking around for used cards, the price difference between something low end like a 3060 isn't that much less in price for the length of support you're going to get.
Unless nvidia's money printing machine breaks soon, expect the same to continue for the next 3+ years. Crappy expensive cards with a premium on memory with almost no actual video rendering performance increase.
> Unless you're looking around for used cards, the price difference between something low end like a 3060 isn't that much less in price for the length of support you're going to get.
This does not somehow give purchasers more budget room now, but they can buy 30-series cards in spades and not have to worry about the same heating and power deliveries as a little bonus.
I don’t want to jump on nvidia but I found it super weird when they clearly remote controlled a Disney bot onto the stage and claimed it was all using real time AI which was clearly impossible due to no latency and weirdly the bot verifying correct stage position in relation to the presenter. It was obviously the Disney bot just being controlled by someone off stage.
I found it super alarming because why would they fake something on stage to the extent of just lying.i know Steve jobs had backup phones but jsut claiming a robot is autonomous when it isn’t I just feel it was scammy.
It reminded me of when Tesla had remote controlled Optimus bots. I mean I think that’s awesome like super cool but clearly the users thought the robots were autonomous during that dinner party.
I have no idea why I seem to be the only person bothered by “stage lies” to this level. Tbh even the Tesla bots weren’t claimed to be autonomous so actually I should never have mentioned them but it explains the “not real” vibe.
Not meaning to disparage just explaining my perception as a European maybe it’s just me though!
EDIT > Im kinda suprised by the weak arguments in the replies, I love both companies, I am just offering POSITIVE feedback, that its important ( in my eyes ) to be careful not to pretend in certain specific ways or it makes the viewer question the foundation ( which we all know is SOLID and good ).
EDIT 2 >There actually is a good rebuttal in the replies, although apparently I have "reading comprehension skill deficiencies" its just my pov that they were insinuating the robot was aware of its surroundings, which is fair enough.
As I understand it the Disney bots do actually use AI in a novel way: https://la.disneyresearch.com/publication/design-and-control...
So there’s at least a bit more “there” there than the Tesla bots.
I believe its RL trained only.
See this snipet : "Operator Commands Are Merged: The control system blends expressive animation commands (e.g., wave, look left) with balance-maintaining RL motions"
I will print a full retraction if someone can confirm my gut feeling is correct
Having worked on control systems a long time ago, that's a 'nothing' statement: the whole job of the control system is to keep the robot stable/ambulating, regardless of whatever disturbances occur. It's meant to reject the forces induced due to waving exactly as much as bumping into something unexpected.
It's easier to stabilise from an operator initiated wave, really; it knows it's happening before it does the wave, and would have a model of the forces it'll induce.
I tried to understand the point of your reply but Im not sure what your point was - I only seemed to glean "its easier to balance if the operator is moving it".
Please elaborate unless Im being thick.
EDIT > I upvoted your comment in any case as Im sure its helping
'control system' in this case is not implying remote control, it's referring to the feedback system that adjust the actuators in response to the sensed information. If the motion is controlled automatically, then the control loop can in principle anticipate the motion in a way that it could not if it was remote controlled: i.e. the opposite, it's easier to control the motions (in terms of maintaining balance and avoiding overstressing the actuators) if the operator is not live puppeteering it.
Apologies, yes, "control system" is somewhat niche jargon. "Balance system" is probably more appropriate.
Well "control system" is a proper term understood by anyone with a decent STEM education since 150 years ago.
Thank you for the explanation
It's that there's nothing special about blending "operator initiated animation commands" with the RL balancing system. The balance system has to balance anyway; if there was no connection between an operator's wave command and balance, it would have exactly the same job to do.
At best the advantage of connecting those systems is that the operator command can inform the balance system, but there's nothing novel about that.
"RL is not AI" "Disney bots were remote controlled" are major AI hypebro delulu moment lol
Your understanding of AI and robotics are more cucumber than pear shaped. You're making very little technical sense here. Challenges and progress in robotics aren't where you think they are. It's all propagandish contents you're basing your understandings on.
If you're getting information from TikTok or YouTube Shorts style content, especially around Tesla bros - get the hell out of it at Ludicrous Speed. Or consume way more of it so thoroughly that you cannot be deceived anymore despite blatant lies everywhere. Then come back. They're all plain wrong and it's not good for you.
Only as opposed to what? VLAM/something else more trendy?
Not just you.
I hate being lied to, especially if it's so the liar can reap some economic advantage from having the lie believed.
Yeah. I have a general rule that I don't do business with people who lie to me.
I can’t even imagine what kind of person would not follow that rule.
Do business with people that are known liars? And just get repeatedly deceived?
…Though upon reflection that would explain why the depression rate is so high.
> I don’t want to jump on nvidia but I found it super weird when they clearly remote controlled a Disney bot onto the stage and claimed it was all using real time AI which was clearly impossible due to no latency and weirdly the bot verifying correct stage position in relation to the presenter. It was obviously the Disney bot just being controlled by someone off stage.
I don't know what you're referring to, but I'd just say that I don't believe what you are describing could have possibly happened.
Nvidia is a huge corporation, with more than a few lawyers on staff and on retainer, and what you are describing is criminal fraud that any plaintiff's lawyer would have a field day with. So, given that, and since I don't think people who work at Nvidia are complete idiots, I think whatever you are describing didn't happen the way you are describing it. Now, it's certainly possible there was some small print disclaimer, or there was some "weasel wording" that described something with ambiguity, but when you accuse someone of criminal fraud you want to have more than "hey this is just my opinion" to back it up.
Tefal literally sells a rice cooker that boasts "AI Smart Cooking Technology" while not even containing a microcontroller and just being controlled by the time-honored technology of "a magnet that gets hot". They also have lawyers.
AI doesn't mean anything. You can claim anything uses "AI" and just define what that means yourself. They could have some basic anti-collision technology and claim it's "AI".
They're soaked eyebrows deep in Tiktok style hype juice, believing that latest breakthrough in robotics is that AGIs just casually started walking and talking on their own and therefore anything code controlled by now is considered proof of ineptitude and fake.
It's complete cult crazy talk. Not even cargocult, it's proper cultism.
There's also a very thick coat of hype in https://www.nvidia.com/en-us/glossary/ai-factory/ and related material, even though the underlying product (an ML training cluster) is real.
[dead]
Disney are open about their droids being operator controlled. Unless nvidia took a Disney droid and built it to be autonomous (which seems unlikely) it would follow that it is also operator controlled. The presentation was demonstrating what Disney had achieved using nvidia’s technology. You can see an explainer of how these droids use machine learning here: https://youtube.com/shorts/uWObkOV71ZI
If you think the droid was autonomous then I guess that is evidence that nvidia were misrepresenting (if not lying).
Having seen these droids outside of the nvidia presentation and watching the nvidia presentation, I think it’s obvious it was human operated and that nvidia were misleading people.
I think its cool you disagree with me, it would be nice to hear a counter argument though.
[dead]
I assume any green accounts that are just asking questions with no research are usually lying. Actual new users will just comment and say their thoughts to join the community.
[dead]
It seems to me like both cases raised by OP - the Disney droids and Optimus - are cases of people making assumptions and then getting upset that their assumptions were wrong and making accusations.
Neither company was very forthcoming about the robots being piloted, but neither seems to be denying it either. And both seem to use RL / ML techniques to maintain balance, locomotion, etc. Not unlike Boston Dynamics' bots, which are also very carefully orchestrated by humans in multiple ways.
Haters gonna hate (downvotes just prove it - ha!)
If you look at the video he says " this is real time simulation .. can you believe it" basically : https://www.youtube.com/shorts/jD5y1eQ3Y_o
Yet he lists all the RL stuff that we know is used in the robot, he isnt being silent and saying " this robot is aided by AI" , or better yet, not commenting on the specifics, ( which would have been totally ok ), instead he is saying " This is real life simulation", which it isnt.
EDIT > apparently I am wrong - thank you for the correction everyone!
I have written motion control firmwares for 20+ years, and "this is real time simulation" has very domain-specific meaning to me. "Real time" means the code is responding to events as they happen, like with interrupts, and not via preemptible processing which could get out of sync with events. "simulation" is used by most control systems from simple PID loops to advanced balancing and motion planning.
It is clearly - to me at least - doing both of those things.
I think you're reading things into what he said that aren't there.
ok thanks
Yea, this seems like the initial poster has reading comprehension skill deficiencies and is blaming NVIDIA for lying about a point they never made. NVIDIA is even releasing some of the code they used to power the robot, which further proves that they in no way said the robot was not being operator controlled, just that it was using AI to make it’s movement look more fluid.
fair enough, upvoted.
I seem to remember multiple posts on large tech websites having the exact same opinion/conclusion/insinuation as the one you originally had, so not necessarily comprehension problem on your part. My opinion: Nvidia's CEO has a problem communicating in good faith. He absolutely knew what he was doing during that little stage show, and it was absolutely designed to mislead people toward the most "AI HYPE, PLEASE BUY GPUs, MY ROBOT NEEDS GPUS TO LIVE" conclusion
[flagged]
Ableton Live is from Europe :)
You win the award for instant karma
oof!
And it has fallen vastly behind other DAWs
Crazy talk. All the others have been playing catchup and still aren’t there with some things.
I just want Acid Pro on Mac
How so?
I wonder if the 12VHPWR connector is intentionally defective to prevent large-scale use of those consumer cards in server/datacenter contexts?
The failure rate is just barely acceptable in a consumer use-case with a single card, but with multiple cards the probability of failure (which takes down the whole machine, as there's no way to hot-swap the card) makes it unusable.
I can't otherwise see why they'd persevere on that stupid connector when better alternatives exist.
It boggles my mind that an army of the most talented electrical engineers on earth somehow fumble a power connector and then don’t catch it before shipping.
Sunk cost fallacy and a burning (literal) desire to have small artistic things. That's probably also the reason the connector was densified so much, and clearly, released with so VERY little tolerance for error human and otherwise.
They use the 12VHPWR on some datacenter cards too.
IANAL, but knowingly leaving a serious defect in your product at scale for that purpose would be very bad behavior and juries tend not like that sort of thing.
However, as we’ve learned from the Epic vs Apple case, corporations don’t really care about bad behavior — as long as their ulterior motives don’t get caught.
Anyone else getting a bit disillusioned with the whole tech hardware improvements thing? Seems like every year we get less improvement for higher cost and the use cases become less useful. Like the whole industry is becoming a rent seeking exercise with diminishing returns. I used to follow hardware improvements and now largely don't because I realised I (and probably most of us) don't need it.
It's staggering that we are throwing so many resources at marginal improvements for things like gaming, and I say that as someone whose main hobby used to be gaming. Ray tracing, path tracing, DLSS, etc at a price point of $3000 just for the GPU - who cares when a 2010 cell shaded game running on an upmarket toaster gave me the utmost joy? And the AI use cases don't impress me either - seems like all we do each generation is burn more power to shove more data through and pray for an improvement (collecting sweet $$$ in the meantime).
Another commenter here said it well, there's just so much more you can do with your life than follow along with this drama.
What stands out to me is that it's not just the hardware side, software production to make use of it to realize the benefits offered doesn't seem to be running smoothly either, at least for gaming. I'm not sure nvidia really cares too much though as there's no market pressure on them where it's a weakness for them, if consumer GPUs disappeared tomorrow they'd be fine.
A few months ago Jensen Huang said he sees quantum computing as the next big thing he wants nvidia to be a part of over the next 10-15 years (which seems like a similar timeline as GPU compute), so I don't think consumer GPUs are a priority for anyone. Gaming used to be the main objective with byproducts for professional usage, for the past few years that's reversed where gaming piggybacks on common aspects to compute.
Your disillusionment is warranted, but I'll say that on the Mac side the grass has never been greener. The M chips are screamers year after year, the GPUs are getting ok, the ML cores are incredible and actually useful.
Good point, we should commend genuinely novel efforts towards making baseline computation more efficient, like Apple has done as you say. Particularly in light of recent x86 development which seems to be "shove as many cores as possible on a die and heat your apartment while your power supply combusts" (meanwhile the software gets less efficient by the day, but that's another thing altogether...). ANY DAY of the week I will take a compute platform that's no-bs no-bells-and-whistles simply more efficient without the manufacturer trying to blow smoke up our asses.
I remember when it was a serious difference, like PS1-PS3 was absolutely miraculous and exciting to watch.
It's also fun that no matter how fast the hardware seems to get, we seem to fill it up with shitty bloated software.
Our stock investments are going up so ...... What can we do other than shrug
> The RTX 50 series are the second generation of NVIDIA cards to use the 12VHPWR connector.
This is wrong. The 50 series uses 12V-2x6, not 12VHPWR. The 30 series was the first to use 12VHPR. The 40 series was the second to use 12VHPWR and first to use 12V-2x6. The 50 series was the second to use 12V-2x6. The female connectors are what changed in 12V-2x6. The male connectors are identical between 12V-2x6 and 12VHPWR.
Nitpicking it doesn't change the fact that the 12v2x6 connector _also_ burns down.
The guy accuses Nvidia of not doing anything about that problem, but ignored that they did with the 12V-2x6 connector, which as far as I can tell, has had far fewer issues.
It still has no fusing, sensing, or load balancing for the individual wires. It is a fire waiting to happen.
It is a connector. None of the connectors inside a PC have those. They could add them to the circuitry on the PCB side of the connector, but that is entirely separate from the connector.
That said, the industry seems to be moving to adding detection into the PSU, given seasonic’s announcement:
https://www.tomshardware.com/pc-components/power-supplies/se...
Finally, I think there is a simpler solution, which is to change the cable to use two large gauge wires instead of 12 individual ones to carry current. That would eliminate the need for balancing the wires in the first place.
Previous well-designed video cards used the technologies I described. Eliminating the sense circuits and fusing is a recent development.
I do like the idea of just using big wires. It’d be so much cleaner and simpler. Also using 24 or 48V would be nice, but that’d be an even bigger departure from current designs.
The 50 series connectors burned up too. The issue was not fixed.
It seems incredibly wrong to assume that there was only 1 issue with 12WHPWR. 12V-2x6 was an improvement that eliminated some potential issues, not all of them. If you want to eliminate all of them, replace the 12 current carrying wires with 2 large gauge wires. Then the wires cannot become unbalanced. Of course, the connector would need to split the two into 12 very short wires to be compatible, but those would be recombined on the GPU’s PCB into a single wire.
(context: 12VHPWR and 12V-2x6 are the exact same thing. The latter is supposed to be improved and totally fixed, complete with the underspecced load-bearing "supposed to be" clause.)
They are not the exact same thing.
https://www.corsair.com/us/en/explorer/diy-builder/power-sup...
> And I hate that they’re getting away with it, time and time again, for over seven years.
Nvidia's been at this way longer than 7 years. They were cheating at benchmarks to control a narrative back in 2003. https://tech.slashdot.org/story/03/05/23/1516220/futuremark-...
> The competing open standard is FreeSync, spearheaded by AMD. Since 2019, NVIDIA also supports FreeSync, but under their “G-Sync Compatible” branding. Personally, I wouldn’t bother with G-Sync when a competing, open standard exists and differences are negligible[4].
Open is good, but the open standard itself is not enough. You need some kind of testing/certification, which is built in to the G-Sync process. AMD does have a FreeSync certification program now which is good.
If you rely on just the standard, some manufacturers get really lazy. One of my screens technically supports FreeSync but I turned it off day one because it has a narrow range and flickers very badly.
The article complains about issues with consumer GPUs but those are nowadays relegated to being merely a side hobby project of Nvidia, whose core business is enterprise AI chips. Anyway Nvidia still has no significant competition from AMD on either front so they are still getting away with this.
Deceptive marketing aside, it's true that it's sad that we can't get 4K 60 Hz with ray tracing with current hardware without some kind of AI denoising and upscaling, but ray tracing is really just _profoundly_ hard so I can't really blame anyone for not having figured out how to put it in a consumer pc yet. There's a reason why pixar movies need huge render farms that take lots of time per frame. We would probably sooner get gaussian splatting and real time diffusion models in games than nice full resolution ray tracing tbh.
I get ray tracing at 4K 60Hz with my 4090 just fine
Really? I can't even play Minecraft (DXR: ON) at 4K 60Hz on a RTX 5090...
Maybe another regression in Blackwell.
If you are a gamer, you are no longer NVIDIA's most important customer.
Sounds like an opening for AMD then. But as long as NVidia has the best tech I'll keep buying it when it's time to upgrade.
A revelation on-par with Mac users waking up to learn their computer was made by a phone company.
Barely even a phone company, more like a app store and microtransactions services company
Yes but why should I care provided the product they have already sold me continues to work? How does this materially change my life because Nvidia doesnt want to go steady with me anymore?
Haven't been for a while. Not since crypto bros started buying up GPUs for coin mining.
With the rise of LLM training, Nvidia’s main revenue stream switched to datacenter gpus (>10x gaming revenue). I wonder whether this have affected the quality of these consumer cards, including both their design and product processes:
https://stockanalysis.com/stocks/nvda/metrics/revenue-by-seg...
It’s reasonable to argue that NVIDIA has a de facto monopoly in the field of GPU-accelerated compute, especially due to CUDA (Compute Unified Device Architecture). While not a legal monopoly in the strict antitrust sense (yet), in practice, NVIDIA's control over the GPU compute ecosystem — particularly in AI, HPC, and increasingly in professional content creation — is extraordinarily dominant.
> NVIDIA's control over the GPU compute ecosystem — particularly in AI, HPC
The two largest supercomputers in the world are powered by AMD. I don't think it's accurate to say Nvidia has monopoly on HPC
Source: https://top500.org/lists/top500/2025/06/
Strict antitrust sense don't look at actual monopoly to trigger, but just if you use your standing in the market to gain unjust advantages. Which does not require a monopoly situation but just a strong standing used wrong (like abusing vertical integration). So Standard Oil, to take a famous example, never had more than a 30% market share.
Breaking a monopoly can be a solution to that, however. But having a large part of a market by itself doesn't trigger anti trust legislation.
Thanks ChatGPT!
This was an efficient, well written, TKO.
Agreed. An excellent summary of a lot of missteps that have been building for a while. I had watched that article on the power connector/ shunt resistors and was dumbfounded at the seemingly rank-amateurish design. And although I don't have a 5000 series GPU I have been astonished at how awful the drivers have been for the better part of a year.
As someone who filed the AMD/ATi ecosystems due to their quirky unreliability, Nvidia and Intel have really shit the bed these days (I also had the misfortune of "upgrading" to a 13th gen Intel processor just before we learned that they cook themselves)
I do think DLSS supersampling is incredible but Lord almighty is it annoying that the frame generation is under the same umbrella because that is nowhere near the same, and the water is awful muddy since "DLSS" is often used without distinction
Because they won't sell you an in-demand high-end GPU for cheap? Well TS
Not to mention that they are currently in stock at my local microcenter.
Aks. "Every beef anyone has ever had with Nvidia in one outrage friendly article."
If you want to hate on Nvidia, there'll be something for you in there.
An entire section on 12vhpwr connectors, with no mention of 12V-2x6.
A lot of "OMG Monopoly" and "why won't people buy AMD" without considering that maybe ... AMD cards are not considered by the general public to be as good _where it counts_. (Like benefit per Watt, aka heat.) Maybe it's all perception, but then AMD should work on that perception. If you want the cooler CPU/GPU, perception is that that's Intel/Nvidia. That's reason enough for me, and many others.
Availability isn't great, I'll admit that, if you don't want to settle for a 5060.
Consumer GPU feels like an "paper launch" for the past years
that's like they purposely not selling because they allocated 80% of their production to enterprise only
I just hope that new fabs operate early as possible because these price is insane
> The RTX 4090 was massive, a real heccin chonker. It was so huge in fact, that it kicked off the trend of needing support brackets to keep the GPU from sagging and straining the PCIe slot.
This isn't true. People were buying brackets with 10 series cards.
> With over 90% of the PC market running on NVIDIA tech, they’re the clear winner of the GPU race. The losers are every single one of us.
Of course the fact that we overwhelmingly chose the better option means that… we are worse off or something?
That bit does seem a bit whiney. AMD's latest offerings are quite good, certainly better value for money. Why not buy that? The only shame is that they don't sell anything as massive as Nvidia's high end.
Choosing the vendor locked in, standards hating brand does tend to mean that you inevitably get screwed when they decide do massively inflate their prices and there's nothing you can do about it does tend to make you worse off, yes.
Not that AMD was anywhere near being in a good state 10 years ago. Nvidia still fucked you over.
I sometimes wonder if people getting this salty over "fake" frames actually realise every frame is fake even in native mode. Neither is more "real" than the other, it's just different.
a friend of mine is a SW developer in Nvidia, working on their drivers. He was complaining lately that he is required to fix a few bugs in the drivers code for the new card (RTX?), while not provided with the actual hardware. His pleas to send him this HW were ignored, but the demand to fix by a deadline kept being pushed.
He actually ended up buying older but somewhat similar used hardware with his personal money, to be able to do his work.
Not even sure if he was eventually able to expense it, but wouldn't be surprised if not, knowing how big companies bureaucracy works...
Read this in good faith but I don't see how it's supposed to be Nvidia's fault?
How could Nvidia realistically stop scalper bots?
Oh man, you haven't gotten into their AI benchmark bullshittery. There's factors of 4x on their numbers that are basically invented whole cloth by switching units.
I disagree with some of the article’s points - primarily, that nVidia’s drivers were ever “good” - but the gist I agree with.
I have a 4070 Ti right now. I use it for inference and VR gaming on a Pimax Crystal (2880x2880x2). In War Thunder I get ~60 FPS. I’d love to be able to upgrade to a card with at least 16GB of VRAM and better graphics performance… but as far as I can tell, such a card does not exist at any price.
Nvidia is full of shit, but this article is full of shit, too. A lot of human slop, some examples:
- 12VHPWR is not at fault / the issue. As the article itself points out, the missing power balancing circuit is to blame. The 3090 Ti had bot 12VHPWR and the balancing power circuit and ran flawless.
- Nvidia G-Sync: Total non-issue. G-Sync native is dead. Since 2023, ~1000 Freesync Monitors have been released, and 3(!!) G-Sync native Monitors.
- The RTX 4000 series is not still expensive, it is again expensive. It was much cheaper a year before RTX 5000 release
- Anti-Sag Brackets were a thing way before RTX 4000
> ... NVENC are pretty much indispensable
What's so special about NVENC that Vulkan video or VAAPI can't provide?
> AMD also has accelerated video transcoding tech but for some reason nobody seems to be willing to implement it into their products
OBS works with VAAPI fine. Looking forward to them adding Vulkan video as an option.
Either way, as a Linux gamer I haven't touched Nvidia in years. AMD is a way better experience.
with Intel also shitting the bed, it seems like AMD is poised to pick up “traditional computing” while everybody else runs off to chase the new gold rush. Presumably there’s still some money in desktops and gaming rigs?
Right now, all silicon talk is bullshit. It has been for a while.
It became obvious when old e-waste Xeons were turned into viable, usable machines, years ago.
Something is obviously wrong with this entire industry, and I cannot wait for it to pop. THIS will be the excitement everyone is looking for.
A lot of those Xeon e-waste machines were downright awful, especially for the "cheap gaming PC" niche they were popular in. Low single-core clock speeds, low memory bandwidth for desktop-style configurations and super expensive motherboards that ran at a higher wattage than the consumer alternatives.
> THIS will be the excitement everyone is looking for.
Or TSMC could become geopolitically jeopardized somehow, drastically increasing the secondhand value of modern GPUs even beyond what they're priced at now. It's all a system of scarcity, things could go either way.
They were awful compared to newer models, but for the price of nothing, pretty good deal.
If no good use is found for high-end GPUs, secondhand models will be like AOL CDs.
Sure, eventually. Then in 2032, you can enjoy the raster performance that slightly-affluent people in 2025 had for years.
By your logic people should be snatching up the 900 and 1000-series cards by the truckload if the demand was so huge. But a GTX 980 is like $60 these days, and honestly not very competitive in many departments. Neither it nor the 1000-series have driver support nowadays, so most users will reach for a more recent card.
Do you have a timeframe for the pop? I need some excitement.
More a sequence of potential events than a timeframe.
High-end GPUs are already useless for gaming (a low-end GPU is enough), their traditional source of demand. They're floating on artificial demand for a while now.
There are two markets that currently could use them: LLMs and Augmented Reality. Both of these are currently useless, and getting more useless by the day.
CPUs are just piggybacking on all of this.
So, lots of things hanging on unrealized promises. It will pop when there is no next use for super high-end GPUs.
War is a potential user of such devices, and I predict it could be the next thing after LLMs and AR. But then if war breaks out in such a scale to drive silicon prices up, lots of things are going to pop, and food and fuel will boom to such a magnitude that will make silicon look silly.
I think it will pop before it comes to the point of war driving it, and it will happen within our lifetimes (so, not a Nostradamus-style prediction that will only be realized long-after I'm dead).
Local LLMs are becoming more popular and easier to run, and Chinese corporations are releasing extremely good models of all sizes under MIT or similar terms in many cases. There amount of VRAM is the main limiter, and it would help with gaming too.
Gaming needs no additional VRAM.
From a market perspective, LLMs sell GPUs. Doesn't even matter if they work or not.
From the geopolitical tensions perspective, they're the perfect excuse to create infrastructure for a global analogue of the Great Firewall (something that the Chinese are pioneers of, and catching up to the plan).
From the software engineering perspective, LLMs are a nuissance, a distraction. They harm everyone.
> Gaming needs no additional VRAM.
Really? What about textures? Any ML that the new wave of games might use? For instance, while current LLMs powering NPC interactions would be pretty horrible, what about in 2 years time? You could have arbitrary dialogue trees AND dynamically voiced NPCs or PCs. This is categorically impossible without more VRAM.
> the perfect excuse to create infrastructure for a global analogue of the Great Firewall
Yes, let's have more censorship and kill the dream of the Internet even deader than it already is.
> From the software engineering perspective, LLMs are a nuissance, a distraction. They harm everyone.
You should be aware that reasonable minds can differ in this issue. I won't defend companies forcing the use of LLMs (it would be like forcing use of vim or any other tech you dislike), but I disagree about being a nuisance, distraction, or a universal harm. It's all down to choices and fit for use case.
I don't see how GPU factories could be running in the event of war "in such a scale to drive silicon prices up". Unless you mean that supply will be low and people scavanging TI calculators for processors to make boxes playing Tetris and Space Invaders.
Why not?
This is the exact model in which WWII operated. Car and plane supply chains were practically nationalized to support the military industry.
If drones, surveillance, satellites become the main war tech, they'll all use silicon, and things will be fully nationalized.
We already have all sorts of hints of this. Doesn't need a genius to predict that it could be what happens to these industries.
The balance with food and fuel is more delicate though. A war with drones, satellites and surveillance is not like WWII, there's a commercial aspect to it. If you put it on paper, food and fuel project more power and thus, can move more money. Any public crisis can make people forget about GPUs and jeopardize the process of nationalization that is currently being implemented, which still depends on relatively peaceful international trade.
CPU and GPU compute will be needed for military use processing the vast data from all sorts of sensors. Think about data centres crunching satellite imagery for trenches, fortifications and vehicles.
> satellite imagery for trenches, fortifications and vehicles
Dude, you're describing the 80s. We're in 2025.
GPUs will be used for automated surveillance, espionage, brainwashing and market manipulation. At least that's what the current batch of technologies implies.
The only thing stopping this from becoming a full dystopia is that delicate balance with food and fuel I mentioned earlier.
It has become pretty obvious that entire wealthy nations can starve if they make the wrong move. Turns out GPUs cannot produce calories, and there's a limit to how much of a market you can manipulate to produce calories for you.
Hell, yeah. I'm in for some shared excitement too if y'all want to get some popcorn.
Uhh, these 12VHPWR connectors seem like a serious fire risk. How are they not being recalled? I just got a 5060ti , now I'm wishing I went AMD instead.. what the hell :(
Whoa, the stuff covered in the rest of the post is just as egregious. Wow! Maybe time to figure out which AMD models compares performance-wise and sell this thing, jeez.
The thing is, company culture is a real thing. And some cultures are invasive/contagious like kudzu both internally to the company and into adjacent companies that they get comped against. The people get to thinking a certain way, they move around between adjacent companies at far higher rates than to more distant parts of their field, the executives start sitting on one another's boards, before you know it a whole segment is enshittified, and customers feel like captives in an exploitation machine instead of parties to a mutually beneficial transaction in which trade increases the wealth of all.
And you can build mythologies around falsehoods to further reinforce it: "I have a legal obligation to maximize shareholder value." No buddy, you have some very specific restrictions on your ability to sell the company to your cousin (ha!) for a handful of glass beads. You have a legal obligation to bin your wafers the way it says on your own box, but that doesn't seem to bother you.
These days I get a machine like the excellent ASUS Proart P16 (grab one of those before they're all gone if you can) with a little 4060 or 4070 in it that can boot up Pytorch and make sure the model will run forwards and backwards at a contrived size, and then go rent a GB200 or whatever from Latitude or someone (seriously check out Latitude, they're great), or maybe one of those wildly competitive L40 series fly machines (fly whips the llama's ass like nothing since Winamp, check them out too). The GMTek EVO-X1 is a pretty capable little ROCm inference machine for under 1000, its big brother is nipping at the heels of a DGX Spark under 2k. There is good stuff out there but its all from non-incumbent angles.
I don't game anymore but if I did I would be paying a lot of attention to ARC, I've heard great things.
Fuck the cloud and their ancient Xeon SKUs for more than Latitude charges for 5Ghz EPYC. Fuck NVIDIA gaming retail rat race, its an electrical as well as moral hazard in 2025.
It's a shame we all have to be tricky to get what used to be a halfway fair deal 5-10 years ago (and 20 years ago they passed a HUGE part of the scaling bonanza down to the consumer), but its possible to compute well in 2025.
Nice advertorial. I hope you got paid for all of those plugs.
I wish! People don't care what I think enough to monetize it.
But I do spend a lot of effort finding good deals on modern ass compute. This is the shit I use to get a lot of performance on a budget.
Will people pay you to post on HN? How do I sign up?
> Fuck the cloud and their ancient Xeon SKUs
Dude, no one talks about this and it drives me up the wall. The only way to guarantee modern CPUs from any cloud provider is to explicitly provision really new instance types. If you use any higher-level abstracted services (Fargate, Cloud Run, Lambda, whatever) you get salvation army second-hand CPUs from 15 years ago, you're billed by the second so the slower, older CPUs screw you over there, and you pay a 30%+ premium over the lower-level instances because its a "managed service". Its insane and extremely sad that so many customers put up with it.
Bare metal is priced like it always was but is mad convenient now. latitude.sh is my favorite, but there are a bunch of providers that are maybe a little less polished.
It's also way faster to deploy and easier to operate now. And mad global, I've needed to do it all over the world (a lot of places the shit works flawlessly and you can get Ryzen SKUs for nothing).
Protip: burn a partition of Ubuntu 24.04 LTS which is the default on everything and use that as "premium IPMI", even if you run Ubuntu. you can always boot into a known perfect thing with all the tools to tweak whatever. If I have to even restart on I just image it, faster than launching a VM on EC2.
All symptoms of being number one.
Customers don’t matter, the company matters.
Competition sorts out such attitude quick smart but AMD never misses a chance to copy Nvidias strategy in any way and intel is well behind.
So for now, you’ll eat what Jensen feeds you.
Nobody’s going to read this, but this article and sentiment is utter anti-corporate bullshit, and the vastly congruent responses show that none of you have watched the historical development of GPGPU, or do any serious work on GPUs, or keep up with the open work of nvidia researchers.
The spoiled gamer mentality is getting old for those of us that actually work daily in GPGPU across industries, develop with RTX kit, do AI research, etc.
Yes they’ve had some marketing and technical flubs as any giant publically traded company will have, but their balance of research-driven development alongside corporate profit necessities is unmatched.
And no I don’t work for nvidia. I’ve just been in the industry long enough to watch the immense contribution nvidia has made to every. single. field. The work of their researchers is astounding, it’s clear to anyone that’s honestly worked in this field long enough. It’s insane to hate on them.
Their contribution to various fields and the fact that they treat the average consumer like shit nowadays are not mutually exclusive.
Also, nobody ever said they hate their researchers.
Maybe the average consumer doesn't agree they are being treated like shit? Steam top 10 GPU list is almost all NVidia. Happy customers or duped suckers? I've seen the later sentiment a lot over the years and discounting consumer's preferences never seems to lead to correct prediction of outcomes..
Or maybe the average consumer bought them while still being unhappy about the overall situation?
It pains me to be on the side of "gamers" but I would rather support spoiled gamers than modern LLM bros.
> Pretty much all upscalers force TAA for anti-aliasing and it makes the entire image on the screen look blurry as fuck the lower the resolution is.
I feel like this is a misunderstanding, though I admit I'm splitting hairs here. DLSS is a form of TAA, and so is FSR and most other modern upscalers. You generally don't need an extra antialiasing pipeline if you're getting an artificially supersampled image.
We've seen this technique variably developed across the lifespan of realtime raster graphics; first with checkerboard rendering, then TAA, then now DLSS/frame generation. It has upsides and downsides, and some TAA implementations were actually really good for the time.
Every kind of TAA that I've seen creates artifacts around fast-moving objects. This may sound like a niche problem only found in fast-twitch games but it's cropped up in turn-based RPGs and factory/city builders. I personally turn it off as soon as I notice it. Unfortunately, some games have removed traditional MSAA as an option, and some are even making it difficult to turn off AA when TAA and FXAA are the only options (though you can usually override these restrictions with driver settings).
The sad truth is that with rasterization every renderer needs to be designed around a specific set of antialiasing solutions. Antialiasing is like a big wall in your rendering pipeline, there's the stuff you can do before resolving and the stuff you can do afterwards. The problem with MSAA is that it is pretty much tightly coupled with all your architectural rendering decisions. To that end, TAA is simply the easiest to implement and it kills a lot of proverbial birds with one stone. And it can all be implemented as essentially a post processing effect, it has much less of the tight coupling.
MSAA only helps with geometric edges, shader aliasing can be combatted with prefiltering but even then it's difficult to get rid of it completely. MSAA also needs beefy multisample intermediate buffers, this makes it pretty much a non-starter on heavily deferred rendering pipelines, which throw away coverage information to fit their framebuffer budget. On top of that the industry moved to stochastic effects for rendering all kinds of things that were too expensive before, the latest being actual realtime path tracing. I know people moan about TAA and DLSS but to do realtime path tracing at 4k is sort of nuts really. I still consider it a bit of a miracle we can do it at all.
Personally, I wish there was more research by big players into things like texture space lighting, which makes shading aliasing mostly go away, plays nice with alpha blending and would make MSAA viable again. The issue there is with shading only the stuff you see and not wasting texels.
There's another path, which is to raise the pixel densities so high we don't need AA (as much) anymore, but I'm going to guess it's a) even more expensive and b) not going to fix all the problems anyway.
That's just called super sampling. Render at 4k+ and down sample to your target display. It's as expensive as it sounds.
No, I mean high pixel densities all the way to the display.
SSAA is an even older technique than MSAA but the results are not visually the same as just having a really high-DPI screen with no AA.
It's not that it's difficult to turn off TAA: it's that so many modern techniques do not work without temporal accumulation and anti-aliasing.
Ray tracing? Temporal accumulation and denoising. Irradiance cache? Temporal accumulation and denoising. most modern light rendering techniques cannot be done in time in a single frame. Add to that the fact that deferred or hybrid rendering makes implementing MSAA be anywhere between "miserable" and "impossible", and you have the situation we're in today.
A lot of this is going to come down to taste so de gustibus and all that, but this feels like building on a foundation of sand. If the artifacts can be removed (or at least mitigated), then by all means let's keep going with cool new stuff as long as it doesn't detract from other aspects of a game. But if they can't be fixed, then either these techniques ought to be relegated to special uses (like cutscenes or the background, kinda like the pre-rendered backdrops of FF7) or abandoned/rethought as pretty but impractical.
So, there is a way to make it so that TAA and various temporal techniques look basically flawless. They need a _lot_ of information and pixels.
You need a 4k rendering resolution, at least. Modern effects look stunning at that res.
Unfortunately, nothing runs well at 4k with all the effects on.
The 4090 was released coming up on 3 years and is currently going for about 25% over launch msrp USED. Buying gpu's is literally an appreciating asset. It is complete insanity and an infuriating situation for an average consumer.
I honestly don't know why nvidia didn't just suspend their consumer line entirely. It's clearly no longer a significant revenue source and they have thoroughly destroyed consumer goodwill over the past 5 years.
>I honestly don't know why nvidia didn't just suspend their consumer line entirely.
It's ~$12 billion a year with a high gross margin by the standards of every other hardware company. They want to make sure neither AMD nor Intel get that revenue they can invest into funding their own AI/ML efforts.
>How is it that one can supply customers with enough stock on launch consistently for decades, and the other can’t?
I guess the author is too young and didn't go through iPhone 2G to iPhone 6 era. Also worth remembering it wasn't too long ago Nvidia was sitting on nearly ONE full year of GPU stock unsold. That has completely changed the course of how Nvidia does supply chain management and forecast. Which unfortunately have a negative impact all the way to Series 50. I believe they have since changed and next Gen should be better prepared. But you can only do so much when AI demand is seemingly unlimited.
>The PC, as gaming platform, has long been held in high regards for its backwards compatibility. With the RTX 50 series, NVIDIA broke that going forward. PhysX.....
Glide? What about all the Audio Drivers API before. As much as I wish everything is backward compatible. That is just not how the world works. Just like any old games you need some fiddling to get it work. And they even make the code available so people could actually do something rather then emulation or reverse engineering.
>That, to me, was a warning sign that maybe, just maybe, ray tracing was introduced prematurely and half-baked.
Unfortunately that is not how it works. Do we want to go back to Pre-3DFx to today to see how many what we thought was great idea for 3D accelerator only to be replaced by better ideas or implementation? These idea were good on paper but didn't work well. We than learn from it and reiterate.
>Now they’re doing an even more computationally expensive version of ray tracing: path tracing. So all the generational improvements we could’ve had are nullified again......
How about Path Tracing is simply a better technology? Game developers also dont have to use any of these tech. The article act as if Nvidia forces all game to use it. Gamers want better graphics quality, Artist and Graphics asset is already by far the most expensive item in gaming and it is still increasing. What hardware improvement is allowing those to be achieved at lower cost. ( To Game Developers )
>Never mind that frame generation introduces input lag that NVIDIA needs to counter-balance with their “Reflex” technology,
No. That is not why "Reflex" tech was invented. Nvidia spend R&D on 1000 fps monitor as well and potentially sub 1ms frame monitor. They have always been latency sensitive.
------------------------------
I have no idea how modern Gamers become what they are today. And this isn't the first time I have read it even on HN. You dont have to buy Nvidia. You have AMD and now Intel ( again ). Basically I can summarise one thing about it, Gamers want Nvidia 's best GPU for the lowest price possible. Or a price they think is acceptable without understanding the market dynamics and anything supply chain or manufacturing. They also want higher "generational" performance. Like 2x every 2 year. And if they dont get it, it is Nvidia's fault. Not TSMC, not Cadence, not Tokyo Electron, not Issac Newton or Law of Physic. But Nvidia.
Nvidia's PR tactic isn't exactly new in the industry. Every single brand do something similar. Do I like it? No. But unfortunately that is how the game is played. And Apple is by far the worst offender.
I do sympathise with the Cable issue though. And not the first time Nvidia has with thermal issues. But then again they are also the one who are constantly pushing the boundary forward. And AFAIK the issues isn't as bad as the series 40 but some YouTube seems to be making a bigger issue than most. Supply issues will be better but TSMC 3nm is fully booked . The only possible solution would be to have consumer GPU less capable of AI workload. Or to have AI GPU working with leading edge node and consumer always be a node lower to split the capacity problem. I would imagine that is part of the reason why TSMC is accelerating 3nm capacity increase on US soil. Nvidia is now also large enough and has enough cash to take on more risk.
Sounds about right :D
it would be "just" capitalist to call these fuckers out for real, on the smallest level.
you are safe.
Why does the hero image of this website says "Made with GIMP"? I've never seen a web banner saying "Made with Photoshop" or anything similar.
I don't know why it says that, but GIMP is an open-source project so it makes sense for fans to advertise it.
Were you on the internet in the 90s? Lots of banners like that on every site.
This guy makes some good points but he clearly has a bone to pick. Calling dlss snake oil was where I stopped reading
Yeah, computer graphics has always been "software trickery" all the way down. There are valid points to be made about DLSS being marketed in misleading ways, but I don't think it being "software trickery" is a problem at all.
Exactly. Running games at a lower resolution isn't new. I remember changing the size of the viewport in the original DOOM 1993 to get it to run faster. Making a lower resolution look better without having to run at a higher resolution is the exact same problem anti-aliasing has been tackling forever. DLSS is just another form of AA that is now so advanced, you can go from an even lower resolution and still look good.
So even when I'm running a game at native resolution, I still want anti-aliasing, and DLSS is a great choice then.
It's one thing to rely on a technique like AA to improve visual quality with negligible drawbacks. DLSS is entirely different though, since upscaling introduces all kinds of graphical issues, and frame generation[1] even more so, while adding considerable input latency. NVIDIA will claim that this is offset by its Reflex feature, but that has its own set of issues.
So, sure, we can say that all of this is ultimately software trickery, but when the trickery is dialed up to 11 and the marketing revolves entirely on it, while the raw performance is only slightly improved over previous generations, it's a clear sign that consumers are being duped.
[1]: I'm also opposed to frame generation from a philosophical standpoint. I want my experience to be as close as possible to what the game creator intended. That is, I want every frame to be generated by the game engine; every object to look as it should within the world, and so on. I don't want my graphics card to create an experience that approximates what the creator intended.
This is akin to reading a book on an e-reader that replaces every other word with one chosen by an algorithm. I want none of that.
I don't disagree about frame-gen, but upscaling and its artifacts are not new nor unique to DLSS. Even later PS3 games upscaled from 720p to 1080p.
But we're not talking about resolution here. We're talking about interpolation of entire frames, multiple frames.
I don't think we are? Article talks about DLSS on RTX 20 series cards, which do not support DLSS frame-gen:
> What always rubbed me the wrong way about how DLSS was marketed is that it wasn’t only for the less powerful GPUs in NVIDIA’s line-up. No, it was marketed for the top of the line $1,000+ RTX 20 series flagship models to achieve the graphical fidelity with all the bells and whistles.
The article doesn't make the best argument to support the claim but it's true that NVIDIA is now making claims like '4090 level performance' on the basis that if you turn on DLSS multi-frame generation you suddenly have Huge Framerates when most of the pixels are synthesized instead of real.
Personally I'm happy with DLSS on balanced or quality, but the artifacts from framegen are really distracting. So I feel like it's fair to call their modern marketing snake oil since it's so reliant on frame gen to create the illusion of real progress.
Nothing new, it is just Enshittification
Call it delusions or conspiracy theories, what ever, I don't care, but it seems to me that NVIDIA wants to vendor lock the whole industry
If all game developers begin to rely on NVIDIA technology, the industry as a whole puts customers in a position where they are forced to give in
The public's perception of RTX's softwarization (DLSS) and them coining the technical terms says it all
They have a long term plan, and that plan is:
- make all the money possible
- destroy all competition
- vendor lock the whole world
When I see that, I can't help myself but to think something is fishy:
https://i.imgur.com/WBwg6qQ.png
[dead]
[flagged]
[flagged]
A bit hyperbolic
Here's something I don't understand: Why is it that when I go look at DigitalOcean's GPU Droplet options, they don't offer any Blackwell chips? [1] I thought Blackwell was supposed to be the game changing hyperchip that carried AI into the next generation, but the best many providers still offer are Hopper H100s? Where are all the Blackwell chips? Its been oodles of months.
Apparently AWS has them available in the P6 instance type, but the only configuration they offer has 2TB of memory and costs... $113/hr [2]? Like, what is going on at Nvidia?
Where the heck is Project Digits? Like, I'm developing this shadow opinion that Nvidia actually hasn't built anything new in three years, but they fill the void by talking about hypothetical newtech that no one can actually buy + things their customers have built with the actually good stuff they built three years ago. Like, consumers can never buy Blackwell because "oh Enterprises have bought them all up" then when Microsoft tries to buy any they say "Amazon bought them all up" and vice-versa. Something really fishy is going on over there. Time to short.
[1] https://www.digitalocean.com/products/gpu-droplets
[2] https://aws.amazon.com/ec2/pricing/on-demand/
Finally someone
I’m sorry but this framing is insane
> So 7 years into ray traced real-time computer graphics and we’re still nowhere near 4K gaming at 60 FPS, even at $1,999.
The guy is complaining that a product can’t live up to his standard, while dismissing barely noticeable proposed trade off that can make it possible because it’s «fake».
I'm so happy to see someone calling NVIDIA out for their bullshit. The current state of GPU programming sucks, and that's just an example of the problems with the GPU market today.
The lack of open source anything for GPU programming makes me want to throw my hands up and just do Apple. It feels much more open than pretending that there's anything open about CUDA on Linux.
Another perspective: Nvidia customer support on their mellanox purchase ...is total crap. It's the worst of corporate America ... paper pushing beurceatric guys who slow roll stuff ... getting to a smart person behind the customer reps requires one to be an ape in a bad mood 5x ... I think they're so used to that now that unless you go crazy mode their take is ... well I guess he wasn't serious about his ask and he dropped it.
Here's another nvdia/mellanox bs problem: many mlx nic cards are finalized or post assembled say by hp. So if you have a hp "mellanox" nic nvidia washes their hands of anything detailed. It's not ours; hp could have done anything to it what do we know? So one phones hp ... and they have no clue either because it's really not their IP or their drivers.
It's a total cluster bleep and more and more why corporate america sucks
Corporate America actually resembles the state of government a lot too. Deceptive marketing, inflated prices that leave the average Joe behind, and low quality products on top of all that.
In the 1980s maybe a course correction was needed to help capitalism. But it's over corrected by 30%. I'm not knocking corporate america or capitalism in absolute terms. I am saying customers have lost power... whether it's phone trees, right to fix, a lack of accountability (2008 housing crisis), the ability to play endless accounting games to pay lower taxes plus all the more mundane things ... it's gotten out of whack.
I have guessing you have HP "mellanox"? Because Connect-X support are great.
>I have guessing you have HP "mellanox"? Because Connect-X support are great.
I'll have to take your word on that.
And if I take your word: ergo not Connect-X support sucks
So that's sucks yet again on the table ... for what the 3rd time? Nvidia sucks.