Forget About Platform Exclusives; Here Comes The PC GPU Exclusives!
from the please-just-stop dept
Of all the things in the gaming industry that annoy me, exclusivity deals have to rank near the very top. The idea that any title, but in particular third-party titles, could be exclusive to certain platforms, such as Xbox or PlayStation, is anathema to how art and culture distribution is meant to work. I understand why they’re a thing, I just think they shouldn’t be. And exclusivity deals tend to taint many other aspects of the industry. You need only look at the all of the convoluted fights Microsoft engaged in with regulators after gobbling up a bunch of large game studios to see the vascular reach exclusivity has in the industry.
The PC gaming community has had to put up with less of this sort of thing, generally. Sure, some titles are console exclusives and that sucks, but there hasn’t been much in the way of PC gamers having to pay attention to the base hardware and software they have to play games. And, yes, certainly there is some of this, particularly with those who want to play games on MacOS or Linux systems, but it’s generally been at a much smaller scale. One Reddit thread I uncovered from several years ago even noticed this and began wondering out loud if hardware exclusives in PC gaming would ever become a thing.
The latest squabble on r/gaming between console owners over exclusive games has got me thinking. What prevents something like this from happening with GPUs? After the GPP thing, I think it is pretty clear Nvidia is willing to do almost anything to control the market. I despise the idea of selling hardware with exclusives: I think hardware should stand on it’s own merits. The whole idea of pc gaming is to have choice, to have control over your machine. GPU exclusives would ruin this idea, in some ways. Could Nvidia pay for a popular game to run only on their hardware?
Well, it didn’t exactly happen in that way with the recently released space epic Starfield, but a specific graphical feature within the game did. See, Bethesda, parent company Zenimax who’s parent company is now Microsoft, inked a deal with AMD. The result is that one of the more popular graphics features found in Nvidia graphics cards, DLSS, is not supported in the game, but AMD’s version of it is.
As IGN noticed, the open-world RPG’s settings menu currently only supports the latest iteration of AMD’s FidelityFX Super Resolution feature, FSR2, meaning players with Intel or Nvidia graphics cards that use different machine learning upscaling algorithms are out of luck. AMD gaming chief Frank Azor wouldn’t confirm if that was a requirement for its partnership with Bethesda, but recently told The Verge the studio could support DLSS if it wanted. “If they want to do DLSS, they have AMD’s full support,” he said.
Frankly, I don’t believe that and I don’t think you should, either. If all of the graphical features in AMD’s rivals’ chipset were free to be used by Bethesda, then what is the point of the deal AMD signed with Bethesda? And why in the world would Bethesda want to deny Nvidia chip owners the graphical abilities of machine-learning graphics upscaling? If you’re not a PC gamer, this might all sound like gibberish to you, but DLSS is no small deal.
For now, if you’re an Nvidia owner, this has all sort of been fixed for you thanks to the modding community.
The good news is that a “Starfield Upscaler” which allows players to replace FSR2 with DLSS or XESS was one of the first mods uploaded to the NexusMods website after the game went live. It’s not bug free and some PC players are still reporting issues getting their preferred upscaling tech to work, but it’s a start and will no doubt continue to get refined in the days ahead.
Bethesda’s exclusive partnership with AMD caused a big controversy when it was announced earlier this summer precisely because of the chip company’s pattern of locking out competitors’ features. The whole point of PC gaming is that it’s supposed to give players freedom to pick and choose their preferred builds, unlike on consoles where fans are locked into the manufacturer’s ecosystem.
Exactly. And the fact that this splintering of the PC gaming ecosystem ostensibly as a result of exclusivity deals with hardware component manufacturers is beginning to rear its ugly head is not a good thing. I’m loathe to make slippery slope arguments generally, but this sure does feel like the very first shot being fired in what might be a longer, and very dumb, war among chipset manufacturers.
Filed Under: exclusives, gpu, pc gaming, video games
Companies: amd, bethesda, microsoft, nvidia, zenimax


Comments on “Forget About Platform Exclusives; Here Comes The PC GPU Exclusives!”
Except they’re not GPU exclusive, AMD’s solutions under their FidelityFX/GPUOpen brand are 99% of the time open source* and there’s and vendor neutral, they run on any hardware that supports semi-recent shader models.
While Nvidia’s solutions are proprietary and hardware locked to only their newest cards. It’s sad to see Techdirt and a lot of news outlets take such an anti-FOSS anti-competitive stance.
*iirc there’s one that’s still fully proprietary though I don’t remember what and right now their shader node software is under what I think is a source available license
Re:
So AMD’s FSR2 is available for Intel and Nvidia as downloadable plug-and-play drivers? That’s GREAT news!
But, um, if so, why would Kotaku say that Intel and Nvidia cards that use different upscaling models were SoL? And why would the modding community see that having compatibility with DLSS or XESS be something that they would need a mod for, when a simply Free and Open Source driver is right at hand?
What? Perhaps there aren’t translations of FSR2 available for those cards, or perhaps not for free? Or perhaps they require knowing the secret handshake, and the password to get into the Computer Lounge Speakeasy? Why sir! How very anti-FOSS of these customers, not using hardware that has FOSS drivers conveniently available that work with Starfield!
Re: Re:
Look, not everyone is a graphics developer, I get that. I’m not trying to be rude here, you have a fundamental misunderstanding of how all this works
>So AMD’s FSR2 is available for Intel and Nvidia as downloadable plug-and-play drivers? That’s GREAT news!
FSR and the rest of the FidelityFX suite are exclusively available in source form, as code libraries and shaders, on one of AMD’s many githubs. Once built, the entire suite, from FSR, to CAS, to those new blur affects, is plug and play. Or as plug and play as DLSS and XeSS are. It’s why there are mods for DLSS exclusive games integrating FSR. The reverse doesn’t need to happen however, because
>What? Perhaps there aren’t translations of FSR2 available for those cards, or perhaps not for free?
This is not how it works. Unlike XeSS and DLSS (presumably both written in some proprietary vendor specific solution or in GPU ASM directly, there’s no way to tell, DLSS at least is probably written in CUDA/Nvidia PTX) FSR is written in shading langauges, and there are alternative options for which. GLSL, a vendor neurtral standard published by the Khronos group (members include AMD, Valve, Apple, Nvidia, etc.) and HLSL (maintained by Microsoft). These shading languages are compiled to vendor specific GPU ASM by the drivers directly or go through a vendor neutral intermediate like SPIR-V or DXIL (khronos/microsoft respectively).
It implicitly and automatically works on all non-AMD hardware including things like intel integrated graphics and mobile graphics processors.
Vendor extensions, i.e. hardware specific features have to be explicitly requested and specified. If Intel and Nvidia wanted to do what AMD was doing, they could have published an extension that allows exposure of DLSS and XeSS hardware. I think there might be some effort within khronos for this to happen but it hasn’t yet.
Though from what I’ve read of the source, FSR doesn’t actually even do that (which makes sense, AMD doesn’t actually publish many hardware extensions so there’d be nothing to gain from it), it doesn’t use anything hardware specific. It’s only optimized for AMD in the sense of its default memory access patterns and sizes of local data are optimized for AMD’s current cards. That is extremely trivial to change for other hardware.
>But, um, if so, why would Kotaku say that Intel and Nvidia cards that use different upscaling models were SoL?
because AMD’s solution is so generalized that is completely ignores most hardware specific features (not that there’s any way for them to actually integrate them into FSR, there would be if they published vendor extensions) which is a GOOD thing
Modern AMD is well known for pushing open source vendor neutral solutions, (so is Intel actually, so what they’ve done with XeSS is a bit sad especially when basically every other part of their entire stack embraces open source).
It’s Nvidia which always been hostile and standoffish to any and all forms of vendor neutrality or open standards.
Re: Re:
Because Kotaku isn’t a good outlet for technical discussions. Their main focus is gaming news, not the technical bits of games. How good they are at the former is YMMV (don’t think Kotaku is particularly good at that either), but gaming outlets in general tend to have very poor technical understanding of the games they’re covering.
Usually I’d trust technical coverage from well… tech outlets like ARSTechnica more for this sort of thing.
Just look at how often gamers blame “the engine” for all their woes and remember that Kotaku just more or less republishes that type of opinion verbatim. Or the amount of people who think that if console hardware is a few years old that it needs immediate replacement since it’s “outdated” without actual concern if the console developers are having issues with it.
All are basically “blame the wizard” opinions and they’re basically gospel in gaming outlets.
Re:
Except those shots were fired with all the games that exclusively used the proprietary, hardware-locked DLSS upscaler that folks had to mod the open source FSR into over the last couple of years.
Re:
I think you’re missing the point. The concern isn’t whether or not any upscaling is available on NVidia cards, the concern is that a specific technology has been blocked due to business reasons as opposed to technical reasons.
If it is true that disabling DLSS support was a condition of the partnership between AMD and Bethesda, then that’s an example of product functionality being actively hindered for a substantial number of users.
Regardless of your position on the players in the GPU market, that’s a negative development. It’s important to call out bad behavior even if it’s perpetrated by a company you like.
Re: Re:
Based on other posts and what I’ve heard elsewhere it appears that that’s not the case and this is entirely manufactured drama; however, even if it wasn’t, proprietary Nvidia solutions need to go, and yesterday. Their behavior has been completely unacceptable for a very long time.
Discounting the ability of Bethsoft’s software engineers…
You’re still gonna need help from nVidia to implement DLSS and other nVidia-related tech.
And there have been rumors that nVidia might not have that help around as the company pivots to “AI”…
It's nVidias fault...
The problem here is not what it appears to be, and is also being stoked by nVidia propaganda to blame Bethesda etc. instead of themselves, (as you’d expect? 😛 ).
There is a really good reason for why Starfield doesn’t support nVidia graphics cards anywhere near as well as AMD, and it’s because nVidia has completely stripped its gaming support division of resources for it’s AI binge.
This demonstrates just how and why nVidias USUAL support really makes that much difference for their own fortunes and well-being in games, by seeing what it’s like without it…
So you are saying only initially supporting an upscaling technology that is available across all AMD, Nvida and Intel GPUs made over the last 8 years or so vs initially supporting only an Nvidia-only 2000, 3000 and 4000 generation restricted cards DLSS implementation is bad?
Re:
Techdirt bizarrely taking the side of locked-down closed ecosystems that are limited to proprietary hardware.
I feel Techdirt has missed the mark with this one, although that is likely because the Kotaku article also did, failing to mention very important details.
Of the three options, FSR is the only one that is open source, and importantly works on all GPUs. And by all GPUs that includes older Nvidia cards that DLSS itself does not run on.
I think choosing FSR was a Bethesda choice, and a fairly obvious decision to make if you are trying to limit scope and release a game as massive as Starfield. It may seem easy because a mod did it in only a few days, but official DLSS is held to a much higher standard. It requires months of testing, working with proprietary software, and back and forth with Nvidia engineers. It requires training the actual machine learning model on the game, which you cannot do until the game is in a mostly complete state.
If I was a game developer I would likely make the same decision.
Re:
Correction: IGN not Kotaku. Same point applies.
FSR and DLSS are easily interchangeable. It’s a simple drop in replacement that a modder who knows what they are doing can do in an afternoon, has been for years now. Which was super beneficial when FSR2 first dropped and all the games only had native DLSS support, like CP77 when PotatoOfDoom’s FSR2 mod was the most downloaded mod for it, and it was so popular CDPR made their own version that was inferior for several iterations until around patch 1.6. A simple dll switcheroo and bingo, that DLSS slider in the game’s menus chooses which FSR mode you want, while still saying DLSS.
I eagerly await Tim's next article....
…about how we’re all big meanies to Microsoft for not supporting UWP, and this is why we can’t have nice things.
Ah an MBA has had another thought about how they can make a few more bucks….
I look forward to them charging to enable different graphic options.
You must have a GTX 9999 or $50 to enable play on your pitiful GTX 8888
AMD's already lubed up the slope.
I remember AMD signing a deal with DICE for being optimized on Battlefield 4.
Except they’re not really, because FSR works on a wide range of GPUs, including Intel and nVidia cards, due to being open source.
So back in the 90s PC gaming and console gaming were pretty separate. PCs couldn’t really handle the sprite-based games while consoles were pretty underpowered compared to what most PCs were running. Plus PCs didn’t really have much in the way of gamepad support, so even if you did emulate consoles and play those games, it was kind of a pain to do it on a keyboard or with a joystick.
That all changed with the Playstation. Graphically, the PSX looked and ran a lot better than their PC counterparts, mostly because TVs provided free antialiasing and again, gamepad support made a big difference. Ironically, many of these games actually were of higher resolution (Tomb Raider, for example) on PC, but looked worse because again, the blurriness of a TV made the game look a bit more realistic without being able to see where every texture connected like you would see on the PC.
But this was relatively short lived. Once hardware acceleration became the norm, PC gaming was able to pull off some amazing graphics that the PSX (or N64) just couldn’t match. I still remember playing Half-life for the first time, watching that opening tram ride when I accidently hit the mouse, and realized I wasn’t watching a movie, but rather the game itself.
But Sony came blasting back with the PS2. Even though it didn’t have graphics as good as a PC, it did have a DVD player built in and a lot of really good exclusives like FFX. And PC gaming was starting to die. You could tell because the PC section of the game stores kept getting smaller and smaller, while the console ones grew and grew.
Then Steam came along. I’m sure it wasn’t just Steam, but they provided a way for PC gamers to download games (legally) and provided enough of a market to keep PC games alive while the console companies blew tons of cash on consoles that didn’t make them any money. Even Nintendo lost money on that generation.
And honestly, evenything these days has converged. I’m not going to hook my PC up to a big TV because I like playing certain games at a desk. But others I like to play sitting on my couch. And others I like being able to play on a handheld before I go to sleep. I just got a Steamdeck to play my PS2 games since my PS2 has been having some issues lately (not bad for a 22 year old console).
So why the hell would any company want to make a game playing experience worse? PC gaming make be effectively “on top” right now, just based on the fact that most AAA games come out for PC as well as console. Even JRPGs, which historically have usually had shitty ports years later are now released for PC at the same time as their console counterparts. But this has changed several times, and it can change again. Console makers aren’t going to sit still while people migrate to PC gaming. They’re going to innovate, and for companies like Microsoft/Bethesda to throw grit in their own gears is just idiotic.
Tim, this is completely ass-backwards.
AMD’s decision to support open standards that are supported by all manufacturers isn’t anti-competitive, Nvidia’s decision to make its own proprietary standard that’s only supported by its cards is.
What’s next, an article about how Firefox is being anticompetitive by not supporting Chrome’s proposed web DRM?
Re:
Adding: Is it anticompetitive when games support Vulkan but not DirectX or Metal?
Um…
Physx? Dx11? Directx?
Definitely not something new on gpus. Definitely something that has happened with cpu features as well.
RTX
This isn’t new.
nVidia Ansel and RTX for example.
Games have been incorporating ansel and RTX features for years, which only work on nVidia hardware.
If you’re an AMD GPU user, you’re almost a second-class citizen in games. I’m one of the said users, it’s annoying.
Re:
I’m a Linux gamer and AMD’s support of open source has been a game-changer. Not just on the desktop; it’s the reason something like the Steam Deck can exist.
Re: Re:
This is the primary reason I switched to AMD.
I’d argue AMD’s hardware is inferior to nVidia, however their commitment to open source and pushing open standards is why I’m happy to support them, even if it comes at a cost in other areas.
Re: Re: Re:
Yeah. When Nvidia was the better choice for Linux, I used Nvidia; now that it isn’t anymore, I use AMD. It really is as simple as that.
Re: Re: Re:
I hope DLSS vs FST eventually goes the way of G-Sync vs FreeSync(which is based on open VRR standards). Sure you can pay extra for a G-Sync certified monitor but the gains are minimal compared to generic VRR that’s available on most TVs and monitors now, even Nvidia gave in and supports regular VRR.
Thank you commenters
I’ll admit the comments here, mostly, are quite helpful and that I won’t pretend to be a technical expert on upscaling technology. What I will commit to is digging into this a bit further and for sure if there is reason for a follow up post on this topic, I will write one.
I typically have a decent deal of trust in outlets like IGN, but hey, I’m capable of missing the nuance on stuff, and I’m sure they are too.
Thank you again to the commenters for the feedback.
Re:
I think you’ve got the right idea here — that there are GPU manufacturers pushing exclusive support for graphical features that only work on their cards — but you’re blaming the wrong company for it in this case. AMD is pushing a standard that’s inclusive and supported by its competitors; Nvidia isn’t.
Bad Take
I don’t see this often here, but definitely a bad take on TD’s part.
It’s pretty well known that GPU with non-AMD chipsets do in fact support FSR.
You can’t blame AMD really for pushing their open standard as opposed to the arguably superior (for now) Nvidia DLSS.
Remember when you had to pay hundreds $ extra to get a monitor that supported Nvidia’s proprietary G-Sync technology to be a ‘real’ PC gamer? AMD introduced the open FreeSync which was also arguably inferior when it was released. Eventually, Nvidia begrudgingly added support to their drivers even if you had to jump through hoops to enable it. FreeSync continued to improve and basically killed off the proprietary G-Sync.
This is a case where I believe the open standard could and should come out on top again.
The Real Issue is
re:
Not sure how DLSS is a fix when FS2 is intended and works for both AMD and nVidia cards. To what level, is for time and software versioning to decide.
I think the real issue is that there isn’t a common standard.
As a consumer its on us to decide between GSync or Freesync when we really just want a monitor that works with our choice of video card.
Same applies to the graphics card. We don’t want to deal with which approach best works for the monitors or game, we simply want whatever we buy to work – period!
To AMD and nVidia:
Get out of the competition mode and enter the cooperative mode on this topic, it will go a long way to help consumers in this era of timid hardware purchasing.
Re:
Well, one of these standards uses proprietary Nvidia hardware and cannot function without it. The other does not, and is thus hardware-agnostic.
So if we’re going for a common standard, there’s only really one option to pick.
AMD uses an open source platform for their code.
The issue is totally the opposite of what is reported here.
The problem is Intel and Nvidia utilise proprietary code with major NDA and rights documents attached to them.
That a company chooses not to jump into bed over proprietary code… so be it.
You really fucked the dog on this one, Tim.