Is Nvidia Playing Fair With Their New Development Tools?

from the dirty-tricks dept

There’s some heavy details in all of this, many of them at least somewhat technical, so let’s dispense with the typical introductions and get right to the meat of this GPU industry sandwich. It’s no secret to anyone paying attention to the video game industry that the graphics processor war has long been primarily waged between rivals Nvidia and AMD. What you may not realize is just how involved those two companies are with the developers that use their cards and tools. It makes sense, of course, that the two primary players in PC GPUs would want to get involved with game developers to make sure their code is optimized for the systems on which they’ll be played. That way, gamers end up with games that run well on the cards in their systems, buy more games, buy more GPUs, and everyone is happy. According to AMD, however, Nvidia is attempting to lock out AMD’s ability to get involved with developers who use the Nvidia GameWorks toolset, and the results can already be seen on the hottest game of the season thus far.

Some as-brief-as-possible background to get things started. First, the GameWorks platform appears to be immensely helpful to developers creating graphically impressive games.

Developers license these proprietary Nvidia technologies like TXAA and ShadowWorks to deliver a wide range of realistic graphical enhancements to things like smoke, lighting, and textures. Nvidia engineers typically work closely with the developers on the best execution of their final code. Recent examples of Nvidia GameWorks titles include Batman: Arkham Origins, Assassin’s Creed IV: Black Flag, and this week’s highly anticipated Watch Dogs.

Now, while this is and should be a licensing-revenue win for Nvidia, aspects of the agreement in using GameWorks may actually seek to extend that win into a realm that threatens the larger gaming ecosystem. As mentioned previously, both Nvidia and AMD have traditionally worked extremely closely with developers, even going so far as assisting them in optimizing the game code itself to offer the best experience on their respective cards. How? Well, I’ll let PR lead for AMD, Robert Hallock, chime in.

“Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products,” Hallock told me in an email conversation over the weekend. But wait, it stands to reason that AMD would be miffed over a competitor having the edge when it comes to graphical fidelity and features, right? Hallock explains that the core problem is deeper: “Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.The code obfuscation makes it difficult to perform our own after-the-fact driver optimizations, as the characteristics of the game are hidden behind many layers of circuitous and non-obvious routines,” Hallock continues. “This change coincides with NVIDIA’s decision to remove all public Direct3D code samples from their site in favor of a ‘contact us for licensing’ page. AMD does not engage in, support, or condone such activities.”

In other words, the dual symbiotic relationships that have always existed between developers and both Nvidia and AMD becomes one-sided, with AMD being locked out of the process in some very important ways. It means that an essential information repository and communications lines for development and game code optimization nearly become proprietary in favor of Nvidia. And, lest you think one shouldn’t simply take the word of a rival PR flack on this kind of thing, other tech journalists appear to not only agree, but have predicted this exact outcome nearly a year ago when the GameWorks program was first rolled out.

“AMD is no longer in control of its own performance. While GameWorks doesn’t technically lock vendors into Nvidia solutions, a developer that wanted to support both companies equally would have to work with AMD and Nvidia from the beginning of the development cycle to create a vendor-specific code path. It’s impossible for AMD to provide a quick after-launch fix. This kind of maneuver ultimately hurts developers in the guise of helping them.”

Forbes’ Jason Evangelho then digs into the title du jour, Watch Dogs, an Ubisoft production developed within the GameWorks platform. When a tech journalist is this surprised by how stark the difference in performance is between two rival GPU manufacturers, it’s worth taking him seriously.

I’ve been testing it over the weekend on a variety of newer AMD and Nvidia graphics cards, and the results have been simultaneously fascinating and frustrating. It’s evident that Watch Dogs is optimized for Nvidia hardware, but it’s staggering just how un-optimized it is on AMD hardware. I guarantee that when the game gets released, a swarm of upset gamers are going to point fingers at AMD for the sub-par performance. Their anger would be misplaced.

The graphic above may not appear all that staggering at first, until you understand the cards involved and what it actually represents. The two cards in question aren’t remotely in the same category of power and cost when compared to one another. That AMD card that is barely keeping up with the Nvidia card is a $500 workhorse, while the Nvidia card is a mid-range $300 staple of their linecard. Both cards were updated with the latest drivers for Watch Dogs prior to testing. The problem, as suggested above, is that the level of optimization done for the Nvidia cards far outpaces what’s been done on AMD’s end and it is thanks to the way the GameWorks platform is licensed and controlled. Games outside of that platform, with the exact same cards being tested, tell a far different story.

To further put this in perspective, AMD’s 290x graphics card performs 51% better than Nvidia’s 770 on one of the most demanding PC titles around, Metro: Last Light — which also happens to be an Nvidia optimized title. As you would expect given their respective prices, AMD’s flagship 290x can and should blow past Nvidia’s 770 and compete with Nvidia’s 780Ti on most titles. To really drive the point home, my Radeon 290x can hit 60fps on Metro: Last Light with High quality settings and 4x anti-aliasing, at a higher resolution of 1440p.

There’s some history here, with Nvidia having a reputation for being more proprietary than AMD, which has always been seen as more of an open-source, open-dialogue, open-competition company. Indeed, Nvidia even has some history with trying to hide colluding with competitors behind trade secret law. But if it’s allowed to simply lock up the open dialogue that everyone agrees makes for the best gaming ecosystem all around, the results could be quite poor for the PC gaming community as a whole. Particularly if upset AMD GPU owners who aren’t aware of the background end up pointing the fingers at their co-victims of Nvidia rather than the villain itself.

Filed Under: , , , ,
Companies: amd, nvidia

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Is Nvidia Playing Fair With Their New Development Tools?”

Subscribe: RSS Leave a comment
36 Comments
Anonymous Coward says:

So, it’s not patent protection, it’s keeping information that Nvidia is developing secret from a competitor? It seems that many of the posters here have been arguing that keeping secrets is a better alternative than patents. While the secrets in this case may be hurting AMD, it still seems like Nvidia is playing by generally accepted rules. Sure, you may be critical of Nvidia for hurting AMD’s customers, but, they ARE competitors, and it would seem like Nvidia would do anything to maintain an edge, and they are not using patents or copyright to do that. So, what’s the problem? Keeping secrets is now bad too?

Anonymous Coward says:

Re: Re: Re: Re:

when has AMD done such a thing?

Generally things work like Nvidia introduce proprietary solution, and AMD pushes for the same thing in less expensive and in an actual standard, like the G-sync stuff were you need a specific hardware from Nvidia in the display, while AMD got the very same functionality introduced in the VESA displayport standard. Or the proprietary CUDA stuff, where AMD has been helping to establish OpenCL.

Nvidia is the one doing the shady business practices. AMD actually does things to help the overall computing world instead of locking things up behind proprietary crap.

Anonymous Coward says:

Re: Re:

It’s an anti-competitive practice that interferes with the business relationship between developers and AMD. Real competition is making a better product that customers want to purchase more than your competitor’s product. It’s not forcing developers not to work with or be able to work with your competitor.

AC says:

Re: Re: Re:

About time someone got it right, did this commenter read the article, did anyone spouting Nvidia did nothing wrong? The whole of it is if you use the gameworeks platform from Nvidia you are NOT allowed to get help to make the game run better with AMD cards as well. So this gameworks Scheme is the problem not AMD and Nvidia keeping secrets.

Anonymous Coward says:

Re: Re: Re: Re:

Frankly, I haven’t seen a single ounce of proof that people are actively prevented from working with AMD if they work with GameWorks. They are prevented from giving others the source for the GameWorks middleware – but not their own builds.

On the other hand, AMD often prevents developers from giving builds to Nvidia until a few days before release. Tomb Raider, anyone?

Aerilus says:

Nvidia is just pissed amd riding there bumper when they used to have ati lapped. afaik amd chips are in new xbox and playstation. dont know why you would want to optimize for nvidia if you are releasing to console. amd has better linux support better support for opengl and better better price to performance ratio whats not to love.

Anonymous Coward says:

AMD’s no better than Nvidia in this space and them trying to play the sole victim here is laughable. Look at any single AMD-optimized game and you get the same result – AMD performance will far outpace Nvidia performance when trying AMD-proprietary settings. Tomb Raider, anyone? Mantle?

And the idea that AMD is seen as less proprietary or any more open is equally laughable. AMD would ship stripped .o files to partners while Nvidia sent out source. Instead of working on an open low-level standard, AMD asked developers to write yet another code path for an already over-stratified set of platforms.

Optimization is an overall problem on PC games because there are far too many sets of hardware to be optimized on all of them all of the time. So yes, when Nvidia invests a large amount of resources into making sure their codepath is optimized, that time isn’t just taken from Nvidia- it’s taken from the developers. Yes, it’s sad that the market is in such a state than in order to have a phenomenal looking game (if you can even consider Watch Dogs that), it’s going to only be phenomenal on a specific set of hardware. Need I remind people that Watch Dogs had already been delayed?

Speaking of which, have these people seen Watch Dogs? It’s not exactly much to talk about. Frankly it doesn’t seem well optimized in general, but that’s just me.

Anonymous Coward says:

Re: Re:

Considering that AMD have managed to leverage a display standard in order to improve frame sunchronisation rates (Freesync using DisplayPort 1.2a) and openly distribute thier GPU drivers for Linux, and they aren’t actively harming consumers’ collective choice, I’d say that AMD have the upper hand here.

That’s not to say that Watch Dogs isn’t horribly-optimised, because it is. But I woudl give more latitude to AMD over NViodia in the GPU space over the decisions it makes.

DB (profile) says:

Disclaimer: I’m a little biased on this topic, but I’m fairly well informed. Evaluate for yourself.

The press battle talks about this as tools, which is a nebulous term. It’s really pretty much libraries, and education on how to use those libraries.

Nvidia spent years and huge bucks on developing the techniques to make these effect, and more money writing and tuning the code necessary to implement the ideas. Many movie and game effects weren’t written by the studios, they came from Nvidia.

AMD does the same thing. Or I should say used to do the same. When they ran into a cash crunch they stopped investing. ATI was still very profitable, but rather than investing in the long-term market position it was used as a cash cow to subsidize the money-losing parts of AMD (the entire rest of the company, including the Global Foundry wafer-start commitments). It was probably needed to save the company, but you still have to look at it as cutting R&D.

What’s happening now is what is supposed to happen in a competitive marketplace. Companies that invest in the next generation have a more advanced product and a competitive advantage. Companies that don’t invest, or invest in the wrong area, end up with a less desirable product.

AMD chose to invest in next generation console chips. They won every major console, displacing Nvidia GPU IP. Nvidia invested in visual computing, including libraries that do realistic rendering, simulating massive numbers of objects, flame effects, fog effects, hair, facial and gesture realism, etc. AMD has.. TressFX for hair.

These features add huge value to games. Or I should say figuring out how do these things is innovative and inventive. Being able to do these effect in real time is astonishing. People don’t know how to put a price on these features. But they can compare two implementations, and buy the one that does the best job. Or pay the same for two that have equivalent performance.

In the GPU business the hard metric has long been compute speed and FPS (frames per second). That’s easy to measure. But increasingly customers have realized that driver quality and feature support — the expansive part of the GPU business — is far more important. It’s hard to measure.. until features like this makes a huge difference that can be measured in FPS.

Anonymous Coward says:

Re: Re: Response to: Anonymous Coward on May 30th, 2014 @ 9:11pm

Installing nvidia drivers on linux is an extreme pain, at least when you are learning how to linux (me 3 years ago), especially when Ubuntu 12.04 didn’t have a drivers manager thing (which made me leave Ubuntu forever), had to boot in recovery mode in root prompt to install drivers for my 2 geforce 230gt working in sli.

Anonymous Coward says:

Re: Re: Re: Response to: Anonymous Coward on May 30th, 2014 @ 9:11pm

Ubuntu does and DID have a drivers manager thing; nVidia prevented them from using it for drivers for nVidia.

At 12.04 ubuntu, you had the restricted drivers manager, (Separate from the manager under software sources), to manage proprietary drivers, which wasn’t palatable enough to nVidia, so they nixed the auto-configs to force you to install from command line. Take in mind this was still while nVidia was officially ‘working with’ linux.

Still, at that point, you could still have installed the nouveau (non-proprietary) drivers from the restricted drivers manager.

rasz_pl (user link) says:

Sure they are

Just like that time they told Ubisoft to remove DX 10.1 or else they would be excluded from “meant to be played” money for free program:

http://www.bit-tech.net/news/hardware/2008/05/12/ubisoft-caught-in-assassin-s-creed-marketing-war/1

or that time they “helped” Crytek with tesselation coincidently just as they released GPU with super fast tesselation performance:

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2

Or when they were shipping Physx compiled with 586/x87 target instead of using modern SSE/AVX instructions.

http://arstechnica.com/gaming/2010/07/did-nvidia-cripple-its-cpu-gaming-physics-library-to-spite-intel/

BTW Watch Dogs on consoles is optimized for AMD GPU PERFECTLY. But as soon as that same x86 codebase moves from console to PC it gets NVIDIA Gameworks “upgrade” and BAM there is your result.

“means to be played” is NVIDIA program that “certifies” your game and slaps 2 second Nvidia banner every time you run said game. Nvidia pays developers for it like its just an ad impression, but if you look deeper in the contract it gets pretty iffy. Nvidia gives you money, but also tells you how to optimize, what features to implement, and how. They not only advise, they directly give you code to inject into your product. Code that makes competitors hardware appear slower.

Intel used to do the very same thing to AMD with their ICC compiler. Compiler injected piece of code that checked CPU vendor string EVERY TIME your code ran. Change vendor string = program gets faster ON SAME HARDWARE.

http://www.osnews.com/story/22683/Intel_Forced_to_Remove_quot_Cripple_AMD_quot_Function_from_Compiler_

FTC forced Intel to stop this practice. Who will force Nvidia to stop??

DB (profile) says:

Actually, almost all of those situations make the opposite point than you suggest.

For instance, Nvidia helped Crytek use a new sophisticated feature of their upcoming GPUs. Nobody was mislead. It was simply “Use this technique to get better visual quality. It’s was too expensive to do with old GPUs, but we now have the hardware and software advances to make it feasible.”

How is that legitimately bad for anyone but competitors?

And “They way it’s meant to be played” program.. Nvidia advises game creators how to make their game faster by looking at effectively it’s utilizing the GPU, and moving some of the physical simulation from the CPU to GPU. They aren’t making the game slower for AMD users, they are making the game faster for Nvidia users.

How is that legitimately bad for anyone but competitors?

It’s not at all like the Intel compiler generating deliberately bad code for AMD processors. It wasn’t a case of not generating special extension instructions such as SSE. Telling the compiler to optimize for AMD resulted in code that would run slower everywhere. Telling it to optimize for generic Intel processors, without SSE, resulted in code that ran fast on AMD as well. That’s trying to trip your competitor rather than running faster yourself.

Anonymous Coward says:

Re: Re:

You mean like they “fixed” DX10 where displacement mapping (essentially tessealtion) was stripped out, because Nvidia was unable to make a compatible GPU in the required timeframe? Coincidently, that was the reason ATI’s radeon HD 2800 series performed like arse, because it was built for the un-neutered DX10.

Nvidia is one of the most shady businesses in the current Hardware industry.

Oh and also, they use paid shills to pertuate the “AMD makes horrible drivers” myth (btw, it were nvidia drivers that actively destroyed hardware because they fucked up fan control, and I also remember the statistics from microsoft were nvidia driver were responsible for about 10 times as many system crashes as AMD/ATI drivers) and generally badmouth AMD hardware, especially when new GPU releases are upcoming.

Fact is, Nvidia lies and cheats, they have been for a very long time now.

Michael Donnelly (profile) says:

Market forces should handle this.

1) Ubisoft knew the product would under-perform on AMD hardware of a similar class. They chose to release it this way.

2) People buying the game with such hardware will be disappointed. A large number of them will understand point #1.

3) Such folks will become reluctant to buy further Ubisoft games.

The normal pain feedback cycle applies here, unlike in many other non-competitive situations (broadband, anything related to movies/records, etc). If Ubisoft can make more money pissing off 40% of their target market, great. If not, they’ll work harder to make sure the performs well on both chipsets.

Laying any blame at nVidia’s feet (or AMD’s) is silly. Ubisoft makes the call, Ubisoft reaps the results. They don’t have a monopoly on the market and Watch Dogs isn’t big enough to make people switch display adapters.

DB (profile) says:

G-sync was an innovation.. an actual invention.

I don’t use those words casually.

It’s something completely obvious, but only in retrospect. Before there was a solution didn’t know it was a problem.

The usual competition in the GPU wars is an immediate claim of “that doesn’t matter”, followed by implementing a similar feature a year or two later, when it suddenly matters.

A good example is fixing frame rate variability, which caused jerky game play even with high FPS rates.

The reaction to G-sync was ‘no one cares’ followed a few days later by ‘[oh shit]’ and a mad scramble by AMD to find something.. anything.. that would have the same benefits.

AMD figured out an alternate approach. The AMD approach is not quite as good, but it still gets most of the benefit. It was possible to reprogram the latest AMD hardware to put out variable sync, and could be quickly added to an existing VESA standard.

AMD would not have done this on their own. The motivation was an innovation by Nvidia. AMD was strictly reacting.

looncraz (profile) says:

Re: Re:

Variable frame-rate displays have been around in various forms for decades. The only thing new is that is is being done on LCD instead of CRT or projector.

Also, it takes far longer to get amendments to a standard ratified than it does to build a frame buffering/repeating device to fake variable frame rate display. Yes, fake. LCD tech requires refresh at certain intervals to maintain the image, so the nVidia solution is to replay the old image at that next interval, then play the new image as soon as it is ready (even if an LCD refresh isn’t needed).

This is just display-level double buffering – something that really shouldn’t require much extra display logic or hardware. You can keep the last completed frame in the frame buffer of the video card and time updates relative to the forced display refresh (20hz as an example). If you can get a completed frame to the frame buffer before the next forced refresh, you update the frame buffer and trigger a display refresh with a signal to the display (only new display ‘logic’ required). This update resets the clock until the next required refresh.

If you can’t make the deadline, you simply leave the frame buffer intact and wait until the refresh, then you update the buffer again and trigger a display redraw.

nVidia’s solution is crude and invasive – the only purpose it has is to make it appear like they are doing something special or innovative.

rasz_pl (user link) says:

Technically no one is prevented. Nvidia simply will not “certify” your game as “meant to be played”, and in effect WONT PAY YOU the developer a huge bribe^^^ad revenue.

This is not a case of Nvidia helping optimize, they are paying developers for CRIPPLING their products.

You dont get to sell millions of copies as a graphic card bundle http://www.geforce.com/getwatchdogs by simply optimizing, you bend over backwards to please your slave master.

Grrr says:

A biased post. Here is the Nvidia response

From the Forbes article linked in a previous comment:

To put this particular argument to bed, I told Cebenoyan I wanted crystal clear clarification, asking ?If AMD approached Ubisoft and said ?We have ideas to make Watch Dogs run better on our hardware,? then Ubisoft is free to do that??

?Yes,? he answered. ?They?re absolutely free to.?

And there?s nothing built in to GameWorks that disables AMD performance? ?No, never.?

Perhaps more fascinating was Nvidia?s response when I flipped the situation around. What about AMD-partnered titles like Battlefield 4 and Tomb Raider? How much lead time did Nvidia receive ? and how much would they need ? to optimize Nvidia GPUs for those games? While I didn?t receive a direct answer, what I got was Nvidia returning fire.

?It varies. There have been times it?s been more challenging because of what we suspect stems from deals with the competition,? Cebenoyan says. ?It doesn?t happen often. But when it does there?s a fair amount of scrambling on our part. I can tell you that the deals that we do, and the GameWorks agreements, don?t have anything to do with restricting anyone?s access to builds.?

Leave a Reply to Watchit Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...