Home General Discussion

"Nvidia, stop being a DICK"

Replies

  • ZacD
    Offline / Send Message
    ZacD ngon master
    I have mixed feeling about this. Imagine if Nvidia decided to make Gameworks open sourced and allow AMD to implement the tech for their own. Nvidia would no longer have any incentive to dump money into R&D, because AMD could quickly implement that tech for a much lower cost than what Nvidia spent researching it. If there was no benefit to investing money into R&D, Nvidia would stop doing it, and there would be less competition, and less new and exciting developments. We wouldn't have seen G-Sync or Free Sync, Hair FX, VXGI, Shadow Play, etc, without competition. 

    Also there's a lot of assumptions that Nvidia is pressuring developers to only optimize for Nvidia and to intentionally gimp AMD performance, which there has never been any evidence for. What makes more sense, is that the performance gap might be from developer incompetence/constraints or they developed the game on Nvidia hardware first. 

    Ideally, I'd like to see Nvidia release their for AMD and other developers to use after a timed period of exclusivity. AMD has the play the underdog, the victim, that everything they are doing is for the consumer, it's the only way they can try to get ahead. 
  • ambershee
    Offline / Send Message
    ambershee polycounter lvl 17
    Studios do tend to develop primarily on Nvidia hardware in my experience - this is because Nvidia completely dominate the top end of the GPU market.

    ATI/AMD are somewhat competitive as they compete on a price / performance basis, but they never seem to have a model that ever competes with Nvidia's top end set, they're constantly stuck in catch-up.
  • Skinpop
    Offline / Send Message
    Skinpop polycounter lvl 9
    hopefully vulkan and dx 12 will remedy the situation. 
  • marks
    Offline / Send Message
    marks greentooth
    From a development point of view too, I've seen tons of stuff that simply doesn't work correctly on AMD cards - when most of your team have nvidia in their workstations it makes sense that games may end up better optimised for nvidia cards.
  • m4dcow
    Offline / Send Message
    m4dcow interpolator
    ZacD said:
    I have mixed feeling about this. Imagine if Nvidia decided to make Gameworks open sourced and allow AMD to implement the tech for their own. Nvidia would no longer have any incentive to dump money into R&D, because AMD could quickly implement that tech for a much lower cost than what Nvidia spent researching it. If there was no benefit to investing money into R&D, Nvidia would stop doing it, and there would be less competition, and less new and exciting developments. We wouldn't have seen G-Sync or Free Sync, Hair FX, VXGI, Shadow Play, etc, without competition. 

    The response on this forum is going to be a lot different than on some random gaming forum, because many of us work in the industry, or have gone to GDC.

    Nvidia spends a lot of time and money on R&D, and really have no incentive to open source a lot of this stuff. Many of the technologies used in gameworks started as papers other people or even Nvidia themselves presented at a Siggraph many years ago that was put in an easy package for developers make use of. They are basically trying to find ways to have developers use new tech so people will buy newer more powerful video cards.

    Their drivers tend to be more stable and when they aren't, fixes come relatively quickly.

    They provide many tools and support to the development community, and in many cases make a lot of cool shit possible. ATI used to do the same sort of stuff, after AMD bought them up not so much, tressfx is the only thing I can really think of.

    While one might say having these technologies optimized for AMD and Nvidia would be a good thing for the gaming community, you might have them either not developed at all, or it would fall to middleware companies to innovate and then smaller games would not be able to implement said tech due to cost.

    When AMD actually does R&D on some new tech, and not just recreate all the tech in gameworks, then maybe they can have a talk about opening up APIs.

  • ZacD
    Offline / Send Message
    ZacD ngon master
    I wouldn't say AMD hasn't done anything for R&D, TressFX was out before Nvidia's hairworks, and I'd say it's a bit better. There's also the whole Mantle/Vulkan API, and they were quick to release Free Sync, which is the cheaper option and comparable to G Sync. 

    But I do agree on drivers, what takes Nvidia a day or week to update/fix, it takes AMD a week to a month. 

    And developers can be slow to implement new tech, Remember me (2013) on last gen consoles had capsule for ambient/reflection occlusion, and some advanced reflection/cube features, tech that aren't even implemented into Unreal Engine 4 by the end of 2015. There's still next gen games that are coming out without proper PBR.
  • Rellek
    As an open source developer, as well as one who works on proprietary products I can see the case for both sides. I'm for open sourcing standards what could be considered standard frameworks. It would not be a loss to Nvidia for releasing their R&D software it would improve upon it as you have am and Nvidia essentially wasting money on developing the same software. Instead you would have the R&D of two major companies creating the best standards possible as well as blanket adoption of new features in games/simulations. The only case to use closed software is for market control, as well as source control in that you can freely adapt open source projects into your own. The latter is unethical and goes against licensing clauses, but since no one else can look at your code, no one can sue. Which is the case being made in the video. I'm very sure Nvidia will not shift gears to open source anytime soon, as the current market place will is still dominated be proprietary software via rules on copyright. The one exception would be instances where low cost, extreme performance/tweeting is needed. 
  • m4dcow
    Offline / Send Message
    m4dcow interpolator

    @ZacD I forgot about Mantle and I suppose that's a big one, freesync not so much.

    If they made their tech easy for devs to implement and/or made good tools for them, maybe something like TressFX would be in more than one game so far.

  • echofourpapa
    Offline / Send Message
    echofourpapa polycounter lvl 4
    This isn't the first time Nvidia has drawn such criticism:

    https://www.youtube.com/watch?v=qqCGaSw5Bes

    And to Nvidia's credit, last I checked they've been much more helpful to the open-source driver community since.
  • EarthQuake
    Speaking as someone who works in software dev, working on a real-time renderer, I've lost count of the amount of fixes we've had to implement because something doesn't work on ATI, and we go out of our way to not have 3rd party dependencies, which is to say, we typically write our own rendering features to work on as many GPUs as possible rather than relying on proprietary tech. We develop and test on both nVidia and ATI. If your game/app doesn't work/run well on your ATI card, it's because you bought an ATI card, not because nVidia is a big bad bully.

    The whole premise of this video is asinine and childish. nVidia is mean... because they support developers... and help to make their drivers work better with the dev's software/games? What should nVidia do, purposely cripple their hardware or stop giving support so it isn't so hard for AMD to compete?


  • Macrow
    Offline / Send Message
    Macrow polycounter lvl 8
    More than any advantage from open-source philosophy, I stand by this principle more: A business has the right to free enterprise.  And, more times than not, we benefit more from closed-source development than we credit.

    If NVIDIA's doing something that's hurting their competition, well, that's just too bad.  Welcome to free enterprise.  That's the risk of running a corporation.  This isn't some camp sitting by a fire, singing kumbaya--it's competitive business.

    And AMD opening up their products as open-source doesn't make them some sort of brilliant savior for the future of technology--they're reacting out of little choice but to open up their products.  They can't beat NVIDIA, so they go for the next appeal: Of being for the "open" community, and hoping big names adopt their products without barriers.

    What AMD does isn't necessarily what NVIDIA *should* be doing.  Is this open way working for AMD?  Are they on top any more today than a decade ago?  No offense to AMD--I like their being competition for NVIDIA.  But just looking at NVIDIA, I say, Hey, their business model's working, right?  More times than not, gamers, game developers, and graphics artists alike choose NVIDIA in our graphics-related industry, right?

    NVIDIA's doing just what a company's expected to do: Make money and ensure the future of the company.  They're not out to determine social values, entertain ideals over development philosophy, or abide by one's arbitrary opinion over ethics.  They're there to thrive as a company and provide services that their competition can't.

    I think if NVIDIA had been going all "open-source" with their biggest products, we'd never see the level of consistency and quality we've come to expect from NVIDIA's top-of-the-line products.  The reason NVIDIA's ahead of AMD is because they've secured the kind of partnership among other companies and quality in the graphics industry which no one matches.

    If AMD struggles to match it, then that's not NVIDIA's problem, really.  Neither is it truly the problem of average consumer, either, since the average consumer isn't the one constantly trying to keep with the bleeding-edge of NVIDIA's products like the niche enthusiasts are.  It's the enthusiasts like this YouTuber who are the ones who spend the most money, at a more frequent rate, so naturally, it makes sense for NVIDIA to target them with higher prices for their bigger products--it's the cost of chasing the bleeding-edge.

    NVIDIA has a wide range of products, from affordable to cutting-edge.  To simply change their whole successful business approach, just to please some guy trying to save money on this endless pursuit of better graphics isn't a wise one, on NVIDIA's part.

    Not that it's even apt for every product to go open-source.  Yes, some things benefit from it.  Some things don't.  Sometimes quality's better found closed-source.  You think NVIDIA's Shadowplay feature which this YouTuber raves over would've been achieved through open-sourced means?  Not a chance.  It came from focused, full-time, unified, dedicated development and tons of in-house testing and quality control which only closed-source development affords.

    Open-source has its perks, but often times, you get something like Blender: Too many cooks in the kitchen, no real game plan, ad-hoc contributions with no real standardization, too many ideas and not enough leadership, and trying to master so many aspects that it never truly masters any of one.

    Now, even Blender's a great idea in concept, and a useful tool for various purposes, but in execution, it could stand to benefit from some behind-closed-doors development.  I think Blender's a prime example why open-source isn't always some dream ideal for technology.  It's useful, it's inspiring, it's good for the poorer folks, but it's not leading the big industry any time soon.  Similar could be said of AMD.

    Since NVIDIA hasn't struggled with selling their cutting-edge products (though, funny enough, they've struggled more in selling their mobile chips, because mobile gaming so largely ignores big horsepower), they have little need to change what's working best for them.  If you were a major business, chances are, you'd keep doing what works for you, too.
  • Bek
    Offline / Send Message
    Bek interpolator
    m4dcow said:

    @ZacD I forgot about Mantle and I suppose that's a big one, freesync not so much.

    AMD also gave the Khronos Group full access to their Mantle API so Vulkan could get a huuuuge head start. That's something done for the good of the community rather than a direct monetary incentive.

    If it could be proven nvidia sabotages a games performance on AMD cards that'd be a good reason for buying AMD. Unless you don't care about supporting shitty business practices. But that tessellation stuff doesn't seem deeply malicious, merely petty if done solely to hurt AMD performance. But that seems unlikely given more reasonable explanations.
  • Panupat
    Offline / Send Message
    Panupat polycounter lvl 17
    I don't see the point why NVidia has to try adopt something else when they already have their own thing. OK you can compare AMD to a saint for giving out their technology as open source, but I'm not gonna call NVidia evil just because they don't share...
  • ZacD
    Offline / Send Message
    ZacD ngon master
    Yeah the most likely situation with Crysis 2 is:

    Crytek released Crysis 2. Gamers complained it wasn't as straining on systems as Crysis 1. The small Crytek team left to do the DX11 build had to implement something quickly that brought most computers to their knees, but still ran well on high end systems. They implemented tessellation that may or may not have being provided by Nvidia. Crytek computers probably had Nvidia GPUs, it worked well enough on their rigs, so they shipped it. 

    Obviously they didn't spend much time on the DX11 build of the game, because it looked like most of the displacement maps were just generated from a program like crazy bump. The AMD fanboy conspiracy theory is that Nvidia wanted their cards to run Crysis 2 DX11 benchmarks better so more people bought Nvidia cards. So Nvidia bribed Crytek to use levels of tessellation that would gimp only AMD cards. 
  • AtticusMars
    Offline / Send Message
    AtticusMars greentooth
    Apparently making your tech closed source and working directly with developers to support them in implementing it is called industrial sabotage now?

    AMD's decision to opensource a lot of their tech is admirable but it also makes business sense for a company that has low market share and can't offer the same quality of direct support for their products as Nvidia. AMD and Nvidia are both publicly traded companies with shareholders, the notion that either of them are doing anything motivated by some sort of gamer altruism strikes me as incredibly naive.

    While we're at it we may as well make Nvidia release all of their patents. Patents are inherently a legally granted monopoly, just as bad if not worse than closed source development. Imagine how competitive AMD cards would be if they could just reproduce everything Nvidia did with none of the cost associated with developing or supporting it!
  • Skinpop
    Offline / Send Message
    Skinpop polycounter lvl 9
    Apparently making your tech closed source and working directly with developers to support them in implementing it is called industrial sabotage now?
    Not quite, but as a programmer it annoys me that nvidia does game specific driver optimizations to get the best performance. it means that unless you have a good relationship with nvidia, your product will run worse on their hardware. I don't like the idea of the GPU being a black box which is very much the case right now. With dx 12 and vulkan that should change somewhat. Devs will finally have the api and tools to do what nvidia has been doing all along. it's no coincidence that mantle came from amd, and it wouldn't surprise me if nvidia doesn't like the recent push for close to the metal on the api side of things since it might have a negative impact on their advantage.
  • ambershee
    Offline / Send Message
    ambershee polycounter lvl 17
    Skinpop said:
    Apparently making your tech closed source and working directly with developers to support them in implementing it is called industrial sabotage now?
    Not quite, but as a programmer it annoys me that nvidia does game specific driver optimizations to get the best performance. it means that unless you have a good relationship with nvidia, your product will run worse on their hardware.
    No, it doesn't. Chances are your product already ran better on Nvidia hardware than it did on AMD; this is down to the fact that Nvidia hardware is just plain superior and has more stable driver support, whether we like it or not. Game specific driver optimizations and fixes are just the cherry on the cake that Nvidia provide for popular titles, not due to any special standing between the developer and Nvidia, but because it makes business sense for them to do so; if a high volume of popular product runs better on an Nvidia setup, then customers are going to be buying Nvidia setups.
  • echofourpapa
    Offline / Send Message
    echofourpapa polycounter lvl 4
     I don't know, Nvidia is forgiven for a lot becuase they make great chips, and there is no real competition.  But Apple and Microsoft are all too often villified for doing the same thing.

    And there exists a strong trend to embrace open-source(Dot Net and UE4 being the two biggest examples relevant to the forum, besides possibly Android).

    I get the merits of a closed and carefully currated system, but opening a system has it's own set of merits. 
  • alekseypindrus
    Offline / Send Message
    alekseypindrus polycounter lvl 10
    echofourpapa, UE4 is not an open source.

    Yes, you have access to source code.
    No, you can't redistribute it or use it as you wish.
    No, it's not free to use — royalty or upfront payment needed if you sell your product.

    https://www.unrealengine.com/eula
    https://www.unrealengine.com/faq


  • Bek
    Offline / Send Message
    Bek interpolator
    ambershee said:
      the fact that Nvidia hardware is just plain superior
    Superior to what? ATI/AMD have at various times had the fastest cards available, they've also at times had the cards with the best price/performance ratio. Just like nvidia. Claiming one companies hardware to be always superior seems like fanboyism, unless you're an engineer with hard data to back up the claim that nvidia hardware is 'plain superior'.

    That's also a pretty ridiculous claim if you remember cases where ATI hardware has been 'superior' (for a certain purpose — claiming something to be superior is kinda vague) to nvidia's, like in the early days of crossfire/SLI, or the modern "crossfire" cards with two chips on the one PCB like the 3870x2. What nvidia hardware was superior to that exactly? And I'm not looking to debate minor technical details or argue what was better at x and y, I'm just saying nvidia had no comparable hardware. The 3870x2 was a clear hardware win for ATI; I'm sure there are other examples.
  • gumustdo
    Offline / Send Message
    gumustdo triangle
    My 2 cent about it is.. As a digital artist working on 3D. yeh, software like 3D-Coat and Blender's cycle renderer are benefit from Nvidia's hardware for its 'CUDA' core feature.  Which kinda lock my choice of GPU straight to Nvidia brand.

    And it's already been like that since the day of Quadro things. (Which is nowadays not that important for 3d content creation anymore).

    So other than focusing on consumers, making Technology open-sourced. One other thing AMD should do is to have a tech for us, content creator too.
  • m4dcow
    Offline / Send Message
    m4dcow interpolator
    Bek said:
    ambershee said:
      the fact that Nvidia hardware is just plain superior
    Superior to what? ATI/AMD have at various times had the fastest cards available, they've also at times had the cards with the best price/performance ratio. Just like nvidia. Claiming one companies hardware to be always superior seems like fanboyism, unless you're an engineer with hard data to back up the claim that nvidia hardware is 'plain superior'.

    That's also a pretty ridiculous claim if you remember cases where ATI hardware has been 'superior' (for a certain purpose — claiming something to be superior is kinda vague) to nvidia's, like in the early days of crossfire/SLI, or the modern "crossfire" cards with two chips on the one PCB like the 3870x2. What nvidia hardware was superior to that exactly? And I'm not looking to debate minor technical details or argue what was better at x and y, I'm just saying nvidia had no comparable hardware. The 3870x2 was a clear hardware win for ATI; I'm sure there are other examples.
    While the large performance jumps tended to go to who was due to update their line first, The drivers from nvidia however have almost always been superior due to the support and involvement in the gaming dev community mentioned above.
  • Bek
    Offline / Send Message
    Bek interpolator
    So I am told; I have not noticed it — though my last nvidia gpu was two cards ago. My friends with nvidia cards have had driver issues now and then (specifically with Shadowplay which was mentioned in the video). I've had issues with AMD drivers as well, but I don't have enough data to be swayed one way or the other. Others with more experience might have horror stories which is fine. It'd be interesting if there is some independent 'rate my driver release' service; I wonder if you'd see any 'bias' towards specific OS's or games for either company. 'Bias' is probably the wrong word there as you couldn't distinguish between good/bad coding on the games behalf and good/bad work on the gpu hard/software, but it's till be interesting.

    Thinking about it, the idea that anything is developed specifically for a certain brand is discouraging, that just seems anti-consumer from the beginning. Competition and choice are good for us consumers; I wouldn't want my options limited to X because it's the only thing capable of doing Y.
  • ambershee
    Offline / Send Message
    ambershee polycounter lvl 17
    Bek said:
    ambershee said:
      the fact that Nvidia hardware is just plain superior
    Superior to what? ATI/AMD have at various times had the fastest cards available, they've also at times had the cards with the best price/performance ratio. Just like nvidia. Claiming one companies hardware to be always superior seems like fanboyism, unless you're an engineer with hard data to back up the claim that nvidia hardware is 'plain superior'.

    That's also a pretty ridiculous claim if you remember cases where ATI hardware has been 'superior' (for a certain purpose — claiming something to be superior is kinda vague) to nvidia's, like in the early days of crossfire/SLI, or the modern "crossfire" cards with two chips on the one PCB like the 3870x2. What nvidia hardware was superior to that exactly? And I'm not looking to debate minor technical details or argue what was better at x and y, I'm just saying nvidia had no comparable hardware. The 3870x2 was a clear hardware win for ATI; I'm sure there are other examples.
    Nvidia's top end cards have never been outperformed in benchmarks by an AMD offering in nearly ten hardware generations (which is now how long ago the 3870x2 was, by the way, the GPU market moves fast). Besides, we're talking about now, not eight years ago. AMD have no comparable offering to something like the GTX980ti - none of their hardware even comes close to it in terms of performance. The R9 Fury X is outperformed by the 780, and is outperformed in terms of both performance and price by the GTX970.



    Like I said before in one of my earlier posts, AMD competes in terms of price and performance, not only in GPUs, but also CPUs. The top-end cards are the sole domain of Nvidia, and this is why developer workstations are almost exclusively running on Nvidia hardware (ergo, the discussion in this thread). Unfortunately for AMD, they are now also now starting to lose the mid-tier market to Nvidia, and have also completely lost the low-end market to Intel.
  • Shrike
    Offline / Send Message
    Shrike interpolator
    It always has been that AMD offered more price/performance and Nvidia / Intel offered the better high end,
    in the low or lower middle class, AMD is far more cost effective, their integrated CPU/GPU solutions are also far better,
    thats seems kinda their market, not sure if they try so hard getting high end GPUs out.
  • Bek
    Offline / Send Message
    Bek interpolator
    ambershee said:
      none of their hardware even comes close to it in terms of performance.
    Er, that benchmark you linked shows a r9 fury x with a score of 8229 and a 780 ti with a score of 8976, for a difference of 747 points between them. That seems pretty comparable to me (though obviously from that chart the 780ti is the better buy with a higher score + lower price by $30). But to say nothing comes close seems hyperbolic. I also seem to remember during the bitcoin mining craze amd cards were preferred due to their raw number crunching power / power usage, which is hardly a sign of inferior hardware. Though I've no sources to back that up.

    Also not to be an ass but you can't say "never" and then add a time limit. It's either never or 'hasn't in...'. But good point, the 3870x2 was 8 years ago. It's a shame one company has dominated the top-tier for that long.
  • Marine
    Offline / Send Message
    Marine polycounter lvl 19
    ambershee said:
    AMD have no comparable offering to something like the GTX980ti - none of their hardware even comes close to it in terms of performance. The R9 Fury X is outperformed by the 780, and is outperformed in terms of both performance and price by the GTX970.
    At standard resolutions sure. But at 4k, which I'd consider high-end, they are competitive with the 980ti http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/14 
  • ambershee
    Offline / Send Message
    ambershee polycounter lvl 17
    In the article you linked, the Fury comes out lower than the 980 in every comparison except Far Cry 4? It does perform reasonably in several of the comparisons though. The verdict at the end is that the 980 is just a plain superior card for the money:

    Once you get to a straight-up comparison, the problem AMD faces is that the GTX 980 Ti is the safer bet. On average it performs better at every resolution, it has more VRAM, it consumes a bit less power, and NVIDIA’s drivers are lean enough that we aren’t seeing CPU bottlenecking that would impact owners of 144Hz displays. To that end the R9 Fury X is by no means a bad card – in fact it’s quite a good card – but NVIDIA struck first and struck with a better card, and this is the situation AMD must face. At the end of the day one could do just fine with the R9 Fury X, it’s just not what I believe to be the best card at $649.
    Regarding the info in my previous post and the points I was making with those benchmarks:
    1) The R9 Fury X is outperformed by Nvidia's GTX780 Ti - the latter was released in 2013 where the former was released in 2015. I was pointing out that according to those benchmarks, AMDs current best offering is comparable to what Nvidia brought to the table over two years ago.
    2) Nvidia's current high end offering is of course not the GTX780 Ti, it is the GTX980 Ti, which obliterated the R9 Fury X in these benchmarks. Whilst the 980 Ti is not actually close to the top end of the benchmark table (which is Nvidia all the way up) it is the best performing consumer orientated card. The R9 Fury X is AMD's top performing card, so I really don't think it's 'hyperbolic' to say nothing comes close when the differences in those scores was so drastic.
    3) For price / performance the R9 Fury X is a fairly poor choice versus the GTX970, which is both cheaper and much better. It does not appear in the comparison image because Passmark only allows me to compare three cards at once.
  • MmAaXx
    Offline / Send Message
    MmAaXx polycounter lvl 10
    Macrow said:

    Open-source has its perks, but often times, you get something like Blender: Too many cooks in the kitchen, no real game plan, ad-hoc contributions with no real standardization, too many ideas and not enough leadership, and trying to master so many aspects that it never truly masters any of one.

    Now, even Blender's a great idea in concept, and a useful tool for various purposes, but in execution, it could stand to benefit from some behind-closed-doors development.  I think Blender's a prime example why open-source isn't always some dream ideal for technology.  It's useful, it's inspiring, it's good for the poorer folks, but it's not leading the big industry any time soon.  Similar could be said of AMD.
    seriously, are you comparing Blender with strategic choice of super huge corporation such AMD and Nvidia?
    You obviously know nothing about open source, I give you a short list of opensource programs and gems that define a standard:
    OpenGL, Linux, bullet physics, Android, Firefox, Chrome, Libre Office, LSCM (most used unwrap tech), Ptex, opensubD, Allembic, Flac, etc etc

    As you can see opensource doesn't mean fail idea of business, most of the time thanks to opensource code you can actually play with cool features in your expensive softwares.

    cheers.
  • Shrike
    Offline / Send Message
    Shrike interpolator
    Theres a big difference between community sourced development and making your "finished" work open source , like AMD does
  • echofourpapa
    Offline / Send Message
    echofourpapa polycounter lvl 4
    @alekseypindrus   I guess there's a bit of a different between OSS and FOSS, UE4 is open-source(it's on Github), anyone can freely download and modify.  You can't redistribute, and Epic wants payment for a released product using UE4, but that doesn't make it closed-source software.  
    And, if I'm not mistaken, they will consider community pull requests when appropriate.  

    Either way, like @Shrike  said, Nvidia doesn't need to make anything community developed, and 
    open-source(see, Chrome, Red Hat, UE4, Android) is not the same thing as  community developed FOSS(Chromium, CentOS, OGRE, CyanogenMod).


    Opening up software/systems/process can make the difference between being a leader people want to follow vs a leader people have to follow.  Epic has made that transition IMO, and it would be great more places, like Nvidia, would do the same.
  • Aabel
    Offline / Send Message
    Aabel polycounter lvl 6
    Aren't all the current consoles running AMD gpu's? If I were an executive at Nvidia I would be doing everything I could to make sure the PC version of as many games as possible look several orders of magnitude better than they do on console. Even if those PC's are running AMD hardware.

    Restricting game works to Nvidia gpu's slows adoption of the technologies and creates bitterness in the PC gaming community. Nvidia is slowly starting to allow certain game works tech to run on AMD hardware. I hope they continue, it would be nice for PC devs to feel confident in using Nvidia software technology without having to worry about alienating a small portion of the market.
  • JedTheKrampus
    Offline / Send Message
    JedTheKrampus polycounter lvl 8
    The whole UE4 thing is why the free software community has the term "source available." There's a certain subset of software licenses that officially qualifies as "open source" and a slightly different subset that officially qualifies as "free software", and UE4 is neither of them for a variety of reasons including the royalty. So, it's proprietary software where you can get and modify the source code and get those changes integrated back into the engine.

    You can read all about it here if you're interested. http://www.gnu.org/philosophy/free-software-for-freedom.en.html
Sign In or Register to comment.