Hello to all,
I made several post in different CG/3D software forums trying to find solution for my problem ....but nothing yet. So ill post here too...and im hoping someone could help me :poly121:
I bought new rig before 2-3 days. i7 2600k ; GF 560 Ti ; 16 GB Corsair 1600hz ; 750W Corsair;Asus P8P67-M Pro etc..
I have problem with maya viewports .They are very slow , just creating simple cone ...smoothing it out 4 time by 2 divisions ....which gave me around 2,5 mil tris and my frame rate in default and high quality rendering is between 4-6 FPS , which is very low . ITs very laggy.I seach in the net , for the solutions ....try everything and nothing work so far ....i tried every possible driver for my GF and nothing worked.I tried same scene on my old rig which is with Radeon 4890 series ( two generations older card ) and i get 20 FPS.Its like maya viewport don’t even use my GPU,actually i tried and un-install nvidia drivers ....try same scene without any driver only generic VGA and get exact same result....in other hand i made a lot of benchmarking and my games and score are exactly what would expect from this GPU.I tried same scene in Max trail version,and again same thing 3-4 frames in OpenGL for 2 mil tris , thats just shame.I Re-installed windows twice .....on second time i format the HDD ,installed fresh windows and first thing that i did was to install only nvidia drivers nothing else and then install maya ....and nothing ...same thing..
I`m very confused.So i need help from master yoda ! :poly142: Please
Thanks from now!
Replies
I remember way back when I used to use Maya it was always an issue trying to find just the right driver that wouldn't cause viewport errors or performance issues. Have you tried all the different driver versions?
In any case, good luck!
Never buying an Nvidia again. Its switched around, ATI/AMD are the best graphics cards for our use these days since Nvidia is forcing you to buy Quadros for game art.
@OP: Hope you get your problem fixed.
Okay so .... i contacted Nvidia support with 3 pages report Thats was before i found out about the problem , telling them all my finding and all test i did .... that 2 generation old ati kick their last generation Geforce in the nuts and that there is whole community angry about last two generation of cards ....btw truly when i un-install my drivers and open maya only with generic vga ,and replicate same scene i have exact same rate....bottom line we don`t have any support from GPU in viewports...its like i don`t have GPU at all...so before some minutes i get response from Nvidia , they told me they did`t know about this bug ....thought i told them its replicated many times from different ppl and all 4 and 5 series have it .. still they don`t know about it , and want to help them replicate the issues so to be passed to Nvidia labs .....so ill be doing this now... if i have any news ill replay
is someone have radeon 6950 please let me know what frame rate you have ..
Thanks
Geexile, I tried with their report form and never got a reply. Hell, I spoke with one of the reps at PAX and he checked my settings and admitted it was a driver issue. But the problem he said that all such things were prioritized for the Quadro. He could put in a request/notice, but he couldn't guarantee any fix. Since new drivers have come and gone since then with the same issue present. I can only assume he didnt follow through, or they ignored him.
I KNOW its a driver issue because Nitro (and Open GL in my case) work fine. You want some really weird shit. There drivers are sooo whacked, that my card will crash in Max if I have it open too long at full screen when I have it on the laptop screen. I can see the card getting to 60 degree Celsius and Bam. The card crashes. Have to reboot to enable it. If I don't maximize Max, it wont get as hot. But what turns it for a loop? If I use the external monitor for max alone, full resolution. The card doesn't get as hot nor crashes.
Thats just how badly unoptimized at least the M versions are. Optimus my ass...
A little googling brings me to this. I'm not sure if this method still works since the guide is 2+ years old now. Unfortunately I don't have a 400 or 500 series card to test it out on.
EDIT:
Never mind a little more research says this isn't possible with modern Nvidia cards.
I have a GTX570 and haven't had any problems, but then again I don't use Max and can't specifically attest to anyone else's problems.
While it probably is the case that Nvidia has crippled the Geforce cards for certain things, they at least work most of the time, and don't turn into the shitfest that is ATI/AMD with 3d stuff. I know there are plenty of people who have those cards working without issue, but when someone has some crazy erratic behaviour with a 3D application that ends up being hardware/driver related more often than not it is AMD/ATI.
Hopefully this next generation ATI/AMD can their driver act together as it relates to 3D development, but there isn't really that incentive for them, so I will believe it when I see it.
I wish I could help more though but we mostly got Quadros at work. It's definitely an interesting issue. Sucks on nVidia's part if they really do this on purpose.
So thanks anyway to everyone ,ill let you know how it goest with Radeon
Thanks
Windows 7 64 bit, Q6600 and 8gb ram. I can double check the driver version I'm running but pretty sure I updated it to the one BF3 forced me to get.
@oxynary: I don't use max as a modeling package but just using it to play with maxscript and analyzing scripts. Havent had any problems.
I did the same thing as your scene.. cone smoothed to 2.6mil polys and was getting about 8fps in maya 2012 with a gtx580. then changed the display to viewport 2.0 and fps runs more around the 180fps mark
even smoothed to 42mil polys it runs 15fps under viewport 2.0 :O
Anyway i`m waiting for my new radeon to arrive in next 2-3 days....i already give back the 560Ti , when i was investing in so expensive cards i was hopping everything to be smooth...not to try found workarounds .... and spend energy on technical stuff... so decide to try my luck with radeon
let us know how the Ati card runs the same scene
I choose Nvidia over ATI because of how often they update drivers and also add in additional functionality/features, like SSAO to games that don't normally support it. Also big perf. increases in new drivers for new games and/or old games.
.
]doesn't seem to be a new issue.[/url]
I'm planning to get a new rig in february so I'll probably just go with AMD. I'm a little concerned about the PHYSX performance though. Not that I can't play games without it but it would be interesting to play with as an aspiring game fx artist.
Sorry i didn't replay earlier ,but i had a lot of stuff to do.
So i have a lot of information to share....starting first with compare test for both cards,and after that ill give official info form Nvidia it self
So first of all i`m using exact same system,same windows,same software.Only thing different are the cards and obviously drivers
Cards been tested on Maya 2012,Max 2012 trail version , Blender , Unity , UDK , Mudbox,PhotoShop,MarmoSet Toolbag.Thats from software side,sorry i was not able to test Softimage,Cinema etc. I just don`t have them and i`m not familiar with this programs but still all 3d creation software have almost the same results.And from games i tested on Rage,Skyrim,DeusEx and DeadSpace2.That all new games that i have on my hdd....don`t have a lot of time to play
So i`m using default setting for both cards ,however even after a lot of changes on setting ...there is almost none impact on performance within 3D software.
I`m comparing
Gigabyte GF GTX 560 Ti, 1GB N560OC-1GI Windforce Factory Clock
VS.
Asus AMD/Ati Radeon 6950, 1GB , EAH6950 DCII Factory Clock
The price difference is very small,just a couple of bugs in my case.
Same scene tested in Max and Maya.
2,6mil Tris
Default (OpenGL) Rendering
Max - GF:3-4 FPS ; RA:38-39 FPS
Maya - GF : 4-6 FPS ; RA:44-46 FPS
So there is it , the Radeon is total beast and brutally kill the Geforce.I test in all games,they had almost same frame rates.....radeon had with 2-3 frames more.
Some other pluses for Radeon,4 DisplayPorts and 2 DVI (one of them can be made to HDMI) which means you can use up to 6 monitors . And because is using displayport i was able to free my DVI from my monitor and hook up my PS3 However thats pretty specific to the manufacturer.So far i tested all the software mentioned in the beginning with 11.12 driver version, and everything runs great ....i did`t encounter a single glitch..11.12 looks stable for now.So bottom line is , if you looking for consumer graphic card,with great performance in viewports and games...... like me than i would defiantly chose Radeon (Why!? well check the scores that i posted )
If you using constantly software that need CUDA,like MARI ....or if you want professional card and have a lot of money to spend ,than i guess Quadro would be better.Anyway im pretty happy about this card .
So never the lees,i continued to communicate with Nvidia.....there is a lot of people that use Geforce....or will use Geforce...so i felt that i have to find as much information as possible...at least if help someone with that info...will be cool
So i helped Nvidia to replicate the problems in Maya and Max,with detailed information and we communicate a lot,so here is last email that i received ....its quote ..and it came directly from nvidia ....
So...if i get update ...ill let everyone know...but it was officially confirmed by Nvidia too..So if someone is going to buy card,i think would be safer to go for Quadro,or for less money for Radeon
Hope that help
Cheers
I'm glad you were able to get your problem solved though
It sounds like you talked to the same guy I did at Pax. He basically said the same thing. I think pointing out that as game developers, we work on the same consumer hardware as the end user would be prudent to making our case. They seem to want to combine us into the CAD/CAM and realtime simulation camp. We are not. If they knew their own audience better, especially we the creators who make consumers want to purchase Nvidia cards to display all the neato bits....
Its like you almost need to go above tech support and talk to a suit to attempt to make them understand they are shooting themselves in the foot. Maybe someone at a bigger company like Valve (which they would ignore since Valve went ATI). A head honcho calling one of their VP explaining how stupid their policy is..
Did you get his direct email or just general? Maybe if we start an email campaign just showing how prevalent consumer cards are used in our industry, it might help this guy make a case to get the priority changed.
From the beginning this guy let me know ,that there is no bug reported until now regarding the topic in Nvidia Bug Database.I believe thats because people are reporting poor performance in viewports without compare to anything, and Nvidia respond to this without even looking ....by default
"Buy Quadro" And don`t even boder to add it like bug or...look at it.
Because they don`t use 3D Apps and they can`t know what frame rate is expected.Thats why i emphasized on the huge difference with radeon card and older geforce cards ( i basically give them in every mail ,radeon results ) Which force them to actually test radeon card and see the difference with their own eyes.BTW from the beginning i let them know that there is whole community with this problems.Some of them are making games,same games for which consumers buy nvidia cards .But i think they did`t care for this as much as they cared about AMD/ATI make them look "pathetic" But thats not important,important is to have results....if that works for them and can stimulate them to dig deeper...then cool.I think there will be answer soon enough.If i have any news ill let you know.
@EmAr : Thanks
Talk about lame... I was trying to upgrade from my Quadro FX 3500 that I bought in 2007. On my test scene file the GTX 560 Ti performed about the same, if not slightly worse. I was dumbfounded... Benchmarking sites rank the 560 as 6 times faster than my old card. Next time I'll seek out package specific benchmarks I think.
anyhow, thanks again
I just submitted a question to Nvidia also saying how poor their cards performance was compared to ATI consumer level cards in Maya/Max.
Hopefully they get enough complaints that they do something about their drivers
So far i was working all week with my new radeon,and 11.12 looks stable enough ....no problem what so ever . I still don`t have any news from nvidia , but if they decide to answer ill post the answer here.
@mezza550 If you going to buy same radeon,and try to tested same way....one think to keep in mind is that immediately after you smooth the object you will get around 20 FPS for several seconds , keep rotating the camera for some seconds and FPS will jump to around 45-47 FPS . I don`t know why is that,probably there is VerticalSync somewhere turned on ,but i could`t find it . So i was trying literary to slow down the Radeon until reaches the FPS of 560 Ti,and i couldn't believe it !!! Check for you self here is image,and it still rotating ...its not crashing or anything...i was thinking that maya will crash...but no.
Only with around 1-2 FPS lower this Radeon manage to handle around 20x times more polys ....2.5 milion VS. 51mil Thats in Default Viewport not in Viewport 2.0.
I'm really disappointed with my 560ti's performance, really sad to see it choke with relatively simple scenes
Realtime shaders?
Since your getting less FPS even with non shader work with Nvidia consumer cards. Which would correlate to even more slowdown with larger scenes. Yes; don't get Nvidia consumer cards.
I have a 555M in my new laptop, and can't get Blender to work correctly, I have screen issues with menus showing up and stuff. You're telling me this is cause nVidia deliberately fucked with the card? MOTHERFUCKERS
It makes me think that performance might be a combination of the particular 3d app you are using + the GPU rather than it just being the GPUs fault.
Right. It has always been the other way around. People having issues with ATI, their shitty drivers, etc. And nVidia being the recommended GPU to use with 3D apps. This sounds like bad drivers on nVidia's part, and it being an issue that will be solved shortly. So far, I have zero issues with my GTX 570.
In fact, I've always turned people away from ATI, because of stability issues I've had with them in the past.
Bull. This issue, at least the Mobile version has been present for at least 2 years, as that is how far back I went on drivers. I also tried a 335m and 540M. BOTH exhibited ths same issue. This is NOT GPU generation dependent.
It is a driver issue. But if you read the first pages Nvidia response, they have no plans to do anything about it. So unless there is something your not sharing about your connection with nVidia workings. Your speaking ignorantly.
As far as ATi, yes. THey did have shit drivers at one point. But the past few years since AMD has taken over the drivers and cards have gotten to the point of surpassing the nVidia for our use. The people who parrot the nVidia cards are out of touch.
Andreas. Try whitelisting the exe for Generations in the nVidia menu.
I think if a campagin were organized it would be best to target both houses ( not only nvidia but autodesk )
A line in the sand ( supporting the game industry or claiming to support the game industry means consumer card support )
I was wanting to upgrade from my msi 8800gt 512mb card soon.
Was looking around on newegg and saw a XFX Double D HD-695X-CDFC Radeon HD 6950 for
$249 after rebate, never been a fan of ati but this could change.
Anyone think nvidia will fix this open gl issue soon? just wanna get dx11 card soon.
This also means the 6xxx series will go down in price very soon.
I have 2 cards 8800GT and 550ti. 8800GT works perfect with all versions of 3ds max, and it's even faster on the latest 2012. 550ti is faster than 8800GT tested on the older Max9, but when I run Max 2012, the slow viewport problem is there... so it's a problem between the new cards and new software... hope it's only a bug in the drivers.
no drivers have fixed this issue for me yet..
gtx580 3gb still low framerates with higher poly counts in viewport.
though using viewport 2.0 is alot faster and necessary when you go over a few million polys
400 Series was quit good compare to 500 series
i dont think so. if anything the 600series may have worse performace due to less compute power that the 500series?
http://download.autodesk.com/us/qualcharts/2011/maya2011_qualifiedgraphics_win.pdf
I'm pretty sure I remember seeing 400 series Nvidia cards on some list at some point, but they seem to have a new system where it just shows professional cards.
Even when they did certify consumer hardware, it was only for certain versions of drivers, because the card makers have a habit of breaking things.
I remember just before the 500 series released, Nvidia issued a driver update which broke High Quality mode in Maya. This wasn't an issue for people with 400 series cards who could just step back a version, but people with 500 series cards had to wait a couple weeks for a driver update because the only versions that supported their cards had the bug in it.
As for the 600 series/Kepler not being supported. I'm pretty sure Autodesk just doesn't want anyone bitching at them like with the 500 series. The 1st Kepler Quadro was only just announced a few days ago K5000, so I'm sure that will be supported in time and any quirks with Kepler architecture should filter down to the consumer cards, plus with the post above the 600 series seems to be working fine for some people.
i know it isnt on the certified hardware list, but it will work, the 680 cards are what im curious about because autodesk even specifically mentions it is not compatible for maya, as it states here, http://usa.autodesk.com/adsk/servlet/index?siteID=123112&id=16820654
the question is what is it incompatible of.
however, it makes no mention for the gtx 580 cards, so presumably they work well enough, just not as good as the quadros that are on the certified list.