Home General Discussion

Slow viewports In Maya with new nvidia card !

1
polycounter lvl 8
Offline / Send Message
Geexile polycounter lvl 8
Hello to all,
I made several post in different CG/3D software forums trying to find solution for my problem ....but nothing yet. So ill post here too...and im hoping someone could help me :poly121:
I bought new rig before 2-3 days. i7 2600k ; GF 560 Ti ; 16 GB Corsair 1600hz ; 750W Corsair;Asus P8P67-M Pro etc..
I have problem with maya viewports .They are very slow , just creating simple cone ...smoothing it out 4 time by 2 divisions ....which gave me around 2,5 mil tris and my frame rate in default and high quality rendering is between 4-6 FPS , which is very low . ITs very laggy.I seach in the net , for the solutions ....try everything and nothing work so far ....i tried every possible driver for my GF and nothing worked.I tried same scene on my old rig which is with Radeon 4890 series ( two generations older card ) and i get 20 FPS.Its like maya viewport don’t even use my GPU,actually i tried and un-install nvidia drivers ....try same scene without any driver only generic VGA and get exact same result....in other hand i made a lot of benchmarking and my games and score are exactly what would expect from this GPU.I tried same scene in Max trail version,and again same thing 3-4 frames in OpenGL for 2 mil tris , thats just shame.I Re-installed windows twice .....on second time i format the HDD ,installed fresh windows and first thing that i did was to install only nvidia drivers nothing else and then install maya ....and nothing ...same thing..
I`m very confused.So i need help from master yoda ! :poly142: Please
Thanks from now!

Replies

  • Ben Apuna
    I thought Nvidia had crippled certain aspects of it's Open GL support with the Geforce 400 and 500 series cards in order to push their pro Quadro cards. That might be what's causing your problems. See what the Blender community has discovered about this issue here. EDIT: or maybe it was this thread. EDIT2: A thread on the issue right here on Polycount.

    I remember way back when I used to use Maya it was always an issue trying to find just the right driver that wouldn't cause viewport errors or performance issues. Have you tried all the different driver versions?

    In any case, good luck!
  • oXYnary
    Offline / Send Message
    oXYnary polycounter lvl 18
    Its more than opengl, they screwed with the Direct X as well. In Max, you cant use the DX mode and rotate the screen without the models popping around on screen.. This is with simple models as well.

    Never buying an Nvidia again. Its switched around, ATI/AMD are the best graphics cards for our use these days since Nvidia is forcing you to buy Quadros for game art.
  • haiddasalami
    Offline / Send Message
    haiddasalami polycounter lvl 14
    @oXynary: Have a 560ti here and nothing wrong with maya on my end.

    @OP: Hope you get your problem fixed.
  • Geexile
    Offline / Send Message
    Geexile polycounter lvl 8
    Yeah well i can totally understand you , i have 4890 and it gives me 4-5 times better performance and never made me any problems . Did someone try 6950 ? I think to change to that eventually ? If someone have one, please can you tell us your frame rate .
    Okay so .... i contacted Nvidia support with 3 pages report :D Thats was before i found out about the problem , telling them all my finding and all test i did .... that 2 generation old ati kick their last generation Geforce in the nuts and that there is whole community angry about last two generation of cards ....btw truly when i un-install my drivers and open maya only with generic vga ,and replicate same scene i have exact same rate....bottom line we don`t have any support from GPU in viewports...its like i don`t have GPU at all...so before some minutes i get response from Nvidia , they told me they did`t know about this bug ....thought i told them its replicated many times from different ppl and all 4 and 5 series have it .. still they don`t know about it , and want to help them replicate the issues so to be passed to Nvidia labs .....so ill be doing this now... if i have any news ill replay
    is someone have radeon 6950 please let me know what frame rate you have ..
    Thanks
  • oXYnary
    Offline / Send Message
    oXYnary polycounter lvl 18
    haiddasalami, try max and get back to me.

    Geexile, I tried with their report form and never got a reply. Hell, I spoke with one of the reps at PAX and he checked my settings and admitted it was a driver issue. But the problem he said that all such things were prioritized for the Quadro. He could put in a request/notice, but he couldn't guarantee any fix. Since new drivers have come and gone since then with the same issue present. I can only assume he didnt follow through, or they ignored him.

    I KNOW its a driver issue because Nitro (and Open GL in my case) work fine. You want some really weird shit. There drivers are sooo whacked, that my card will crash in Max if I have it open too long at full screen when I have it on the laptop screen. I can see the card getting to 60 degree Celsius and Bam. The card crashes. Have to reboot to enable it. If I don't maximize Max, it wont get as hot. But what turns it for a loop? If I use the external monitor for max alone, full resolution. The card doesn't get as hot nor crashes.

    Thats just how badly unoptimized at least the M versions are. Optimus my ass...
  • Geexile
    Offline / Send Message
    Geexile polycounter lvl 8
    I see ....well ill give it a try ...at least for other ppl around ,but i`m changing to ati in this case while i still have time...i spoke earlier with the shop i bought my parts...i did`t know what was the issue still..i guessed is something technical...and they will change the card if i go in next 2 days ...so i think to do that .Meanwhile ill change the cards from my two systems and see what would happen . Thanks oXYnary
  • Entity
    Offline / Send Message
    Entity polycounter lvl 18
    I can confirm about the lackluster performance from newer Nvidia cards. My old 8800 runs circles around the 560ti I have now, and the Ati 5570 card before that performed like a champ as well! I've run into all sorts of problems in Maya with this card, even after I went through different drivers.
  • Ben Apuna
    I remember way waaaay back there was a way to "hack" Geforce cards to use Quadro drivers. Is this still possible? Anyone tried it? Does it help with these issues?

    A little googling brings me to this. I'm not sure if this method still works since the guide is 2+ years old now. Unfortunately I don't have a 400 or 500 series card to test it out on.

    EDIT:

    Never mind a little more research says this isn't possible with modern Nvidia cards.
  • m4dcow
    Offline / Send Message
    m4dcow interpolator
    oXYnary wrote: »
    Never buying an Nvidia again. Its switched around, ATI/AMD are the best graphics cards for our use these days since Nvidia is forcing you to buy Quadros for game art.

    I have a GTX570 and haven't had any problems, but then again I don't use Max and can't specifically attest to anyone else's problems.

    While it probably is the case that Nvidia has crippled the Geforce cards for certain things, they at least work most of the time, and don't turn into the shitfest that is ATI/AMD with 3d stuff. I know there are plenty of people who have those cards working without issue, but when someone has some crazy erratic behaviour with a 3D application that ends up being hardware/driver related more often than not it is AMD/ATI.

    Hopefully this next generation ATI/AMD can their driver act together as it relates to 3D development, but there isn't really that incentive for them, so I will believe it when I see it.
  • Kwramm
    Offline / Send Message
    Kwramm interpolator
    no problems here with ATI. dunno where all the hate comes from. 6970 works stable and fast for me in Maya and Max (so did the 9800 I had years ago inbetween nVidia cards)
    I wish I could help more though but we mostly got Quadros at work. It's definitely an interesting issue. Sucks on nVidia's part if they really do this on purpose.
  • Geexile
    Offline / Send Message
    Geexile polycounter lvl 8
    Well i tried 560 ti on two more totally different systems , one was AMD based other one was Intel based to....and same stuff...so i think that narrow it down to driver issue or Nvidia want to be reacher ,thats just stupid ....i don`t intend to give one bag with money just to be able to work and after that to test my work in game engine....or play games at all....i dont think to buy 2 card just because nvidia are greedy...i never had a problem with ATI....only once i had problem with Marmoset but driver rollback fix the issues ....so i`m going to change my 560ti with 6950..which is with 2 GB not 1 and have 4 Display Ports ,1 HDMI and 2 DVI ports....so i can use 7 monitors !!! its not like nvidia i have only 3 ports and i can`t use more than two monitors ....when you plug up 3rd first one goes off..
    So thanks anyway to everyone ,ill let you know how it goest with Radeon
  • Geexile
    Offline / Send Message
    Geexile polycounter lvl 8
    @haiddasalami Can you please tell me what kind of windows are you using ? Vista or 7 ? 64 bit ....32 bit ?
    Thanks
  • haiddasalami
    Offline / Send Message
    haiddasalami polycounter lvl 14
    Geexile wrote: »
    @haiddasalami Can you please tell me what kind of windows are you using ? Vista or 7 ? 64 bit ....32 bit ?
    Thanks

    Windows 7 64 bit, Q6600 and 8gb ram. I can double check the driver version I'm running but pretty sure I updated it to the one BF3 forced me to get.

    @oxynary: I don't use max as a modeling package but just using it to play with maxscript and analyzing scripts. Havent had any problems.
  • pinkbox
    try using maya viewport 2.0 if your using a newer version of maya. runs alot faster compared to standard. tho viewport 2.0 is really only good for displaying your model and not so good for modelling with :(

    I did the same thing as your scene.. cone smoothed to 2.6mil polys and was getting about 8fps in maya 2012 with a gtx580. then changed the display to viewport 2.0 and fps runs more around the 180fps mark

    even smoothed to 42mil polys it runs 15fps under viewport 2.0 :O
  • Geexile
    Offline / Send Message
    Geexile polycounter lvl 8
    :) Yep i already try that ,im using 2012 and viewport 2.0 is behaving a lot better than 2011 ,but still there is a lot of work on that vuewport ......don`t support dynamics,hair,proper poly counter...etc...Not that i`m using very frequently dynamics or hair ,but there is bigger chance to use those things than CUDA or Phyx so.....
    Anyway i`m waiting for my new radeon to arrive in next 2-3 days....i already give back the 560Ti , when i was investing in so expensive cards i was hopping everything to be smooth...not to try found workarounds .... and spend energy on technical stuff... so decide to try my luck with radeon :)
  • pinkbox
    yup hopefully viewport 2.0 is developed further and can be used instead of the standard display.

    let us know how the Ati card runs the same scene :)
  • Geexile
    Offline / Send Message
    Geexile polycounter lvl 8
    Yep sure thing :) before i give back 560 Ti i made benchmarks to all games and software that i have currently , which includes ...UDK,Mudbox,Maya,Skyrim,Rage ...and so on ...so when i get new card ill post my results. ;)
  • mLichy
    I have used Nvidia for a long time and never had any issues. I have a GTX570 now and love it. Runs basically anything I throw at it, on high settings, just fine.

    I choose Nvidia over ATI because of how often they update drivers and also add in additional functionality/features, like SSAO to games that don't normally support it. Also big perf. increases in new drivers for new games and/or old games.
    .
  • EmAr
    Offline / Send Message
    EmAr polycounter lvl 18
    The sad part is that this [url=http://area.autodesk.com/forum/autodesk-3ds-max/installation---hardware---os/asus-geforce-gtx-480-problems-in-3ds-max-2011/page-1/
    ]doesn't seem to be a new issue.[/url]

    I'm planning to get a new rig in february so I'll probably just go with AMD. I'm a little concerned about the PHYSX performance though. Not that I can't play games without it but it would be interesting to play with as an aspiring game fx artist.
  • Geexile
    Offline / Send Message
    Geexile polycounter lvl 8
    Hello folks,
    Sorry i didn't replay earlier ,but i had a lot of stuff to do.
    So i have a lot of information to share....starting first with compare test for both cards,and after that ill give official info form Nvidia it self :)
    So first of all i`m using exact same system,same windows,same software.Only thing different are the cards and obviously drivers :)
    Cards been tested on Maya 2012,Max 2012 trail version , Blender , Unity , UDK , Mudbox,PhotoShop,MarmoSet Toolbag.Thats from software side,sorry i was not able to test Softimage,Cinema etc. I just don`t have them and i`m not familiar with this programs but still all 3d creation software have almost the same results.And from games i tested on Rage,Skyrim,DeusEx and DeadSpace2.That all new games that i have on my hdd....don`t have a lot of time to play :)
    So i`m using default setting for both cards ,however even after a lot of changes on setting ...there is almost none impact on performance within 3D software.
    I`m comparing
    Gigabyte GF GTX 560 Ti, 1GB N560OC-1GI Windforce Factory Clock
    VS.
    Asus AMD/Ati Radeon 6950, 1GB , EAH6950 DCII Factory Clock

    The price difference is very small,just a couple of bugs in my case.

    Same scene tested in Max and Maya.
    2,6mil Tris
    Default (OpenGL) Rendering
    Max - GF:3-4 FPS ; RA:38-39 FPS
    Maya - GF : 4-6 FPS ; RA:44-46 FPS

    So there is it , the Radeon is total beast and brutally kill the Geforce.I test in all games,they had almost same frame rates.....radeon had with 2-3 frames more.
    Some other pluses for Radeon,4 DisplayPorts and 2 DVI (one of them can be made to HDMI) which means you can use up to 6 monitors . And because is using displayport i was able to free my DVI from my monitor and hook up my PS3 :) However thats pretty specific to the manufacturer.So far i tested all the software mentioned in the beginning with 11.12 driver version, and everything runs great ....i did`t encounter a single glitch..11.12 looks stable for now.So bottom line is , if you looking for consumer graphic card,with great performance in viewports and games...... like me than i would defiantly chose Radeon (Why!? well check the scores that i posted :) )
    If you using constantly software that need CUDA,like MARI ....or if you want professional card and have a lot of money to spend ,than i guess Quadro would be better.Anyway im pretty happy about this card .

    So never the lees,i continued to communicate with Nvidia.....there is a lot of people that use Geforce....or will use Geforce...so i felt that i have to find as much information as possible...at least if help someone with that info...will be cool :)
    So i helped Nvidia to replicate the problems in Maya and Max,with detailed information and we communicate a lot,so here is last email that i received ....its quote ..and it came directly from nvidia ....
    Response via Email 12/15/2011 10:57 AM

    Thanks for the update.
    I just got an update from the lab and it appears we were able to easily replicate this poor viewport performance with Maya. In fact we are actually seeing worst fps, only 2-3 fps on our test system. We did also compared with Radeon and frankly we look pathetic. I escalated the bug to engineering for review and it was immediately flagged as low priority since this is a consumer card. The GeForce card is not targeted for professional CAD/DCC applications like Maya and 3ds Max. We recommend the Quadro cards for such applications since they are better optimized for those applications and will be much faster. Both engineering and marketing team have started to comment on the bug and so far no one is surprised by the poor fps and that this is likely expected result for a GeForce card.

    I'm steering everyone toward the fact that we are so much slower than a AMD/ATI card, which is also a consumer card. I still don't know which way this bug will go, either not a bug and this is expected since it's a GeFroce card or they can decide that we are too far behind AMD/ATI and invest in more performance tuning. I've submitted a bug for both Maya and 3ds Max, and the low fps have been confirmed. I don't know what's next but both bugs are still under review. I'll update as soon as there is any new development.

    So...if i get update ...ill let everyone know...but it was officially confirmed by Nvidia too..So if someone is going to buy card,i think would be safer to go for Quadro,or for less money for Radeon ;)
    Hope that help
    Cheers
  • EmAr
    Offline / Send Message
    EmAr polycounter lvl 18
    Thanks for the info. I was suspicious when some manufacturers like XFX switched to "AMD only" when DX 11 cards were first coming out. Then nVidia had the lock-up problem and now this...

    I'm glad you were able to get your problem solved though :)
  • oXYnary
    Offline / Send Message
    oXYnary polycounter lvl 18
    Geexile wrote: »
    Response via Email 12/15/2011 10:57 AM

    Thanks for the update.
    I just got an update from the lab and it appears we were able to easily replicate this poor viewport performance with Maya. In fact we are actually seeing worst fps, only 2-3 fps on our test system. We did also compared with Radeon and frankly we look pathetic. I escalated the bug to engineering for review and it was immediately flagged as low priority since this is a consumer card. The GeForce card is not targeted for professional CAD/DCC applications like Maya and 3ds Max. We recommend the Quadro cards for such applications since they are better optimized for those applications and will be much faster. Both engineering and marketing team have started to comment on the bug and so far no one is surprised by the poor fps and that this is likely expected result for a GeForce card.

    I'm steering everyone toward the fact that we are so much slower than a AMD/ATI card, which is also a consumer card. I still don't know which way this bug will go, either not a bug and this is expected since it's a GeFroce card or they can decide that we are too far behind AMD/ATI and invest in more performance tuning. I've submitted a bug for both Maya and 3ds Max, and the low fps have been confirmed. I don't know what's next but both bugs are still under review. I'll update as soon as there is any new development.

    It sounds like you talked to the same guy I did at Pax. He basically said the same thing. I think pointing out that as game developers, we work on the same consumer hardware as the end user would be prudent to making our case. They seem to want to combine us into the CAD/CAM and realtime simulation camp. We are not. If they knew their own audience better, especially we the creators who make consumers want to purchase Nvidia cards to display all the neato bits....

    Its like you almost need to go above tech support and talk to a suit to attempt to make them understand they are shooting themselves in the foot. Maybe someone at a bigger company like Valve (which they would ignore since Valve went ATI). A head honcho calling one of their VP explaining how stupid their policy is..

    Did you get his direct email or just general? Maybe if we start an email campaign just showing how prevalent consumer cards are used in our industry, it might help this guy make a case to get the priority changed.
  • Geexile
    Offline / Send Message
    Geexile polycounter lvl 8
    @oXYnary Nope i`m communicating with them trough Bug Report System directly from Nvidia website.
    From the beginning this guy let me know ,that there is no bug reported until now regarding the topic in Nvidia Bug Database.I believe thats because people are reporting poor performance in viewports without compare to anything, and Nvidia respond to this without even looking ....by default
    "Buy Quadro" And don`t even boder to add it like bug or...look at it.
    Because they don`t use 3D Apps and they can`t know what frame rate is expected.Thats why i emphasized on the huge difference with radeon card and older geforce cards ( i basically give them in every mail ,radeon results :) ) Which force them to actually test radeon card and see the difference with their own eyes.BTW from the beginning i let them know that there is whole community with this problems.Some of them are making games,same games for which consumers buy nvidia cards .But i think they did`t care for this as much as they cared about AMD/ATI make them look "pathetic" :) But thats not important,important is to have results....if that works for them and can stimulate them to dig deeper...then cool.I think there will be answer soon enough.If i have any news ill let you know.

    @EmAr : Thanks :)
  • mezza550
    Offline / Send Message
    mezza550 polycounter lvl 12
    That is so nuts. Thanks so much for doing the research and posting it Geexile. I'm in the process of returning a GTX 560 Ti also and will replace it with the one that you did also.

    Talk about lame... I was trying to upgrade from my Quadro FX 3500 that I bought in 2007. On my test scene file the GTX 560 Ti performed about the same, if not slightly worse. I was dumbfounded... Benchmarking sites rank the 560 as 6 times faster than my old card. Next time I'll seek out package specific benchmarks I think.

    anyhow, thanks again
  • pinkbox
    Thanks for testing the difference between these 2 cards.

    I just submitted a question to Nvidia also saying how poor their cards performance was compared to ATI consumer level cards in Maya/Max.

    Hopefully they get enough complaints that they do something about their drivers >_<
  • Geexile
    Offline / Send Message
    Geexile polycounter lvl 8
    Hello,yep sure no problem.
    So far i was working all week with my new radeon,and 11.12 looks stable enough ....no problem what so ever . I still don`t have any news from nvidia , but if they decide to answer ill post the answer here.
    @mezza550 If you going to buy same radeon,and try to tested same way....one think to keep in mind is that immediately after you smooth the object you will get around 20 FPS for several seconds , keep rotating the camera for some seconds and FPS will jump to around 45-47 FPS . I don`t know why is that,probably there is VerticalSync somewhere turned on ,but i could`t find it . So i was trying literary to slow down the Radeon until reaches the FPS of 560 Ti,and i couldn't believe it !!! Check for you self here is image,and it still rotating ...its not crashing or anything...i was thinking that maya will crash...but no.
    1.jpg

    Only with around 1-2 FPS lower this Radeon manage to handle around 20x times more polys ....2.5 milion VS. 51mil :) Thats in Default Viewport not in Viewport 2.0.
  • Entity
    Offline / Send Message
    Entity polycounter lvl 18
    Nice to see you're getting good performance out of the Radeon. Trick is to find the right drivers and then you're all set.

    I'm really disappointed with my 560ti's performance, really sad to see it choke with relatively simple scenes :/
  • dempolys
    If you're solely worried about shaders and don't care about games, does ATI vs Nvidia really matter? Like is one a terrible choice?
  • oXYnary
    Offline / Send Message
    oXYnary polycounter lvl 18
    dempolys wrote: »
    If you're solely worried about shaders and don't care about games, does ATI vs Nvidia really matter? Like is one a terrible choice?


    Realtime shaders?

    Since your getting less FPS even with non shader work with Nvidia consumer cards. Which would correlate to even more slowdown with larger scenes. Yes; don't get Nvidia consumer cards.
  • Andreas
    Offline / Send Message
    Andreas polycounter lvl 11
    Ben Apuna wrote: »
    I thought Nvidia had crippled certain aspects of it's Open GL support with the Geforce 400 and 500 series cards in order to push their pro Quadro cards. That might be what's causing your problems. See what the Blender community has discovered about this issue here. EDIT: or maybe it was this thread. EDIT2: A thread on the issue right here on Polycount.

    I remember way back when I used to use Maya it was always an issue trying to find just the right driver that wouldn't cause viewport errors or performance issues. Have you tried all the different driver versions?

    In any case, good luck!

    I have a 555M in my new laptop, and can't get Blender to work correctly, I have screen issues with menus showing up and stuff. You're telling me this is cause nVidia deliberately fucked with the card? MOTHERFUCKERS
  • Ben Apuna
    Well the story is kind of strange as far as I can tell. IIRC when Modo 501 came out people were complaining about ATIs performance and crashing while the people with Nvidias were having great experiences.

    It makes me think that performance might be a combination of the particular 3d app you are using + the GPU rather than it just being the GPUs fault.
  • Will Faucher
    Offline / Send Message
    Will Faucher polycounter lvl 12
    Ben Apuna wrote: »
    Well the story is kind of strange as far as I can tell. IIRC when Modo 501 came out people were complaining about ATIs performance and crashing while the people with Nvidias were having great experiences.

    It makes me think that performance might be a combination of the particular 3d app you are using + the GPU rather than it just being the GPUs fault.

    Right. It has always been the other way around. People having issues with ATI, their shitty drivers, etc. And nVidia being the recommended GPU to use with 3D apps. This sounds like bad drivers on nVidia's part, and it being an issue that will be solved shortly. So far, I have zero issues with my GTX 570.

    In fact, I've always turned people away from ATI, because of stability issues I've had with them in the past.
  • Andreas
    Offline / Send Message
    Andreas polycounter lvl 11
    I am (was?) an nVidia fanboy, but this recent 'Optimus' bullshit has left me without a camp to go to; just bought Sonic Generations on steam, and it won't recognise the card because of it; insists on using my i7. Looks fine, but not getting the frame rate I want.
  • oXYnary
    Offline / Send Message
    oXYnary polycounter lvl 18
    Prophecies wrote: »
    it being an issue that will be solved shortly.

    Bull. This issue, at least the Mobile version has been present for at least 2 years, as that is how far back I went on drivers. I also tried a 335m and 540M. BOTH exhibited ths same issue. This is NOT GPU generation dependent.

    It is a driver issue. But if you read the first pages Nvidia response, they have no plans to do anything about it. So unless there is something your not sharing about your connection with nVidia workings. Your speaking ignorantly.

    As far as ATi, yes. THey did have shit drivers at one point. But the past few years since AMD has taken over the drivers and cards have gotten to the point of surpassing the nVidia for our use. The people who parrot the nVidia cards are out of touch.

    Andreas. Try whitelisting the exe for Generations in the nVidia menu.
  • claydough
    Offline / Send Message
    claydough polycounter lvl 10
    oXYnary wrote: »
    It sounds like you talked to the same guy I did at Pax. He basically said the same thing. I think pointing out that as game developers, we work on the same consumer hardware as the end user would be prudent to making our case. They seem to want to combine us into the CAD/CAM and realtime simulation camp. We are not. If they knew their own audience better, especially we the creators who make consumers want to purchase Nvidia cards to display all the neato bits....

    Its like you almost need to go above tech support and talk to a suit to attempt to make them understand they are shooting themselves in the foot. Maybe someone at a bigger company like Valve (which they would ignore since Valve went ATI). A head honcho calling one of their VP explaining how stupid their policy is..

    Did you get his direct email or just general? Maybe if we start an email campaign just showing how prevalent consumer cards are used in our industry, it might help this guy make a case to get the priority changed.

    I think if a campagin were organized it would be best to target both houses ( not only nvidia but autodesk )
    A line in the sand ( supporting the game industry or claiming to support the game industry means consumer card support )
  • claydough
    Offline / Send Message
    claydough polycounter lvl 10
    I would be fine with AMD but for a 3d enthusiast it has been equally as frustrating waiting for their solution to finally mature, gain support and work with eyefinity.
  • chrismaddox3d
    Offline / Send Message
    chrismaddox3d polycounter lvl 17
    After reading this i wonder if i should wait for the 600 nvidia cards.
    I was wanting to upgrade from my msi 8800gt 512mb card soon.
    Was looking around on newegg and saw a XFX Double D HD-695X-CDFC Radeon HD 6950 for
    $249 after rebate, never been a fan of ati but this could change.
    Anyone think nvidia will fix this open gl issue soon? just wanna get dx11 card soon.
  • Entity
    Offline / Send Message
    Entity polycounter lvl 18
    Chris, the 8800gt is a solid card for 3d work, I see no improvements with a 560ti except for gaming. Also, I'd hold on about buying a new card, Ati just/or is about to release their new 7xxx series. From what I've seen that thing is a beast, 30-40% increase in performance (new architecture) from their previous flagship cards.

    This also means the 6xxx series will go down in price very soon.
  • ajr2764
    Offline / Send Message
    ajr2764 polycounter lvl 10
    Well this is an interesting read because I've been putting together a list of parts to build my new system in a few months and I had decided to get a Geforce GTX 550ti. I've always leaned toward nvidia I may have to do some more research.
  • nikkin
    I want to add something important:
    I have 2 cards 8800GT and 550ti. 8800GT works perfect with all versions of 3ds max, and it's even faster on the latest 2012. 550ti is faster than 8800GT tested on the older Max9, but when I run Max 2012, the slow viewport problem is there... so it's a problem between the new cards and new software... hope it's only a bug in the drivers.
  • KonginChains
    Offline / Send Message
    KonginChains polycounter lvl 12
    anyone can update if the gtx 580 cards to this date still have the same problem of abysmal viewport performance speeds, or has the issue been resolved already by a driver update? Otherwise, i guess the next best choice for me would be an ati fire pro v7900. I could've gotten a radeon, but i heard firepro was better for professional graphics while radeon is more preferred for gaming. what you guys think?
  • pinkbox
    @KonginChains

    no drivers have fixed this issue for me yet..

    gtx580 3gb still low framerates with higher poly counts in viewport.
    though using viewport 2.0 is alot faster and necessary when you go over a few million polys
  • gaganjain
    On other note
    Peter Zoppi run few test on his new 680 -4gb card here his views
    i did some non scientific testing with my 680 w/ 4gigs of ram. mudbox seems to handle pretty well in the viewport. i had the default head subdivided up to 32 million polygons and viewport performance was smooth for me. sculpting speed didn't seem to suffer either unless i use a large brush as such a high poly count which i don't do anyway, so for me, mudbox handles well with the card. i haven't done a ton of work because i've been super busy at work, but it seems to work well

    maya's regular viewport is a little sluggish with a high poly count mesh. i imported a 4.3 million poly mesh from mudbox and was getting a few frames per second with the regular viewport, however, viewport 2.0 is lighting fast with no slowdowns with the same mesh. i'm not sure what more I can test at this point. it seems sufficient enough for me currently. i had the gtx580 3gb previously and that served me well.
  • KonginChains
    Offline / Send Message
    KonginChains polycounter lvl 12
    peter zoppi's results are great but still doesn't explain why autodesk labeled gtx 6 series cards as not compatible in maya in their certified hardware faq. i emailed autodesk asking that question but still no reply. i would gladly love to get the newer gtx 680 over the 580, but it also feels like i am risking it, that i might stumble in a compatibility issue with that card that i'd regret in the long run.
  • Marine
    Offline / Send Message
    Marine polycounter lvl 19
    The 580 isn't on the certified hardware list either. Autodesk doesn't officially support consumer cards, never have afaik
  • PolyHertz
    Offline / Send Message
    PolyHertz polycount lvl 666
    Soooo...the OpenGL problems from Fermi cards (400/500 series)are mostly solved with Kepler (600 series) then?
  • gaganjain
    PolyHertz wrote: »
    Soooo...the OpenGL problems from Fermi cards (400/500 series)are mostly solved with Kepler (600 series) then?
    I don't think so
    400 Series was quit good compare to 500 series
  • pinkbox
    PolyHertz wrote: »
    Soooo...the OpenGL problems from Fermi cards (400/500 series)are mostly solved with Kepler (600 series) then?

    i dont think so. if anything the 600series may have worse performace due to less compute power that the 500series?
  • m4dcow
    Offline / Send Message
    m4dcow interpolator
    Marine wrote: »
    The 580 isn't on the certified hardware list either. Autodesk doesn't officially support consumer cards, never have afaik
    They have as per this link
    http://download.autodesk.com/us/qualcharts/2011/maya2011_qualifiedgraphics_win.pdf

    I'm pretty sure I remember seeing 400 series Nvidia cards on some list at some point, but they seem to have a new system where it just shows professional cards.

    Even when they did certify consumer hardware, it was only for certain versions of drivers, because the card makers have a habit of breaking things.

    I remember just before the 500 series released, Nvidia issued a driver update which broke High Quality mode in Maya. This wasn't an issue for people with 400 series cards who could just step back a version, but people with 500 series cards had to wait a couple weeks for a driver update because the only versions that supported their cards had the bug in it.

    As for the 600 series/Kepler not being supported. I'm pretty sure Autodesk just doesn't want anyone bitching at them like with the 500 series. The 1st Kepler Quadro was only just announced a few days ago K5000, so I'm sure that will be supported in time and any quirks with Kepler architecture should filter down to the consumer cards, plus with the post above the 600 series seems to be working fine for some people.
  • KonginChains
    Offline / Send Message
    KonginChains polycounter lvl 12
    Marine wrote: »
    The 580 isn't on the certified hardware list either. Autodesk doesn't officially support consumer cards, never have afaik

    i know it isnt on the certified hardware list, but it will work, the 680 cards are what im curious about because autodesk even specifically mentions it is not compatible for maya, as it states here, http://usa.autodesk.com/adsk/servlet/index?siteID=123112&id=16820654
    the question is what is it incompatible of.

    however, it makes no mention for the gtx 580 cards, so presumably they work well enough, just not as good as the quadros that are on the certified list.
1
Sign In or Register to comment.