Home Technical Talk

Best Video Card for modeling?

Just curious if a card such as eVGA or ATI would be better for 3d modeling/rendering then a Matrox? What are some advantages and disadvantages of both? Thanks.

Replies

  • EarthQuake
    Matrox still makes video cards?

    High end nvidia or ati cards are your best bet, nvidia just put out a dx10 card even, it all depends on how much money you have to spend. Just saying matrox or ati is silly, also i'm assuming by evga you mean nvidia as they are a third party vendor for the geforce series card.
  • Richard Kain
    Offline / Send Message
    Richard Kain polycounter lvl 18
    Personally, I'd wait. Vista and DirectX 10 are coming soon, and cards to take advantage of those two platforms are beginning to come out. In another year you will be able to get a decent DirectX 10 optimized card for a reasonable price. If you can't wait that long, get something to tide you over. (aka, not expensive) Try one of the more budget-priced ($50 - $75) cards. Black Friday is coming up soon, and a lot of the major electronics retailers are going to have some pretty decent budget cards for cheaps. The specific manufacturer isn't quite as important as the chipset that the card uses. (Nvidia or ATI) There are pros and cons to each, but I'm pretty sure that both are pretty good for modeling/rendering. (they both work well with DirectX, which is usually a rendering option in major modeling software) Generally speaking, ATI is more flexible in its rendering options, but its drivers often lag behind. Nvidia is more rigid, they don't provide as many options when it comes to rendering. But they update their drivers more often, which often results in more frequent performance advantages.

    The more important factor in rendering is memory. The more MB your card has, the more likely it is to perform well. Try to get a 512MB 3D card if you can find one for a reasonable price. More RAM is almost always a good thing for real-time applications, and I've always found that it helps cut down rendering time.
  • Daz
    Offline / Send Message
    Daz polycounter lvl 18
    [ QUOTE ]

    The more important factor in rendering is memory. The more MB your card has, the more likely it is to perform well. Try to get a 512MB 3D card if you can find one for a reasonable price. More RAM is almost always a good thing for real-time applications, and I've always found that it helps cut down rendering time.

    [/ QUOTE ]

    I'm always up for being corrected, but I think this paragraph could be misleading. I'm fairly certain that your video card RAM isn't used by 3D apps for rendering at all, system RAM is.
  • Ryno
    Offline / Send Message
    Ryno polycounter lvl 18
    I believe that is correct, Daz. The graphics card handles on the fly display display issues, whereas rendering is a calculation handled by processor and memory.

    Oh, and DX 10 cards are almost here now: http://www.firingsquad.com/hardware/nvidia_geforce_8800_preview/

    Matrox cards tend to handle 3d quite poorly. Workstation cards are focused on good OpenGL performance, and can really push polygons. Unfortunately, we gamemakers don't always benefit from this as few of our games use Open GL, but use Direct X instead. By purchasing a high end game card that runs Direct X well, we can effectively preview complex shaders, and can see our textures in the 3d package much as they will appear in the game. For those of us working on PC games, we also will be able to play test our artwork in the game itself on our own machine, as our graphics card is designed for games.
  • gamedev
    Offline / Send Message
    gamedev polycounter lvl 12
    I'm usually a fan of Nvidia gaming cards. You can still play games on maxed out settings and enjoy more than decent 3d performance. Nvidia cards and drivers tend to support OpenGL better as well. I'd agree in that system ram is far more important and that waiting for a directX 10 card would be nice. Keep in mind though it will be awhile before there is wide support.
  • Joao Sapiro
    Offline / Send Message
    Joao Sapiro sublime tool
    buy something small and cheap like 4 nvidias 7950 SLI smile.gif
  • Sage
    Offline / Send Message
    Sage polycounter lvl 19
    I kinda have a similar question. I'm wondering which graphics familily of ATI and Nvidia will show normal maps and direct x 9 shaders in the 3d apps viewport. I currently have a geforce 4 ti 4400 128 ddr ram but the thing won't show the direct x 9 shaders or cg shaders. I'm looking for something good and cheap that fits in an AGP slot. Thanks in advance.

    Daz is correct normal graphics cards don't help with rendering. However there a Raytracing video cards that cost a few thousands of dollars that do help with rendering. I doubt they would be good for gamming. When I bought my workstation I thought having a 3D labs Oxygen card would be really nice since it cost 200 usd. But it sucked for games, and had problems showing alpha masks in the viewports. Waste of money.

    Alex
  • Black_Dog
    Offline / Send Message
    Black_Dog polycounter lvl 17
    Sage: a geforce4ti can actually do normalmapping with it's fixed function hardware. If you need to run sm2.0 shaders on top of that, you need an ATI 9x00 or nV 5x00 card or newer. You should be able to find something like a second hand 9600 at silly low prices.

    [ QUOTE ]
    The graphics card handles on the fly display display issues, whereas rendering is a calculation handled by processor and memory.

    [/ QUOTE ]
    Interestingly, there are a few schemes to use graphics cards to accelerate offline rendering. Check out nvidia's gelato, for instance.

    I think we'll be seeing more of that kind of thing down the road.
  • Ryno
    Offline / Send Message
    Ryno polycounter lvl 18
    Sage, make sure that you are running the most current version of Direct X, which is available from Microsoft's downloads. It looks like any Nvidia card from the 6000 series should be able to do DX9 fine. I'm not sure about anything previous to that, but I'm seeing Geforce 6800 GTs on Pricewatch for about $135 right now. Personally, I might step up to a 7950 GT (currently $279) when the 8800s hit the street. I'm guessing within a few months, they'll be down to $230ish.

    I've had really good results with Nvidia based cards for running Max, and have seen some weird stuff with ATIs, due to their drivers. But that's just my experience.
  • Motz
    Offline / Send Message
    Motz polycounter lvl 12
    go above a 6800, you'll thank yourself for spending the extra money. I have a 6800 and it's getting to that stage where it's naggingly slow. But nvidia has always been a great companion to 3dsmax.
  • Sage
    Offline / Send Message
    Sage polycounter lvl 19
    Thanks for the feedback. I can see normal maps fine with my current card when I play games or render out a scene. The card doesn't see real time shaders that require directX 9c or open gl higher than 1.1. Do the latest detonators drivers update those? I can seen some of the directx 9 shaders in XSI and Max but not the useful ones to see how my normal maps look in real time. My card won't see any of the shaders Ben Cloward wrote that require Direct X 9. I did install Direct X 9c since Warhammer 40k required and HL@ required it.


    Alex
  • Daz
    Offline / Send Message
    Daz polycounter lvl 18
    I would definitely concur with you there mate, but the use of the phrase 'cut down render time' could definitely be misleading in the context of a discussion about graphics cards. It certainly to me suggests non realtime rendering smile.gif
Sign In or Register to comment.