Home General Discussion

Introducing the GeForce GTX 680 GPU

polycounter lvl 17
Offline / Send Message
chrismaddox3d polycounter lvl 17
http://www.geforce.com/whats-new/articles/introducing-the-geforce-gtx-680-gpu?sf3580994=1/
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&N=-1&isNodeId=1&Description=gtx+680&x=0&y=0
I am still on a 8800gt 512mb video card my self, this GTX 680 is just looking good
Anyone plan to get this?
I see newegg has some for sale and coming soon from the link i posted above.
Wanting to get a DX11 card but for $499 range i may get a older DX11 card that should be cheaper now.

Replies

  • 3DLee
    Options
    Offline / Send Message
    If you're in the US/Canada, NewEgg has had clearance GTX 480s for $210 recently. If you set up an email alert on slickdeals.net you'll get a notification if they go on sale again.
  • chrismaddox3d
    Options
    Offline / Send Message
    chrismaddox3d polycounter lvl 17
    3DLee wrote: »
    If you're in the US/Canada, NewEgg has had clearance GTX 480s for $210 recently. If you set up an email alert on slickdeals.net you'll get a notification if they go on sale again.

    Yep i get emails from newegg all the time, they have decent sales.
    I will have to try the slickdeals email sign up sounds good.
    I was thinking about a GTX 560 ti card for $200 range or little less if i wait.
    Thinking all video card prices will drop since the GTX 680 came out this week.
  • Tokusei
    Options
    Offline / Send Message
    Tokusei polycounter lvl 10
    in the same boat as you chris, hoping to pick up a dx11 card this week as my current gpu is locking up if i have it on for too long. hoping the prices on the 400/500 series cards drop soon ;)
  • Calabi
    Options
    Offline / Send Message
    Calabi polycounter lvl 12
    I'm tempted by this but supposedly there will be better cards later, better for compute/cuda functions at least.
  • claydough
    Options
    Offline / Send Message
    claydough polycounter lvl 10
    Still can not play metro2033 or crysis in HD at 60fps?
    http://benchmarkreviews.com/index.php?option=com_content&task=view&id=877&Itemid=72
    http://3dvision-blog.com/7516-the-new-nvidia-geforce-gtx-680-kepler-finally-making-an-appearance/

    Crysis_Warhead_Benchmark.jpg


    ( after all isn't metro2033 the new "will it run in crysis?" since crysis 2 )
    Metro-2033_DX11_Benchmark.jpg

    2 gigs of memory?
    Sounds incremental when 1500 was a roadblock for 3dvision surround with Battlfield3's heavy gbuffer requirement.
    I am definitely going to tri sli eventually ( jes cus I am a gear addict and my soul will kill any sleep till I do )
    But I am betting/hoping that there will be a 3gig version eventually just like the 580s. and will upgrade as soon as evga comes out with that version.
    ( I usually prefer reference boards for 3rd party waterblock fun )

    I am hoping that the benchmarks are indicative of drivers that are not close to leveraging the cards power. Otherwise the promise of 4 times the power of fermi was grossly exxagerated? :poly121:

    On the other hand, I still am excited about maxwell which does not seem that far off into the future.
    ( still have my fingers crossed that next generation of consoles have power equal to what is touted in a Haswell/maxwell combination. considering they will both be released probably within a year of each other )
    It would be sad if consoles were crippled performance-wise?? comparitively right out of the gate?
  • m4dcow
    Options
    Offline / Send Message
    m4dcow interpolator
    http://www.geforce.com/whats-new/articles/introducing-the-geforce-gtx-680-gpu?sf3580994=1/
    http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&N=-1&isNodeId=1&Description=gtx+680&x=0&y=0
    I am still on a 8800gt 512mb video card my self, this GTX 680 is just looking good
    Anyone plan to get this?
    I see newegg has some for sale and coming soon from the link i posted above.
    Wanting to get a DX11 card but for $499 range i may get a older DX11 card that should be cheaper now.

    I remember when the 500 series came out, the 1st set of drivers that supported them didn't allow for "High Quality" rendering mode in Maya ie:displaying normal maps and that didn't get rectified until 2-3 months later.

    I think this is a good card, although it was supposed to be the replacement for the 560, but when Nvidia realized it was able to beat/compete with the 7970, they jacked up the price by $200 and named it the 680.
  • claydough
    Options
    Offline / Send Message
    claydough polycounter lvl 10
    Calabi wrote: »
    I'm tempted by this but supposedly there will be better cards later, better for compute/cuda functions at least.
    exactly, mature 28nm will hopefully make for an exciting leap in what will be possible.
  • m4dcow
    Options
    Offline / Send Message
    m4dcow interpolator
    claydough wrote: »
    I am hoping that the benchmarks are indicative of drivers that are not close to leveraging the cards power. Otherwise the promise of 4 times the power of fermi was grossly exxagerated? :poly121:
    The card they were talking about was the GK110 which was going to be the 680 but release quite a bit later, but like I said in my other post GK104 was more than able to compete, so they decided to boost their margins.

    Maybe that means that the 680 drops by quite a bit when the GK110 releases.
  • claydough
    Options
    Offline / Send Message
    claydough polycounter lvl 10
    m4dcow wrote: »
    I remember when the 500 series came out, the 1st set of drivers that supported them didn't allow for "High Quality" rendering mode in Maya ie:displaying normal maps and that didn't get rectified until 2-3 months later.

    I think this is a good card, although it was supposed to be the replacement for the 560, but when Nvidia realized it was able to beat/compete with the 7970, they jacked up the price by $200 and named it the 680.

    That sounds like the same horror story I have been hearing?:poly122:

    For the last 3 months the 560 replacement was supposed to be released first in the $3xx dollar price range. And were touted with the same performance numbers we are seeing with the 680.
    If true, such a flash of contemptous greed makes me want to puke.
  • Kbrom12
    Options
    Offline / Send Message
    Kbrom12 polycounter lvl 8
    my 2GB Palit GTX560 runs whatever I want it to...but I'll definintly be checking these out, maybe I can get a nice upgrade to a 580 for cheap since the 6 series just came out
  • jipe
    Options
    Offline / Send Message
    jipe polycounter lvl 17
    claydough wrote: »
    If true, such a flash of contemptous greed makes me want to puke.
    I would love if the GTX 680 launched at $300, but I don't think it's contemptuous greed on Nvidia's part-- it's just the realities of the free market. The 680 is considerably faster than current $300 cards (and even AMD's top card, which is more expensive), and I'm sure 28nm yields aren't that great right (meaning more production expenses). Expecting a company to completely upend their current product structure for a new release is a bit crazy. When Intel's Ivy Bridge chips debut, they'll slot into similar price points occupied by current Sandy Bridge chips. And if/when Nvidia introduces a larger size Kepler chip (that beats the current 680 in performance), prices will adjust accordingly.
  • chrismaddox3d
    Options
    Offline / Send Message
    chrismaddox3d polycounter lvl 17
    http://www.techspot.com/news/47849-ivy-bridge-e-delayed-until-second-half-of-2013.html
    Talking of Ivy Bridge i read it was delayed until second half of 2013.
    I think this happened because Intel has no competition right now.
  • m4dcow
    Options
    Offline / Send Message
    m4dcow interpolator
    http://www.techspot.com/news/47849-ivy-bridge-e-delayed-until-second-half-of-2013.html
    Talking of Ivy Bridge i read it was delayed until second half of 2013.
    I think this happened because Intel has no competition right now.

    Well thats Ivy Bridge-E, which is a sort of performance oriented version, and considering that Sandy Bridge-E (X79 Chipset) only launched around the beginning of this year it isn't too bad.

    Ivy Bridge for 1155 socket mobos launches April 29th, with some dual core versions launching in June.
  • Seirei
    Options
    Offline / Send Message
    http://www.techspot.com/news/47849-ivy-bridge-e-delayed-until-second-half-of-2013.html
    Talking of Ivy Bridge i read it was delayed until second half of 2013.
    I think this happened because Intel has no competition right now.

    Delaying progress just to make more money... :poly118:

    At this rate I'll never see space!
  • katana
    Options
    Offline / Send Message
    katana polycounter lvl 14
    Still running a nvidia geforce 9800 gtx+..I rebuilt this beast about 5 months ago, and the graphics card was the only thing left to do...I'll look into this card as well.
  • Jesse Moody
    Options
    Offline / Send Message
    Jesse Moody polycounter lvl 17
    I'm good with my EVGA GTX 570 for a while longer... Maybe in another year or 2...
  • claydough
    Options
    Offline / Send Message
    claydough polycounter lvl 10
    All the incremental numbers coming from intel and Ivy have me cold. Considering render-relevant cinebench numbers never show anything other than incremental results. Even with a 980/990x I see nothing close to exciting that would leave me to believe that the added cores from ivy-e would be worth another $900. I am holding out all my hopes for haswell!

    I'm good with my EVGA GTX 570 for a while longer... Maybe in another year or 2...
    Maybe at the end of this year??

    I wonder if earlier rumors that a gtx 780 release will actually pan out?
    I woul certainly jump at the chance for the original chip advertised!

    Hmmm..
    will there be a GTX 780 before the end of the year?
    I would easily tri sli that on a day 1 purchase!!!

    http://wccftech.com/leaked-nvidia-generation-performance-slide-pits-upcoming-flagship-gk100-kepler-based-geforce-gtx-780-gtx-580/
    0908157p020sn1srmmmkln-635x491.png
    Performance was tested in a total of 8 games and Kepler seriously obliterated GTX 580 in every one of them. The chart shows a performance increase ranging from 70% – 130% which is some serious improvement. The GTX 780 would surely be a card worth the wait when it arrives in Q2 2012 but the prices would be higher then the AMD Counterparts which are now expected to release on 22nd December 2011. Also where did the GTX600 Series go? Would Nvidia abandon it like GeForce 300 Series and make it a Mobille only segment.
  • chrismaddox3d
    Options
    Offline / Send Message
    chrismaddox3d polycounter lvl 17
    How long do you think it will be till price drops happen?
    Maybe on a GTX 550 ti or 560 ti model which i like either.
    Just wanting it for DX 11 so i mess around with it in UDK and maybe a little gaming.
    Also i read somewhere about DX 11.1, anyone know much about it?
  • Seirei
    Options
    Offline / Send Message
    I'll probably stay with my 580 GTX until the next console generations hits the market and we have another leap in graphics and even then, only if my actual card falls short performance wise.

    I don't see any practical improvement I could gain from these newer cards.

    Feels like I would waste a lot of money, since I doubt that even a 780 GTX would support whatever the newest Direct X version will be by then. But I'm open for a surprise. :)
  • claydough
    Options
    Offline / Send Message
    claydough polycounter lvl 10
    Seirei wrote: »
    I'll probably stay with my 580 GTX until the next console generations hits the market and we have another leap in graphics and even then, only if my actual card falls short performance wise.

    I don't see any practical improvement I could gain from these newer cards.

    Feels like I would waste a lot of money, since I doubt that even a 780 GTX would support whatever the newest Direct X version will be by then. But I'm open for a surprise. :)

    Considering how much money I might waste.. that is a very important consideration...
    I am hoping the next generation of consoles require a major hardware upgrade.
    ( can the revolution please begin? please? )
    And I would buy expecting my next investment to last throughout that next cycle ( even though I would probably upgrade multiple time in that period, I usually keep a renderfarm of boxes in rotation ) Would be a shame if directx 12 did not come out till maxwell?

    Otherwise I agree. If Maxwell and the nextgen of consoles are being released at the same time around 2014 and no other hardware will support the shading languages used...
    then why bother?
  • Rabbid_Cheeze
    Options
    Offline / Send Message
    Still well above the 8000/9000/200 series in both heat and power draw (or so I've read). Will wait for the 660.
  • Cojax
    Options
    Offline / Send Message
    Cojax polycounter lvl 10
    I would get a few of these cards to replace my 580, but I want to wait for the mega version of this card... kinda like a 590. Will they make a duel gpu card for this series?
  • Kwramm
    Options
    Offline / Send Message
    Kwramm interpolator
    good they're finally bringing down power consumption to be more in line with the ATI cards.... that damn 1000 watt PSU of the ole XPS did make a dent in my power bill.
  • MainManiac
    Options
    Offline / Send Message
    MainManiac polycounter lvl 11
    you fucking kidding me I just got my 560 ti in december
  • R3D
    Options
    Offline / Send Message
    R3D interpolator
    I'll take 50!

    Also, now the 560 Ti will drop in price considerably :D
  • Brendan
    Options
    Offline / Send Message
    Brendan polycounter lvl 8
    I've heard a lot about the 560Ti. Is it worth picking one of these up as a general purpose thing or waiting for the 600/700 cards?
  • Ace-Angel
    Options
    Offline / Send Message
    Ace-Angel polycounter lvl 12
    frell wrote: »
    you fucking kidding me I just got my 560 ti in december

    I don't think ANY generation of cards lasted as long as the 8800 series did before being dropped.
  • MainManiac
    Options
    Offline / Send Message
    MainManiac polycounter lvl 11
    So far from what gameplay i've seen the card is pretty much useless if you already have a 560-ti+ on a SINGLE monitor. Most fps gains are about 40 but theyre well above 60 so it isn't a HUGE visual difference, you just wont hit rock bottom on your min fps.
  • claydough
    Options
    Offline / Send Message
    claydough polycounter lvl 10
    frell wrote: »
    So far from what gameplay i've seen the card is pretty much useless if you already have a 560-ti+ on a SINGLE monitor. Most fps gains are about 40 but theyre well above 60 so it isn't a HUGE visual difference, you just wont hit rock bottom on your min fps.

    That depends on what you consider a demanding game that you would be upgrading for in the first place.

    If you wanted to assure an ideal average of at least 60fps...
    Benchmarks showing metro 2033 at far below that at 1920x1080 is a bit depressing.
    ( as well as some Batman AC benchmarks I have seen as well as Crysis Warhead :( )
    Hopefully the Kepler supercard is just hiding till driver optimizations expose it's true power.

    For a an enthusiast nut who prefers multi-monitor and 3d...
    This is hardly the card that would finally allow me to run at 6050x1080 without any slowdown in 120hz or 3d.

    Thinking ahead for a configuration that could easily power a next geeration console requirement...
    ( The best of what is possible today representing what seems like an incremental upgrade does not inspire confidence in a console generation that might have untold levels of geomerics ray tracing fidelity, expensive soft dynamic shadows whose equally expensive filtering assured accuracy without acne or huge bone counts for mega fun muscle system fantasies )

    Moores law sure does seem to have hit a huge sticky plasma wall... ( between depressing ivy numbers and kepler, I am holding out for haswell and maxwell which do not seem so far off into the future really )
  • claydough
    Options
    Offline / Send Message
    claydough polycounter lvl 10
    fer instance:
    Crysis_Warhead_Benchmark.jpg

    with a standard resolution like 1920x1080 HD the gtx680 only represents 6 more frames a second over a gtx 580 in crysis? ( and still does not reach an ideal rate 60fps unless you overclock... thank god for overclocking )

    A little better distance between the flagship models in metro 2033:
    Metro-2033_DX11_Benchmark.jpg

    but at 45 fps kepler does not seem to represent a huge advancement in hardware power like it was being touted ( 1.5x - 2x the power of fermi? )
  • Andreas
    Options
    Offline / Send Message
    Andreas polycounter lvl 11
    Cannot for the life of me comprehend why people run multiple monitors with the borders running down them... there can't be any immersion at all.
  • MainManiac
    Options
    Offline / Send Message
    MainManiac polycounter lvl 11
    Many eyefinity monitors have very thin borders
  • Andreas
    Options
    Offline / Send Message
    Andreas polycounter lvl 11
    I can understand borderless screens, if such a thing exist, for sure. That would be great. But anything else, especially black... boggles my mind. Maybe for multiplayer, when its more about performing I guess? Or a driving game, if the body of the car is situated at the edges of the screens?
  • leilei
    Options
    Offline / Send Message
    leilei polycounter lvl 14
    Ace-Angel wrote: »
    I don't think ANY generation of cards lasted as long as the 8800 series did before being dropped.
    Geforce256. Infact the mass marketed geforce2mx, geforce4mx held gaming back a bit and made it last longer in the market, even the pisspoor fx5200 made the geforce2 last a bit longer because the fx series was popular and slow at any shader operations, keeping DX7-level hardware support longer until sometime like 2007, the time as that's all shedded when UE3 and Orangebox'd Source started to really come around.
Sign In or Register to comment.