If you're in the US/Canada, NewEgg has had clearance GTX 480s for $210 recently. If you set up an email alert on slickdeals.net you'll get a notification if they go on sale again.
If you're in the US/Canada, NewEgg has had clearance GTX 480s for $210 recently. If you set up an email alert on slickdeals.net you'll get a notification if they go on sale again.
Yep i get emails from newegg all the time, they have decent sales.
I will have to try the slickdeals email sign up sounds good.
I was thinking about a GTX 560 ti card for $200 range or little less if i wait.
Thinking all video card prices will drop since the GTX 680 came out this week.
in the same boat as you chris, hoping to pick up a dx11 card this week as my current gpu is locking up if i have it on for too long. hoping the prices on the 400/500 series cards drop soon
( after all isn't metro2033 the new "will it run in crysis?" since crysis 2 )
2 gigs of memory?
Sounds incremental when 1500 was a roadblock for 3dvision surround with Battlfield3's heavy gbuffer requirement.
I am definitely going to tri sli eventually ( jes cus I am a gear addict and my soul will kill any sleep till I do )
But I am betting/hoping that there will be a 3gig version eventually just like the 580s. and will upgrade as soon as evga comes out with that version.
( I usually prefer reference boards for 3rd party waterblock fun )
I am hoping that the benchmarks are indicative of drivers that are not close to leveraging the cards power. Otherwise the promise of 4 times the power of fermi was grossly exxagerated? :poly121:
On the other hand, I still am excited about maxwell which does not seem that far off into the future.
( still have my fingers crossed that next generation of consoles have power equal to what is touted in a Haswell/maxwell combination. considering they will both be released probably within a year of each other )
It would be sad if consoles were crippled performance-wise?? comparitively right out of the gate?
I remember when the 500 series came out, the 1st set of drivers that supported them didn't allow for "High Quality" rendering mode in Maya ie:displaying normal maps and that didn't get rectified until 2-3 months later.
I think this is a good card, although it was supposed to be the replacement for the 560, but when Nvidia realized it was able to beat/compete with the 7970, they jacked up the price by $200 and named it the 680.
I am hoping that the benchmarks are indicative of drivers that are not close to leveraging the cards power. Otherwise the promise of 4 times the power of fermi was grossly exxagerated? :poly121:
The card they were talking about was the GK110 which was going to be the 680 but release quite a bit later, but like I said in my other post GK104 was more than able to compete, so they decided to boost their margins.
Maybe that means that the 680 drops by quite a bit when the GK110 releases.
I remember when the 500 series came out, the 1st set of drivers that supported them didn't allow for "High Quality" rendering mode in Maya ie:displaying normal maps and that didn't get rectified until 2-3 months later.
I think this is a good card, although it was supposed to be the replacement for the 560, but when Nvidia realized it was able to beat/compete with the 7970, they jacked up the price by $200 and named it the 680.
That sounds like the same horror story I have been hearing?:poly122:
For the last 3 months the 560 replacement was supposed to be released first in the $3xx dollar price range. And were touted with the same performance numbers we are seeing with the 680.
If true, such a flash of contemptous greed makes me want to puke.
my 2GB Palit GTX560 runs whatever I want it to...but I'll definintly be checking these out, maybe I can get a nice upgrade to a 580 for cheap since the 6 series just came out
If true, such a flash of contemptous greed makes me want to puke.
I would love if the GTX 680 launched at $300, but I don't think it's contemptuous greed on Nvidia's part-- it's just the realities of the free market. The 680 is considerably faster than current $300 cards (and even AMD's top card, which is more expensive), and I'm sure 28nm yields aren't that great right (meaning more production expenses). Expecting a company to completely upend their current product structure for a new release is a bit crazy. When Intel's Ivy Bridge chips debut, they'll slot into similar price points occupied by current Sandy Bridge chips. And if/when Nvidia introduces a larger size Kepler chip (that beats the current 680 in performance), prices will adjust accordingly.
Well thats Ivy Bridge-E, which is a sort of performance oriented version, and considering that Sandy Bridge-E (X79 Chipset) only launched around the beginning of this year it isn't too bad.
Ivy Bridge for 1155 socket mobos launches April 29th, with some dual core versions launching in June.
Still running a nvidia geforce 9800 gtx+..I rebuilt this beast about 5 months ago, and the graphics card was the only thing left to do...I'll look into this card as well.
All the incremental numbers coming from intel and Ivy have me cold. Considering render-relevant cinebench numbers never show anything other than incremental results. Even with a 980/990x I see nothing close to exciting that would leave me to believe that the added cores from ivy-e would be worth another $900. I am holding out all my hopes for haswell!
Performance was tested in a total of 8 games and Kepler seriously obliterated GTX 580 in every one of them. The chart shows a performance increase ranging from 70% 130% which is some serious improvement. The GTX 780 would surely be a card worth the wait when it arrives in Q2 2012 but the prices would be higher then the AMD Counterparts which are now expected to release on 22nd December 2011. Also where did the GTX600 Series go? Would Nvidia abandon it like GeForce 300 Series and make it a Mobille only segment.
How long do you think it will be till price drops happen?
Maybe on a GTX 550 ti or 560 ti model which i like either.
Just wanting it for DX 11 so i mess around with it in UDK and maybe a little gaming.
Also i read somewhere about DX 11.1, anyone know much about it?
I'll probably stay with my 580 GTX until the next console generations hits the market and we have another leap in graphics and even then, only if my actual card falls short performance wise.
I don't see any practical improvement I could gain from these newer cards.
Feels like I would waste a lot of money, since I doubt that even a 780 GTX would support whatever the newest Direct X version will be by then. But I'm open for a surprise.
I'll probably stay with my 580 GTX until the next console generations hits the market and we have another leap in graphics and even then, only if my actual card falls short performance wise.
I don't see any practical improvement I could gain from these newer cards.
Feels like I would waste a lot of money, since I doubt that even a 780 GTX would support whatever the newest Direct X version will be by then. But I'm open for a surprise.
Considering how much money I might waste.. that is a very important consideration...
I am hoping the next generation of consoles require a major hardware upgrade.
( can the revolution please begin? please? )
And I would buy expecting my next investment to last throughout that next cycle ( even though I would probably upgrade multiple time in that period, I usually keep a renderfarm of boxes in rotation ) Would be a shame if directx 12 did not come out till maxwell?
Otherwise I agree. If Maxwell and the nextgen of consoles are being released at the same time around 2014 and no other hardware will support the shading languages used...
then why bother?
I would get a few of these cards to replace my 580, but I want to wait for the mega version of this card... kinda like a 590. Will they make a duel gpu card for this series?
good they're finally bringing down power consumption to be more in line with the ATI cards.... that damn 1000 watt PSU of the ole XPS did make a dent in my power bill.
So far from what gameplay i've seen the card is pretty much useless if you already have a 560-ti+ on a SINGLE monitor. Most fps gains are about 40 but theyre well above 60 so it isn't a HUGE visual difference, you just wont hit rock bottom on your min fps.
So far from what gameplay i've seen the card is pretty much useless if you already have a 560-ti+ on a SINGLE monitor. Most fps gains are about 40 but theyre well above 60 so it isn't a HUGE visual difference, you just wont hit rock bottom on your min fps.
That depends on what you consider a demanding game that you would be upgrading for in the first place.
If you wanted to assure an ideal average of at least 60fps...
Benchmarks showing metro 2033 at far below that at 1920x1080 is a bit depressing.
( as well as some Batman AC benchmarks I have seen as well as Crysis Warhead )
Hopefully the Kepler supercard is just hiding till driver optimizations expose it's true power.
For a an enthusiast nut who prefers multi-monitor and 3d...
This is hardly the card that would finally allow me to run at 6050x1080 without any slowdown in 120hz or 3d.
Thinking ahead for a configuration that could easily power a next geeration console requirement...
( The best of what is possible today representing what seems like an incremental upgrade does not inspire confidence in a console generation that might have untold levels of geomerics ray tracing fidelity, expensive soft dynamic shadows whose equally expensive filtering assured accuracy without acne or huge bone counts for mega fun muscle system fantasies )
Moores law sure does seem to have hit a huge sticky plasma wall... ( between depressing ivy numbers and kepler, I am holding out for haswell and maxwell which do not seem so far off into the future really )
with a standard resolution like 1920x1080 HD the gtx680 only represents 6 more frames a second over a gtx 580 in crysis? ( and still does not reach an ideal rate 60fps unless you overclock... thank god for overclocking )
A little better distance between the flagship models in metro 2033:
but at 45 fps kepler does not seem to represent a huge advancement in hardware power like it was being touted ( 1.5x - 2x the power of fermi? )
I can understand borderless screens, if such a thing exist, for sure. That would be great. But anything else, especially black... boggles my mind. Maybe for multiplayer, when its more about performing I guess? Or a driving game, if the body of the car is situated at the edges of the screens?
I don't think ANY generation of cards lasted as long as the 8800 series did before being dropped.
Geforce256. Infact the mass marketed geforce2mx, geforce4mx held gaming back a bit and made it last longer in the market, even the pisspoor fx5200 made the geforce2 last a bit longer because the fx series was popular and slow at any shader operations, keeping DX7-level hardware support longer until sometime like 2007, the time as that's all shedded when UE3 and Orangebox'd Source started to really come around.
Replies
Yep i get emails from newegg all the time, they have decent sales.
I will have to try the slickdeals email sign up sounds good.
I was thinking about a GTX 560 ti card for $200 range or little less if i wait.
Thinking all video card prices will drop since the GTX 680 came out this week.
http://benchmarkreviews.com/index.php?option=com_content&task=view&id=877&Itemid=72
http://3dvision-blog.com/7516-the-new-nvidia-geforce-gtx-680-kepler-finally-making-an-appearance/
( after all isn't metro2033 the new "will it run in crysis?" since crysis 2 )
2 gigs of memory?
Sounds incremental when 1500 was a roadblock for 3dvision surround with Battlfield3's heavy gbuffer requirement.
I am definitely going to tri sli eventually ( jes cus I am a gear addict and my soul will kill any sleep till I do )
But I am betting/hoping that there will be a 3gig version eventually just like the 580s. and will upgrade as soon as evga comes out with that version.
( I usually prefer reference boards for 3rd party waterblock fun )
I am hoping that the benchmarks are indicative of drivers that are not close to leveraging the cards power. Otherwise the promise of 4 times the power of fermi was grossly exxagerated? :poly121:
On the other hand, I still am excited about maxwell which does not seem that far off into the future.
( still have my fingers crossed that next generation of consoles have power equal to what is touted in a Haswell/maxwell combination. considering they will both be released probably within a year of each other )
It would be sad if consoles were crippled performance-wise?? comparitively right out of the gate?
I remember when the 500 series came out, the 1st set of drivers that supported them didn't allow for "High Quality" rendering mode in Maya ie:displaying normal maps and that didn't get rectified until 2-3 months later.
I think this is a good card, although it was supposed to be the replacement for the 560, but when Nvidia realized it was able to beat/compete with the 7970, they jacked up the price by $200 and named it the 680.
Maybe that means that the 680 drops by quite a bit when the GK110 releases.
That sounds like the same horror story I have been hearing?:poly122:
For the last 3 months the 560 replacement was supposed to be released first in the $3xx dollar price range. And were touted with the same performance numbers we are seeing with the 680.
If true, such a flash of contemptous greed makes me want to puke.
Talking of Ivy Bridge i read it was delayed until second half of 2013.
I think this happened because Intel has no competition right now.
Well thats Ivy Bridge-E, which is a sort of performance oriented version, and considering that Sandy Bridge-E (X79 Chipset) only launched around the beginning of this year it isn't too bad.
Ivy Bridge for 1155 socket mobos launches April 29th, with some dual core versions launching in June.
Delaying progress just to make more money... :poly118:
At this rate I'll never see space!
Maybe at the end of this year??
I wonder if earlier rumors that a gtx 780 release will actually pan out?
I woul certainly jump at the chance for the original chip advertised!
Hmmm..
will there be a GTX 780 before the end of the year?
I would easily tri sli that on a day 1 purchase!!!
http://wccftech.com/leaked-nvidia-generation-performance-slide-pits-upcoming-flagship-gk100-kepler-based-geforce-gtx-780-gtx-580/
Maybe on a GTX 550 ti or 560 ti model which i like either.
Just wanting it for DX 11 so i mess around with it in UDK and maybe a little gaming.
Also i read somewhere about DX 11.1, anyone know much about it?
I don't see any practical improvement I could gain from these newer cards.
Feels like I would waste a lot of money, since I doubt that even a 780 GTX would support whatever the newest Direct X version will be by then. But I'm open for a surprise.
Considering how much money I might waste.. that is a very important consideration...
I am hoping the next generation of consoles require a major hardware upgrade.
( can the revolution please begin? please? )
And I would buy expecting my next investment to last throughout that next cycle ( even though I would probably upgrade multiple time in that period, I usually keep a renderfarm of boxes in rotation ) Would be a shame if directx 12 did not come out till maxwell?
Otherwise I agree. If Maxwell and the nextgen of consoles are being released at the same time around 2014 and no other hardware will support the shading languages used...
then why bother?
Also, now the 560 Ti will drop in price considerably
I don't think ANY generation of cards lasted as long as the 8800 series did before being dropped.
That depends on what you consider a demanding game that you would be upgrading for in the first place.
If you wanted to assure an ideal average of at least 60fps...
Benchmarks showing metro 2033 at far below that at 1920x1080 is a bit depressing.
( as well as some Batman AC benchmarks I have seen as well as Crysis Warhead )
Hopefully the Kepler supercard is just hiding till driver optimizations expose it's true power.
For a an enthusiast nut who prefers multi-monitor and 3d...
This is hardly the card that would finally allow me to run at 6050x1080 without any slowdown in 120hz or 3d.
Thinking ahead for a configuration that could easily power a next geeration console requirement...
( The best of what is possible today representing what seems like an incremental upgrade does not inspire confidence in a console generation that might have untold levels of geomerics ray tracing fidelity, expensive soft dynamic shadows whose equally expensive filtering assured accuracy without acne or huge bone counts for mega fun muscle system fantasies )
Moores law sure does seem to have hit a huge sticky plasma wall... ( between depressing ivy numbers and kepler, I am holding out for haswell and maxwell which do not seem so far off into the future really )
with a standard resolution like 1920x1080 HD the gtx680 only represents 6 more frames a second over a gtx 580 in crysis? ( and still does not reach an ideal rate 60fps unless you overclock... thank god for overclocking )
A little better distance between the flagship models in metro 2033:
but at 45 fps kepler does not seem to represent a huge advancement in hardware power like it was being touted ( 1.5x - 2x the power of fermi? )