I can't name one game that looks a step up from it, looks better than Crysis 3, Farcry, Battlefield 3, Metro, etc.
personally i think it's the lighting model that's much better, and not the actual artwork/game. almost all of the e3 demos looked like they were using a physical based lighting model, something that hasn't been rolled out on pc games just yet.
once games like witcher 3 are available on both console and pc, then you'll see the difference properly.
It's not that great. Take a look at the screenshots from the 'gameplay' thus far (and bear in mind, this is demo gameplay, not real gameplay, so it will look much better than the real game):
The lighting and shadowing is not great - it's adequate, but it's very muddy. Texturing is fairly high resolution, but it's also quite dull. There's no real indication of a decent global illumination setup, nor any kind of reflective setup (look at the nasty generic cube mapping on the reflective panel on the left). The image is also so heavily post processed that it's causing artefacting around the emissive structures (which do not cast any light).
It's a good looking game, that's for sure, but it's hardly beyond anything we already have. From a technical perspective, it's still behind. It's saving graces are the quantity of polygons, light sources and particles being thrown around.
You know i had to. Max Settings Crysis 3. Ingame. This is what we have right now. But next gen is not only about graphics. It's how comfertable it is to acctually just put in a game. And start playing without installing this and that.
However from a Graphic Standpoint. I dont think the new Consoles will be years ahead of High end pc. Heck i even think they are behind.
Who cares about graphics I care about gameplay, and that has stagnated for years.
I dont know if its because imaginations or abilities have declined, but we're still playing dull games where you wonder down corridors and your only form of expression is to shoot and kil things. Thats what most of E3 was, no wonder gaming is declining.
Who cares about graphics I care about gameplay, and that has stagnated for years.
I dont know if its because imaginations or abilities have declined, but we're still playing dull games where you wonder down corridors and your only form of expression is to shoot and kil things. Thats what most of E3 was, no wonder gaming is declining.
We've had this discussion for years, every e3 someone watches a trailer of a game he doesn't like and gets tunnel vision while posting about a doomed and stagnant games industry.
If we were to remove the last 10 years of games we would suffer a shit ton of fantastic games and progressions.
It's largely gameplay that sells for me, which is why my consoles are gathering dust and my games library is increasingly filling with indie boxes instead of AAAs. I love a good looking game, but if it's just more mindless wandering through corridors with a gun, I'll usually pass.
Graphics wont hit such an exponential rise in quality. Improvements aren't linear, they're diminishing. And it's curving off right now in a big way.
A sizable chunk of our budget will be going into true 1080p Which from the naked human eye is really only discernable when they're compared side by side on TV's larger than 50". Completely irrelevant to most of the public.
High End PC is where we'll be. Better shader models, better lighting, and better post processing.
Aka: marginal improvents in graphics.
Using a series of obviously CrossGen screenshots is a very inaccurate comparison.
Compare the latter screens to Gears of War 1.
There's a difference between high fidelity and high aesthetic value. The latter can be achieved without the former. I'd have to say the state of aesthetics from the big publishers is pretty poor. The most used palette is a range of desaturated brown, red, and grey. Fidelity can't make a game on its own and too many have tried to do just that. They've gotten into a habit of creating corridor shooters. They put the majority of their development into making the graphics give people the ooh's and ah's. Then they have little room for providing the player a unique narrative.
I'd gladly sacrifice some visual fidelity if it meant that I had the freedom to find a solution to an obstacle or narrative choice that the designers never thought of. Whatever happened to letting the player write their own story as they play through the game? Why does it always have to be shoot the bad guy? Why can't you chose a path that ends the conflict without violence? Why can't you let your conscience determine how to handle a volatile situation in a unique way? Why can't the player diffuse a hostage scenario by going in unarmed and negotiating a peaceful surrender? Why can't I just skirt around a combat zone and get to my objective without killing or being killed because I didn't do it the way the designers decided I should do it? You know, using critical thinking to solve a problem instead of just blasting it with a gun all the time.
I get it. I've been making games for almost 15 years now. What I'm saying, I guess, is that I'm tired of the cries of "What about the gameplay?!" as if it's some sort of high grounded, intellectual position to take. It's not.
Ugly games don't sell (and no, Minecraft isn't ugly).
Of course gameplay is important. It has the word "game" right in it.
We're paying $400 or $500 for very marginal graphical improvements?
No, it goes beyond graphics, but on the surface, that's what most people perceive. For me, I'm a huge Battlefield 3 fan, but I was always disappointed that the console version not only didn't have all the environmental details, but we also had lower player limits, and fewer objectives, than the PC. The PC could have a 64 man server, but the consoles top out at 24. Many of the PC maps have 5 objectives, while the consoles have 3. The PC also gets more vehicles on their maps. These are all because of limitations in the current gen consoles.
BF4, on the next gen consoles, are going to have everything the PC has (or so we are told atm). I'm rather excited about that, and that's what I'm paying for. I'm also excited to see what developers will do with the new power under the hood.
I'm slightly disappointed that they weren't designed with more power though. I seem to recall, when the current gen was developed, the GPUs were top of the line (not quite available yet). This time, it seems like the GPUs are closer to a budget gaming PC.
Who cares about graphics I care about gameplay, and that has stagnated for years.
I dont know if its because imaginations or abilities have declined, but we're still playing dull games where you wonder down corridors and your only form of expression is to shoot and kil things. Thats what most of E3 was, no wonder gaming is declining.
Yes well obviously you have much more refined tastes than the rest of us lowly dudebros who can't appreciate anything but simpleminded linear gameplay and thinly veiled power fantasies. I'm sure it must be frustrating watching the rest of us juvenile manchildren enjoy things you find dull and distasteful.
Luckily for you Sony and (to some extent) Microsoft are both trying to make inroads with independent developers to bring more diverse/experimental games to their consoles. The last 3 games I played were the Bridge, Metro Last Light, and Antichamber, all of which I thought were excellent and when we get to a point where I can play a similarly diverse range of games (and production quality) on a console I'll be very excited and I do believe that is going to happen in the next generation.
Yes well obviously you have much more refined tastes than the rest of us lowly dudebros who can't appreciate anything but simpleminded linear gameplay and thinly veiled power fantasies. I'm sure it must be frustrating watching the rest of us juvenile manchildren enjoy things you find dull and distasteful.
Luckily for you Sony and (to some extent) Microsoft are both trying to make inroads with independent developers to bring more diverse/experimental games to their consoles. The last 3 games I played were the Bridge, Metro Last Light, and Antichamber, all of which I thought were excellent and when we get to a point where I can play a similarly diverse range of games (and production quality) on a console I'll be very excited and I do believe that is going to happen in the next generation.
Huh? Sorry I didnt realise I made a personal insult to Dudebros like you call yourself. But I like linear experiences sometimes, but we could see some innovations within that. Like better AI or maybe not taking control away from the player and having the best bits in the cutscene.
Gameplay is why people play games over and above the other medias, so whats wrong with having a look at that and attempting to solve some of the hard problems. It could lead to some things that even Dudebros can enjoy.
You made the tired argument about how games were all about "wandering down corridors where your only expression is shooting and killing" (corrected for grammar) trying to back it up with the claim that game sales are in decline.
Nevermind that both these statements are false and you're retrospectively changing your complaint, if all you're looking for is improved AI and more player control then I don't think you have much to worry about moving forward into the next gen.
So fully freeform games with entirely dynamic worlds and no baked lighting with the fidelity of a blizzard cinematic are just around the corner?
No, it's more like graphic fidelity is approaching a lower ROI in terms of player interest. That is to say that high fidelity is getting ubiquitous enough that people won't be salivating for the latest GPU melter because every game has it now. Given the state of the used games market, I think gamers already are itching for a good gameplay experience that they're not getting. People buy AAA blockbusters, finish it in a week or less, and take it back for credit on the next one. Clearly, gameplay has become disposable and isn't compelling enough for the player to put in more than a few hours before they sell it back.
That comparison is like saying shadow of the colossus HD was as good as ps3 games could ever look, far cry 3 was built for the consoles and any extra features on pc are just fortunate extras.
There really has not been many pc targeted titles as they target the consoles, the total war series could be one example.
Games are getting a new target platform and pc gamers will finally benefit from it.
The difference between DX7 and DX9 was astronomical.
The difference between DX9 and DX11 is largely incremental.
Faked GI vs. Real GI... which most people couldn't tell a difference unless they're side by side.
1080p vs 720p ... which most people couldn't tell a difference unless they're side by side.
Luckily for you Sony and (to some extent) Microsoft are both trying to make inroads with independent developers to bring more diverse/experimental games to their consoles. The last 3 games I played were the Bridge, Metro Last Light, and Antichamber, all of which I thought were excellent and when we get to a point where I can play a similarly diverse range of games (and production quality) on a console I'll be very excited and I do believe that is going to happen in the next generation.
MS is still playing by the "if you hope to get on quickly at a certain calendar month without issues, you better have a big name publisher behind you" angle for their console.
Also, as several other small dev's have already mentioned out there, alongside this article, it proves that MS is simply not in touch with it's consumer, period.
Then again, unlike Sony (who even sells Health Insurance and has it's own IPS to sell in Japan), MS is in a position where they kinda cannot conduct a proper market research.
Games are getting a new target platform and pc gamers will finally benefit from it.
Lets just hope studios have learnt how to port a game properely... So many good games suffered from console crap (Metro 2033 spring to mind)
And honestly... How frickin hard is it to release a mouse and keyboard (or similar control device) for consoles? With so many FPS games on console now I find it mind boggling that you cant get official devices. Are they just not doing them because it would give those players an unfair advantage....
MS is still playing by the "if you hope to get on quickly at a certain calendar month without issues, you better have a big name publisher behind you" angle for their console.
Also, as several other small dev's have already mentioned out there, alongside this article, it proves that MS is simply not in touch with it's consumer, period.
Then again, unlike Sony (who even sells Health Insurance and has it's own IPS to sell in Japan), MS is in a position where they kinda cannot conduct a proper market research.
See: "to some extent"
Microsoft has already stated that they're open to changing their policy on requiring a publisher for independent developers if it becomes a major problem.
1080p vs 720p ... which most people couldn't tell a difference unless they're side by side.
So not true. I thought something was wrong when I first started playing The Last of Us. Turns out 720p just looks that shitty. Such as shame because all that great art and environments would look amazing in 1080p.
So not true. I thought something was wrong when I first started playing The Last of Us. Turns out 720p just looks that shitty. Such as shame because all that great art and environments would look amazing in 1080p.
I don't doubt an artist with a highly trained eye looking for visual fidelity in backgrounds could spot the difference.
He said "most people". Which is true. My wife couldn't give a toss which flavor of HD we're watching ... she just doesn't care and doesn't see much of a difference.
Most people also won't know that their tv is over scanning that 720p input to something less, and they would not care at all if their input had a criminally high latency.
They also cannot tell good art from a game with high res textures and fancy effects.
what is true though is that they will notice something different is going on with any improvement hey don't understand, even if they don't pinpoint it, or that they think it's super annoying to squint in their 720p games when looking at something blurry at a distance.
1080p is a much needed resolution to finally support by default.
It's not just about resolution either. It's about what our brains can see as recognisable visuals. I first played a 360 on a lowly 480i television and was still blown away by the awesome visuals. Sure the latest jump is less perceptible, but does anybody really think they're not going to notice the nice graphics on the fancy new machines? Even more so when you can only get the latest Halo or whatnot on those 'next-gen systems.'
So not true. I thought something was wrong when I first started playing The Last of Us. Turns out 720p just looks that shitty. Such as shame because all that great art and environments would look amazing in 1080p.
It will depend a lot on how well your screen scales for 720p. Native 720p sets and HD CRTs should look fine, but a lot of 1080p sets are just terrible at scaling lower resolution signals.
I have my PS3 hooked up via component cables to an HD CRT and games look perfectly fine on it. Tried hooking it up to a 1080p monitor a few months ago, and any game I tried that was 720p look much much worse then they ever had before.
It will depend a lot on how well your screen scales for 720p. Native 720p sets and HD CRTs should look fine, but a lot of 1080p sets are just terrible at scaling lower resolution signals.
I have my PS3 hooked up via component cables to an HD CRT and games look perfectly fine on it. Tried hooking it up to a 1080p monitor a few months ago, and any game I tried that was 720p look much much worse then they ever had before.
LCD displays don't do well with images that are not at their native resolution. CRT's don't have that issue. But I wouldn't want to switch back to CRT just for resolution scaling.
You're forgetting that these two games are just ports of eachother. They're not going to make an entirely new set of prop models and level layouts and such JUST for the PC version. They only throw in a bit better effects, lighting and highres textures.
edit: of course there are cases when just swapping the shader out for a better one makes a significant difference:
It's a bit hard to judge from just screenshots because you can cherry pick and games very rarely have a screen where there's absolutely no motion, excluding loading screens.
Find a game that supports both DX11 and 9 (like Tomb Raider), and force it into DX9, you'll notice a huge difference.
Also there's a lot you can do with really strong texture work and lighting. Dear Esther and some parts of Half Life 2 come to mind.
Thats the second one though, I think they improved it for Crysis 3.
The best comparison would be Crysis (not the sequels). From what I gather the 360 port really struggled (runs at around 22-25fps), and IMHO, the PC version looks a bit better than it's successors, likely as it was originally intended to be PC only.
For comparison, here's the original Crysis on PC. And the screenshot is from 2008, so it's running on age-appropriate hardware.
Anyway, all I wanted to say with the previous image is that you could definitely notice a simple shader swap, so imagine if the polycounts and such get a boost too. You will definitely notice a difference between xBox360 and xBox180.
I mean... the next gen is supposed to look like this:
Which looks a lot nicer than the xbox 360 screenshots we've been posting. 1080p native and higher texture resolutions for starters, making it less of a blurry mess. Plus there seems to be much more use of complex shaders like blending in the snow, rather than having the blend baked into a tiling texture.
It might be a bullshot, sure, but all of these things are possible on current PC hardware, which is essentially what the PS4 and X1 are.
only piece that seems worthwhile is the controller, which i will be getting to play games on PC
"Controller
Now this was a real winner. An almost unqualified improvement over the already excellent 360 controller. The d-pad is vastly improved and the analogue sticks are less rubbery, coming with a neat dip that your thumb can rest in comfortably. The sticks are a little smaller, which may cause issues for people with big hands. However, it’s the triggers that perhaps impress the most, with contoured edges that your forefingers can curl around. There’s also haptic feedback, with the triggers rumbling and resisting when appropriate. Great for driving games like Forza, and potentially interesting for use in action and horror games. One area of concern, though, is that the distance between the triggers and the shoulder buttons is quite large. Arguably it will just be a case of getting used to the difference, and a lot of games won’t find it an issue, but a game of FIFA in which you have to switch between trigger and shoulder buttons quickly and regularly could be a problem."
That was the one thing I really liked about the xbone, the controller. I really like the idea of having feedback on the trigger buttons, so it may emulate a gun trigger better. I would love it if the game makers emulate a gun trigger properly, so that maybe people couldn't spam the trigger as much, and it may create trigger fatigue (a good thing).
Replies
If the question is, "How amazing are "console" games 7 years later going to look?...." Pretty much like High end PC games now.
Troll...
C'mon guys, don't be like Kotaku, who complained that the Kinect cost 150$ (as if production and developer legroom costs somehow don't exist).
I can't name one game that looks a step up from it, looks better than Crysis 3, Farcry, Battlefield 3, Metro, etc.
personally i think it's the lighting model that's much better, and not the actual artwork/game. almost all of the e3 demos looked like they were using a physical based lighting model, something that hasn't been rolled out on pc games just yet.
once games like witcher 3 are available on both console and pc, then you'll see the difference properly.
http://ps4on.com/wp-content/uploads/2013/02/killzone4battle.jpg
The lighting and shadowing is not great - it's adequate, but it's very muddy. Texturing is fairly high resolution, but it's also quite dull. There's no real indication of a decent global illumination setup, nor any kind of reflective setup (look at the nasty generic cube mapping on the reflective panel on the left). The image is also so heavily post processed that it's causing artefacting around the emissive structures (which do not cast any light).
It's a good looking game, that's for sure, but it's hardly beyond anything we already have. From a technical perspective, it's still behind. It's saving graces are the quantity of polygons, light sources and particles being thrown around.
You know i had to. Max Settings Crysis 3. Ingame. This is what we have right now. But next gen is not only about graphics. It's how comfertable it is to acctually just put in a game. And start playing without installing this and that.
However from a Graphic Standpoint. I dont think the new Consoles will be years ahead of High end pc. Heck i even think they are behind.
I dont know if its because imaginations or abilities have declined, but we're still playing dull games where you wonder down corridors and your only form of expression is to shoot and kil things. Thats what most of E3 was, no wonder gaming is declining.
I do. I don't buy ugly games.
We've had this discussion for years, every e3 someone watches a trailer of a game he doesn't like and gets tunnel vision while posting about a doomed and stagnant games industry.
If we were to remove the last 10 years of games we would suffer a shit ton of fantastic games and progressions.
A sizable chunk of our budget will be going into true 1080p Which from the naked human eye is really only discernable when they're compared side by side on TV's larger than 50". Completely irrelevant to most of the public.
High End PC is where we'll be. Better shader models, better lighting, and better post processing.
Aka: marginal improvents in graphics.
Using a series of obviously CrossGen screenshots is a very inaccurate comparison.
Compare the latter screens to Gears of War 1.
There's a difference between high fidelity and high aesthetic value. The latter can be achieved without the former. I'd have to say the state of aesthetics from the big publishers is pretty poor. The most used palette is a range of desaturated brown, red, and grey. Fidelity can't make a game on its own and too many have tried to do just that. They've gotten into a habit of creating corridor shooters. They put the majority of their development into making the graphics give people the ooh's and ah's. Then they have little room for providing the player a unique narrative.
I'd gladly sacrifice some visual fidelity if it meant that I had the freedom to find a solution to an obstacle or narrative choice that the designers never thought of. Whatever happened to letting the player write their own story as they play through the game? Why does it always have to be shoot the bad guy? Why can't you chose a path that ends the conflict without violence? Why can't you let your conscience determine how to handle a volatile situation in a unique way? Why can't the player diffuse a hostage scenario by going in unarmed and negotiating a peaceful surrender? Why can't I just skirt around a combat zone and get to my objective without killing or being killed because I didn't do it the way the designers decided I should do it? You know, using critical thinking to solve a problem instead of just blasting it with a gun all the time.
Ugly games don't sell (and no, Minecraft isn't ugly).
Of course gameplay is important. It has the word "game" right in it.
You need both of those things to have a mega hit.
No, it goes beyond graphics, but on the surface, that's what most people perceive. For me, I'm a huge Battlefield 3 fan, but I was always disappointed that the console version not only didn't have all the environmental details, but we also had lower player limits, and fewer objectives, than the PC. The PC could have a 64 man server, but the consoles top out at 24. Many of the PC maps have 5 objectives, while the consoles have 3. The PC also gets more vehicles on their maps. These are all because of limitations in the current gen consoles.
BF4, on the next gen consoles, are going to have everything the PC has (or so we are told atm). I'm rather excited about that, and that's what I'm paying for. I'm also excited to see what developers will do with the new power under the hood.
I'm slightly disappointed that they weren't designed with more power though. I seem to recall, when the current gen was developed, the GPUs were top of the line (not quite available yet). This time, it seems like the GPUs are closer to a budget gaming PC.
Luckily for you Sony and (to some extent) Microsoft are both trying to make inroads with independent developers to bring more diverse/experimental games to their consoles. The last 3 games I played were the Bridge, Metro Last Light, and Antichamber, all of which I thought were excellent and when we get to a point where I can play a similarly diverse range of games (and production quality) on a console I'll be very excited and I do believe that is going to happen in the next generation.
So fully freeform games with entirely dynamic worlds and no baked lighting with the fidelity of a blizzard cinematic are just around the corner?
Huh? Sorry I didnt realise I made a personal insult to Dudebros like you call yourself. But I like linear experiences sometimes, but we could see some innovations within that. Like better AI or maybe not taking control away from the player and having the best bits in the cutscene.
Gameplay is why people play games over and above the other medias, so whats wrong with having a look at that and attempting to solve some of the hard problems. It could lead to some things that even Dudebros can enjoy.
Nevermind that both these statements are false and you're retrospectively changing your complaint, if all you're looking for is improved AI and more player control then I don't think you have much to worry about moving forward into the next gen.
No, it's more like graphic fidelity is approaching a lower ROI in terms of player interest. That is to say that high fidelity is getting ubiquitous enough that people won't be salivating for the latest GPU melter because every game has it now. Given the state of the used games market, I think gamers already are itching for a good gameplay experience that they're not getting. People buy AAA blockbusters, finish it in a week or less, and take it back for credit on the next one. Clearly, gameplay has become disposable and isn't compelling enough for the player to put in more than a few hours before they sell it back.
hehe
This is the jump in graphics we'll be getting:
http://techasino.blogspot.ca/2011/03/crysis-2-pc-vs-360-mp-comparison-shots.html
And I for one can't wait for fork out $900 to buy consoles that offers that staggering amount of visual improvement.
(That was also sarcasm)
^_^
There really has not been many pc targeted titles as they target the consoles, the total war series could be one example.
Games are getting a new target platform and pc gamers will finally benefit from it.
The difference between DX7 and DX9 was astronomical.
The difference between DX9 and DX11 is largely incremental.
Faked GI vs. Real GI... which most people couldn't tell a difference unless they're side by side.
1080p vs 720p ... which most people couldn't tell a difference unless they're side by side.
MS is still playing by the "if you hope to get on quickly at a certain calendar month without issues, you better have a big name publisher behind you" angle for their console.
Also, as several other small dev's have already mentioned out there, alongside this article, it proves that MS is simply not in touch with it's consumer, period.
Then again, unlike Sony (who even sells Health Insurance and has it's own IPS to sell in Japan), MS is in a position where they kinda cannot conduct a proper market research.
Lets just hope studios have learnt how to port a game properely... So many good games suffered from console crap (Metro 2033 spring to mind)
And honestly... How frickin hard is it to release a mouse and keyboard (or similar control device) for consoles? With so many FPS games on console now I find it mind boggling that you cant get official devices. Are they just not doing them because it would give those players an unfair advantage....
Microsoft has already stated that they're open to changing their policy on requiring a publisher for independent developers if it becomes a major problem.
Their vetting process still remains an issue.
I don't doubt an artist with a highly trained eye looking for visual fidelity in backgrounds could spot the difference.
But most consumers can't. It's a hugely diminishing return when compared to the jump from 480 to 720
http://m.digitaltrends.com/home-theater/720p-vs-1080p-can-you-tell-the-difference-between-hdtv-resolutions/
Most people also won't know that their tv is over scanning that 720p input to something less, and they would not care at all if their input had a criminally high latency.
They also cannot tell good art from a game with high res textures and fancy effects.
what is true though is that they will notice something different is going on with any improvement hey don't understand, even if they don't pinpoint it, or that they think it's super annoying to squint in their 720p games when looking at something blurry at a distance.
1080p is a much needed resolution to finally support by default.
It will depend a lot on how well your screen scales for 720p. Native 720p sets and HD CRTs should look fine, but a lot of 1080p sets are just terrible at scaling lower resolution signals.
I have my PS3 hooked up via component cables to an HD CRT and games look perfectly fine on it. Tried hooking it up to a 1080p monitor a few months ago, and any game I tried that was 720p look much much worse then they ever had before.
LCD displays don't do well with images that are not at their native resolution. CRT's don't have that issue. But I wouldn't want to switch back to CRT just for resolution scaling.
You're forgetting that these two games are just ports of eachother. They're not going to make an entirely new set of prop models and level layouts and such JUST for the PC version. They only throw in a bit better effects, lighting and highres textures.
edit: of course there are cases when just swapping the shader out for a better one makes a significant difference:
Crysis is a far better comparison because it pushes the limits of both.
Oh shit. My bad. Thats bad
Find a game that supports both DX11 and 9 (like Tomb Raider), and force it into DX9, you'll notice a huge difference.
Also there's a lot you can do with really strong texture work and lighting. Dear Esther and some parts of Half Life 2 come to mind.
The best comparison would be Crysis (not the sequels). From what I gather the 360 port really struggled (runs at around 22-25fps), and IMHO, the PC version looks a bit better than it's successors, likely as it was originally intended to be PC only.
Anyway, all I wanted to say with the previous image is that you could definitely notice a simple shader swap, so imagine if the polycounts and such get a boost too. You will definitely notice a difference between xBox360 and xBox180.
I mean... the next gen is supposed to look like this:
Which looks a lot nicer than the xbox 360 screenshots we've been posting. 1080p native and higher texture resolutions for starters, making it less of a blurry mess. Plus there seems to be much more use of complex shaders like blending in the snow, rather than having the blend baked into a tiling texture.
It might be a bullshot, sure, but all of these things are possible on current PC hardware, which is essentially what the PS4 and X1 are.
http://www.joystiq.com/2013/06/26/xbox-one-kinect-for-windows-sdk-applications-live-now-cost-400/
only piece that seems worthwhile is the controller, which i will be getting to play games on PC
"Controller
Now this was a real winner. An almost unqualified improvement over the already excellent 360 controller. The d-pad is vastly improved and the analogue sticks are less rubbery, coming with a neat dip that your thumb can rest in comfortably. The sticks are a little smaller, which may cause issues for people with big hands. However, it’s the triggers that perhaps impress the most, with contoured edges that your forefingers can curl around. There’s also haptic feedback, with the triggers rumbling and resisting when appropriate. Great for driving games like Forza, and potentially interesting for use in action and horror games. One area of concern, though, is that the distance between the triggers and the shoulder buttons is quite large. Arguably it will just be a case of getting used to the difference, and a lot of games won’t find it an issue, but a game of FIFA in which you have to switch between trigger and shoulder buttons quickly and regularly could be a problem."
Yep, that URL. 'Kinect 2.0 to drive targeted integrated Xbox-One advertising platform.
It sounds hideous