I'd like to see some tech demos for this that show off things like rooms constructed out of and filled with items made of refractive glass, or a fun house made of warped mirrors. Basically, show us what scenarios/designs it makes possible in games, at its current level of development, that were previously unfeasible.
Also, DX12 is mentioned, but what about Vulcan and OpenGL?
I guess this means we will see Volta GPU's announced soon. GTC maybe?
The Pascal series has been out for almost 2 years atp, so
nvidia should be releasing something soon. Rumors are all over the place
though and it's really hard to get a clear picture of nvidias schedule.
There's gotta be no way that'll look decent enough on this gen's hardware and even Volta would be hard pressed to get close I'm sure. I'm wondering if this is a status/statement kinda thing, to become immortalized sort of like Crysis.
To be fair it was rendered using an "undisclosed number of NVIDIA Volta GPUs". It is rumored to be 3 volta cards and at $8k a pop, that is well outside the range of reasonable for games. Film/TV studios looking to shave time, energy and money out of their pipelines might be more of the target market.
Not that we can't celebrate this as a big step forward, we can totally do that, but don't expect this to be the next thing everyone is working on.
To be fair it was rendered using an "undisclosed number of NVIDIA Volta GPUs". It is rumored to be 3 volta cards and at $8k a pop, that is well outside the range of reasonable for games. Film/TV studios looking to shave time, energy and money out of their pipelines might be more of the target market.
Not that we can't celebrate this as a big step forward, we can totally do that, but don't expect this to be the next thing everyone is working on.
It was the DGX Station according to Nvidia, 4 Tesla Titan Volta cards, workstation available in US for $69,000 according to this source in the UK it's around £75,000 according to Scan. Don't know why it costs that much more over here but whatever. Anyway you're absolutely right it'll definitely be used in Film/TV, this quality in games won't be available for a few more years.
This is why, even on Volta, I expect the Metro Exodus game to not meet expectations when it implements this ray tracing.
Rasterized graphics will still be around for a long time anyway!
Just went to a GDC talk yesterday by Nvidia, talking about their raytracing efforts. It's early yet, but they're showing the writing on the wall.
We're where film rendering was 10 years ago, facing a revolution in rendering, transitioning from raster to rays. Bugs Life for example was a turning point. Some good things to learn from that era, very applicable for us.
Nvidia was saying basically, get ready, this is what's coming. No more light baking, less hand-tuned fakery.
Big takeaway was that this will free up artists, make us happy.
To be fair it was rendered using an "undisclosed number of NVIDIA Volta GPUs". It is rumored to be 3 volta cards and at $8k a pop, that is well outside the range of reasonable for games. Film/TV studios looking to shave time, energy and money out of their pipelines might be more of the target market.
Not that we can't celebrate this as a big step forward, we can totally do that, but don't expect this to be the next thing everyone is working on.
I remember that the first Unreal Engine 3 Samaritan demo originally ran on 3 graphics cards, but a few months later, Epic got it down to one card (although I believe it dropped the render resolution from 2500p to 1080p).
I posted before that real time ray tracing was already being used in Movie production, so it was only a matter time that it was going to trickle down to game.
To be fair it was rendered using an "undisclosed number of NVIDIA Volta GPUs". It is rumored to be 3 volta cards and at $8k a pop, that is well outside the range of reasonable for games. Film/TV studios looking to shave time, energy and money out of their pipelines might be more of the target market.
Not that we can't celebrate this as a big step forward, we can totally do that, but don't expect this to be the next thing everyone is working on.
I remember that the first Unreal Engine 3 Samaritan demo originally ran on 3 graphics cards, but a few months later, Epic got it down to one card (although I believe it dropped the render resolution from 2500p to 1080p).
I posted before that that real time ray tracing was already being used in Movie production, so it was only a matter time that it was going to trickle down to game.
When was real-time ray tracing done in Movie production, since this is the first time it's ever being done to decent quality? Do you mean real-time feedback using a Ray Tracer in Movie Production aka Redshift or Prorender? Similar words, whole different meaning.
those demos arnt rendering the whole image in raytrace... just some passes and layering that on top of the game stuff... thats not the same thing redsift or any other GPU renderer is doing...
> using RTX technology to include real-time ray traced Global Illumination
I would take it with a huge grain of salt that ray tracing is responsible for rendering the full frame given that quality of output. Just took that quote above from the description on the video, it's not a fully ray traced scene only a GI pass.
Not Volta either. The new specs looks crazy too... Excuse me if this is a stupid question but is nvlink a new alternative of sharing resources between cards? Like for example there is afr where you flip flop rendering between cars but this means that the content needs to be loaded into both or more of them. With this tech, this is not the case anymore?
It's all based on more floating point precision. I hope to get the RTX and it has real time ray trace reflections (You'd have to watch the whole talk by Jensen to get the last part)
I hope there will be a tech demo that shows the user freely navigating the scene in a "chaotic" way, like in a game. So far these look like coreographed videos, kind of defeats the purpose of showing it's "real-time".
The Battlefield V demo was really impressive. I honestly thought we were still 10 years away from seeing these kinds of improved shadows and accurate reflections from ray-tracing integrated into games. The performance hit may be huge, but at least we're already here today.
I'm particularly excited with this being integrated into rendering plugins such as V-Ray.
PC Games Hardware posted their first impressions. According to them the RTX mode was running on an 2080 Ti at Full HD and came with a lot of performance issues, a lot of which probably come from the DX12 mode that is needed for the raytracing features (which famously is having frame pacing issues on all Frostbyte games).
I've been wondering if the real-time raytracing is finally capable to render the refraction on the cornea from the iris. That'd be awesome, and we could just model a concave iris and a convex cornea all the time, and it'd get refracted correctly.
For example the current best eye material is in Epic's own "DigitalHuman" project they share for UE4. But it's so complex with its nodes and a material function to understand, and it uses a specific mesh with a specific UVs, that it restricts artists a lot, to use it.
At least Nvidia is mentioning refractions in their tech papers too, so that's why I was wondering about if we as artists can finally just create eyes for charactes easier:
I was about to order a 2080ti before the release, but not I'm very hesitant. The benchmarks about non ray tracing performance are out from various hardware testing youtubers, and the performance improvement isn't as good between the 10th series top cards and this, as you would expect at a generation leap... Some games barely shows improvement, but some others shows 20-40 % imprivement depending on the title. I'm a little bit disappointed so far.
Replies
Also, DX12 is mentioned, but what about Vulcan and OpenGL?
The Pascal series has been out for almost 2 years atp, so nvidia should be releasing something soon. Rumors are all over the place though and it's really hard to get a clear picture of nvidias schedule.
AMD announces their own thing.
I'm ready.
I'd imagine the first benefits we see will be in offline rendering, baking tools and simulation tech.
https://www.youtube.com/watch?v=J3ue35ago3Y
https://www.youtube.com/watch?v=tjf-1BxpR9c
Not that we can't celebrate this as a big step forward, we can totally do that, but don't expect this to be the next thing everyone is working on.
https://www.fxguide.com/featured/epics-unreal-real-time-ray-tracing-gdc-day2-part-2/?utm_source=facebook&utm_medium=social&utm_campaign=SocialWarfare
We're where film rendering was 10 years ago, facing a revolution in rendering, transitioning from raster to rays. Bugs Life for example was a turning point. Some good things to learn from that era, very applicable for us.
Nvidia was saying basically, get ready, this is what's coming. No more light baking, less hand-tuned fakery.
Big takeaway was that this will free up artists, make us happy.
I posted before that real time ray tracing was already being used in Movie production, so it was only a matter time that it was going to trickle down to game.
Guerilla Games did similar things for Killzone3 years ago...
https://youtu.be/_29M8F-sRsU?t=3586
http://www.youtube.com/watch?v=XByzjj9JTBM
http://www.youtube.com/watch?v=3jb3flTRykQ
http://www.youtube.com/watch?v=9A81NeQgJFE
I like those area shadows.
I have to get one of those cards.
So far these look like coreographed videos, kind of defeats the purpose of showing it's "real-time".
I'm particularly excited with this being integrated into rendering plugins such as V-Ray.