It doesn't really just 'plug in', it takes quite a bit of work to integrate an alternative physics engine, since UE3 is already pretty reliant on PhysX.
I suspect UE4 is largely reliant on APEX, although I'd imagine GPU particles would use an alternative as required for console platforms.
One of the most important developments in Unreal Engine 4 is the use of a system where all lighting is dynamic. Every light, shadow, and bounce. By doing this, it will no longer be necessary to spend time time waiting for lighting to rebuild, and it removes the guesswork of fine-tuning level lighting.
Particles
UE4 looks to change that significantly with a drastically-improved GPU-augmented particle system. The system is capable of simulating over a million particles (compared to previous-generation's hundreds, at best), and is designed to allow real-time design interaction, as well as vector fields.
Kismet
Unreal Kismet is being evolved to a far more powerful system. Epic claims that you will be able to create a mod entirely using the updated visual scripting system. The next generation of Kismet now allows for scripting of object behaviors, as well as the previous functionality for levels. [...] For those wishing to customize further, programmers can click on a property and edit the C++ code directly, with no rebuild time required.
Unrealscript/C++
In the past, gameplay code existed in UnrealScript. UnrealScript is the scripting language which forms the core of current community mods, and much of the gameplay code of all previous Unreal Engine titles.
However, UnrealScript is being removed.
In its place, the engine will be 100% C++, and highly optimized. DLLs will still be supported on PC, but this is a significant change for almost every Unreal developer operating today, whether hobbyist or professional. This may have some very interesting ramifications in their development community.
So supposedly Kismet is getting beefed up. Will be interesting to see if it'll truly allow making custom gameplay.
Indeed, especially as that's the current claim with Kismet in UDK (which is a complete fallacy given that even the Jazz demo had custom nodes written specifically for it!)
I'm really excited for this (naturally for the particle side of things). I'm curious though as to how much functionality we'll lose with materials when using GPU particles. A million particles is nice, but I always believed the power of VFX in UDK lies in the incredible customization & freedom we have with our materials.
If I'm understanding it right, it means we could do a small game-play demo without programmers at all. Using Kismet for the same functionality we currently get from Unrealscript.
It's correct - however you'll be working at near the kind of level that UnrealScripters used to; defining vars and functions, and dictating overall flow.
In the past, gameplay code existed in UnrealScript. UnrealScript is the scripting language which forms the core of current community mods, and much of the gameplay code of all previous Unreal Engine titles.
However, UnrealScript is being removed.
In its place, the engine will be 100% C++, and highly optimized. DLLs will still be supported on PC, but this is a significant change for almost every Unreal developer operating today, whether hobbyist or professional. This may have some very interesting ramifications in their development community.
And Eat 3D staffers and purchasers of their UnrealScript DVD's collectively swear.
Not to mention the tens of thousands more that have spent time learning the language. C++ in mind-bendingly tricky, surely this is going to annoy most people?
C++ is not hard language. Really. It's some urban legend created by Java programmers.
Or I should say it this way. Every language is hard if you delving into it's tricks and hacks. And you do it eventually.
For gameplay coding ? No in the long run I think C++ backed with framework and libraries will far better than US.
C++ isn't horrific - but it is way more open to people writing hideous, abusive code than UnrealScript is, and pointers take some getting used to. All-in-all though it's more beneficial in the long run, even if I am somewhat attached to UnrealScript (heck, I made my career out of that language).
ok, so it's 2012. Can we have sorted transparency now?
This is something i've never fully understood, from realtime engine to 3d app, i dont know of any program that has proper transparency sorting, is there some sort of deep seeded limitation with how geometry is rendered that this is such an issue? Seems like this should have been solved many years ago
It's possible, it's just expensive. When you have 100 opaque objects, you just find and deal with the one which is in front. When you have 100 translucent objects, you have to put 100 objects in order, then deal with each of them in turn. I don't know if there are any particularly elegant solutions for that kind of thing yet.
Surprising then that nobody has seen fit to make a bundle by creating some adequately-competent middleware.
That's the thing, it's not just a matter of people not wanting the problem solved or someone not putting the time into it. Sorting translucency 100% correctly hasn't been fast enough in the past. Meaning you could do it, it's just slow.
It's a really complex problem. For example, say you sort correctly, and now you want to do DOF. Fast traditional DOF methods are done in a deferred way, you take a final image and blur it based on depth, but translucency mucks all that up: You have a pixel on the screen that represents multiple depths in your final image, but your Z buffer can only contain one depth so what do you do? Something is getting blurred incorrectly. Do you blur your Translucency separately? if so you have to decide how many layers you want to do, and then you have to composite it somehow back into your scene. It's easy sometimes to be like "Why doesn't this just work" but it really is a deep problem.
Well any chance for that new order-independant translucency that AMD presented a year or two ago ? [ame="http://www.youtube.com/watch?v=IjylP1q5BpU"]AMD DX11 Demo - Mecha in HD @ OCWORKBENCH - YouTube[/ame]
It's easy sometimes to be like "Why doesn't this just work" but it really is a deep problem.
I totally agree but I feel like I would personally rather see a new engine that deals with this deep issue rather than adding features like tessellation or fancy dof and particle effects.
It's not like noone is trying to solve the issue - there has been a huge amout of research both academically and industrially to find a neat solution to the problem. The caveat is that to date, noone has really found a particularly viable solution suitable for use in a game - although people are getting close (but their techniques often have flaws).
Usually these single feature demos show off something that is absolutely only viable with the current hardware at the time. Which means while it's great research doesn't always fit within a full rendering package(performance wise).
I also should make it clear that I'm not saying this problem won't be solved or become better, I just wanted to shed some light on how complex of a problem translucency is.
I think the change to full C++ coding does quite make sense, isn't C++ the industry standard (or at least was)? I know recently it's starting to change but I guess nearly all coders in the gaming industry have some experience with it.
I think the change to full C++ coding does quite make sense, isn't C++ the industry standard (or at least was)? I know recently it's starting to change but I guess nearly all coders in the gaming industry have some experience with it.
It is, but it's far from being the industry standard when it comes to gameplay programming.
Yes, but it being in C++ means that there's nothing stopping a studio from implementing whichever scritping language they prefer on top of it without much difficulty
It is, but it's far from being the industry standard when it comes to gameplay programming.
Ok. What is industruy standard when it come to >>gameplay<< programming ?
Lua, Python, C#, Java, Perl ?
Guys. You should realize that language itself is mererly a bunch of words that have meaning defined by compiler.
What makes or don't languague approperiate for coding gameplay is framework around it. In this case UE4 librariers build around it.
C++ will prove much better. For everyone in long run.
It's good they abandoned Unreal Script. Now they can focus on developing more engine focused tools. Yes. Even for programmers around industry standard.
Yeah I'd like they would choose D for example as I like it semantics better than C++, but you can't have everything.
It's been possible for years. You have always been able to fiddle values in memory while applications have been running - there is also the concept of 'just in time' compilation
I've been doing a bit of research into Realtime Global Illumination. It's Kinda a hobby of mine, reading various white papers and watching tech demos even if I can't really understand too much of the technical programming aspects. :poly122: (Maybe someday!)
I got a feeling that UE4 is using Geomeric's Enlighten for their realtime radiosity lighting based on a lot of their GDC 2012 videos using UDK and their deep integration into the engine. Although I guess Autodesk/Illuminate-Labs Beast is the same way, but those guys aren't dooing the same thing, e.g. realtime/runtime GI.
Anyway, if it is Enlighten or something similar, I hope it doesn't take as much manual setup and preprocessing as their 'Maya plugin workflow' video suggests. I guess it still produces better results than Cry3's LPVs, and is faster and more realistically achievable than voxel cone tracing, even in it's current form.
I am hoping that UE4 will be able to get some sharper/higher-res GI than the really low-frequency broad GI that Enlighten appears to produce in realtime, or that they at least couple it with some good Screen Space Directional Occlusion to fill in those details:
Enlighten Realtime Preview GI:
VS.
Baked GI (Also done with Enlighten!):
Either way, I CANT WAIT to get my hands on some Real-time GI tech. Crysis 2's was fun to play with to say the least. :thumbup:
That baked GI looks nice, you can easily see why AO is needed in textures when comparing those screen shots. I'll be nice when we can just let the lighting do all the work.
You can have broad wash of low-frequency GI pass and use SSAO for contact shadows, no need to spend computing power on very accurate GI solution which will be lost in dirt and grime anyways
There's a reason they decided to go with this wide open desert scene. Try this with an indoor scene and you will see why you also need the high and medium frequency GI. BF3 didn't really sell me on the indoor environments because of this.
I also wish they had glossy reflections and indirect specular.
Compare this to the video I posted earlier with voxel cone tracing and prepare to have your jaw drop.
Yeah I checked that voxel cone tracing vid. Looks impressive on sponza atrium , but does not hold up at all on last (complex) scene. Result seems to be rather splotchy, unpredictable and full of artifacts.
That's why I'd rather have simple, approximated color bleeding that I can use as a tool instead of half-working GI solution that tries to take over everything and leaves me with artifacts to deal with.
It's true that interior scenes are lot trickier and subtler, but nothing couple of carefully placed additional lights can't deal with. You can't expect to place only window light and have clean, aesthetically pleasing lighting solution at a fraction of second anyways.
There's a reason they decided to go with this wide open desert scene. Try this with an indoor scene and you will see why you also need the high and medium frequency GI. BF3 didn't really sell me on the indoor environments because of this.
I also wish they had glossy reflections and indirect specular.
Compare this to the video I posted earlier with voxel cone tracing and prepare to have your jaw drop.
On other hand I didn't haave time to watch every corner while shooting people ;p.
It's still good trade off compilation vs real-time.
So E3 is is now called 'Engine' or something, I think this is about the 5th time I hear a company has nothing to show 'new' to the crowd, but has a surprise for game development.
It's almost as if GD's don't want to make games and are release tools so everyone else does the job for them...
I think that's the standard time slot for gametrailers episode on Spike TV. Not sure if there's a live stream on Gametrailers site too, but they always post full episodes few hours later anyway.
Some info about the episode:
Q: Are we going to see a full reveal of Unreal Engine 4?
Geoff: Yes, we will be unveiling Unreal Engine 4 during E3 week on Spike. I went down to Epic and filmed with Cliff, Tim Sweeney and the whole crew. Full half-hour special on the engine and tech.
Q: F-yes. Hopefully it will be a treat for people more into the development side and the gritty details, not just the "Those graphics are so graphic!". But I guess all that info will come out sooner rather than later anyway.
Geoff: For sure, we are showing off the editor and a lot of the tech. Very in depth, glad to hear you are excited!
You can have broad wash of low-frequency GI pass and use SSAO for contact shadows, no need to spend computing power on very accurate GI solution which will be lost in dirt and grime anyways
The current implementation of SSAO in games bugs me. It shows up in direct lighting when it should only show up in shadows/overcast areas, but nothing that involves direct lighting. That and it can make uses of low polycounts and transparent planes look bad. Every glass clump in skyrim gets a black shadow under it with SSAO turned on, I'm sure there's probably a way to fix it, but that really annoys me XD building look a lot better, but it doesn't help too much with the landscape.
The current implementation of SSAO in games bugs me. It shows up in direct lighting when it should only show up in shadows/overcast areas, but nothing that involves direct lighting.
Actually plenty of games have that implemented in their SSAO.
I'd like to see comparison screenshots I figured any deferred render solution should be able to easily fix it. But as far as udk goes, there's no "fix"
Replies
It doesn't really just 'plug in', it takes quite a bit of work to integrate an alternative physics engine, since UE3 is already pretty reliant on PhysX.
I suspect UE4 is largely reliant on APEX, although I'd imagine GPU particles would use an alternative as required for console platforms.
I can't comment on specific things but I wouldn't assume too much.
QFT. Better transparency support would be great.
Highlights for me:
Lighting
Particles
Kismet
Unrealscript/C++
WANT!
Indeed, especially as that's the current claim with Kismet in UDK (which is a complete fallacy given that even the Jazz demo had custom nodes written specifically for it!)
YES! :poly118:
If I'm understanding it right, it means we could do a small game-play demo without programmers at all. Using Kismet for the same functionality we currently get from Unrealscript.
And Eat 3D staffers and purchasers of their UnrealScript DVD's collectively swear.
Not to mention the tens of thousands more that have spent time learning the language. C++ in mind-bendingly tricky, surely this is going to annoy most people?
Or I should say it this way. Every language is hard if you delving into it's tricks and hacks. And you do it eventually.
For gameplay coding ? No in the long run I think C++ backed with framework and libraries will far better than US.
Then it'll probably suit you just fine
This is something i've never fully understood, from realtime engine to 3d app, i dont know of any program that has proper transparency sorting, is there some sort of deep seeded limitation with how geometry is rendered that this is such an issue? Seems like this should have been solved many years ago
That's the thing, it's not just a matter of people not wanting the problem solved or someone not putting the time into it. Sorting translucency 100% correctly hasn't been fast enough in the past. Meaning you could do it, it's just slow.
It's a really complex problem. For example, say you sort correctly, and now you want to do DOF. Fast traditional DOF methods are done in a deferred way, you take a final image and blur it based on depth, but translucency mucks all that up: You have a pixel on the screen that represents multiple depths in your final image, but your Z buffer can only contain one depth so what do you do? Something is getting blurred incorrectly. Do you blur your Translucency separately? if so you have to decide how many layers you want to do, and then you have to composite it somehow back into your scene. It's easy sometimes to be like "Why doesn't this just work" but it really is a deep problem.
I totally agree but I feel like I would personally rather see a new engine that deals with this deep issue rather than adding features like tessellation or fancy dof and particle effects.
Usually these single feature demos show off something that is absolutely only viable with the current hardware at the time. Which means while it's great research doesn't always fit within a full rendering package(performance wise).
I also should make it clear that I'm not saying this problem won't be solved or become better, I just wanted to shed some light on how complex of a problem translucency is.
It is, but it's far from being the industry standard when it comes to gameplay programming.
Lua, Python, C#, Java, Perl ?
Guys. You should realize that language itself is mererly a bunch of words that have meaning defined by compiler.
What makes or don't languague approperiate for coding gameplay is framework around it. In this case UE4 librariers build around it.
C++ will prove much better. For everyone in long run.
It's good they abandoned Unreal Script. Now they can focus on developing more engine focused tools. Yes. Even for programmers around industry standard.
Yeah I'd like they would choose D for example as I like it semantics better than C++, but you can't have everything.
Also, is it going to be for all? Or for big developers only?
I got a feeling that UE4 is using Geomeric's Enlighten for their realtime radiosity lighting based on a lot of their GDC 2012 videos using UDK and their deep integration into the engine. Although I guess Autodesk/Illuminate-Labs Beast is the same way, but those guys aren't dooing the same thing, e.g. realtime/runtime GI.
Anyway, if it is Enlighten or something similar, I hope it doesn't take as much manual setup and preprocessing as their 'Maya plugin workflow' video suggests. I guess it still produces better results than Cry3's LPVs, and is faster and more realistically achievable than voxel cone tracing, even in it's current form.
I am hoping that UE4 will be able to get some sharper/higher-res GI than the really low-frequency broad GI that Enlighten appears to produce in realtime, or that they at least couple it with some good Screen Space Directional Occlusion to fill in those details:
VS.
Baked GI (Also done with Enlighten!):
Either way, I CANT WAIT to get my hands on some Real-time GI tech. Crysis 2's was fun to play with to say the least. :thumbup:
I also wish they had glossy reflections and indirect specular.
Compare this to the video I posted earlier with voxel cone tracing and prepare to have your jaw drop.
That's why I'd rather have simple, approximated color bleeding that I can use as a tool instead of half-working GI solution that tries to take over everything and leaves me with artifacts to deal with.
It's true that interior scenes are lot trickier and subtler, but nothing couple of carefully placed additional lights can't deal with. You can't expect to place only window light and have clean, aesthetically pleasing lighting solution at a fraction of second anyways.
On other hand I didn't haave time to watch every corner while shooting people ;p.
It's still good trade off compilation vs real-time.
hmm, why not? ive been expecting this for a while.
Seems kinda weird for it to be at 1 AM eastern.
It's almost as if GD's don't want to make games and are release tools so everyone else does the job for them...
I think that's the standard time slot for gametrailers episode on Spike TV. Not sure if there's a live stream on Gametrailers site too, but they always post full episodes few hours later anyway.
Some info about the episode:
And new logo:
beautiful!
The current implementation of SSAO in games bugs me. It shows up in direct lighting when it should only show up in shadows/overcast areas, but nothing that involves direct lighting. That and it can make uses of low polycounts and transparent planes look bad. Every glass clump in skyrim gets a black shadow under it with SSAO turned on, I'm sure there's probably a way to fix it, but that really annoys me XD building look a lot better, but it doesn't help too much with the landscape.
Actually plenty of games have that implemented in their SSAO.