I assume it is still using Y- though? I'd prefer Y+ but I doubt that will change. Not that it is a huge deal.
It's really DirectX issue. DX wants to use Y-, while OpenGL Y+.
I don't know why. Probably Microsoft wanted to defy standard and make it's own instead.
Here's an update on the Xnormal / Synced normals pipeline.
We found 2 steps to better sync normal map rendering:
The first step is a setting in xnormal you can change to make it better synced with our rendering:
Click the plug icon on the bottom left
Click Tangent basis calculators tab
Select Mikk - TSpace plugin
Click Configure button
Check the Compute binormal in pixel shader box.
( I will make sure the docs are updated with this)
The second step was a change to our renderer and will be available in the next release. The good news is you can go ahead and make this xnormal change and when the next version of Unreal Engine is available to you, your art will look better.
Could you give a breakdown of what exactly this changes? I'd be interested to know a bit more about the tech behind it rather than just blindly accepting it
Could you give a breakdown of what exactly this changes? I'd be interested to know a bit more about the tech behind it rather than just blindly accepting it
The simple explanation is:
Normal maps add per pixel surface detail to a model by taking a highpoly model and looking at the normal tangents of the low poly and baking the detail on top of that. If your target game engines reads the normal map with exactly the same tangent basis as the baking program, there shouldn't be any shading errors (besides baking errors). The problem is there's no complete standard for tangent basis and most programs will bake and read normal maps in different ways.
I can't wait until everybody uses the same standard for normal maps. Why has there been no standard set yet?
For the simple reason that most of this tech is developed separately, by different companies, at different times. Look at PBR workflow and terminology right now: We have diffuse, albedo, and base color all describing the same input, depending on the tool you're using.
For better or for worse, Autodesk rules the major 3D market, so we can at least lean on FBX a little bit instead of trying to make due with OBJ. And UE4 is going to take over the world, so everybody will be bent towards Epic's will
Is this with the new preview build? I tested Quack's meshes with the new "Check the Compute binormal in pixel shader box" in 4.0.2 - Still getting the line where the tri's are(like Quack's previous image).
I guess I'll wait for 4.1 to try Modo>Xnormal>UE4 again.
Here's an update on the Xnormal / Synced normals pipeline.
We found 2 steps to better sync normal map rendering:
The first step is a setting in xnormal you can change to make it better synced with our rendering:
Click the plug icon on the bottom left
Click Tangent basis calculators tab
Select Mikk - TSpace plugin
Click Configure button
Check the Compute binormal in pixel shader box.
( I will make sure the docs are updated with this)
The second step was a change to our renderer and will be available in the next release. The good news is you can go ahead and make this xnormal change and when the next version of Unreal Engine is available to you, your art will look better.
Yeah kind of surprised that there's no real way to easily import uncompressed images into unreal.
Also disappointed that there's no more vertex lighting, for some assets it was pretty useful.
Despite that, I'm still loving UE4, it's such a great improvement, motivated me to mess around with lots of stuff again outside work, thanks Epic!
synergy11, guess it would be interesting for Eat3D or such to make tutorials, but honestly the stuff Epic has put in the documentation will already get you pretty far, it's fairly easy to learn!
Yeah kind of surprised that there's no real way to easily import uncompressed images into unreal.
im like 99% sure you can just import uncompressed textures right out of the box. In fact Im like 85% sure that you cant import compressed textures at all.
Yeah... specifically i'm referencing the fact that, for example in Unity, you can set a textures compression type to "truecolour" which is completely uncompressed.
Gotcha, it's something we've discussed doing in the past. I'll bring it up again, it makes a bit more sense now that there's a much broader audience of the engine and a much wider use-case.
im like 99% sure you can just import uncompressed textures right out of the box. In fact Im like 85% sure that you cant import compressed textures at all.
I wasn't talking about the compression of the source file, which is obviously uncompressed, but what Unreal does with it, which is compress it to some DXT format (or BC5 for normal maps, etc.)
The light baking seems a lot more efficient in UE4. It turned me off at first too, but it's much more flexible and less prone to artifacts/blotchiness than UE3/UDK.
Has the way vertex painting changed from UDK?
Granted i didn't get much exposure to UDK, but i did hook up a vertex painting material.
Now in UE4, i have to paint on all 3 channels to it to take effect, if i just pain in the red channel, it paints red color.
my graph in case i fudged up somewhere.
You are pulling the full color from the vertex color and then adding that to the texture multiplied by the red vertex color channel. The full vertex color is obviously going to be red if you just paint red. Not sure what the intetion of the that add node is but I suspect you will get the result you are after if you just remove it (old leftovers?).
I'm running into something strange with screenshots losing saturation. Capturing a high res screenshot and opening it in Photoshop gives me a capture that seems washed out, like it lost vibrancy. Hitting printscreen gives me the same. The weirdest is that using the snipping tool gives me the correct looking image in the preview, but pasting it into Photoshop, or saving it and viewing it in Photoshop or Windows Photo Viewer, it loses it's vibrancy as well. Opening a .jpg saved through the Snipping tool in Internet Explorer or Chrome gives me the correct image. It almost looks like some kind of linear/gamma space thing, but that seems like a stretch.
Not very familiar with UDK and now UE4 but did anyone get this to work: https://wiki.unrealengine.com/Rama's_Vertex_Snap_Editor_Plugin ?
Also anyone got any tips on actor/model placement, organisation in the scene? All the engines I've tried (Cryengine, Unity, UE4) have fairly old school way of object placement, population, copying etc. Coming from max it always feels I am 10 times faster in max setting up the same scene than in UE4. Stuff like this: https://www.youtube.com/watch?v=ePmIAoT2p58 + object painting and generally better tools for that would be amazing
Yay, atlast Ive finally got me ue4!!
Been working on this for 2 sleepless days, this is my animation originally made in max but I decided to port to ue4 to avoid rendering and go deeper in game development software, but mostly just to avoid rendering lol.
Questions about hair: My character is bold :shifty: Anyone know where i can get some really awesome tutorials for realtime hair with planes, and also any ideas on how i can make that hair dynamic e.g something like apex cloth for hair simulation, I really like the hair on the character below made by this guy http://simpmonster.cghub.com/ I think it would look great in ue4 especially with some apex cloth sim to make it dynamic.
And also a tutorial on subsurface scattering in ue4 or udk.
Does any have any idea if epic is planning on having somekind of dynamic hair simulator for ue4 like Nvidia hair works??
I wasn't talking about the compression of the source file, which is obviously uncompressed, but what Unreal does with it, which is compress it to some DXT format (or BC5 for normal maps, etc.)
There was actually was a secret way to in UE3-I'm curious if they preserved it in UE4. If you set the LOD group to colorlookup and reimport it never compresses the texture (or at least, used to never compress). That's normally reserved for 1d super small textures, but it's a handy trick for certain other (small) textures. I'm not sure what that would do to a normal map though.
Could someone tell me, that If I were to learn this tutorial from Eat3D about the particle system in UDK, that I could do the same in Unreal Engine 4? Or would the particle system just strictly apply to UDK?
think digging through there, reading the wiki, and mucking around with it should teach ya a bit! Once you've read some things and followed some tutorials, would recommend opening up the sample content and dissecting the existing particles in them to learn more!
Cascade and the material editor in UE4 is very similar to UE3. I haven't seen that DVD, but I'd think 90% of it would transfer to UE4. The biggest difference is anything set up in Kismet would be totally different now that UE4 uses Blueprint.
And also a tutorial on subsurface scattering in ue4 or udk.
I'd also like to know this. I've got it working just fine in the material viewer but in game it doesn't behave the same
EDIT: ......nevermind, my setup was right apart from the lightsource being static and for some reason it didn't work until I closed and re-opened UE4. Behold some crappy textures that atleast work!
Hey, I was sitting at my computer and I noticed something cool.
When the metal part of the pencil is in direct light, the reflection on the table is very alive. When I took it out of the sunlight, the reflection becomes more diffuse or invisible. Is there a way to do this in UE4?
Following on from my post earlier I've been playing around more to make sure everything is working as expected and had to make a few alterations to what I was storing in which map channel.
Thought I'd share my final setup for anyone else wanting to do PBR with SubSurface:
I downloaded the Realistic Room Rendering demo from the marketplace and have been studying how they go about putting the materials together. I noticed for example, the wood for the floor, they used several normal maps to create a nice looking material.
They used the same wood texture for the doors, but its a lot more flat because they didn't add as much detail to them as they did to the floors.
I thought it was because they already have the baked normal of the door, but they could just add on the extra wood detail within the material editor right? Also, is there a reason they didn't use a normal created from the actual wood texture itself? It seemed they just used a lot of detail normals instead.
The roughness texture they used, was the red channel of the original wood texture, but I thought that with roughness, towards black is glossy, and towards white is rough/matte(within UE4). So how come in the texture, the grooves within the wood grain are darker than the surface
Replies
Have you tested this with Modo to Xnormal to UE4?:eek:
It's really DirectX issue. DX wants to use Y-, while OpenGL Y+.
I don't know why. Probably Microsoft wanted to defy standard and make it's own instead.
Could you give a breakdown of what exactly this changes? I'd be interested to know a bit more about the tech behind it rather than just blindly accepting it
The simple explanation is:
Normal maps add per pixel surface detail to a model by taking a highpoly model and looking at the normal tangents of the low poly and baking the detail on top of that. If your target game engines reads the normal map with exactly the same tangent basis as the baking program, there shouldn't be any shading errors (besides baking errors). The problem is there's no complete standard for tangent basis and most programs will bake and read normal maps in different ways.
Here's an extreme case senario, but it's a good example http://www.3pointstudios.com/tmp/3point_3psteaser.png
Handplane has a few really good videos about this
https://www.youtube.com/watch?v=m-6Yu-nTbUU
There's another 2 videos that go really well with this.
I can't wait until everybody uses the same standard for normal maps. Why has there been no standard set yet?
For the simple reason that most of this tech is developed separately, by different companies, at different times. Look at PBR workflow and terminology right now: We have diffuse, albedo, and base color all describing the same input, depending on the tool you're using.
For better or for worse, Autodesk rules the major 3D market, so we can at least lean on FBX a little bit instead of trying to make due with OBJ. And UE4 is going to take over the world, so everybody will be bent towards Epic's will
Probably wont/cant happen. Everyone has unique requirements for platform, texture format, optimization desires etc...
I guess I'll wait for 4.1 to try Modo>Xnormal>UE4 again.
Fantastic news!
You mean for textures?
Anyone know if Eat3D or 3DMotive are working on some video tutorials for it?
Money waiting to be made here!
Also disappointed that there's no more vertex lighting, for some assets it was pretty useful.
Despite that, I'm still loving UE4, it's such a great improvement, motivated me to mess around with lots of stuff again outside work, thanks Epic!
synergy11, guess it would be interesting for Eat3D or such to make tutorials, but honestly the stuff Epic has put in the documentation will already get you pretty far, it's fairly easy to learn!
im like 99% sure you can just import uncompressed textures right out of the box. In fact Im like 85% sure that you cant import compressed textures at all.
Hey Jordan,
Yeah... specifically i'm referencing the fact that, for example in Unity, you can set a textures compression type to "truecolour" which is completely uncompressed.
You can import DDX textures (including with custom mip-maps).
That's a good list!
Granted i didn't get much exposure to UDK, but i did hook up a vertex painting material.
Now in UE4, i have to paint on all 3 channels to it to take effect, if i just pain in the red channel, it paints red color.
my graph in case i fudged up somewhere.
You are pulling the full color from the vertex color and then adding that to the texture multiplied by the red vertex color channel. The full vertex color is obviously going to be red if you just paint red. Not sure what the intetion of the that add node is but I suspect you will get the result you are after if you just remove it (old leftovers?).
Any ideas?
Also anyone got any tips on actor/model placement, organisation in the scene? All the engines I've tried (Cryengine, Unity, UE4) have fairly old school way of object placement, population, copying etc. Coming from max it always feels I am 10 times faster in max setting up the same scene than in UE4. Stuff like this: https://www.youtube.com/watch?v=ePmIAoT2p58 + object painting and generally better tools for that would be amazing
Been working on this for 2 sleepless days, this is my animation originally made in max but I decided to port to ue4 to avoid rendering and go deeper in game development software, but mostly just to avoid rendering lol.
Questions about hair: My character is bold :shifty: Anyone know where i can get some really awesome tutorials for realtime hair with planes, and also any ideas on how i can make that hair dynamic e.g something like apex cloth for hair simulation, I really like the hair on the character below made by this guy http://simpmonster.cghub.com/ I think it would look great in ue4 especially with some apex cloth sim to make it dynamic.
And also a tutorial on subsurface scattering in ue4 or udk.
Does any have any idea if epic is planning on having somekind of dynamic hair simulator for ue4 like Nvidia hair works??
There was actually was a secret way to in UE3-I'm curious if they preserved it in UE4. If you set the LOD group to colorlookup and reimport it never compresses the texture (or at least, used to never compress). That's normally reserved for 1d super small textures, but it's a handy trick for certain other (small) textures. I'm not sure what that would do to a normal map though.
http://eat3d.com/advanced_vfx1
https://www.unrealengine.com/blog/a-new-cascade-vfx-tutorial-series
https://wiki.unrealengine.com/Visual_Effects:_Lesson_01:_Material_Particle_Color
https://wiki.unrealengine.com/Visual_Effects:_Lesson_02:_Using_Depth_Fade
also people have been posting tutorials on our forums:
https://forums.unrealengine.com/forumdisplay.php?12-Community-Content-Tools-and-Tutorials
https://forums.unrealengine.com/showthread.php?2263-Tutorial-Creating-a-Simple-Particle-System
think digging through there, reading the wiki, and mucking around with it should teach ya a bit! Once you've read some things and followed some tutorials, would recommend opening up the sample content and dissecting the existing particles in them to learn more!
I'd also like to know this. I've got it working just fine in the material viewer but in game it doesn't behave the same
EDIT: ......nevermind, my setup was right apart from the lightsource being static and for some reason it didn't work until I closed and re-opened UE4. Behold some crappy textures that atleast work!
When the metal part of the pencil is in direct light, the reflection on the table is very alive. When I took it out of the sunlight, the reflection becomes more diffuse or invisible. Is there a way to do this in UE4?
When viewed from a distance, the reflection was still there.
Isn't specular when its light from the light source being bounced back?
The pencil needed to be in that area of light to bounce an image of itself onto the table (not the sun).
You can see there's a specular reflection on the pencil when it went outside that strip of light (but not a reflection).
Edit: Here's a quick drawing to better understand my point.
Thought I'd share my final setup for anyone else wanting to do PBR with SubSurface:
Hope it proves useful for someone else
(once again, excuse the crappy test maps)
They used the same wood texture for the doors, but its a lot more flat because they didn't add as much detail to them as they did to the floors.
I thought it was because they already have the baked normal of the door, but they could just add on the extra wood detail within the material editor right? Also, is there a reason they didn't use a normal created from the actual wood texture itself? It seemed they just used a lot of detail normals instead.
The roughness texture they used, was the red channel of the original wood texture, but I thought that with roughness, towards black is glossy, and towards white is rough/matte(within UE4). So how come in the texture, the grooves within the wood grain are darker than the surface