albedo is if i remember the amount of diffuse light thats been drawn back by a surface as opposed to selfilluminating surfaces, damn my english... its the diffuse color of the object :P
I have a bunch of UIScenes that I am loading textures into the backgrounds on, when they load up they are all pixelated and take a second to "pop" into the actual resolution/quality. How can I get them to load up full quality from the beginning?
I am having a serious problem with skeletal meshes clipping in and out depending on camera position and need to know the work around. It makes making cinematics the way you want nearly impossible. I have tried updating my actorx to the latest version, updated drivers etc, latest UT3 patches but still the same story. The only work around is to set bRotationOnly in the anim tab but keyframing movement looks un-natural. If anyone knows a work around for this I sure would appreciate it. Thanks!
Is there such a thing as TOO much BSP in a scene? Like for an environment, could I make all my buildings bases be BSP and then just have window, door and trim meshes that I place around that BSP to make it decorative?
I'm just curious if this is considered bad form or not.
You mean the customlighting node? I got that and i use unreal tournament 3. It´s at the bottom of the material-thingy. Look at my images on the previous page to see what i mean.
hey man
yeah it's not in UT3 or GOW builds (or any public ones I dont think), it's some in-house build feature from whoever added that. Or maybe it's an upcoming feature (hopefully!)
Though of course, in a studio, anyone could add custom stuff to it...
aerynSun
You can indeed have too much BSP imo. It's nowhere near as easy to edit and re-iterate as static meshes. There are other, performance related issues for it as well. But all of this still just comes down to studio workflow, and how you are expected to work there. There may indeed be people who create pure BSP levels with minimal static mesh decos, who knows!
is a map to mask thickness which you can paint or generate with xnormal i think
Hey Neox,
I was just wondering how you went about putting an outline around the whale in your airborn game.
I'm just having a rough time wrapping my mind around how it would work in unreal. I've been trying doing something with a fresnal and a dot to isolate the edges, but I can't figure out how to get a outline to draw around a character.
Its all just UT3, no fancy new features, lighting is baked in maya and i created a shader in unreal that can read the directional lightmaps and use them in cooperation with the normalmaps. But in those screens its just one light source with dynamic shadows really nothing fancy.
i do, yes but still its only a UT3 Build we are working with, so we can't do wnything more that you can do besides I can find informations better then you can :P
but most of the stuff that defines airborn is very basic once you get how it works, the shading and outline is really no magic, i'm totally bad with math it takes me ages to analyze the formulas behind it but once i get it, its mostly very simple stuff
tiledshot 2 (with 2 stating that you want the image to be twice as large as the screen resolution. If you put 1, the screen resolution is the screen resolution. If you put 3, it is 3 times the resolution and so forth. )
tiledshot 6 128 (this is good for post effect heavy level. it will generate a resolution that is 6 tmies the screen resolution, and overlap an area of 128 pixels for each tile. It creates tiles that overlap a little bit to ensure that fullscreen blurring or distortion post-processing effect will match)
F9 (if your in game, i believe this is the default for screen shots without using the consul command)
Its all just UT3, no fancy new features, lighting is baked in maya and i created a shader in unreal that can read the directional lightmaps and use them in cooperation with the normalmaps. But in those screens its just one light source with dynamic shadows really nothing fancy.
Directional lightmaps? How does that work? I'm still a little shady on light maps. I know that for unreal I need to create a second UV set that doesn't have any overlapping UVs. But that's the extent of what I know to do with them. I'm unfamiliar with what else can be done.
I assumed that Unreal just uses this second set to store shadowing information on that particular object.
You light your scene in Maya before hand before taking it into unreal? Or was this project just a special circumstance?
I remember reading somewhere that someone used an ambient occlusion pass render as a light map. I don't know if that makes any sense to you, because it most certainly is over my head.
When I went back and read over the post concerning the outline in unreal, I came across you discussing the "zbuffer" quite a bit. I guess really the whole discussion was a bit over my head. I'm just a little confused as to how a zbuffer could be utilized. The only way I know how to get a zbuffer render is from checking it on in the Render Settings in Maya and rendering out an image of my model. I'm quite unsure has to how a static render could be used for a model going into unreal. Is it baked in some way?
When ImSlightlyBored gave his explanation of what to do for the outline, I just got quite lost. :P It sounded like that a Texture2DCubeParameter could be used for his 8 way zbuffer? I guess just what nodes to use and where and how to use them is really my problem.
You mentioned a "soft alpha" as well. What exactly is a "soft alpha"?
Thank you so much for taking the time to answer all my questions. I really really do appreciate all your help!
I have a seam problem in Unreal that I can't seem to solve. This is a specific mesh normals seam too. I have 2 meshes that, when in Maya, flow together fine in both regular and high quality rendering. I import it into Unreal using the collada exporter, and snap the 2 pieces together. Pop a point light in and boom! Seam.
It's a seam that looks like the normals are inverted, but that just isn't so. Both meshes have the same smoothing groups, which is smoothed tangents. All I have on the material is a constant at .5 plugged into the Diffuse. No normal map or anything.
They are both water tight pieces as well. I've tried not making them water tight, like deleting out the pieces that snap up to one another. I've tried using multiple smoothing groups, by hardening the corners.
Nothing seems to work. It seems to be that I'm either exporting it wrong or that I'm missng a setting on the material. I really have no idea. So I'm hoping someone can assist me with this.
Below are some screens from Maya and UT3
These screens are with my trial run with not having it be water tight. You can see in Unreal it looks like the normals on one are going in the opposite direction. Which in Maya says they are not.
Here are my export settings out of Maya for Collada:
Let me know if you want to see any more screens. Thanks a lot for any assists I can get with this!
How exactly does one go about making a light map in Unreal, I made on in Photoshop (just changed to grey scale and render clouds over everything) but for one the import keeps saying it is bad (targa format) and I cant find any instructions on how to set it up in the material editor
Also I am trying to find the render double sided option because I got some meshes you will need to see inside of, I found something calls Forcesdoublesidedshadow, is that it?
hay me again, first off let me say thanks for all the help I have been getting, thanks to you guys this class project has been moving along on schedule, but as with anything one thing gets fix and another breaks, this problem I have had since the beginning, its been nearly a month and I still cant figure it out
as you can see in the images
I have random faces missing, I checked the normals and they where fine. All over the level I have small parts of meshes missing or I have just random faces missing and there normals are just fine, like bolts to big to make from normal or displacement mapping so I made them from meshes just gone or dials piping missing some of their faces
I thought it was a size thing like things to small where not just exporting over but the screw in the first picture, size wise, is bigger than the player
I have no idea what is going on, and this is really bad because several important things are suffering from this
I had that problem once before as well. Though for me it was because I beveled certain corners in my geometry without re-sewing the UVs and re-baking my normal map.
You can also check and make sure your normals aren't locked. Just do a quick unlock of them, set them to face and then reapply normal angle to your object. If you have Maya then all of this is under the Normals menu in the Polygons submenu, if you're a Max user then I have no idea what the equivalent of that is. :P
Also, once you do a re-check of your geometry, re-export and re-import into Unreal with a different name. Don't override the previous static mesh. I've had issues where it'll just keep the previous problem even though it had been solved. Re-importing a fresh mesh seemed to help.
The one main thing you can check if it IS a normals issue is to turn on Two Sided on the material like nrek showed previously.
If they show up when you turned on Two Sided, then it's a normals issue, if they don't, check your UVs. If it where me, I'd just redo the UV map on one piece that's messed up to test it really quickly.
Hopefully that helps you.
Also for my previous post as to the problem I was having, I got it solved. It was a light map/geometry issue. The light map res was too low and I had colliding faces in my geometry.
I had that problem once before as well. Though for me it was because I beveled certain corners in my geometry without re-sewing the UVs and re-baking my normal map.
You can also check and make sure your normals aren't locked. Just do a quick unlock of them, set them to face and then reapply normal angle to your object. If you have Maya then all of this is under the Normals menu in the Polygons submenu, if you're a Max user then I have no idea what the equivalent of that is. :P
Also, once you do a re-check of your geometry, re-export and re-import into Unreal with a different name. Don't override the previous static mesh. I've had issues where it'll just keep the previous problem even though it had been solved. Re-importing a fresh mesh seemed to help.
The one main thing you can check if it IS a normals issue is to turn on Two Sided on the material like nrek showed previously.
If they show up when you turned on Two Sided, then it's a normals issue, if they don't, check your UVs. If it where me, I'd just redo the UV map on one piece that's messed up to test it really quickly.
Hopefully that helps you.
Well I tried what you suggested and still no luck, none of my test provided favorable results, so back to square one thought at this point we can pretty much rule out Normal issues unless their is something more I am missing
Thanks for the help anyway man, really appreciate it
well multiply the sine with 0.5 so it will have half the speed ( i think thats the way hwo i did it, not sure right now, would have to test it, or maybe change the time input for the sine by lowering it, not really sure now) and as fr the brightening up thing why not just clamp the result of the sine?
erm if you want to stop a sine wave from going from -1 to 1 and instead have it 0 - 1 (is that what you are talking about? I am not sure.)
multiply it by .5, which effectively halves the range of the wave ( -.5 to .5) then add .5 to it to shift the wave up.
to change a wave's speed, I believe it has the option to do that from within the node itself.
edit: alright so you want just a slide down from 1 to 0 then no change? I think you have to use cosine to start at 1 though I'm not sure.
Why not just control this with a material instance actor and matinee and a scalar parameter? It's cheaper (sine waves are quite expensive.) and much more controllable, than having to dick around with maths for something that can be done with just one expression and a custom curve
it has much more than the material itself tbh - its sometimes far easier and more efficient to set the material up and then just create the animated sections via a curve.
you control values via curves, so you can have bezier, step, etc, just really easy. Also set them to loop infinitely or play out only once, when the matinee has been told to play.
I believe it also lets you sync up your animated mat with sound fx too, but don't quote me on that. Basically, very useful!
So if I had my lightmaps already as .tga files, how would I make each specific object use them as lightmaps?
as said you have to create your own shader and use it how you want it to be used (multiplied, added, overlayed, whatever), best would be to create it with a texture parameter so you can just apply a Material Instance onto your models and then load the lightmaps into the instanced materials, it not like clicky clicky, load lightmap. You definitely have to dig a bit deeper into shader creation to create one that does all what you want.
How do I make transparency maps in Unreal, like for windows and glass
You import a 32 bit image, or an image to act as a way to control transparencies and apply it to the mesh. In the materials you have an option to tag it as a material that uses transparencies.
Don't know if it's already been made but would anyone be interested in a master material and instance based off Neoxs toon shader? Only of course if Mr. Neox will give us his permission.
harryscary - there's several ways to do that... one being use the image like Lamont said, or you can simply apply a constant, constant vector 3, or a fresnel with a normal map plugged into it... all to the "Opacity" of the material.
Question:
you talk about "making your own shader". i don't even know where to start with that in UE3. the only option i have, that i'm aware of, within the standard editor that comes with unreal tournament, is to create a material based on the shaders available within the engine.
is there some hidden command, or shader editor which most people aren't aware of?
also, would you mind sharing some of your processes on how you got your fake SSS to work?
thanks Lamont and nfrrty, I had been messing with it but no luck, when it comes to programs I usually can mess with it for a few hours and figure things out, but as it is this is showing itself more and more to be something that I cant just do that anymore
I will give what you said a try, hope it works for me
1) A dot product between the inverse surface normal and the light, clamped from 0 to 1, and finally distance attenuated;
you create a dot product, a surface normal (either normalmap or a (0,0,1) vector) and a light vector plug those into the dot product, then create a constantclamp and plug in the result of the dot product and then you can multiply it with a distance dependent term.
In that case its really easy to copy, not like the formula from valves tf2 shader where ui had to reconstruct every part of the fomula without knowing what i am really doing there
1) A dot product between the inverse surface normal and the light, clamped from 0 to 1, and finally distance attenuated;
2) Another half-lambert term between the inverse normal and view, also distance attenuated;
3) Scaling of subsurface color by red, green and blue extinction coefficients.
ok, so i've done parts 1 + 2. the third i don't really understand, but i'm assuming a constant 3 vector, or a texture sample with whatever colour i want the scatter to be, would work?
i've copied your half lambert shader from the TF2 sample you made (thanks ). but, how do i connect them together? and do they end up multiplied into the diffuse channel?
oh sorry there is one mistake, you have to invert the surface normal as written in the text
i interpreted 3 as the multiply of the SSS Color, to colorize the SSS effekt
and then just add that result to the "SSS part 2"
if you have that plug it into custom lighting, not diffuse and turn the shader to use custom lighting insteadt of phong
about what you plug where and how to combine, best thing would be to plug the results first into custom lighting so you see what is happening, in case of the "SSS part 1" term, the result should be a white to black gradient from the shadowed part to the lighted part, and now you see it, plug in the shading part, you should have a very soft shading curve, now what do you want to do with the "SSS part 1"? Of course you want to brighten up the parts that are facing away from the lightsource, so add it on the "SSS part 2" (i'd call that "shading group" instead), mathmetically thats totally easy, your "SSS part 1" goes (thanks to the clamp) from 0-1 everything that is black won't add to the shading group the rest will brighten it up, so the effect of SSS appears.
finally, is there a way (do you think) to control the SSS? like, if there are metal surfaces, could i use a SSS texture to say "these parts have NO sss at all, and places like the ears and nose have more sss than normal"?
just think simple
say you have a image rendered and overall SSS solution rendered and you compose both together what would you do in photoshop to control where the SSS appears and where not, think of it as a postprocess nothing you would handle inside your 3dapp, the solution is very simple.
Replies
Cheers man! I made the change and it thought it would work, but someone else forgot to update the packages from the repo! DOH! Looks great now.
see image http://www.treadster.com/images/meshProblem.jpg
Is there such a thing as TOO much BSP in a scene? Like for an environment, could I make all my buildings bases be BSP and then just have window, door and trim meshes that I place around that BSP to make it decorative?
I'm just curious if this is considered bad form or not.
Thanks!
yeah it's not in UT3 or GOW builds (or any public ones I dont think), it's some in-house build feature from whoever added that. Or maybe it's an upcoming feature (hopefully!)
Though of course, in a studio, anyone could add custom stuff to it...
aerynSun
You can indeed have too much BSP imo. It's nowhere near as easy to edit and re-iterate as static meshes. There are other, performance related issues for it as well. But all of this still just comes down to studio workflow, and how you are expected to work there. There may indeed be people who create pure BSP levels with minimal static mesh decos, who knows!
But it's a path I wouldn't like to take.
Hey Neox,
I was just wondering how you went about putting an outline around the whale in your airborn game.
I'm just having a rough time wrapping my mind around how it would work in unreal. I've been trying doing something with a fresnal and a dot to isolate the edges, but I can't figure out how to get a outline to draw around a character.
Any tips would be great!
Thanks!
are you using a custom lighting plugin? or did you program it yourselves?
but most of the stuff that defines airborn is very basic once you get how it works, the shading and outline is really no magic, i'm totally bad with math it takes me ages to analyze the formulas behind it but once i get it, its mostly very simple stuff
tiledshot 2 (with 2 stating that you want the image to be twice as large as the screen resolution. If you put 1, the screen resolution is the screen resolution. If you put 3, it is 3 times the resolution and so forth. )
tiledshot 6 128 (this is good for post effect heavy level. it will generate a resolution that is 6 tmies the screen resolution, and overlap an area of 128 pixels for each tile. It creates tiles that overlap a little bit to ensure that fullscreen blurring or distortion post-processing effect will match)
F9 (if your in game, i believe this is the default for screen shots without using the consul command)
Alternatively, you can just print screen in the Engine itself. You risk getting steps, but with the right video settings, it should be all good.
Directional lightmaps? How does that work? I'm still a little shady on light maps. I know that for unreal I need to create a second UV set that doesn't have any overlapping UVs. But that's the extent of what I know to do with them. I'm unfamiliar with what else can be done.
I assumed that Unreal just uses this second set to store shadowing information on that particular object.
You light your scene in Maya before hand before taking it into unreal? Or was this project just a special circumstance?
I remember reading somewhere that someone used an ambient occlusion pass render as a light map. I don't know if that makes any sense to you, because it most certainly is over my head.
When I went back and read over the post concerning the outline in unreal, I came across you discussing the "zbuffer" quite a bit. I guess really the whole discussion was a bit over my head. I'm just a little confused as to how a zbuffer could be utilized. The only way I know how to get a zbuffer render is from checking it on in the Render Settings in Maya and rendering out an image of my model. I'm quite unsure has to how a static render could be used for a model going into unreal. Is it baked in some way?
When ImSlightlyBored gave his explanation of what to do for the outline, I just got quite lost. :P It sounded like that a Texture2DCubeParameter could be used for his 8 way zbuffer? I guess just what nodes to use and where and how to use them is really my problem.
You mentioned a "soft alpha" as well. What exactly is a "soft alpha"?
Thank you so much for taking the time to answer all my questions. I really really do appreciate all your help!
It's a seam that looks like the normals are inverted, but that just isn't so. Both meshes have the same smoothing groups, which is smoothed tangents. All I have on the material is a constant at .5 plugged into the Diffuse. No normal map or anything.
They are both water tight pieces as well. I've tried not making them water tight, like deleting out the pieces that snap up to one another. I've tried using multiple smoothing groups, by hardening the corners.
Nothing seems to work. It seems to be that I'm either exporting it wrong or that I'm missng a setting on the material. I really have no idea. So I'm hoping someone can assist me with this.
Below are some screens from Maya and UT3
These screens are with my trial run with not having it be water tight. You can see in Unreal it looks like the normals on one are going in the opposite direction. Which in Maya says they are not.
Here are my export settings out of Maya for Collada:
Let me know if you want to see any more screens. Thanks a lot for any assists I can get with this!
Also I am trying to find the render double sided option because I got some meshes you will need to see inside of, I found something calls Forcesdoublesidedshadow, is that it?
Thanks in advance for the help
http://www.hourences.com/book/tutorialsue3lightmap.htm
Then for the double sided question, go into the material editor of the mat that will be on your object. And its right here...
that explains it, I was looking in the mesh properties not the material
hum okay I will look into that hopefully I can get this working
thanks big time nrek
as you can see in the images
I have random faces missing, I checked the normals and they where fine. All over the level I have small parts of meshes missing or I have just random faces missing and there normals are just fine, like bolts to big to make from normal or displacement mapping so I made them from meshes just gone or dials piping missing some of their faces
I thought it was a size thing like things to small where not just exporting over but the screw in the first picture, size wise, is bigger than the player
I have no idea what is going on, and this is really bad because several important things are suffering from this
help
You can also check and make sure your normals aren't locked. Just do a quick unlock of them, set them to face and then reapply normal angle to your object. If you have Maya then all of this is under the Normals menu in the Polygons submenu, if you're a Max user then I have no idea what the equivalent of that is. :P
Also, once you do a re-check of your geometry, re-export and re-import into Unreal with a different name. Don't override the previous static mesh. I've had issues where it'll just keep the previous problem even though it had been solved. Re-importing a fresh mesh seemed to help.
The one main thing you can check if it IS a normals issue is to turn on Two Sided on the material like nrek showed previously.
If they show up when you turned on Two Sided, then it's a normals issue, if they don't, check your UVs. If it where me, I'd just redo the UV map on one piece that's messed up to test it really quickly.
Hopefully that helps you.
Also for my previous post as to the problem I was having, I got it solved. It was a light map/geometry issue. The light map res was too low and I had colliding faces in my geometry.
Thanks Moody for the help with that!
as far as i know it is only possible on a script base
Well I tried what you suggested and still no luck, none of my test provided favorable results, so back to square one thought at this point we can pretty much rule out Normal issues unless their is something more I am missing
Thanks for the help anyway man, really appreciate it
multiply it by .5, which effectively halves the range of the wave ( -.5 to .5) then add .5 to it to shift the wave up.
to change a wave's speed, I believe it has the option to do that from within the node itself.
edit: alright so you want just a slide down from 1 to 0 then no change? I think you have to use cosine to start at 1 though I'm not sure.
Why not just control this with a material instance actor and matinee and a scalar parameter? It's cheaper (sine waves are quite expensive.) and much more controllable, than having to dick around with maths for something that can be done with just one expression and a custom curve
you control values via curves, so you can have bezier, step, etc, just really easy. Also set them to loop infinitely or play out only once, when the matinee has been told to play.
I believe it also lets you sync up your animated mat with sound fx too, but don't quote me on that. Basically, very useful!
time is a cruel mistress but I haven't forgot
(I'd like to do alot more aside from those as well, but can't outside of work. fail.)
as said you have to create your own shader and use it how you want it to be used (multiplied, added, overlayed, whatever), best would be to create it with a texture parameter so you can just apply a Material Instance onto your models and then load the lightmaps into the instanced materials, it not like clicky clicky, load lightmap. You definitely have to dig a bit deeper into shader creation to create one that does all what you want.
So many ways to get transparency it's rediculous.
you talk about "making your own shader". i don't even know where to start with that in UE3. the only option i have, that i'm aware of, within the standard editor that comes with unreal tournament, is to create a material based on the shaders available within the engine.
is there some hidden command, or shader editor which most people aren't aware of?
also, would you mind sharing some of your processes on how you got your fake SSS to work?
I will give what you said a try, hope it works for me
Thanks again
http://udn.epicgames.com/Three/MaterialsCompendium.html#Custom
what you do in hlsl/glsl is math, you can do most of that just with nodes instead of translating the fomula into code you translate it into nodes
in case of this http://www.gamedev.net/community/forums/topic.asp?topic_id=481494
it's a perfect instruction on how to do it in unreal
you create a dot product, a surface normal (either normalmap or a (0,0,1) vector) and a light vector plug those into the dot product, then create a constantclamp and plug in the result of the dot product and then you can multiply it with a distance dependent term.
In that case its really easy to copy, not like the formula from valves tf2 shader where ui had to reconstruct every part of the fomula without knowing what i am really doing there
ok, so i've done parts 1 + 2. the third i don't really understand, but i'm assuming a constant 3 vector, or a texture sample with whatever colour i want the scatter to be, would work?
i've copied your half lambert shader from the TF2 sample you made (thanks ). but, how do i connect them together? and do they end up multiplied into the diffuse channel?
i interpreted 3 as the multiply of the SSS Color, to colorize the SSS effekt
and then just add that result to the "SSS part 2"
if you have that plug it into custom lighting, not diffuse and turn the shader to use custom lighting insteadt of phong
about what you plug where and how to combine, best thing would be to plug the results first into custom lighting so you see what is happening, in case of the "SSS part 1" term, the result should be a white to black gradient from the shadowed part to the lighted part, and now you see it, plug in the shading part, you should have a very soft shading curve, now what do you want to do with the "SSS part 1"? Of course you want to brighten up the parts that are facing away from the lightsource, so add it on the "SSS part 2" (i'd call that "shading group" instead), mathmetically thats totally easy, your "SSS part 1" goes (thanks to the clamp) from 0-1 everything that is black won't add to the shading group the rest will brighten it up, so the effect of SSS appears.
finally, is there a way (do you think) to control the SSS? like, if there are metal surfaces, could i use a SSS texture to say "these parts have NO sss at all, and places like the ears and nose have more sss than normal"?
say you have a image rendered and overall SSS solution rendered and you compose both together what would you do in photoshop to control where the SSS appears and where not, think of it as a postprocess nothing you would handle inside your 3dapp, the solution is very simple.