Is there a way, using spawn time in cascade, to zero the time node in the materials editor? I want the linear interpolation I've set up for a particle's material to begin when each particle spawns.
Edit: Found out that you use the DynamicParameter node for particles in the material editor and cascade, have it set to emitter or spawn time (I used emitter time for better results), and set a value method (then a refresh of the node within cascade). Got some interesting results.
how do you create visible light, like light beams through dust or smoke? a link to a tutorial or anything would be great, Thanks.
The area of the UDN that had this in detail is on the developer side. You can either search volumetric in the asset browser and study the shader and try to understand it, or just copy it. And while it does do volumetrics, will take some work to get it the way you want.
Pretty much just go to UDN and volumetric is right there.
I've looked through this thread, but didn't see it come up yet - apologies if it has.
I'm trying to import a model from Blender, but it hasn't been working out. Here are my results:
The left one is a .dae file, the right one a .psk. For a comparison, here is the same model, with the same textures, in Marmoset (except that an odd polygon got added somewhere in that process).
Am I doing something wrong with UDK or should I look into other software to export my models with?
Question: Why can't I get a simple Pointlight to bake out when I render lighting? My dominant directional light gives me some bounced lighting no problem, but a pointlight thats in my scene isn't contributing at all.
@ ZWebbie - One thing I've learned about DAE/FBX from applications is that they might do things a little different than others. Lightwave DAE export screws up UV info for UDK. If you can, post the original Blender DAE file because I'd like to compare Maya/Lightwave/Blender DAE files. The Maya one works perfect.
zwebbie: Are you setting the normal map compression settings to normal map on import or just using default?
The mesh info also looks borked, but I'm not sure why. Never went the blender export route.
ok the UT3 Thread dissappeared, however i guess this is still an issue in UDK
could someone tell me why offsetting UVs works different with ScalarParameters as with Constants?
Ok an Example, if i have UV coordinates i could add a constant 2, but there is no Constant 2 scalar parameter, right? So i have to append one float and another float together (damn my limited english, i hope someone can understand this) to have X and Y as XY or RG, if i append another one to it, i get XYZ or RGB or UVW right?
Now if i use that appended made out of scalar parameters vector of RG/XY/UV and add it to my UVs it doesn't offset G/Y/V it doesn't offset anything, but if i plug in constants instead, it does offset, i give it the exact same inputs but in constants not scalar parameters.
So i searched further and found a solution, it appears that Unreal has different Y/Z with scalar parameters and constants, so what i have to do, to move the uvs on Y/V is append R/X/Y with G/Z/W and append a B/Y/V to it and then component mask it to RB to have the exact same result as with constants.
So i had a mistake in the outline for ages and only recognized it when i wanted to antialias it the brute force way (blut the scene, use an edge detection as mask, so it only blurs the outsides, thanks to Rob for the hint!) and it never blurred the top and bottom pixels of the scene... now i have to change all Y/V values to read B instead of G of my append vector o_O
@ ZWebbie - One thing I've learned about DAE/FBX from applications is that they might do things a little different than others. Lightwave DAE export screws up UV info for UDK. If you can, post the original Blender DAE file because I'd like to compare Maya/Lightwave/Blender DAE files. The Maya one works perfect.
I said this before (I don't think it was in this thread) but I could never get any DAE file from Blender to import correctly into UE3. I'm also thinking this is a Blender issue... And no, I'm not knocking the program either :P
I've looked through this thread, but didn't see it come up yet - apologies if it has.
I'm trying to import a model from Blender, but it hasn't been working out. Here are my results:
The left one is a .dae file, the right one a .psk. For a comparison, here is the same model, with the same textures, in Marmoset (except that an odd polygon got added somewhere in that process).
Am I doing something wrong with UDK or should I look into other software to export my models with?
Hi ,
have You seen the tuto Importing into Unreal Editor 3 at EAT 3D ?
Is it possible to fade an object / staticmesh away using a trigger?
For example, say you were making a sidescroller where you had the player run behind some rock meshes in the foreground and you wanted those rocks to fade away. Is it possible to achieve this? It would also be useful for situations where you wanted to isolate certain exteriors and interiors.
Is it possible to fade an object / staticmesh away using a trigger?
For example, say you were making a sidescroller where you had the player run behind some rock meshes in the foreground and you wanted those rocks to fade away. Is it possible to achieve this? It would also be useful for situations where you wanted to isolate certain exteriors and interiors.
These would be static meshes.
About the only way I could think of doing it would be to set up the objects material with some sort of opacity. Trigger the opacity to start decreasing once that trigger is entered. Then kill the static mesh completely once the opacity is at 0.
you'd need to switch the material in script to a translucent material, you don't it to render with translucency all the time, even though unreal now supports lit translucensy i guess it would be way to expensive and sorting will be a mess if you have all rocks with a translucent material, so you could create a normal material for your rocks and a second one with a scalar parameter for the opacity which you scale in script.
@Lamont: Sure, here's a download. There are other version of the export script, but they produce even worse results. Let me know if you find any substantial differences, I'm curious.
@Vailias: I tried that and the normal map seems to work fine on other objects - it's a .tga from xNormal, after all
@DarthNater: Agh, it must be an issue with Blender, then. I heard there's new Collada code coming up, so let's hope it works better.
@[iWi]: Thanks, that tutorial seems to be doing much the same as I'm doing, except for the exporting.
So I guess it's a Blender problem then - can't say I'm surprised. I'll try Maya or a Max trial one of these days and let you guys know if that does indeed solve my problems.
@Lamont: Sure, here's a download. There are other version of the export script, but they produce even worse results. Let me know if you find any substantial differences, I'm curious.
@Vailias: I tried that and the normal map seems to work fine on other objects - it's a .tga from xNormal, after all
@DarthNater: Agh, it must be an issue with Blender, then. I heard there's new Collada code coming up, so let's hope it works better.
@[iWi]: Thanks, that tutorial seems to be doing much the same as I'm doing, except for the exporting.
So I guess it's a Blender problem then - can't say I'm surprised. I'll try Maya or a Max trial one of these days and let you guys know if that does indeed solve my problems.
I heard that there was another Colloda exporter for Blender when I used to use it. Maybe you could find it and try that. I'm pretty sure that collada exporter in Blender now is 3rd party, so can't blame Blender this time
Okay, so I'm having a problem with camera movement inside unreal editor. I hope this image gets across what I'm having problems with: http://3dryan.blogspot.com/
So, you are looking at the top view of a room with the green square being the camera and the red line its path. After I get the camera movement looking good in matinee, I do a "save all", and it asks me if i want to exit interpolation editing mode. I hit yes, and when I check the camera movement in matinee, it gets offset like in the picture i posted. The starting position of the camera gets moved while the actual rotation/translation values are still correct. The picture is to show that the camera still moves correctly, but all the keyframes get translated in x for some odd reason. Anyone know why it's doing this? Any help would be greatly appreciated.
Thanks for the quick reply! I'll try this out but I'm still confused as to why the camera gets moved when I exit interpolation mode. Do you know why this is caused in the first place? I didn't touch anything in editor mode after I saved the level.
I think by default it's trying to interpret the camera's keyed position from where it is initially placed in the level. If you change it's movement track to "World Frame" Unreal uses the camera's keyed values as exact positions in relation to the world instead.
Zwebbie - I opened your file in a text editor, and it seems you got WAY more vert data than what is needed for your poly count. So then I imported into UDK and yeah I think your exporter is borked as all your polys are unwelded.
About the only way I could think of doing it would be to set up the objects material with some sort of opacity. Trigger the opacity to start decreasing once that trigger is entered. Then kill the static mesh completely once the opacity is at 0.
you'd need to switch the material in script to a translucent material, you don't it to render with translucency all the time, even though unreal now supports lit translucensy i guess it would be way to expensive and sorting will be a mess if you have all rocks with a translucent material, so you could create a normal material for your rocks and a second one with a scalar parameter for the opacity which you scale in script.
Thanks for the information.
Shortly after posting my question I actually found a way to fade (More like a quick cut) through kismet + matinee, which seems to work pretty good.
I know that wasn't intended for me but Im bored so I'll answer :P
Yes and no. You can write UnrealScript files and turn them into Kismet nodes. So essentially you could script that action, make it a kismet node, and use that, or just script it and never use kismet at all
I know that wasn't intended for me but Im bored so I'll answer :P
Yes and no. You can write UnrealScript files and turn them into Kismet nodes. So essentially you could script that action, make it a kismet node, and use that, or just script it and never use kismet at all
Cool, thanks for the info!
Next question! Does anyone know why when I preview my level with the game view it loses some of its color? I don't have a post process volume or anything like that. Maybe theres a setting somewhere?
How do I apply a scene material and a new post process type to a PP volume? I've got my material created and setup using Scene Texture Sample parameter. I've created a post process chain that inserts my material into the line. So...how do I get this to apply to my scene or volume?
I don't see a checkbox or dropdown or any field to specify this, and the references in the UDN documentation direct me to examples that don't exist in my UDK install or are obsolete.
I know this is quite common all over! ive tried the Horizontal mirroring .. that doesnt seem to help! ive tried different lightmap uv's but there isnt much point playing with that as it still hass issues before baking
So, a quick question. How does everyone present their wireframes for a portfolio when using unreal? The wireframe view it has looks like rubbish. I tried applying a UV snapshot texture to the objects in my scene, to give it the wireframe look, but mipmapping makes it hard to see the lines in the wireframe.
Speaking of mipmapping, has anyone found a solution to unreal lowering the resolution of your textures, besides making your textures TEXTUREGROUP_SKYBOX and telling it to never stream? This only works to a certain extent for me.
whats the normal map like razorb?
try the model without a normal map, see if it still has issues.
edit:
3DRyan, there is no nice way of presenting a model wireframe in UE3. Best off screenshotting it in max. It'll look cleaner, read easier. UE's will be triangulated which isn't easy to read at all, and importing a UV snapshot is just cumbersome tbh
Szark, your colors get a little desaturated in the game view due to it being tonemapped. This is something that is applied to the post process so that the highlights don't get banding and crappy colors as they get bright, the downside is colors get -slightly- less contrast. The positives of this tonemapping function far outweigh that negative though
Is it okay to group meshes thats going to be imported into Unreal 3 editor? Or does every piece you import have to be separate? I ask this because I've been haven't some problems with placing decals on group meshes and then having them crash.
Szark, your colors get a little desaturated in the game view due to it being tonemapped. This is something that is applied to the post process so that the highlights don't get banding and crappy colors as they get bright, the downside is colors get -slightly- less contrast. The positives of this tonemapping function far outweigh that negative though
Blastradius: Everyone correct me if I'm wrong, but I think every time you export from your 3d package to unreal, you can only treat it as one solid object. At least that's how it is for me with Maya. Not sure about Max. I would just export each piece separately either way.
Slightlybored and ben: Thanks for the help. I've figured out how I'm gonna do the wireframes. Sometimes it seems that unreal does it good enough with a wireframe texture applied, and other times a maya screenshot is better. Depends on how close you are to the object. Kudos!
so heres my problem, i made this prop for our team game class, it looks fine in the mesh editor in udk under the dynamic lighting. as seen here
now when the lighting gets built it seems like it is inverting the lighting on half of the mesh, and it gives this horrid seam, it was a lot worse, ive tried several different lightmaps, light map resolutions, welding uvs on the main unwrap, so please help me :poly122:
Show us your normal UVs and your lightmap UVs, the chances are you are just mirroring the wrong way on your diffuse/normal/spec UVs.
Here's a little tutorial I made about mirroring UVs for normal mapped assets in Unreal. More info from UDN.
Also related here's a good tutorial on making efficient lightmap UVs, though with the introduction of Lightmass in UDK you don't need to pad the outer border anymore, just between the shells.
Hello, I'm working on a bunch of objects and was wondering if I could get some insight on how textures vs whats on the screen is handled. Say if I had modeled a spoon and a box, however the box was on screen and the spoon wasn't, but they are sharing the same texture maps. Now my question is does this waste memory because it renders the whole say 512x512 map out at one time? BTW engine is UDK
it works nice for cube maps to, just choose "New Render To Texture Cube"
and "SceneCaptureCubeMapActor". You'll know where if you'll follow the tutorial...
Replies
Edit: Found out that you use the DynamicParameter node for particles in the material editor and cascade, have it set to emitter or spawn time (I used emitter time for better results), and set a value method (then a refresh of the node within cascade). Got some interesting results.
The area of the UDN that had this in detail is on the developer side. You can either search volumetric in the asset browser and study the shader and try to understand it, or just copy it. And while it does do volumetrics, will take some work to get it the way you want.
Pretty much just go to UDN and volumetric is right there.
I'm trying to import a model from Blender, but it hasn't been working out. Here are my results:
The left one is a .dae file, the right one a .psk. For a comparison, here is the same model, with the same textures, in Marmoset (except that an odd polygon got added somewhere in that process).
Am I doing something wrong with UDK or should I look into other software to export my models with?
Neither. It's just pressing L and clicking..
The mesh info also looks borked, but I'm not sure why. Never went the blender export route.
could someone tell me why offsetting UVs works different with ScalarParameters as with Constants?
Ok an Example, if i have UV coordinates i could add a constant 2, but there is no Constant 2 scalar parameter, right? So i have to append one float and another float together (damn my limited english, i hope someone can understand this) to have X and Y as XY or RG, if i append another one to it, i get XYZ or RGB or UVW right?
Now if i use that appended made out of scalar parameters vector of RG/XY/UV and add it to my UVs it doesn't offset G/Y/V it doesn't offset anything, but if i plug in constants instead, it does offset, i give it the exact same inputs but in constants not scalar parameters.
So i searched further and found a solution, it appears that Unreal has different Y/Z with scalar parameters and constants, so what i have to do, to move the uvs on Y/V is append R/X/Y with G/Z/W and append a B/Y/V to it and then component mask it to RB to have the exact same result as with constants.
So i had a mistake in the outline for ages and only recognized it when i wanted to antialias it the brute force way (blut the scene, use an edge detection as mask, so it only blurs the outsides, thanks to Rob for the hint!) and it never blurred the top and bottom pixels of the scene... now i have to change all Y/V values to read B instead of G of my append vector o_O
I said this before (I don't think it was in this thread) but I could never get any DAE file from Blender to import correctly into UE3. I'm also thinking this is a Blender issue... And no, I'm not knocking the program either :P
Hi ,
have You seen the tuto Importing into Unreal Editor 3 at EAT 3D ?
http://eat3d.com/free_ue3_import
should be help !
Yea, everythings insides the LMV.
For example, say you were making a sidescroller where you had the player run behind some rock meshes in the foreground and you wanted those rocks to fade away. Is it possible to achieve this? It would also be useful for situations where you wanted to isolate certain exteriors and interiors.
These would be static meshes.
About the only way I could think of doing it would be to set up the objects material with some sort of opacity. Trigger the opacity to start decreasing once that trigger is entered. Then kill the static mesh completely once the opacity is at 0.
@Vailias: I tried that and the normal map seems to work fine on other objects - it's a .tga from xNormal, after all
@DarthNater: Agh, it must be an issue with Blender, then. I heard there's new Collada code coming up, so let's hope it works better.
@[iWi]: Thanks, that tutorial seems to be doing much the same as I'm doing, except for the exporting.
So I guess it's a Blender problem then - can't say I'm surprised. I'll try Maya or a Max trial one of these days and let you guys know if that does indeed solve my problems.
I heard that there was another Colloda exporter for Blender when I used to use it. Maybe you could find it and try that. I'm pretty sure that collada exporter in Blender now is 3rd party, so can't blame Blender this time
So, you are looking at the top view of a room with the green square being the camera and the red line its path. After I get the camera movement looking good in matinee, I do a "save all", and it asks me if i want to exit interpolation editing mode. I hit yes, and when I check the camera movement in matinee, it gets offset like in the picture i posted. The starting position of the camera gets moved while the actual rotation/translation values are still correct. The picture is to show that the camera still moves correctly, but all the keyframes get translated in x for some odd reason. Anyone know why it's doing this? Any help would be greatly appreciated.
The last part of this post should help you with that camera problem.
Thanks for the information.
Shortly after posting my question I actually found a way to fade (More like a quick cut) through kismet + matinee, which seems to work pretty good.
@Neox: By script do you mean kismet?
I know that wasn't intended for me but Im bored so I'll answer :P
Yes and no. You can write UnrealScript files and turn them into Kismet nodes. So essentially you could script that action, make it a kismet node, and use that, or just script it and never use kismet at all
here is a link to how chris arnold a former co worker on WAR
http://www.chrisarnoldart.com/pages/tk/war_tk.html
a compile of how the textures blend from botom layer to top layer
Cool, thanks for the info!
Next question! Does anyone know why when I preview my level with the game view it loses some of its color? I don't have a post process volume or anything like that. Maybe theres a setting somewhere?
I don't see a checkbox or dropdown or any field to specify this, and the references in the UDN documentation direct me to examples that don't exist in my UDK install or are obsolete.
anyone any ideas?
Speaking of mipmapping, has anyone found a solution to unreal lowering the resolution of your textures, besides making your textures TEXTUREGROUP_SKYBOX and telling it to never stream? This only works to a certain extent for me.
Any help is appreciated. Thanks!
try the model without a normal map, see if it still has issues.
edit:
3DRyan, there is no nice way of presenting a model wireframe in UE3. Best off screenshotting it in max. It'll look cleaner, read easier. UE's will be triangulated which isn't easy to read at all, and importing a UV snapshot is just cumbersome tbh
To get a wireframe view while ingame, press Tab to bring up the console, then type in the command show meshedges and hit enter.
EDIT:
ImSlightlyBored is right about the triangulation though, no way around that unfortunately.
I might as well mention the tiledshot command while I'm at it for taking highres screenshots.
Oh and the TEXTUREGROUP_SKYBOX, lod bias -2 thing has never really worked for me either
yea i know the normal map is ugly! but this whole model was just a test model ;( (excuse!)
in max
Thanks for that, Jordan! Much appreciated!
Slightlybored and ben: Thanks for the help. I've figured out how I'm gonna do the wireframes. Sometimes it seems that unreal does it good enough with a wireframe texture applied, and other times a maya screenshot is better. Depends on how close you are to the object. Kudos!
now when the lighting gets built it seems like it is inverting the lighting on half of the mesh, and it gives this horrid seam, it was a lot worse, ive tried several different lightmaps, light map resolutions, welding uvs on the main unwrap, so please help me :poly122:
I found this one, do you think this old UT2004 tutorial would still work before I go down this road?
http://www.psyonix.com/ImportingVehiclesTutorial/
I would really like a tank rigging tutorial if you have seen one then please post the link.
Cheers, Claude
Show us your normal UVs and your lightmap UVs, the chances are you are just mirroring the wrong way on your diffuse/normal/spec UVs.
Here's a little tutorial I made about mirroring UVs for normal mapped assets in Unreal. More info from UDN.
Also related here's a good tutorial on making efficient lightmap UVs, though with the introduction of Lightmass in UDK you don't need to pad the outer border anymore, just between the shells.
Please...'cause I'm kinda stuck now...
http://www.hourences.com/book/tutorialsue3reflections.htm
it works nice for cube maps to, just choose "New Render To Texture Cube"
and "SceneCaptureCubeMapActor". You'll know where if you'll follow the tutorial...
sorry again.