With that it i possible to share libraries. Awesome! Thanks man! The only thing i could miss is an additional question if you really want to delete a tool. Because there is no way back - maybe there should be a "security" question. Would there a way to select multiple tools and then drag&drop/delete them? In my case the used lib wasn't loaded the next time when i closed the tool and reopend it.
But anyway - this is an awesome tool! Many thanks for sharing!
Yeah, I could have a Yes/No for the Delete I suppose.
I did have Multi-Select setup at first, but I'm not sure if I can get it to work properly or not. Right now though my code is all setup to work with 1 at a time, so i'd have to re-write all of it.
Yeah, the Library doesnt load on start, mainly because if u had a huge library open and u close the tool and forget, and then open it again, u might not want to have all that loading. But I could maybe have an option to load the last library on open.
Or you could have a "latest used libs" dropdownbox - like in 3ds max the latest used files or maybe the feature, that the folder-dialog (where i select the lib) starts at the latest used directory.
I started working on a software renderer in C# just for fun and learning.
I also made a half finished Quake2 BSP loader which is where the level geometry is from.
It renders the entire map at that resolution at about 40 fps without any occlusion. Its about 10,000 triangles.
I've been working on setting up the Razer/Sixense device as a mocap input for maya. You get around 1deg rotation and 1mm translation precision at 60hz. I don't have spectacular demo videos, but hopefully you get the idea and if people are interested I'd love to hear suggestions on how to improve it.
I started working on a software renderer in C# just for fun and learning.
I also made a half finished Quake2 BSP loader which is where the level geometry is from.
It renders the entire map at that resolution at about 40 fps without any occlusion. Its about 10,000 triangles.
That's really cool. How are you going about that? Using any documents/tutorials, or just from previous knowledge?
That's really cool. How are you going about that? Using any documents/tutorials, or just from previous knowledge?
Some previous knowledge and some google :P The triangle rasterization part is based on this but that does vertex interpolation in 2d space instead of 3d so it creates those nasty affine artifacts for uvs, so I need to fix that.
I'm not sure if I had this in the public release, but I've added 2 more options to my Object Placer tool.
One being Array Along Edge. Which lets you array/clone objects along edges spaced evenly.
The Second that I just added, being Align To Edge Direction, which is pictured below. It aligns each object to each edge chosen, so its Local Z is pointing along the Edge Direction, and then the Local Y is always outward from the Edge Normal. So this way all Objects are oriented the same in regards to Local Space. So you could easily rotate all the same value and it would be the same.
Edit: Also adding a Face Option now too. So they all orient the same (hopefully) to the faces selected. So far it works great.
Wow, that's awesome! Vector Displacements can be used like Normal Maps too though can't they? I think Valve used them for the water in Portal 2.
Using your method above, you'd have to have a super high poly mesh, like a persons head for example, in order to get those details nice and smooth though right?
Wow, that's awesome! Vector Displacements can be used like Normal Maps too though can't they? I think Valve used them for the water in Portal 2.
Using your method above, you'd have to have a super high poly mesh, like a persons head for example, in order to get those details nice and smooth though right?
I'm going to answer both at the same time if you don't mind.
The term "vector map" is somewhat missleading actually. A normalmap is also a vector map.
In Portal 2 and Left 4 Dead 2, they used vector maps to distort the flow of water in U and V. So in other words, blue channel was left empty. There multiple ways of creating these type of maps. You can plug them into the UV Coords in a TextureSample in UDK easily.
What I'm showing in the video, though, is quite limited (under DirectX9).
You can't just plug in a "vector map" in WorldPositionOffset.
Here is why: (Source: UDN)
The World Position Offset material input is executed in the vertex shader. This is different from all the other existing material inputs, which are applied in the pixel shader. A large number of the material nodes exposed in the material editor won't work in vertex shaders, specifically anything involving a texture lookup.
With DirectX11 however, you can. Which is great! You can have different "vector maps" for the same mesh.
Currently my solution only allows one "vector map", which needs to be baked as vertex color. Which in fact, is probably more precise than a bitmap.
There are also other problems that I have been able to fix, such as normals, tangents and binormals not moving correctly as the mesh deforms.
WorldPositionOffset is quite limited in general, it is purely for visual effects. Stuff such as bullet decals will only be rendered on the "original" un-altered state of the mesh. To sum it up, the CPU does not know what's going on with the mesh, only the GPU does.
Currently I'm trying to work around the limitation of one vectormap per staticmesh. I'm getting there
Some previous knowledge and some google :P The triangle rasterization part is based on this but that does vertex interpolation in 2d space instead of 3d so it creates those nasty affine artifacts for uvs, so I need to fix that.
Ah, so you are using the verts then? So you'd have to have a dense mesh and bake it down to work properly?
well it depends on what you want to accomplish. The example in my video might not be the best one, since I didn't optimize the mesh at all. So you don't really need as many polygons as shown in the video!
Imagine you wanted to morph a plane to a nose. You would take the mesh of the nose... and flatten it until it's a square plane. The polycount of the mesh is totally up to you. If the ear looks good with 500 polygons, then great! The ear in the video would look just as great with 30% of the polygons, I just didn't bother to optimize it for the video, sorry :P
If however, you intend to reuse a mesh in order to morph it to multiple shapes (multiple vertex colors), you'll probably need more polygons.
If you have a column for example, and you want to bend it, you'll need enough polygons for it to look right, meaning it will probably need extra segments. If you want to turn that column into a face (why not? lol) you'll have to make sure you'll be able to achieve it with the current polycount. Or you can always add extra polygons to the column for the only purpose of getting the face right.
So in other words, you need to plan ahead what you want to morph the staticmesh into, you can't go nuts and try to morph a 2 triangle plane into an airplane, unless it's a paper airplane hehe.
One of the things I want to try out later on is turn a Portal-type chamber and morph it into something completely different like a cave or something. Should be do-able, poly count shouldn't be high, and performance-wise it should cause no problems.
Here is another video of the displacement. This time I'm showing how instanced actors can have different morph targets, since vertex paint data can be stored per actor. So there is no need to have 4 unique actors. This should save resource size dramatically if you plan to have an environment full of morphing floors and walls :poly124:
Here is a screenshot of the possible inaccuracies of one of the resulting meshes. (Notice the "jagged" wireframe at some locations.) This is due to the fact that XYZ coordiantes have to be converted to RGB values (0-255) and need to be rounded up or down in order to fit.
Other good news is, this should run on UDK mobile.
Added another tool to my Object Placer. I took some ideas from other scripts, and made my own version with a nice feature set I think. I also have a Freeform version, button above this one, that works on the faces of the actual objects and not the bounds.
I would like to have the faces Highlight when you are on a Bounds Side instead of the Circle, but I'm not sure I can do that. At least not with a material. Maybe with gw drawing methods or creating a plane to match the face somehow.
Here I'm experimenting with replacing a 3dsMax viewport with an UDK viewport in order to be able to check the changes in realtime between 3dsmax and UDK.
Thanks to the MaxScript sub-forum at CGTalk, there are a ton of useful bits of code that I have learnt from!
Norman, that's completely awesome. I've managed to get the object-wise Unreal<->Max sync working, but that viewport sync is fantastic. C++ dll plugin thing?
Here I'm experimenting with replacing a 3dsMax viewport with an UDK viewport in order to be able to check the changes in realtime between 3dsmax and UDK.
That is taking the crown! Please release it with proper documenation, I'd love to learn how you did that.
I've seen the extended viewport thing before, or rather, someone had a script that let u dock IE in max or really anything you want I think.
Yes exactly! That's what got me thinking!
So in the script there are actually a couple of more things happening.
First I'm sending a message with "SendMessage" from 3dsMax in order to create a new floating viewport in UDK. (For this you need to find the toolbar handle and send the proper button press commands and so on)
Then I'm finding the handle I'm interested in. In this case I don't want the whole floating viewport, just the viewport itself. Once you have that you can create a new dialog in 3dsmax with the UDK viewport in it. (This is what that bit of code you found essential does)
Then you have to automatize the process of replacing the active viewport with the extended one.
There are also bugs that I had to deal with. If you were to close the extended viewport with the UDK viewport in it, it will destroy the actual UDK viewport. UDK won't be able to find it and it will crash. So I need to make sure that the proper WM_CLOSE message is being send to the UDK parent handle when closing the 3dsmax extended viewport dialog.
There is still a LOT of stuff to fix/implement. I wouldn't want to get anyone's hopes up. This more of a R&D/just-for-fun kind of side-project than anything else. Hopefully I'll have a "finished" and stable version some day
It would certainly help me a lot more if I had an actual programmer helping me out with the Unreal 3 side of things. But it's my understanding that you can only mess with the editors code if you have a license. So a programmer would have to hack his way around the UDK.
EDIT: Right now I'm trying to figure out a way of being able to get the list of staticmeshes in the scene to 3dsmax quickly. As well as a way for me to select staticmeshes from 3dsmax. I have a couple of ideas, but they are just ugly workarounds. Like everything else lol.
EDIT: Right now I'm trying to figure out a way of being able to get the list of staticmeshes in the scene to 3dsmax quickly. As well as a way for me to select staticmeshes from 3dsmax. I have a couple of ideas, but they are just ugly workarounds. Like everything else lol.
Eh, if it works, it works! haha. That's what I have had to do sometimes, because I didn't have access/knowledge to C++ or whatever. Getting the handle is a good idea. I did something similar when doing an automated OBJ Import/Export from Max to Maya and vice versa.
My hack was doing things like setting Max/Maya windows active, and dropping in .ms or .mel scripts via C# from my form to start exporting objs and whatnot.
By the way, it looks like ur sending Transforms from Max to UDK in realtime?
Seems like it would be better performance wise to send after transforms are done if possible? Not per change. Or is that what your doing?
I can't remember what it was doing at time I uploaded that video. But I did fix it indeed change it so that it would only send the data after the transformation was done. So moving objects in 3dsmax was fast.
But even then, 3dsmax 2011 and 2012 were still slow. Strangely enough, with 2009 and 2010 it went really, really well. I also tried it on 2011 and 2012 on another machine and it went well too. There must be something wrong with my 2011 and 2012 install.
This was back in June, haven't touched the script since then. I'll be working on it the next couple of days (hopefully).
Yeah, I've noticed things like that as well. Between version.
Although for me, normally max 2012 was faster than previous versions. Doing anything with many objects was like 2-3x faster in 2012 vs. 2010 on my end, if not more.
I've kept scripting in 2010 most of the time, but lately I've been using 2012 and copying files back to my 2010 directory to check stuff in there.
I'm using 2011 for now. 2012 just seems too buggy for me. Specially with the whole nitrous viewport thing. I know they have fixed a lot of things with the service packs, but there just a couple of things that are half-way-implemented. (DirectX materials crashing max for example). I think I'm just going to try to avoid 2012 and wait for 2013.
Yeah, they definitely broke a ton of stuff in 2012. But overhauling what they did, its sort of expected, but it was pretty bad at first.... really bad actually.
I've had good results with nitrous though. I have a GTX570 though, and before a GTX260. Both have good perf with realistic mode on. I can get high FPS even with 9 million tris in the scene in realistic with lights and shadows.
How? UDK mobile ignores the material networks. I was under the impression that you couldn't write your own custom shaders.
Are you sure? I haven't really looked into it. But if those nifty vertex animated flags in Epic Citadel work I don't see why the vertex deformation shown in my video shouldn't.
Are you sure? I haven't really looked into it. But if those nifty vertex animated flags in Epic Citadel work I don't see why the vertex deformation shown in my video shouldn't.
yea, that's actually a canned animation that is part of the shaders they let you use. UDK mobile shaders are like the cryengine illum shader, but much more limited. you can't add any custom functionality.
yea, that's actually a canned animation that is part of the shaders they let you use. UDK mobile shaders are like the cryengine illum shader, but much more limited. you can't add any custom functionality.
I'm not sure I'm following you... The animation of those flags is done with WorldPositionOffset, am I right? Aren't you allowed to use WorldPositionOffset? Is that what you mean? If so, holy crap, that sucks! o_O
edit: nevermind... those flags are actually skeletalmeshes! :poly115: I was under the impression they were using WorldPositionOffset
You can;t use any nodes at all, they have no effect on the material for mobile (Unless you use auto flatten or bake normal's lighting in. but like the name says, FLAT)
Mobile materials are made in a sepperate rollout under the node editor. its a very limited shader with a few permutations for various effects like bump offsets. The only thing you can do is assemble a shader from their template. nothing custom. Maybe if your a UE3 source user than you might be able to do something like what your asking.
You can;t use any nodes at all, they have no effect on the material for mobile (Unless you use auto flatten or bake normal's lighting in. but like the name says, FLAT)
Mobile materials are made in a sepperate rollout under the node editor. its a very limited shader with a few permutations for various effects like bump offsets. The only thing you can do is assemble a shader from their template. nothing custom. Maybe if your a UE3 source user than you might be able to do something like what your asking.
Just checked the link. Well this sucks. It is definitely possible from the technical point of view or so it seems. They just won't let you play with it. Meh.
Replies
IS
F*CKING
AWESOME
With that it i possible to share libraries. Awesome! Thanks man! The only thing i could miss is an additional question if you really want to delete a tool. Because there is no way back - maybe there should be a "security" question. Would there a way to select multiple tools and then drag&drop/delete them? In my case the used lib wasn't loaded the next time when i closed the tool and reopend it.
But anyway - this is an awesome tool! Many thanks for sharing!
Yeah, I could have a Yes/No for the Delete I suppose.
I did have Multi-Select setup at first, but I'm not sure if I can get it to work properly or not. Right now though my code is all setup to work with 1 at a time, so i'd have to re-write all of it.
Yeah, the Library doesnt load on start, mainly because if u had a huge library open and u close the tool and forget, and then open it again, u might not want to have all that loading. But I could maybe have an option to load the last library on open.
I also made a half finished Quake2 BSP loader which is where the level geometry is from.
It renders the entire map at that resolution at about 40 fps without any occlusion. Its about 10,000 triangles.
Interface:
Controlling IK [ame="http://www.youtube.com/watch?v=zhiJaRDYX2w"]impSixense - YouTube[/ame]
Simple test vid: [ame="http://www.youtube.com/watch?v=f7drjSIkvGw"]sixense.test - YouTube[/ame]
That's really cool. How are you going about that? Using any documents/tutorials, or just from previous knowledge?
the idea is to use mesh pieces for every sprite to pack them even tighter on a texture map like UV mapping with mesh islands- but all automated.
Is that automated right now for the spline, or did u create it? It seems like it would be quite
challenging to get it to automate that properly.
Very nice. What algorithms do you use?
Some previous knowledge and some google :P The triangle rasterization part is based on this but that does vertex interpolation in 2d space instead of 3d so it creates those nasty affine artifacts for uvs, so I need to fix that.
One being Array Along Edge. Which lets you array/clone objects along edges spaced evenly.
The Second that I just added, being Align To Edge Direction, which is pictured below. It aligns each object to each edge chosen, so its Local Z is pointing along the Edge Direction, and then the Local Y is always outward from the Edge Normal. So this way all Objects are oriented the same in regards to Local Space. So you could easily rotate all the same value and it would be the same.
Edit: Also adding a Face Option now too. So they all orient the same (hopefully) to the faces selected. So far it works great.
[ame="http://www.youtube.com/watch?v=Iuv4TY2-gdA"]3D Vector Displacement - UDK (DX9 WorldPositionOffset) - Test#1 - YouTube[/ame]
Using your method above, you'd have to have a super high poly mesh, like a persons head for example, in order to get those details nice and smooth though right?
I'm going to answer both at the same time if you don't mind.
The term "vector map" is somewhat missleading actually. A normalmap is also a vector map.
In Portal 2 and Left 4 Dead 2, they used vector maps to distort the flow of water in U and V. So in other words, blue channel was left empty. There multiple ways of creating these type of maps. You can plug them into the UV Coords in a TextureSample in UDK easily.
What I'm showing in the video, though, is quite limited (under DirectX9).
You can't just plug in a "vector map" in WorldPositionOffset.
Here is why: (Source: UDN)
With DirectX11 however, you can. Which is great! You can have different "vector maps" for the same mesh.
Currently my solution only allows one "vector map", which needs to be baked as vertex color. Which in fact, is probably more precise than a bitmap.
There are also other problems that I have been able to fix, such as normals, tangents and binormals not moving correctly as the mesh deforms.
WorldPositionOffset is quite limited in general, it is purely for visual effects. Stuff such as bullet decals will only be rendered on the "original" un-altered state of the mesh. To sum it up, the CPU does not know what's going on with the mesh, only the GPU does.
Currently I'm trying to work around the limitation of one vectormap per staticmesh. I'm getting there
check this article series for proper interpolation
http://chrishecker.com/Miscellaneous_Technical_Articles
well it depends on what you want to accomplish. The example in my video might not be the best one, since I didn't optimize the mesh at all. So you don't really need as many polygons as shown in the video!
Imagine you wanted to morph a plane to a nose. You would take the mesh of the nose... and flatten it until it's a square plane. The polycount of the mesh is totally up to you. If the ear looks good with 500 polygons, then great! The ear in the video would look just as great with 30% of the polygons, I just didn't bother to optimize it for the video, sorry :P
If however, you intend to reuse a mesh in order to morph it to multiple shapes (multiple vertex colors), you'll probably need more polygons.
If you have a column for example, and you want to bend it, you'll need enough polygons for it to look right, meaning it will probably need extra segments. If you want to turn that column into a face (why not? lol) you'll have to make sure you'll be able to achieve it with the current polycount. Or you can always add extra polygons to the column for the only purpose of getting the face right.
So in other words, you need to plan ahead what you want to morph the staticmesh into, you can't go nuts and try to morph a 2 triangle plane into an airplane, unless it's a paper airplane hehe.
One of the things I want to try out later on is turn a Portal-type chamber and morph it into something completely different like a cave or something. Should be do-able, poly count shouldn't be high, and performance-wise it should cause no problems.
[ame="http://www.youtube.com/watch?v=ovelfgNiyJk"]3D Vector Displacement - UDK (DX9 WorldPositionOffset) - Test #2 - YouTube[/ame]
Here is a screenshot of the possible inaccuracies of one of the resulting meshes. (Notice the "jagged" wireframe at some locations.) This is due to the fact that XYZ coordiantes have to be converted to RGB values (0-255) and need to be rounded up or down in order to fit.
Other good news is, this should run on UDK mobile.
[ame="http://www.youtube.com/watch?v=MPxhC35TZvA"]3D Vector Displacement - UDK (DX9 WorldPositionOffset) - Test #3 - YouTube[/ame]
I would like to have the faces Highlight when you are on a Bounds Side instead of the Circle, but I'm not sure I can do that. At least not with a material. Maybe with gw drawing methods or creating a plane to match the face somehow.
http://matthewlichy.com/boundsPlacer/boundsPlacer.html
UDK Sync is kind of an experiment. I'm trying to bring 3dsMax and UDK closer together.
I think I already posted this video some time ago:
[ame="http://www.youtube.com/watch?v=nnfdj5VVVjs"]UDK Sync 0.2 - YouTube[/ame]
To sum it up. You can take advantage of all kinds of placement tools in 3dsMax.
In that video I had both windows side by side, which was a bit inconvenient. So here is my attempt at fixing that:
[ame="http://www.youtube.com/watch?v=LDgwbrhOVSY"]UDK Sync 0.3 Preview - YouTube[/ame]
Here I'm experimenting with replacing a 3dsMax viewport with an UDK viewport in order to be able to check the changes in realtime between 3dsmax and UDK.
Thanks to the MaxScript sub-forum at CGTalk, there are a ton of useful bits of code that I have learnt from!
That is taking the crown! Please release it with proper documenation, I'd love to learn how you did that.
I've seen the extended viewport thing before, or rather, someone had a script that let u dock IE in max or really anything you want I think.
Yes exactly! That's what got me thinking!
So in the script there are actually a couple of more things happening.
First I'm sending a message with "SendMessage" from 3dsMax in order to create a new floating viewport in UDK. (For this you need to find the toolbar handle and send the proper button press commands and so on)
Then I'm finding the handle I'm interested in. In this case I don't want the whole floating viewport, just the viewport itself. Once you have that you can create a new dialog in 3dsmax with the UDK viewport in it. (This is what that bit of code you found essential does)
Then you have to automatize the process of replacing the active viewport with the extended one.
There are also bugs that I had to deal with. If you were to close the extended viewport with the UDK viewport in it, it will destroy the actual UDK viewport. UDK won't be able to find it and it will crash. So I need to make sure that the proper WM_CLOSE message is being send to the UDK parent handle when closing the 3dsmax extended viewport dialog.
There is still a LOT of stuff to fix/implement. I wouldn't want to get anyone's hopes up. This more of a R&D/just-for-fun kind of side-project than anything else. Hopefully I'll have a "finished" and stable version some day
It would certainly help me a lot more if I had an actual programmer helping me out with the Unreal 3 side of things. But it's my understanding that you can only mess with the editors code if you have a license. So a programmer would have to hack his way around the UDK.
EDIT: Right now I'm trying to figure out a way of being able to get the list of staticmeshes in the scene to 3dsmax quickly. As well as a way for me to select staticmeshes from 3dsmax. I have a couple of ideas, but they are just ugly workarounds. Like everything else lol.
Eh, if it works, it works! haha. That's what I have had to do sometimes, because I didn't have access/knowledge to C++ or whatever. Getting the handle is a good idea. I did something similar when doing an automated OBJ Import/Export from Max to Maya and vice versa.
My hack was doing things like setting Max/Maya windows active, and dropping in .ms or .mel scripts via C# from my form to start exporting objs and whatnot.
http://www.matthewlichy.com/pivot_bounds/pivot_bounds.html
Seems like it would be better performance wise to send after transforms are done if possible? Not per change. Or is that what your doing?
I can't remember what it was doing at time I uploaded that video. But I did fix it indeed change it so that it would only send the data after the transformation was done. So moving objects in 3dsmax was fast.
But even then, 3dsmax 2011 and 2012 were still slow. Strangely enough, with 2009 and 2010 it went really, really well. I also tried it on 2011 and 2012 on another machine and it went well too. There must be something wrong with my 2011 and 2012 install.
This was back in June, haven't touched the script since then. I'll be working on it the next couple of days (hopefully).
Although for me, normally max 2012 was faster than previous versions. Doing anything with many objects was like 2-3x faster in 2012 vs. 2010 on my end, if not more.
I've kept scripting in 2010 most of the time, but lately I've been using 2012 and copying files back to my 2010 directory to check stuff in there.
I've had good results with nitrous though. I have a GTX570 though, and before a GTX260. Both have good perf with realistic mode on. I can get high FPS even with 9 million tris in the scene in realistic with lights and shadows.
It's more likely than I thought.
How? UDK mobile ignores the material networks. I was under the impression that you couldn't write your own custom shaders.
Are you sure? I haven't really looked into it. But if those nifty vertex animated flags in Epic Citadel work I don't see why the vertex deformation shown in my video shouldn't.
yea, that's actually a canned animation that is part of the shaders they let you use. UDK mobile shaders are like the cryengine illum shader, but much more limited. you can't add any custom functionality.
http://udn.epicgames.com/Three/MobileMaterialReference.html
I'm not sure I'm following you... The animation of those flags is done with WorldPositionOffset, am I right? Aren't you allowed to use WorldPositionOffset? Is that what you mean? If so, holy crap, that sucks! o_O
edit: nevermind... those flags are actually skeletalmeshes! :poly115: I was under the impression they were using WorldPositionOffset
Mobile materials are made in a sepperate rollout under the node editor. its a very limited shader with a few permutations for various effects like bump offsets. The only thing you can do is assemble a shader from their template. nothing custom. Maybe if your a UE3 source user than you might be able to do something like what your asking.
http://udn.epicgames.com/Three/MobileMaterialReference.html
Just checked the link. Well this sucks. It is definitely possible from the technical point of view or so it seems. They just won't let you play with it. Meh.
[ame="http://www.youtube.com/watch?v=dD5iqQIp1RA"]UnityGUI using Windows Forms - YouTube[/ame]