Here's how to exchange arguments between jsx/flash. There's some drawbacks, and a solution, in case you can't live with those drawbacks.
PASSING ARGS TO JSX FUNCTIONS:
CSXSInterface.instance.evalScript("myFunctionA"); // no args
CSXSInterface.instance.evalScript("myFunctionB", someValue);
CSXSInterface.instance.evalScript("myFunctionC", someValue, someOtherValue);
You should be able to pass any ordinary data type (string, int, float)
drawback1: JSX will always receive a string
drawback2: no complex types as arrays (need to verify this)
RETURN VALUES FROM JSX:
jsx:
function returnSomething() {
// do something...
var myString = "Hello Flash!";
var strXMLResult = '<object><property id="bSuccess"><true/></property>';
strXMLResult += '<property id="retString"><string>' + myString + '</string> </property>';
strXMLResult += '</object>';
return strXMLResult;
}
flash:
var sr:SyncRequestResult = CSXSInterface.instance.evalScript("myFunction");
if (sr.status == SyncRequestResult.COMPLETE) {
if(sr.data) {
var or:Object = sr.data as Object;
if(or.hasOwnProperty("retString")) {
var myReturnValue:String = or.retString;
// do something with myReturnValue
}
}
}
We can only return a string! And it must be XML formatted like this! Inside this XML string we can encapsulate our real return data. The "bSuccess" property is mandatory, and also the <object> tags. After the bSuccess property we can add more of our own return properties (we only have one here: retString).
drawback1: assembling this XML string can be annoying if you want to return many values
drawback1: strings inside the XML string have to be escaped on the JSX side and un-escaped on the Flash side!
SOLUTION: enter JSON!
My personal choice to overcome those drawbacks was to just JSON encode any parameters. This takes care of type conversations, encapsulation (we only need 1 return parameter in the XML), escaping and unicode issues!
There's a small downside to this: Photoshop executes the .jsx file in some sort of Sandbox, which is a DIFFERENT directory than Plug-ins/Panels. And any external .js files will not be copied there. So we have to #include or eval external .js files from a (semi) absolute path. This could be the plug-ins/panels folder or also Presets/Scripts.
Then load the .js like this:
var SCRIPTS_FOLDER = decodeURI(app.path + '/' + localize("$$$/ScriptingSupport/InstalledScripts=Presets/Scripts"));
$.evalFile(SCRIPTS_FOLDER+"//json2.js");
Put this code on top of your .jsx then you can use everything from json2.js.
If you put json2.js into the Scripts folder and you want to hide it so it doesn't show up in the Photoshop Scripts menu, add this on top:
Flash panels were supported in PS CS3, they were called patch panels. But Adobe dropped this(link) and replaced it with CS SDK. It seems they worked in a different way and were not as integrated as flash panels in CS5.
I hate customizing my Zbrush UI for each version. So i decided to work on a separate toolbar that works along with the current version or whatever the new coming version is.
It has keyboard-mode ctrl+alt/shift (pressed down) combination. that can be toggle on/off. So it works great on a tablet without keyboard.
it's a lot of work, mostly designed the UI, i'm making this in C# (programming language).
It won't make changes or interfere with Zbrush, it only assigned some new hot-keys to brushes/buttons that don't have hotkeys. (won't override ZBrush default hot-key)
Is there a reason the brush buttons are spaced the way they are? It seems like like you could get a another row and a few more columns in there if they were closer.
Lets you place a pivot point in a sequence or individual or mixed sprites. The way this works is by assuming that the pivot point is always the center of the sprite canvas. So when you change the pivot point it will actually extend the new canvas size of each sprite in such a way so that the center of each new image is where you placed the pivot.
It detects sequences automatically and matches the sequence canvas size to the bounds of all frames of the sequence.
This is a powerful feature especially if you pack the sprites later anyway and loose all that extra space anyhow because the offset and center of the original sprite can always be restored in the engine.
Flash panels tutorial part 3: packaging your extension
2 ways of packaging:
* using extension manager using a .mxi file - for simple in-house deploys
* manually - for adobe exchange AND if you want to restrict resizing of your panel
Yes, you heard right, setting an inital size for your panel or not allowing to resize your panel requires you to package it! doesn't that suck?!
USING EXTENSION MANAGER
create a folder for your package (e.g. myPackage), drop the .swf and any other needed files in it.
Create a .mxi file and put it in that folder too. E.g. it can have the same name as your package (e.g. myPackage.mxi). Open the .mxi in a text editor:
Double click the .mxi. The extension manager will launch and ask you for a name for the target zxp file - which is your packaged extension.
MANUAL PACKAGING:
For this to work you need to create a security certificate. You can also buy a certificate from a vendor like VeriSign ot Thawte. What's the point of this? The certificate ensures that no 3rd party (like an evil haxxor) modified the extension and that what you're installing is just the same thing the author himself packaged.
We're making our own certificate tho because the other options cost money. (You can ask your IT to update your company's root certificates, or something like that, and make your own certificate trusted, then Adobe Extension Manager won't spit out a warning every time you install your plugin).
1) Download Adobe Configurator - google for it. It's free. Install it.
2) Make a folder e.g. myPackage.
Put the .swf, .jsx and everything else belonging to your panel inside this folder.
Inside the myPackage folder make a folder called "CSXS"
Inside the CSXS folder create a text file "manifest.xml"
This creates your packaged extension named myPackage.zxp. You can double click it and the Adobe Extension Manager should install it. Or you can open it with WinZip and have a look what's inside.
HYBRID EXTENSIONS
One problem with the manual packaged extensions is that they can contain ONLY a .swf, an optional .jsx and optional rollover.png images and NOTHING else. If you want to distribute anything else, you have to double package your extension (like double fried pork!) - package once manually and then package again with the .mxi method.
In the .mxi you can copy additional files - e.g. .js or .jsxinc files or compiled C++ plugins or whatever.
I still have to add the support of cages and presets.
About the preset, I was thinking about creating an empty transform (aka group) in the scene with a lot of custom parameters. So if you want to share your settings you simply send the scene and reload the transform in the script which will read the custom attributes.
What do you guys think about this ?
I'm thinking about external txt file otherwise, but it's hard to know if it's better. I'm not professional (hey, still a student), so I would like external opinion.
Some more work on the Unity IDE. Proper context sensitive autocomplete is working, including generics. There are a few small things to do like determining if instance or static members should be shown and icons indicating what the item represents.
Working on a Ghosting Tool for animation. We are doing a lot of gameplay mock ups at work right now, and have found the built in ghosting tool lacking.
Monster for something like that you're just referencing the model at those certain frame positions right? Maybe additional functionality for seeing curve? Think I saw that in one of the Maya videos on some tech artist reel.
Martins Upitis got realtime area lights of rectangular shape and arbitrary texture (Even Video! Shown at 7:16 in the first video) to work in the candy build of Blender 3d. Specular and Diffuse area lighting work excellent and are supposedly very efficient.
Supposedly, up until recently (before Wall-E, I think?), even Pixar weren't simulating actual GI, but instead where faking the effect through Area lighs. I can't wait until more enginges get this feature, there are so many things you can do with area lights. The Cryengine for Crysis 3 will also supposedly have this.
Now all we need is an efficient realtime area soft shadowing method.
Supposedly, up until recently (before Wall-E, I think?), even Pixar weren't simulating actual GI, but instead where faking the effect through Area lighs. I can't wait until more enginges get this feature, there are so many things you can do with area lights. The Cryengine for Crysis 3 will also supposedly have this.
Now all we need is an efficient realtime area soft shadowing method.
the gather function/colorbleeding workflow had been around since shortly after tracing was introduced (2002). It was also available prior by using a rayserver to connect to BMRT.
My understanding is that they try to stick with Area lights when it will afford them a similar effect at a faster framerate. I don't remember where exactly I heard/read this, but I am pretty sure it was from a Geomerics GDC/Siggraph presentation on a realtime Area Shadowing feature they have been developing. I will try and find the source when I get a chance.
Martins Upitis got realtime area lights of rectangular shape and arbitrary texture (Even Video! Shown at 7:16 in the first video) to work in the candy build of Blender 3d. Specular and Diffuse area lighting work excellent and are supposedly very efficient.
Awesome stuff! I downloaded it and tried it out, it works great! Frame rate wouldn't go higher than my screen refresh rate (75hz), but the frame rate was a solid 75 fps with 10 area lights and a good number of polys.
Edit: After playing with it some more, this is one of the coolest things I've gotten to mess around with in a very long time! I may even start using Blender's viewport for realtime beauty shots now, it looks really good.
Working on a xnormal baking assistant for 3dsMax, a bit like Froyok, made from the one I've made at work, but more evolved and with more options (as people can have a lot of different workflows)
Working on a xnormal baking assistant for 3dsMax, a bit like Froyok, made from the one I've made at work, but more evolved and with more options (as people can have a lot of different workflows)
Sure, This is a screen from the "work" version, as you can see it's pretty basic and minimal.
It's made to work with naming suffixes and a folder structure.
So to make it more evolved, I'm currently adding a setting window, to control what maps to bake and their options, the xnormal baking settings (bucket size, folder location, xnormal location etc... ) (lots of rollouts yeah!)
Your newest article is very interesting. Failure... more so the "fear" of failure is very interesting to me. I love that you cited the app Alchemy. That app is very inspirational to me because of its main motivation to not remove the fear but make you embrace it and learn that it's ok.
I had never heard of the Alchemy app, I will have to check it out. The idea of shifting the risky perception of failure unto the machine and thus focusing on the process is interesting.
Here is a new Zbrush script I have been working on.
Smart Dynamesh. This script takes the guess work out of getting an acceptable poly density with DynaMesh. No more moving the Dynamesh Resolution slider around and repeatedly undoing to get polygons you want. The script will create a Dynamesh near the "Target Polycount" slider. This is especially useful for hard surfaces that you know you will need a dense mesh.
Use this script instead of initially pressing Dynamesh then just Dynamesh as normal. This script is also great for reducing the polycount of your Dynamesh when the mesh gets too heavy. Reducing the polycount this way has very little detail loss.
This wannabe-TA (me) is working on a material tool for Maya meant to aid our level designer.
Written in MEL ofc, and I'm building it around a stripped-down version of the original hypershade. Currently stripping away all the bullshit in it - I just want to keep the material stuff from it, and then add my own stuff and shortcuts, custom tools, whatever.
I posted an update to my plugin and it includes the Smart Dynamesh show in the video a few posts back. I would love to hear what you guys think or if you have any ideas for future scripts.
The new version of Mari (1.6) now has a flow map painter and shader built in.
You can paint flow in 3d and see the results in real time as well it seems:
Hello poly-people,
tool I've been working on for the last few weeks:
CryENGINE 3 exporter for Softimage (tested with ModTool 7.5 and 2012 SP1)
Features:
Code is pretty ugly (and/or unnecessary) in some places, will be cleaned up sometime soon.
I bet there are still bugs somewhere, if you find any, please report them on GitHub or here.
Any suggestions on functionality or whatever are appreciated!
There is some cool tech I was able to use with this project, so check out the making of link if you are interested. I'll have some more screens and videos of the making process once I catch up a bit.
Hello poly-people,
tool I've been working on for the last few weeks:
CryENGINE 3 exporter for Softimage (tested with ModTool 7.5 and 2012 SP1)
Features:
Code is pretty ugly (and/or unnecessary) in some places, will be cleaned up sometime soon.
I bet there are still bugs somewhere, if you find any, please report them on GitHub or here.
Any suggestions on functionality or whatever are appreciated!
cool! i know quite a few people who love using XSI and would probably love using this!
RenderHJS >> The game was fun to play, too bad my laptop doesn't support touchpad+keys at the same time. I could only last to level 3. (standing still every wave is kinda hard) Good performance for a flash game.
Nothing fancy here, but while working on a CGFX shader in Maya, I decided to have a look at the attribute editor interface.
So here it is:
added the possibility to group parameters by using the semantic
added a menu to display parameters either the classic way or by group (because not all shaders will support this last option, and it's up to the user to decide)
fixed the bug with the technique selection in Maya 2011
added texture swatch to the sampler parameters
the sampler parameters now have the correct labels (i.e. 'Diffuse Map' instead of 'diffuseSampler')
available for Maya 2011, 2012 and 2013
Here is an example with a modified version of XoliulShader:
I am looking forward to playing around with some of the new engines this year, particularly because higher specs like Dx11, more memory, faster storage, etc. will become the norm. Hopefully the new consoles that are supposedly being announced this year will help set a good pace. (:poly122:)
I am looking forward towards new strong hardware platforms. DirectX didn't matter after 9 until now and I hope that with the xbox and playstation models we'll get a bigger push towards or perhaps beyond what's already possible on PC.
Oyua and Steam-box (booth linux based) look interesting as well and I think this year is only getting us more middle ware to play around and port to all those various software / hardware platforms.
To be honest though for many things I am not looking forward that much. Windows 8 is not my cup of tea, Autodesk and Adobe are still stuck on their old horses which they patch and build on with every release (often not for the better).
Professional PC/ or computer solutions are shrinking in favor of the Consumerization move. Everything is dummyfied and ported with a primary focus on mobile devices. Those are things that worry me a bit these days.
Replies
Here's how to exchange arguments between jsx/flash. There's some drawbacks, and a solution, in case you can't live with those drawbacks.
PASSING ARGS TO JSX FUNCTIONS:
You should be able to pass any ordinary data type (string, int, float)
drawback1: JSX will always receive a string
drawback2: no complex types as arrays (need to verify this)
RETURN VALUES FROM JSX:
jsx:
flash:
We can only return a string! And it must be XML formatted like this! Inside this XML string we can encapsulate our real return data. The "bSuccess" property is mandatory, and also the <object> tags. After the bSuccess property we can add more of our own return properties (we only have one here: retString).
drawback1: assembling this XML string can be annoying if you want to return many values
drawback1: strings inside the XML string have to be escaped on the JSX side and un-escaped on the Flash side!
SOLUTION: enter JSON!
My personal choice to overcome those drawbacks was to just JSON encode any parameters. This takes care of type conversations, encapsulation (we only need 1 return parameter in the XML), escaping and unicode issues!
For ExtendScript I'm using this. For Flash this.
There's a small downside to this: Photoshop executes the .jsx file in some sort of Sandbox, which is a DIFFERENT directory than Plug-ins/Panels. And any external .js files will not be copied there. So we have to #include or eval external .js files from a (semi) absolute path. This could be the plug-ins/panels folder or also Presets/Scripts.
Then load the .js like this:
Put this code on top of your .jsx then you can use everything from json2.js.
If you put json2.js into the Scripts folder and you want to hide it so it doesn't show up in the Photoshop Scripts menu, add this on top:
Some info about it here http://blog.drwoohoo.com/?p=654
Patch panel in action in Illustrator CS3 http://vimeo.com/324573
It has keyboard-mode ctrl+alt/shift (pressed down) combination. that can be toggle on/off. So it works great on a tablet without keyboard.
it's a lot of work, mostly designed the UI, i'm making this in C# (programming language).
It won't make changes or interfere with Zbrush, it only assigned some new hot-keys to brushes/buttons that don't have hotkeys. (won't override ZBrush default hot-key)
Is there a reason the brush buttons are spaced the way they are? It seems like like you could get a another row and a few more columns in there if they were closer.
Lets you place a pivot point in a sequence or individual or mixed sprites. The way this works is by assuming that the pivot point is always the center of the sprite canvas. So when you change the pivot point it will actually extend the new canvas size of each sprite in such a way so that the center of each new image is where you placed the pivot.
It detects sequences automatically and matches the sequence canvas size to the bounds of all frames of the sequence.
This is a powerful feature especially if you pack the sprites later anyway and loose all that extra space anyhow because the offset and center of the original sprite can always be restored in the engine.
2 ways of packaging:
* using extension manager using a .mxi file - for simple in-house deploys
* manually - for adobe exchange AND if you want to restrict resizing of your panel
Yes, you heard right, setting an inital size for your panel or not allowing to resize your panel requires you to package it! doesn't that suck?!
USING EXTENSION MANAGER
create a folder for your package (e.g. myPackage), drop the .swf and any other needed files in it.
Create a .mxi file and put it in that folder too. E.g. it can have the same name as your package (e.g. myPackage.mxi). Open the .mxi in a text editor:
Double click the .mxi. The extension manager will launch and ask you for a name for the target zxp file - which is your packaged extension.
MANUAL PACKAGING:
For this to work you need to create a security certificate. You can also buy a certificate from a vendor like VeriSign ot Thawte. What's the point of this? The certificate ensures that no 3rd party (like an evil haxxor) modified the extension and that what you're installing is just the same thing the author himself packaged.
We're making our own certificate tho because the other options cost money. (You can ask your IT to update your company's root certificates, or something like that, and make your own certificate trusted, then Adobe Extension Manager won't spit out a warning every time you install your plugin).
1) Download Adobe Configurator - google for it. It's free. Install it.
2) Make a folder e.g. myPackage.
Put the .swf, .jsx and everything else belonging to your panel inside this folder.
Inside the myPackage folder make a folder called "CSXS"
Inside the CSXS folder create a text file "manifest.xml"
Here's an example manifest.xml:
Way down you can see where we can specify the flash panel's dimensions.
3) Open the command line to package your extension...
Make sure java.exe is in the path.
Adobe installs a default java at C:\ProgramData\Adobe\CS5\jre\bin , so you can
just use that.
java.exe -jar "C:\CreativeSuiteSDK\CS Flex SDK 3.4.0\lib\adt.jar" -certificate -cn MyCompanyOrName 1024-RSA myCert.p12 mypassword
java.exe -jar "C:\Program Files (x86)\Adobe\Adobe Configurator 3\ucf.jar" -package -storetype PKCS12 -keystore myCert.p12 -storepass mypassword myPackage.zxp -C "myPackage"
This creates your packaged extension named myPackage.zxp. You can double click it and the Adobe Extension Manager should install it. Or you can open it with WinZip and have a look what's inside.
HYBRID EXTENSIONS
One problem with the manual packaged extensions is that they can contain ONLY a .swf, an optional .jsx and optional rollover.png images and NOTHING else. If you want to distribute anything else, you have to double package your extension (like double fried pork!) - package once manually and then package again with the .mxi method.
In the .mxi you can copy additional files - e.g. .js or .jsxinc files or compiled C++ plugins or whatever.
I still have to add the support of cages and presets.
About the preset, I was thinking about creating an empty transform (aka group) in the scene with a lot of custom parameters. So if you want to share your settings you simply send the scene and reload the transform in the script which will read the custom attributes.
What do you guys think about this ?
I'm thinking about external txt file otherwise, but it's hard to know if it's better. I'm not professional (hey, still a student), so I would like external opinion.
First release.
just uploaded a new version of my flow map script, please download, install and test
[ame="http://www.youtube.com/watch?v=3JUYFGkmE1Y&feature=plcp"]http://www.youtube.com/watch?v=3JUYFGkmE1Y&feature=plcp[/ame]
[ame="http://www.youtube.com/watch?v=f8Foaf7u7Qk"]http://www.youtube.com/watch?v=f8Foaf7u7Qk[/ame]
Supposedly, up until recently (before Wall-E, I think?), even Pixar weren't simulating actual GI, but instead where faking the effect through Area lighs. I can't wait until more enginges get this feature, there are so many things you can do with area lights. The Cryengine for Crysis 3 will also supposedly have this.
Now all we need is an efficient realtime area soft shadowing method.
the gather function/colorbleeding workflow had been around since shortly after tracing was introduced (2002). It was also available prior by using a rayserver to connect to BMRT.
The workflow is generate ptc
brickmap
gather
Awesome stuff! I downloaded it and tried it out, it works great! Frame rate wouldn't go higher than my screen refresh rate (75hz), but the frame rate was a solid 75 fps with 10 area lights and a good number of polys.
Edit: After playing with it some more, this is one of the coolest things I've gotten to mess around with in a very long time! I may even start using Blender's viewport for realtime beauty shots now, it looks really good.
It's made to work with naming suffixes and a folder structure.
So to make it more evolved, I'm currently adding a setting window, to control what maps to bake and their options, the xnormal baking settings (bucket size, folder location, xnormal location etc... ) (lots of rollouts yeah!)
But yea, more tech and definitely more art.
Your newest article is very interesting. Failure... more so the "fear" of failure is very interesting to me. I love that you cited the app Alchemy. That app is very inspirational to me because of its main motivation to not remove the fear but make you embrace it and learn that it's ok.
Here is a new Zbrush script I have been working on.
Smart Dynamesh. This script takes the guess work out of getting an acceptable poly density with DynaMesh. No more moving the Dynamesh Resolution slider around and repeatedly undoing to get polygons you want. The script will create a Dynamesh near the "Target Polycount" slider. This is especially useful for hard surfaces that you know you will need a dense mesh.
Use this script instead of initially pressing Dynamesh then just Dynamesh as normal. This script is also great for reducing the polycount of your Dynamesh when the mesh gets too heavy. Reducing the polycount this way has very little detail loss.
[ame="http://www.youtube.com/watch?v=fngVhl2D2Q0"]Smart DynaMesh Preview - YouTube[/ame]
Written in MEL ofc, and I'm building it around a stripped-down version of the original hypershade. Currently stripping away all the bullshit in it - I just want to keep the material stuff from it, and then add my own stuff and shortcuts, custom tools, whatever.
Get it Here
You can paint flow in 3d and see the results in real time as well it seems:
[vv]55865390[/vv]
tool I've been working on for the last few weeks:
CryENGINE 3 exporter for Softimage (tested with ModTool 7.5 and 2012 SP1)
Features:
Website: https://sites.google.com/site/andescp/softcry
Code is pretty ugly (and/or unnecessary) in some places, will be cleaned up sometime soon.
I bet there are still bugs somewhere, if you find any, please report them on GitHub or here.
Any suggestions on functionality or whatever are appreciated!
http://labs.soapcreative.com/GangsterSquad/
The game can be played here
http://www.toughjustice.com.au/
There is some cool tech I was able to use with this project, so check out the making of link if you are interested. I'll have some more screens and videos of the making process once I catch up a bit.
Edit*
Highscore at 272,002. It was fun.
So here it is:
Here is an example with a modified version of XoliulShader:
I am looking forward to playing around with some of the new engines this year, particularly because higher specs like Dx11, more memory, faster storage, etc. will become the norm. Hopefully the new consoles that are supposedly being announced this year will help set a good pace. (:poly122:)
So what are you guys looking forward to?
Oyua and Steam-box (booth linux based) look interesting as well and I think this year is only getting us more middle ware to play around and port to all those various software / hardware platforms.
To be honest though for many things I am not looking forward that much. Windows 8 is not my cup of tea, Autodesk and Adobe are still stuck on their old horses which they patch and build on with every release (often not for the better).
Professional PC/ or computer solutions are shrinking in favor of the Consumerization move. Everything is dummyfied and ported with a primary focus on mobile devices. Those are things that worry me a bit these days.