Hi friends! Not sure if i'm correct here, but my issue is heavily technical - but of course about game art. Anyway this is pure self-advertisment. I write an articles about...
"Fakes & Tricks of games which impressed me"
It goes about crazy ideas or awesome tech fakes which made an cool effect. I write about this in my blog and collected everyting i can remember. Feel free to comment about my stuff, explain what i didn't understood or tell me about what stuff impressed you.
The problem seems to be photoshop. Their gradients, also entitled as being linear are not linear in the resulting image. I don't know about color profiles and stuff within photoshop, there might be some way, maybe working with the gradients in 16 bit might help (only saving in 8bit) but i didn't test it.
my result with a gradient rendered in maya was as one would expect it to look like.
For uncompressed textures use TC_NormalmapUncompressed. it throws away the blue channel and will reduce the resolution by 50%, but it keeps the rest uncompressed. Make sure to set the UnpackMin values in the texture back to 0 for all values then (normalmap import sets unpack min to -1 )
alfalfasprossen
thanks man! have to check this out! for such like this i hate 3d & tech i mean why does it have to be so complicated. its really weird how black the lower left corner gets in photoshop...
yeah, the joy of linear workflow
(i just checked, the gradient textures would look more alike, i tested something with sRGB turned on, when taking the screenshot, so the difference between the two textures is close to not visible at all, but the results are kind of horribly different :poly136: )
edit: well ok, the photoshop is still definitely darker in the lower left parts
I know but i tried to make it by my own and then play around with it. Just to understand how exactly it works. Really nice stuff! Will write it together soon.
Sorry guys, i have another question. Couldnt find anything useful in google. Its a bit weird. Originally i tried to invert the green channel of the texture because in the TexCoord node the darkest corner is up/left and in my texture it was bottom/left. Since then i got weird results. The i made the texture like it was, but the weird stuff didnt disappear.
Part 1: Import a test texture, all fine, no problem (compression TD_Defaul) Part 2: Import a uv color texture (compression TC_Default), now i get weird colors in the Material Editor during the colors in the Content Browser it looks like it should Part 3: Use uv color texture as uv input for the test texture: it tiles 2x, WTF? Why?
from your images i suppose you mean you imported it using TC_Normalmap_whatever, as your colors are that blueish pink stuff. Or, if not directly for this texture, did you import it before as normalmap, and use the same name? It might keep the settings for the texture and only update the underlying values.
anyway.
Check the settings of your texture (Texture Properties). Make sure the unpack min values are set to 0. If you imported as NormalMap, they will be set to -1.
the range -1 to 1 in uv coords will repeat your texture twice across a unit area.
Glad you managed to work it out in the end Simon - nice write up, nice demo too Tamarin!
Once I created the texture and an example shader I passed this on to the effects guys who created the full effect you see in the video. So really I didn't do much And of course the method is inspired by lots of other peoples work.
I remember we had a lot of screens in the game and I thought it would be cool to have something more interesting on them, something less "2d". Of course you could use render targets and render an actual 3d scene to a texture, sometimes that's fine, but that does incur extra memory and rendering costs.
When running the shader with the rendered UV texture for the first time I was kind of surprised it actually worked in terms of the scrolling. When you think about it it makes total sense , but it was a "could it be that simple?" moment.
Mikiex
Glad you like it. And thanks again for mentioning this. It was really interesting to learn how you did it. And personally i look at it again and again and can't believe how good it works
I'm surprised as to StarForge using the same 'layered' glow for the lights, doesn't make sense for a game in 2013.
I mean HW had shader limitation, so they make do back in the day, but SF doesn't benefit in anyway from this, especially since you can get up close and personal with the spot lights and aren't in a position to hide it effectively without layering many versions of them.
A simple cone, which once intersects with another geometry fading away via shader is a much better way to accomplish this, unless I missed something.
Maybe procedural solution? They're generating each layer on the fly from the origin, and will stop generating after "X" length or of it hits another vertex? Hmmm.
Also, about the flat planes behind the Interceptor, maybe they're distance based glows? They're following the engines pretty close, and from the picture, there is a faint 'smog' like effect outside of the trail? Can't really tell, and I don't have the game.
Or hell, they could even be spline-like delayed rigs, where everytime the ship turns, they will record the vertex bend of the position for the tail and from where they're supposed to start?
hmmm. Would you consider doing something involving how Journey managed it's sand?
Hm good question. The problem, i don't own a PS3, never played Journey and even if: there are no tool for investigating on a console :,( But thank for the tip. I have to watch a Let's Play. But if you're interested in Sand in general check this out (starts at slide 90): http://gdcvault.com/play/1015898/The-Tricks-Up-Our-Sleeves
this here will create a mask to mask out the opacity on a object based on a arbitrary vector coordinate.
so more or less animated values on that vector that controls this in conjunction with a plane that move along with it.
could all be made into actor with unrealscript, where as you move the actor which would be the plane, it changes the value of the vector that pushes this effect.
Froyok & equil
Thanks for the links! Really interesting!
Kurt Russell Fan Club
Thanks man Hm nice link ... i don't understand the shader code and stuff but it looks really nice!
p.s. i added your link to the blog
bb0x
Hm right...good idea. Maybe it's some kind of "softparticle" ...
All
If you have a PC version of the new Tomb Raider or Walkind Dead: Survival Instinct which you don't need, i would love to checkout some stuff in these games. I already bought Metal Gear: Rising and Dead Space 3 to make these articles but it gets a bit expensive.
Decimation master doesn't really work that well if you go below %10.
(I guess that's why its called DECImation master, heh...)
Going any lower than the current polycount, 4,685, (even after re-anylizing/decimating) destroys that image entirely. And this is just the flat plane3d with importance set %100 on polypaint in decimation masters settings.
Cool stuff though. Probably with some tweaking on some more practical meshes it will work out nicely. It is necesary to have a pretty blurry/dreammy texture, sharper stuff will not work nearly as well.
Not as far as i'm aware. I don't think any of the mudbox versions support vertex paints and up until recently, their decimation/retopo tools have been severely lacking when compared to zbrush.
Just because the translated variations of the textures are in the game data doesnt mean they dont use a system for auto translation. They could have a simple tool for designing localized textures by compositing existing textures and font based text, which would be translated along with all the other things that get translated. The tool could have tools for texture/font layer transformation, masking with other textures, and blending modes, which would give you enough control to make nice looking textures but still be able to swap out the text automatically. A system like this would probably build all of the needed variations while doing a full build instead of doing it at runtime on the users computer, which could either be slow, or have compatibility issues if its hardware accelerated.
The graffiti looks like its probably hand made in photoshop, but the other standard font stuff looks like a good candidate for automatic translation.
When I worked at Cryptic there was a tool in the editor for doing this.
^That reminds me of Alegorithmic's Substance procedurally generated materials/textures.
Its also interesting if you tie it back to how this can be used to save space, especially considering how many Blizzard games allow you to download just a small portion of the game and start playing while the rest of the game downloads in the background in the order its needed, essentially streaming the game...
commander_keen
Yeah that's right. I asked Julian and i hope to hear back from him how the SC2 Team did it. Thanks for you comment, really interesting to hear!
Computron
That's right. Really impressive how they stream the content while playing
I was just talking to my colleague about that effect last week! I was convinced the artists just placed a cone around the lights with vertex colour and alpha, but I never noticed that it actually got occluded. Very nice, I like the lofi appeal on the gif you posted.
Transparent fresnel in vertex offset shader, multiple pass maybe?
Terrain can be 3D noise texture, color by world axis mix with view direction?
Hmm want to make it..
Does anybody know about tools which can tessellate arbitrary models this way? My idea is to bake lightmaps into vertex colors this way. Would be a great performance and resource saver for mobile graphics.
MightyPea
Thanks for your reply! Oh, the effect video you posted is sick! Like it very much...why didn't we see something like this in some games...hm maybe becuause there aren't that much space games...but looks really cool!
cupsster
I have no idea what you talking about But if you want to explain it with some more words i would love to read it
diamond3
I'm note sure if this is necessary. I mean you can bake lighting information into vertex color with 3Ds Max. For basic stuff this i totally fine. I mean what you want is, taking a final geometry (with already done UVs) and re-organize the geometry depending on the lighting info but without loosing the UVs and without loosing the general form of the original shape. This sounds crazy complicated.
What i could imagine is, that you just invest some more geometry detail (like in Stalker, there was a lot geometry used) and bake the lighting into the vertices.
Does anybody know about tools which can tessellate arbitrary models this way? My idea is to bake lightmaps into vertex colors this way. Would be a great performance and resource saver for mobile graphics.
I try it in 3DS Max but couldn't find nice workflow. I was using baking GI to radiosity then extract radiosity mesh with script.I tested this on mobile It works but need workflow optimization and better shaders on mobile.
For your purpose just use radiosity and mesh decimation. Special algorytm tahan cat cut mesh along shadow umbra and subdivide resulting cut would be handy... Any maxcript guru here?
UPDATE diamond3
I'm note sure if this is necessary. I mean you can bake lighting information into vertex color with 3Ds Max. For basic stuff this i totally fine. I mean what you want is, taking a final geometry (with already done UVs) and re-organize the geometry depending on the lighting info but without loosing the UVs and without loosing the general form of the original shape. This sounds crazy complicated.
What i could imagine is, that you just invest some more geometry detail (like in Stalker, there was a lot geometry used) and bake the lighting into the vertices.
I hope you could understand what i wrote
But on games where the geometry you can use is very limited, this would lead into blurry lighting. The method that diamond3 described could give you very sharp shadows with not a lot of geometry.
I've seen something like this before, on some mobile games. God of war on PS2 uses something like this, also. You can see that it's vertex-lit, and you can notice that they made arbitrary cuts into the geometry to allow such sharp shadows. A way of automating that would be great and I'd be interested too!
under modify>convert>, maya has a function called "texture to geometry" that lets you cut geometry according to an input texture. i've used it to do kind of what you guys are talking about by taking lightmap textures into photoshop, and apply a posterize filter with at least 3 levels to it. using that as an input i got some pretty nasty results , but after a bunch of manual cleanup they were actually useable. wonder if you could combine this with that zbrush trick.
Replies
my result with a gradient rendered in maya was as one would expect it to look like.
For uncompressed textures use TC_NormalmapUncompressed. it throws away the blue channel and will reduce the resolution by 50%, but it keeps the rest uncompressed. Make sure to set the UnpackMin values in the texture back to 0 for all values then (normalmap import sets unpack min to -1 )
for testing it yourself:
photoshop created texture (.tga)
maya crated texture (.tga)
thanks man! have to check this out! for such like this i hate 3d & tech i mean why does it have to be so complicated. its really weird how black the lower left corner gets in photoshop...
(i just checked, the gradient textures would look more alike, i tested something with sRGB turned on, when taking the screenshot, so the difference between the two textures is close to not visible at all, but the results are kind of horribly different :poly136: )
edit: well ok, the photoshop is still definitely darker in the lower left parts
You need to set the smoothness of the gradient to 0%.
YOU ARE A FOX! It works! THX! Awesome! Here's an example:
And if you don't know where is setting is, here you go:
Part 1: Import a test texture, all fine, no problem (compression TD_Defaul)
Part 2: Import a uv color texture (compression TC_Default), now i get weird colors in the Material Editor during the colors in the Content Browser it looks like it should
Part 3: Use uv color texture as uv input for the test texture: it tiles 2x, WTF? Why?
anyway.
Check the settings of your texture (Texture Properties). Make sure the unpack min values are set to 0. If you imported as NormalMap, they will be set to -1.
the range -1 to 1 in uv coords will repeat your texture twice across a unit area.
#28 007 Legends - The World
http://dl.dropbox.com/u/16703380/scrollingUV/scrollingUV.html
No progress with the vertex colored skybox. I think it was just took some really skilled artists to create that effect.
Once I created the texture and an example shader I passed this on to the effects guys who created the full effect you see in the video. So really I didn't do much And of course the method is inspired by lots of other peoples work.
I remember we had a lot of screens in the game and I thought it would be cool to have something more interesting on them, something less "2d". Of course you could use render targets and render an actual 3d scene to a texture, sometimes that's fine, but that does incur extra memory and rendering costs.
When running the shader with the rendered UV texture for the first time I was kind of surprised it actually worked in terms of the scrolling. When you think about it it makes total sense , but it was a "could it be that simple?" moment.
Glad you like it. And thanks again for mentioning this. It was really interesting to learn how you did it. And personally i look at it again and again and can't believe how good it works
NEW
#29 Homeworld 2 - Engines
I mean HW had shader limitation, so they make do back in the day, but SF doesn't benefit in anyway from this, especially since you can get up close and personal with the spot lights and aren't in a position to hide it effectively without layering many versions of them.
A simple cone, which once intersects with another geometry fading away via shader is a much better way to accomplish this, unless I missed something.
Maybe procedural solution? They're generating each layer on the fly from the origin, and will stop generating after "X" length or of it hits another vertex? Hmmm.
Also, about the flat planes behind the Interceptor, maybe they're distance based glows? They're following the engines pretty close, and from the picture, there is a faint 'smog' like effect outside of the trail? Can't really tell, and I don't have the game.
Or hell, they could even be spline-like delayed rigs, where everytime the ship turns, they will record the vertex bend of the position for the tail and from where they're supposed to start?
No idea, but I'm very curious about them whites.
Pretty awesome! Julian Love gave us great information about the bubbles. With shader code examples!
#27 Diablo 3 - Resource Bubbles
Tamarin
I added your demo link to the 007 article Thanks again!
#30 Homeworld 2 - Hyperspace
Hm good question. The problem, i don't own a PS3, never played Journey and even if: there are no tool for investigating on a console :,( But thank for the tip. I have to watch a Let's Play. But if you're interested in Sand in general check this out (starts at slide 90): http://gdcvault.com/play/1015898/The-Tricks-Up-Our-Sleeves
this here will create a mask to mask out the opacity on a object based on a arbitrary vector coordinate.
so more or less animated values on that vector that controls this in conjunction with a plane that move along with it.
could all be made into actor with unrealscript, where as you move the actor which would be the plane, it changes the value of the vector that pushes this effect.
http://www.thatgamecompany.com/forum/viewtopic.php?p=11191#p11191
http://www.thatgamecompany.com/forum/viewtopic.php?p=11274#p11274
http://advances.realtimerendering.com/s2012/thatgamecompany/SandRenderingInJourney_thatgamecompany.pptx
That Homeworld effect is probably achieved using clip planes (a programming thing, not as much an art trick.)
For the white glow/outline thingy. It might be similar to how water shaders fade at intersecting geomtery, using the zdepth trick.
Thanks for the links! Really interesting!
Kurt Russell Fan Club
Thanks man Hm nice link ... i don't understand the shader code and stuff but it looks really nice!
p.s. i added your link to the blog
bb0x
Hm right...good idea. Maybe it's some kind of "softparticle" ...
All
If you have a PC version of the new Tomb Raider or Walkind Dead: Survival Instinct which you don't need, i would love to checkout some stuff in these games. I already bought Metal Gear: Rising and Dead Space 3 to make these articles but it gets a bit expensive.
#31 Bioshock - Glossiness
#32 Starcraft 2 - Localization
Is this possible with mudbox?
Not as far as i'm aware. I don't think any of the mudbox versions support vertex paints and up until recently, their decimation/retopo tools have been severely lacking when compared to zbrush.
Just because the translated variations of the textures are in the game data doesnt mean they dont use a system for auto translation. They could have a simple tool for designing localized textures by compositing existing textures and font based text, which would be translated along with all the other things that get translated. The tool could have tools for texture/font layer transformation, masking with other textures, and blending modes, which would give you enough control to make nice looking textures but still be able to swap out the text automatically. A system like this would probably build all of the needed variations while doing a full build instead of doing it at runtime on the users computer, which could either be slow, or have compatibility issues if its hardware accelerated.
The graffiti looks like its probably hand made in photoshop, but the other standard font stuff looks like a good candidate for automatic translation.
When I worked at Cryptic there was a tool in the editor for doing this.
Its also interesting if you tie it back to how this can be used to save space, especially considering how many Blizzard games allow you to download just a small portion of the game and start playing while the rest of the game downloads in the background in the order its needed, essentially streaming the game...
#33 Doom 3 - Volumetric Glow
commander_keen
Yeah that's right. I asked Julian and i hope to hear back from him how the SC2 Team did it. Thanks for you comment, really interesting to hear!
Computron
That's right. Really impressive how they stream the content while playing
I came across this video and love the effects in there to death:
https://www.youtube.com/watch?v=dNm0l1L8O1A
particularly the kind of effect of things seen at an angle, where it looks like a heathaze or something.
Terrain can be 3D noise texture, color by world axis mix with view direction?
Hmm want to make it..
Does anybody know about tools which can tessellate arbitrary models this way? My idea is to bake lightmaps into vertex colors this way. Would be a great performance and resource saver for mobile graphics.
I added a link to my 3Ds Max > Doom 3 Modding nodes and a link to a reddit discussion about the article.
MightyPea
Thanks for your reply! Oh, the effect video you posted is sick! Like it very much...why didn't we see something like this in some games...hm maybe becuause there aren't that much space games...but looks really cool!
cupsster
I have no idea what you talking about But if you want to explain it with some more words i would love to read it
diamond3
I'm note sure if this is necessary. I mean you can bake lighting information into vertex color with 3Ds Max. For basic stuff this i totally fine. I mean what you want is, taking a final geometry (with already done UVs) and re-organize the geometry depending on the lighting info but without loosing the UVs and without loosing the general form of the original shape. This sounds crazy complicated.
What i could imagine is, that you just invest some more geometry detail (like in Stalker, there was a lot geometry used) and bake the lighting into the vertices.
I hope you could understand what i wrote
I try it in 3DS Max but couldn't find nice workflow. I was using baking GI to radiosity then extract radiosity mesh with script.I tested this on mobile It works but need workflow optimization and better shaders on mobile.
For your purpose just use radiosity and mesh decimation. Special algorytm tahan cat cut mesh along shadow umbra and subdivide resulting cut would be handy... Any maxcript guru here?
Maybe not in this thread but of course in the tech section. Maybe open a new topic? And feel free to post your results here
But on games where the geometry you can use is very limited, this would lead into blurry lighting. The method that diamond3 described could give you very sharp shadows with not a lot of geometry.
I've seen something like this before, on some mobile games. God of war on PS2 uses something like this, also. You can see that it's vertex-lit, and you can notice that they made arbitrary cuts into the geometry to allow such sharp shadows. A way of automating that would be great and I'd be interested too!
Yes, that sounds interesting...
Thanks for sharing
#34 Doom 3 - HDUI
equil
THX for the tip!
bobmartien
Thanks man!
#35 Scribble Cel
Hi guys, i just would like to mention that i created a twitter account for the blog so if you want to connect, click click:
NEW
#36 Lego Batman - Crawler