Is there any tools that will allow me to take the data from the rgb and make a dxt1 with no alpha? I thought the only difference between the formats was the full greyscale alpha the dxt5. So, in theory, if something could strip away that channel. It would work as a dxt1? Glancing over the nvidia texture tools command line. I dont see a command like this.
Replies
But recompressing shouldn't change it drastically anyhow, they use the same basic RGB compression. Just load the dxt5 in Photoshop, ditch the alpha, and save as dxt1.
No, this is not how DXT compression works.
More info: http://en.wikipedia.org/wiki/S3_Texture_Compression
You're going to have to re-save it, saving from the DXT5 to DXT1 is probably gonna lose a little more quality but hopefully not too much. I would definitely re-save from the original PSD or lossless image if possible.
http://www.3dfergy.com/3.html
Thanks for the responses.
http://developer.nvidia.com/object/photoshop_dds_plugins.html
Our engine automatically converts TGA files to the appropriate file format at run-time so we never even have to deal with DXT compression. Everything gets flagged as the correct type and compressed accordingly - LATC for normal-maps, DXT5 for specular with gloss in alpha channel, DXT1 for diffuse only or DXT5 if the diffuse has opacity in the alpha channel.
It seems like Unreal tech works in a similar way - you load in TGAs and they get compressed accordingly (and in both engines you get the option to specify texture flags which affect how it's compressed)
I always found this pretty nice since you never deal with compressed textures yourself, it just happens magically in the background. Are there any real upsides to directly using DDS?
The only thing I can think of is that you get to edit mip-maps manually if you need to. Personally I have never had to do this, but I've heard some people occasionally need to edit mip maps for quality reasons.
I hadn't considered the memory-in-application thing, since I'm a character/weapon/item guy at work I tend to deal with scenes that only contain a few textures (a bunch of 1k - 2k diffuse, normal, spec etc) so texture memory and load time is never really a problem.
Good point about getting to see the compressed image in the 3d app, although in our pipeline we tend to view our stuff directly in the game's model viewer (everything gets exported and processed immediately, so reloading the current model & textures only takes a fraction of a second in most cases), and we get to see it using the compressed textures at that point, with all the postprocessing and colour correction from whatever game atmosphere is applied. We can't do that in Maya without writing a custom OpenGL renderer for the viewport, the closest you can get is a .fx shader (which we have, and it works fairly well, but it's still not a 1:1 correspondance with what you will see in-game with all the bloom/tinting/Depth-of-Field etc going on).
Everything gets exported from Maya and Photoshop using custom scripts and the game detects changed files to auto-reload them, so it's a pretty fast turnaround time (hit F8 in PS to save your layer groups to separate TGA files, then alt-tab to the game's model viewer and they auto-update and compress on the fly).
We don't tend to build large levels in Maya alone, we have the in-game editor for that, so again texture memory isn't a problem as it's all streamed and managed by the game itself rather than Maya or Max (which always seem to be hugely inefficient when it comes to managing texture memory!).
Interesting points though, I guess it really depends on the sort of work you're doing and what tech you're working with, which will determine the best workflow.
jk
Mop, yeah my whole career I've been creating environments in maya or max as the level editor rather than an external editor/viewer, you guys are probably way more used to what you see is what you get. All we see in maya is diffuse textures and our blend shader, lighting, post fx, and atmosphere all need to be viewed in game.