.......I'm just thinking - if using the compressed texture override is necessary for use of this mod, why not look into ways of making the game make the most of it...
It's very difficult. No source code.
Yeah, I suspected as much. It's a shame - I think it would be very beneficial for 1C/Maddox Games to follow the example set by Volition, and simply publish the source code but retain the IP rights... this would essentially enable the modding community much more freedom while still keeping profit coming in from the game.
Newer ATI's 3Dc compression algorithm automatically does compression in normal maps, no? So if one uses ATI cards, no need for game to compress.
It doesn't work that way.
3DC compression is based on DXT5, but instead of storing RGB and Alpha, it stores two channels with individual compression. This makes it ideal for (tangent space) normal maps which only utilize two channels.
However, the fact that a GPU supports a new format does not mean that it will somehow know to convert arbitrary texture data to that format before using it. The texture compression needs to happen before the texture is sent to GPU; if it's non-compressed, the GPU treats it as a bitmap, if it's compressed and the GPU recognizes the format, then it can use it directly.
IL-2 is an old game from time when DXT compression was a new technology. That's why it is referred to by its old/original name - S3 Texture Compression, or S3TC. If that is enabled, the game applies compression algorithm to all (appropriate) textures, and sends the compressed textures to the GPU for rendering.
If texture compression is not enabled, it just sends the textures as their datatype defines them. Depending on how wisely the program is coded, it might even change the texture type for 8-bit textures when it encounters them. It can also decrease colour depth to 16-bit (actually 12-bit RGB, 4-bit alpha) and that probably does affect the memory footprint as well, but S3TC / DXT compression is more memory efficient.
However, it is sheer impossibility for a GPU to independently determine that some texture is a normal map, and based on that diagnosis apply an advanced, normal-map specific compression type. It would require a level of communication between the software and the graphics API and GPU driver that IL-2 most likely does not have. There would need to be some sort of identification flag that tells the graphics API that this texture is intended to be used as a normal map, and request it to compress it on-the-fly - which is another thing that GPU's themselves don't do! - before loading it into VRAM.
For me with texcompress=2 and TexFlags.TexCompressARBExt=1, I get unwanted artifacts. Most notible on a/c skin. Both set to 0, and true colour on my 2048 skin returns. Just thought I'd mention that. But please correct if I'm wrong.
I am not surprised that texture compression causes artefacts; that is by design, although like I said in earlier post the settings of the on-the-fly compression algorithm is probably optimized for compression speed rather than quality. Remember, though, that increasing resolution decreases the apparent size of the artefacts and masks them more effectively into the other details of the texture.
Most likely TexCompress=2 is only necessary if you are exceeding your GPU's VRAM capacity; when it needs to swap the content on the memory several times per frame, you start getting slowdowns. Activating TexCompress and TexCompressARBExt can reduce memory by 75-83%. If you can manage without compression, I don't think it's strictly necessary for functionality.