I've altered Atmosphere.class so as to apply a varying rate of temperature decrease with altitude based on the surface temperature. In other words, I implement a varying environmental lapse rate. Formerly, no matter the surface air temperature the same temp decrease for unit alt increase was applied. This resulted in the temp at 7,000m--where condensation trails form--to vary from +5C to -56.5C (the lowest permitted temp) when the surface temp was between +50C and -11C, respectively.
My scheme has a steeper temp decrease for hot maps and a more gradual temp decrease for cold maps. The same lapse rate as for the stock scheme occurs for a surface temp of +15C. And so at the 7,000m condensation alt, the temp range of -5C to -56.5C applies for surface temps of +50C to -28C, respectively.
To put into a bit better perspective:
- At 7,000m, for a surface temp of +50C my temp is 10C colder.
- For a surface temp of -50C, the stock height where the temp falls to the min permitted of -56.5C is 1,000m; my scheme has that same -56.5C occur at 3,000m.
Now, whether the stock or my modified scheme is in effect, I feel that given the huge range of surface temps possible, the height at which condensation trails occur should vary at least somewhat. In colder weather the condensation alt can be made lower. To avoid the potential for the condensation trails to switch on and off due to a change to the height threshold as the surface temp varies over the course of a longer mission, the dynamically determined surface temp during play would not be a wise variable upon which to determine the threshold height. Rather, some other more constant temp read or calculated from a map's load.ini would be safer to use.