Interlacing would require selecting odd lines for left eye view, even lines for right eye view, and then overlaying them (assuming odd/even = left/right convention applies for these monitors/tv's that can read interlaced images like that).
It would, however, mean loss of half the vertical resolution and, with game like IL-2 that could be disastrous, considering planes are often small enough to appear as a single pixel which would then be visible by one eye only. That would be very distracting. For just flying around, it would probably work but I wouldn't expect very good results for gaming purposes.
However, if it were possible to make an interface that can output left eye framebuffer and right eye framebuffer, it could perhaps be possible to make a true 3D interface. Instead of applying a colour filter to the images and then combining them to single frame, it would just pass the images to 3D display, in whatever format the 3D device can handle - whether it's two separate video streams, or if it's one stream containing two images per frame, I don't know. It would be worth looking into, regardless, especially considering that 3D capable monitors and televisions are becoming more and more popular.
Anaglyph stereo is great, but it would be a quantum leap forwards to have support for real colour 3D, regardless of what technique is used on the device.
However, I'm completely unsure whether it would be even possible to do this. I'm not sure if programs such as DeviceLink can do this kind of stuff.