My rig is a 2019 vintage i5 9600 overclocked to 4.8GHz, with an RTX 2060 GPU driving a 2560x1440, 144Hz monitor.
Thus far I've always used the maximum refresh rate, to take advantage of the 144Hz screen. But I got to not very much liking the large variability in frame rate, depending on graphical load and computation of heavier action. It made even 60FPS periods seem to be comparatively 'choppy' due to the contrast against those > 2X faster frame rates under light loads. And on top of that, it seemed that this FPS jumping about had some impact on TrackIR update rates, with little periods of what seemed to be 30 updates/sec, in spite of ~60FPS performance.
And so I tried an experiment earlier this evening. The image below shows all my Nvidia Control Panel settings (requiring 3 screenshots stitched together due to the inability to resize the window within which the settings appear). Of particular note are the two frame rate settings, which I've set to 60 FPS. (The Background Application Max Frame Rate setting may play no role here--unless it might pertain to the TrackIR app concurrently running???)
Right away I rather liked this. The far more consistent frame rate, which the console reports as fluctuating between 59-61 (unless a heavy graphical/computational load causes it to drop lower) is rather more enjoyable. And the TrackIR performance does seem to be just that bit more consistent as well.