On Refresh rates and FPS.
Toll last edited by
I stumbled across the following from anandtech forums:
It basically foes into a bit of detail on Frame sync times, FPS and input lag. There was some interesting stuff in there that I hadn’t considered before. Cheifly, setting your FPS cap to a number that is not a multipal of your monitors refresh rate, ie set it at 73 for a 60hz monitor or set it to 157 for a 120mhz monitor. You basically set it as high as possible, whilst also being offset from your monitors refresh rate. It also mentioned using a 1000hz mouse.
This is with Vsync turned off to minimize perceived input lag.
Taken from here.
Allow me to introduce myself as The Blur Busters Blog, which has extensive knowledge about LCD refreshing & this is important to have a good understanding of how tearing occurs. Quote:
True, but flaky rendering automatically means more visible tearing. Read below to see why.
Just wanted to chime in with my experience about VSYNC OFF and how to reduce tearing problems, because I am very sensitive to tearing even when fps > Hz.
| What if we still had 60 FPS, but used a 120 Hz screen instead? The late frames do not have to wait for 16.7 ms but rather 8.3 ms and this improves smoothness quite a bit; we get back 59 effective FPS. So a user switching to a 120 Hz screen will say motion is smoother even at 60 FPS, because his new screen displays 59 FPS instead of 52. |
If it is were perfect 60fps@120Hz, then it only has less input lag; motion is not smoother. However, microstutters last only for 8.33ms rather than 16.7ms.
Also, some VSYNC OFF considerations:
How does an LCD refresh? How does it relate to understanding tearing?
I have two high speed videos of how an LCD refreshes – Old 2007 LCD and a New 2012 LightBoost LCD. Also I have a page that shows a high speed video comparision between LCD versus CRT. These are helpful in understanding how displays refresh top-to-bottom – and gives an understanding of how tearing will occur, because of the top-to-bottom scanout from the computer to the display, especially when multiple frames are essentially visually “spliced” into the same refresh (causing the tearline effect).
Tearing Interaction between fps and Hz (Harmonic Frequencies)
another consideration, if you got a powerful GPU that runs capped out at a game’s framerate limit (e.g. source engine games with a configurable fps_max) …. then if you play VSYNC OFF, you really do not want a frame cap (fps_max) that has harmonic frequencies between fps and Hz. For example, an fps_max of 60, 120 or 180 for a 60Hz display, especially if you have a graphics card (e.g. Titan). The reason is you will have nearly-stationary or slowly-moving tearlines, as the splice of the previous frame cuts into the next frame at fairly synchronized intervals. For example, I can see two very clear (nearly-stationary) tearlines during fast turns in an older Source Engine game (without AA) when I configure it to a fps_max 240 on my 120 Hz display, because my GTX 680 can easily run capped-out at 240 fps. And likewise, you will have more visible tearing on a 60 Hz display if you cap at 59/60/61 (one persistent tearline) or at 119/120/121 (two persistent tearlines) or at 179/180/181 (three persistent tearlines) assuming your GPU is powerful enough to always run fully capped out at the fps_max. Instead you prefer tearing to faintly & randomly go all over the place rather than being obnoxiously stationary or in the middle of your screen. Uncapping is better (e.g. fps_max 999), or setting an odd number as high microstutter-free value as you are able to (e.g. fps_max 317), can significantly reduce the appearance of tearing.
How does microstutter increase tearing?
Microstutters includes abnormal moments of longer frame render times. Longer frame render times means bigger-offsets during the mid-refresh “splices” between frames (next frame being output immediately without waiting for tne top-down refresh to finish first). Bigger offsets means more visible tearing.
Eliminate all weak links that makes tearing more visible
Stutters gives opportunity for bigger-offset tearlines.
Inconsistent frame rendertimes gives opportunity for bigger-offset tearlines.
More powerful CPU means less chance for microstutters to happen.
Faster SSD will reduce game-load-related stutters.
Better GPU will reduce stutters.
1000 Hz mouse will reduce stutters.
Whatever you do, eliminate ALL your weak links as much as you can.
Vsync ON vs Adaptive vs OFF
Love and hate it, but it’s worth mentioning that adaptive VSYNC has less average input lag than VSYNC ON, but more average input lag than VSYNC OFF – as a compromise if you’re very sensitive to tearing. It can in fact, have roughly similar input lag to VSYNC OFF with fps_max capping roughly equalling to Hz, adaptive VSYNC simply smartly synchronizes on the fly (“pushes” the tearline just off the edge of the screen); making it look like VSYNC ON. Competition gamers still prefer VSYNC OFF, but it’s worth keeping in mind if you like solo or you really, really, really hate tearing. It isn’t a the solution if you’re a pro/competitor gamer.
But if you’re going cap VSYNC OFF at fps=Hz frame cap anyway, then you might as well use “Adaptive VSYNC” instead – there’s really virtually little to no difference in input lag (in a properly designed implementation) when you use Adaptive versus a fps=Hz framecap (e.g. fps_max=60 combined with VSYNC OFF on a 60 Hz display). A good Adaptive VSYNC implementation only simply push the tearline to the top/bottom edge of the screen whenever fps matches Hz, in order to make the tearline invisible without adding any further unnecessary latency. It’s worth knowing this little detail that makes it less evil than forced double buffering…
Less input lag can occur with fps_max far beyond Hz
Running at 300fps on a 60Hz or 120Hz is beneficial because it has less input lag. Fresher frames are delivered (and spliced into the existing refresh in the existing top-to-bottom scan). So running fps at 3x refresh rate (e.g. 180fps at 60Hz), you have 3 subsections of frames, looking as if spliced on top of each other – top third being the oldest frame (rendered 3/180sec ago), middle third the 2nd oldest (rendered 2/180sec ago), and the bottom third the newest frame (rendered just 1/180sec ago). Give or take, the positions of tearlines can vary, depending on the timing of frame renders relative to the vertical blanking interval (the pause between frames – on old analog TV’s that’s the black far you see when VHOLD adjustment is bad and the picture is rolling). So having massive framerate far beyondf Hz, benefits input lag in VSYNC OFF situations. And the tearlines are smaller as a bonus (less horizontal shift between the slices, means tearing is less noticeable) Assuming frame render times are consistent, so the spliced frames But a problem occurs when you get lots of microstutters (poor consistency between frames; varying frame render times) will often cause more noticeable tearing, because even at say 300fps, if you get a microstutter that lasts 1/60sec, you’ve got a bigger tear offset between the previous frame and the next frame. If all frames are equally rendered 1/300sec apart, then the tearing horizontal offsets are tiny, and hard to see. On some cards, it’s possible microstutters happens more often when you uncap the framerate (because of the need to pause everything to execute garbage collection, if everything’s been running flat-out at maximum speed). So you then want to add a framecap; but then you need to be mindful of harmonics between fps vs Hz. But where possible, if microstutters aren’t bad and there aren’t any uncapping problems (e.g. CPU starvation problem; increased microstutter problem), just simply uncap your framerate. If you do cap, then avoid capping at a multiple of your Hz when doing VSYNC OFF (unless you’re using Adaptive VSYNC)
A 1000 Hz mouse reduces tearing
If you love VSYNC OFF gaming and are sensitive to tearing, then you definitely WANT a 1000 Hz mouse running at a full 1000rps (not 250 or 500rps), something far beyond your display Hz and GPU framerate. This avoids any stutters caused by aliasing between the mouse rate and the Hz rate / framerate. For example, a cheap mouse (125 Hz) on a 120 Hz or 60 Hz, you will get about five microstutters per second. This is the harmonic beat frequency where the mouse gives you two bigger movement steps during one frame. This is noticeable during fast panning motion and when you have software based mouse smoothing turned off – you don’t want software based mouse smoothing, because that increases input lag.
How to minimize tearing while having minimum input lag & maximum fluidity
So for powerful GPU users that are very sensitive to tearing even at fps > Hz:
best fluidity & low input lag & no tearing during VSYNC OFF situation:
- You want as insanely high fps as you can. Uncap if possible (unless uncapping is buggy). If uncap is bad, then cap as high as you can.
- You don’t want harmonics. Avoid fps being a multiple of Hz (stationary tearline problem). Avoid mouse Hz close to display Hz (increases judder). Avoid mouse Hz close to fps_max (increases judder).
- You want consistent frame latency; frame render times (between frames).
- You don’t want microstutters; otherwise more visible tearing will occur because there’s bigger-offset tearlines during the moments of stutters.
This results in less visible tearing, best fluidity & low input lag simultaneously.
Do insane framerates really look better?
Yes, if frame render times are consistent. That’s if you prefer VSYNC OFF – then believe it or not, 500fps (consistent frame rendertimes) at 60 Hz can looks much better than ~60fps at 60 Hz. You get many MUCH-smaller splices of many different (fresher) frames into one refresh. The tearline offsets become very tiny as a result. (top part of refresh being almost 16.7ms ago, bottom part of refresh being nearly 0ms ago). Here, this is a situation where a 1000 Hz mouse makes a quite noticeable fluidity difference; since the more accurate mouse position updates result in more consistently and smaller offset splices during insane-high framerate VSYNC-OFF gaming.
I am someone who is sensitive to tearing even if fps > Hz (even at 300fps @ 120 Hz). This is what I’ve discovered that greatly reduces tearing, at least in games with consistent frame render times (e.g. Source engine games).
If you want the most perfect possible on-screen motion and aren’t concerned about a bit of input lag (e.g. solo playing), then play with VSYNC ON.
BlurBusters.com Blog – Eliminating Motion Blur on LCD’s
Terminallyhill last edited by
Toll last edited by
Also I found out that DirectX doesn’t do real Triple Buffering, it effectively renders ahead.
For the sake of minimal input lag, you should use Double Buffering with v-sync off, with a FPS cap between monitor refreshes to randomise the screen tearing, this reduces the impact of tearing. Furthermore if you can’t gauntee higher the Refresh Rate FPS, then you may be better off setting your FPS cap to just under your monitor refresh rate and use v-sync off with double buffering, to minimise wasted time waiting for frames.
zombojoe last edited by
my problem with vsync has always been the input lag