Staredit Network > Forums > Technology & Computers > Topic: Vertical Sync
Vertical Sync
Apr 14 2012, 11:10 pm
By: Oh_Man  

Apr 14 2012, 11:10 pm Oh_Man Post #1

Find Me On Discord (Brood War UMS Community & Staredit Network)

Can someone please explain to me in noob-friendly terms why vertical sync is good? I mean, why do games always have it turned ON by default, for example (Diablo 3 beta).

What benefits do vertical sync give? What is 'screen tearing'? If I am not suffering from screen tearing, are there any other benefits to having vertical sync on?
Does it really butcher FPS? What exactly is 'triple buffering' and how does it stop vertical sync from raping framerate?

Cheers.




Apr 14 2012, 11:17 pm Lanthanide Post #2



Vertical sync prevents screen tearing, by locking the game refresh rate to your monitors refresh rate. Not entirely sure it works the same way on LCDs as it does on CRTs though, since the way the screen refreshes is obviously different.

Turning on vertical sync therefore caps your frame rate. But personally I don't believe that seeing 'torn' frames is of any benefit to the player. Probably only obsessive people at the very highest level of play would really be impacted by vertical sync, and everyone else will just be posers that think it makes a difference to their ability when it really doesn't.

Personally I always turn it on when it's available.



None.

Apr 14 2012, 11:19 pm Zycorax Post #3

Grand Moderator of the Games Forum

I read a pretty good article about v-sync a while ago, but I can't remember where. Think this should cover most of it though: http://www.tweakguides.com/Graphics_9.html




Apr 15 2012, 12:34 am Oh_Man Post #4

Find Me On Discord (Brood War UMS Community & Staredit Network)

Nice article, ty. I'm just going to have it default OFF, unless I see screen tearing.




Apr 15 2012, 11:08 pm O)FaRTy1billion[MM] Post #5

👻 👾 👽 💪

I usually turn it on because drawing excess frames between frames your monitor displays is entirely useless and a waste of processing time. :\

Post has been edited 1 time(s), last time on Apr 15 2012, 11:15 pm by FaRTy1billion.



TinyMap2 - Latest in map compression! ( 7/09/14 - New build! )
EUD Action Enabler - Lightweight EUD/EPD support! (ChaosLauncher/MPQDraft support!)
EUDDB - topic - Help out by adding your EUDs! Or Submit reference files in the References tab!
MapSketch - New image->map generator!
EUDTrig - topic - Quickly and easily convert offsets to EUDs! (extended players supported)
SC2 Map Texture Mask Importer/Exporter - Edit texture placement in an image editor!
\:farty\: This page has been viewed [img]http://farty1billion.dyndns.org/Clicky.php?img.gif[/img] times!

Apr 15 2012, 11:43 pm Lanthanide Post #6



Quote from O)FaRTy1billion[MM]
I usually turn it on because drawing excess frames between frames your monitor displays is entirely useless and a waste of processing time. :\
Well it really doesn't matter if you're 'wasting' processing time, because the alternative is that it's going to sit there idle. Unless you did actually have some other application that was able to use the GPU resource when the game wasn't, but I very much doubt that.

So really in practical terms it becomes a potential waste of electricity. In real terms this is going to be very very tiny, as the GPU is still going to be fully powered for a second, it'll simply be drawing 60 frames instead of maybe 150 frames for that second - it's not like it's going from drawing 150 frames to 1 frame or similar, where there could be more of a power saving.



None.

Apr 16 2012, 5:19 am Aristocrat Post #7



Quote from Lanthanide
So really in practical terms it becomes a potential waste of electricity. In real terms this is going to be very very tiny, as the GPU is still going to be fully powered for a second, it'll simply be drawing 60 frames instead of maybe 150 frames for that second - it's not like it's going from drawing 150 frames to 1 frame or similar, where there could be more of a power saving.

Turning on VSync in Minecraft made the total power drain of my CPU/GPU go from 34W to 15W (For comparison, idle is 11W). This translates to significantly longer battery life if I play it with VSync on. For desktop GPUs, this could translate into a difference between 40W and 250W load consumption.

I would recommend you actually test these things out before claiming they make no difference.



None.

Apr 16 2012, 7:48 am ShadowFlare Post #8



If you have a very fast GPU, turning on V-Sync for games that would run with FPS way over your monitor's refresh rate will make your GPU do less work, use less power, and generate less heat. Capping your frame rate in some way is always a good thing for such games. If you use a CRT monitor, you may potentially see a benefit for frame rates up to double your refresh rate, but you aren't likely to see this on an LCD.

With V-Sync off, tearing is most visible for 3d first-person games or games with fast full-screen scrolling or flashing. You will see a line where above you see one frame and below you see a different frame.



None.

Apr 16 2012, 8:35 am Oh_Man Post #9

Find Me On Discord (Brood War UMS Community & Staredit Network)

Yes, yes; my question has been answered! All of you - DISPERSE!




Apr 16 2012, 9:22 am Lanthanide Post #10



Quote from Aristocrat
Turning on VSync in Minecraft made the total power drain of my CPU/GPU go from 34W to 15W (For comparison, idle is 11W). This translates to significantly longer battery life if I play it with VSync on. For desktop GPUs, this could translate into a difference between 40W and 250W load consumption.

I would recommend you actually test these things out before claiming they make no difference.
Wow, that is a big difference. It was more of an estimate for what kind of difference it would make, based on my reading around the subject. For example the article that Zycorax linked to didn't make any mention about potential power saving. Probably depends on the card as well as the game as well, I'd imagine.



None.

Apr 16 2012, 1:13 pm Aristocrat Post #11



Yeah, the gains are going to be smaller if it's a game you don't render at a rate above 60 FPS (StarCraft II goes from 53W to ~40-45W while Skyrim goes from 52W to 33-40W, while their framerates get capped at ~30). But it'll always save some amount of power. 30 FPS actually does not feel to be laggier compared to an irregular number like 38FPS.

Enabling triple buffering makes the power savings go away, but the framerate increases and it feels smoother compared to 30FPS. (And smoother compared to the same framerate without VSync).

It is worth noting that you should not use unbuffered VSync if you render the game at a rate that's only slightly above 60FPS on average and have dips below 60FPS when a lot of detail is onscreen; this happens with Super Street Fighter IV which I usually get ~80FPS in: perceived "lag spikes" occur when a particularly detailed ultra is used, when VSync caps the fraperate to 30 momentarily due to the card no longer being able to reach 60FPS.

EDIT> I don't have a desktop card that drains significant amounts of power to test, so could someone who does see what the actual differences are? My earlier estimate was based off Anand's data about a Radeon 7970's load/idle power consumption.

Post has been edited 1 time(s), last time on Apr 16 2012, 1:32 pm by Aristocrat.



None.

Apr 17 2012, 3:43 pm rockz Post #12

ᴄʜᴇᴇsᴇ ɪᴛ!

reducing total number of frames dramatically increases the input lag. If you need to be accurate down to the millisecond, you'll want to turn it off.

I wish there were an easier way to limit fps to like 200 or something in all games.



"Parliamentary inquiry, Mr. Chairman - do we have to call the Gentleman a gentleman if he's not one?"

Apr 17 2012, 4:12 pm Aristocrat Post #13



Given that most consumers don't seem to mind the ~200ms input lag of touchscreens, I think they can live with the theoretical maximum of 16ms input lag from a 60FPS VSync'd game. In online play, the latency is most likely significantly greater than 16ms as well. Only in offline play does this become worthy of concern.



None.

Apr 17 2012, 8:22 pm Lanthanide Post #14



Quote from rockz
reducing total number of frames dramatically increases the input lag. If you need to be accurate down to the millisecond, you'll want to turn it off.

I wish there were an easier way to limit fps to like 200 or something in all games.
I'm not really sure how you get 'input lag' from only showing the frames at the rate of your monitor. It seems to me that the 'lack' of input lag would result in torn screens, eg the top half of the screen doesn't match the bottom half of the screen because the top half has responded to your action while the bottom half hasn't. But in 1/60th of a second you would have seen that full image anyway...



None.

Apr 18 2012, 4:31 am Aristocrat Post #15



Quote from Lanthanide
I'm not really sure how you get 'input lag' from only showing the frames at the rate of your monitor.

At precisely 60FPS you are rendering each frame in ~16 milliseconds. This means that every time you see a frame, it'll be from game data ~16 milliseconds ago. At 0:00.0017 you are seeing the game as it's rendered from 0:00.0001, at 0:00.0033 you are seeing the game as it's rendered from 0:00.0017, etc.

At precisely 200FPS, you will render a frame in 5 milliseconds. Assuming every part of the screen takes equal time to render, this means at 0:00.0017 the top 2/5 of the screen will be the game as it's rendered at 0:00.0015, while the bottom 3/5 of the screen will be the game as it's rendered at 0:00.0010. The input lag in this case is only 7 milliseconds (or if the input is reflected in the top 2/5 of the screen, 2ms). However, you get screen tearing.

This is also the reason why crossfire/SLI in AFR has more input lag than a single GPU. If you get exactly 60FPS with four cards in SLI, assuming no microstutter, that means each card renders one frame in ~67 milliseconds. That means what's onscreen is going to be what the game was like ~67 milliseconds ago. At this point the input delay is noticeable for dedicated gamers.



None.

Apr 18 2012, 5:15 am Lanthanide Post #16



But the human eye can't see 200 frames per second anyway. This site says the median reaction time is 215 milliseconds: http://www.humanbenchmark.com/tests/reactiontime/index.php

So I don't see any benefit to having very high frame rates: you can't see them, and even if you could, you're unlikely to be able to react to them unless you're at the very top of the competitive ladder.

Here's the wiki page on reaction time: http://en.wikipedia.org/wiki/Reaction_time



None.

Apr 18 2012, 5:23 am Aristocrat Post #17



Quote from Lanthanide
But the human eye can't see 200 frames per second anyway. This site says the median reaction time is 215 milliseconds: http://www.humanbenchmark.com/tests/reactiontime/index.php

So I don't see any benefit to having very high frame rates: you can't see them, and even if you could, you're unlikely to be able to react to them unless you're at the very top of the competitive ladder.
Read and think over my post again; the picture displayed by the monitor for a game rendered at 200FPS is still more "recent" than the image that is displayed if the game is rendered at 60FPS, even if the monitor is at 60Hz for both scenarios. Hence, the additional input lag is perceived.

I play fighting games casually, and I can already see and count frames. My mean reaction time is ~350ms (approx. 22 frames) according to that site, but I am still significantly hampered by network latencies greater than 120ms (which translates to additional input delay of approximately 8 frames). Input delay does make a huge difference even if it's only a fraction of your "reaction time", because it screws up your timings for directional inputs, move buffers, roman cancels, etc. The same applies to StarCraft in the form of precise unit micromanagement. Imagine how difficult it would be to micro mutalisks if your clicks showed up onscreen a few frames after you actually clicked because you used SLI to get it running on Ultra at 60FPS; if you have >60 APM, that means every time you click the mutalisks will be moving to the location you indicated multiple clicks ago. This is a huge discrepancy that will mess with your brain and inevitably cause you to screw up.



None.

Apr 18 2012, 5:40 am Lanthanide Post #18



Quote
Read and think over my post again; the picture displayed by the monitor for a game rendered at 200FPS is still more "recent" than the image that is displayed if the game is rendered at 60FPS, even if the monitor is at 60Hz for both scenarios.
Read and think over my post. It doesn't matter what the monitor is showing 'right now' because your brain is still processing what it was showing 1/10th of a second ago.

This article investigates input lag in depth, and the difference between vsync on and off in TF2: http://www.anandtech.com/show/2803/6

Quote
The number of complaints we've seen on the net about Fallout 3 and input lag (though notably a subjective improvement over Oblivion) combined with our own experience lead us to believe that somewhere between TF2's snappy 89ms worst case input latency (which we still couldn't feel) and Fallout 3's ~50% longer latency we cross the point of perceptibility and distraction. That 100ms number certainly looks like a good point for developers to keep input latency below.

Emphasis mine.

My point is that unless the rest of your system is slow, you're unlikely to notice the input lag induced by enabling vsync. Check the next page of the article which shows the various sources of input lag, vsync doesn't make up much of it in the average set up.



None.

Apr 18 2012, 11:06 am NudeRaider Post #19

We can't explain the universe, just describe it; and we don't know whether our theories are true, we just know they're not wrong. >Harald Lesch

Well I *think* "input lag" is a technical problem of some sort and has nothing to do with what Aristo explained.
That doesn't make what Aristo said invalid though. It's probably best explained with an example:

At time t=0ms all game information is gathered and the computer starts rendering the image. For sake of simplicity lets say the first monitor refresh is also at t=0ms.
Since there's no new render yet at t=0 the monitor will show outdated data from a frame ago.
At t=5ms the computer is done rendering the first 200ms frame, but sine the monitor is still drawing the old frame the new frame will wait in the frame buffer unused.
At t=10ms 2nd 200fps frame is put into frame buffer, still unused. (The 60fps frame is still in the making)
At t=15ms 3rd 200fps frame goes into the buffer, waiting.
At t=16ms the first 60fps frame goes into buffer and the monitor starts drawing the 60fps buffer and the 200fps buffer.
That means at that time you get to see game information from t=0ms (=16ms ago from current monitor refresh @ 60fps buffer) and from t=15ms (=1ms ago from current monitor refresh @ 200fps buffer). Then you have to add reaction time of 200ms which means with 200 fps you get to react (in this example) after 201ms while with 60fps it's 216ms ago.

Disclaimer: Haven't thought this to the end, but I gotta go now. Correct me whereever necessary.




Apr 18 2012, 11:40 pm rockz Post #20

ᴄʜᴇᴇsᴇ ɪᴛ!

Most everyone uses triple buffering. That means you store 3 frames ahead of time. If you're at 60 fps, that's 50 ms. If you're at 200 fps, that's 15 ms.

If I'm wrong (which I may be) just know this: people can tell the difference. All of your explanations on "but this is inconceivable" don't really do anything when I can seriously tell the difference between 800 fps input lag and 60 fps input lag on most games. To me, it's a fact, and there are thousands of others who agree with me on the matter. I would say millions, but it's likely they don't care, or are not using their computer for time-sensitive material in an urgent fashion.

It's the same reason why mouse acceleration is utterly horrid for gaming.

Finally, this whole thing about "human reaction time" is moot. There may be a reaction time of 200 ms, but it's not like the human brain is always 200 ms behind. What if you start to act 200 ms before? The net reaction time would be 0 ms, right? when it comes to games, there are many things that remain constant which make it easier to guess at where they will be in the future. This includes a virtual person running in a linear direction at a linear speed, or even a constant acceleration.

Post has been edited 1 time(s), last time on Apr 18 2012, 11:48 pm by rockz.



"Parliamentary inquiry, Mr. Chairman - do we have to call the Gentleman a gentleman if he's not one?"

Options
  Back to forum
Please log in to reply to this topic or to report it.
Members in this topic: None.
[10:09 pm]
Ultraviolet -- let's fucking go on a madmen rage bruh
[10:01 pm]
Vrael -- Alright fucks its time for cake and violence
[2024-5-07. : 7:47 pm]
Ultraviolet -- Yeah, I suppose there's something to that
[2024-5-06. : 5:02 am]
Oh_Man -- whereas just "press X to get 50 health back" is pretty mindless
[2024-5-06. : 5:02 am]
Oh_Man -- because it adds anotherr level of player decision-making where u dont wanna walk too far away from the medic or u lose healing value
[2024-5-06. : 5:01 am]
Oh_Man -- initially I thought it was weird why is he still using the basic pre-EUD medic healing system, but it's actually genius
[2024-5-06. : 3:04 am]
Ultraviolet -- Vrael
Vrael shouted: I almost had a heart attack just thinking about calculating all the offsets it would take to do that kind of stuff
With the modern EUD editors, I don't think they're calculating nearly as many offsets as you might imagine. Still some fancy ass work that I'm sure took a ton of effort
[2024-5-06. : 12:51 am]
Oh_Man -- definitely EUD
[2024-5-05. : 9:35 pm]
Vrael -- I almost had a heart attack just thinking about calculating all the offsets it would take to do that kind of stuff
[2024-5-05. : 9:35 pm]
Vrael -- that is insane
Please log in to shout.


Members Online: Ultraviolet, Roy