When you’re building your PC gaming station you can think of it like fine-tuning a sports car: for peak performance, you need the mechanics working together in harmony.
And just like a car, if your PC is lacking in one area it could hinder the machines overall performance. On that note, one of the most important relationships in your PC is the one between your GPU and gaming monitor.
If these two aren’t working in sync then you could wind up with several infuriating issues, like screen-tearing, stuttering, and input lag to name a few.
Fortunately for us, technology has been developed to help with this syncing problem, namely with V-Sync, AMD’s FreeSync, and NVIDIA’s G-Sync.
You’ve likely heard of these before, but may still find yourself wondering: “What is V-Sync, FreeSync, and G-Sync, and how are the different?”
Today, we’re going to discuss just that.
But before we delve into the nuances of each, let’s first explore the causes of screen-tearing and stuttering so we can better understand what these technologies do to prevent them.
Screen-tearing occurs when the video output of images from your GPU happens at a faster rate than your monitor is able to display them because of its slower refresh rate. The result is an overload of frames for your monitor to update which leads to 2 or more frames overlapping, producing a jarring, torn in-half look on your screen.
Here’s an example:
Let’s say your gaming monitor has a refresh rate of 60Hz (Learn more about refresh rates here) but your GPU begins to render frames at a faster rate because of scenes requiring higher graphics, faster action, etc. Because the GPU’s frame rate exceeds your monitors refresh rate, your monitor will still be working to show one image while the GPU is already pushing out the next. Since your monitor isn’t able to refresh the images quickly enough, two or more images begin to overlap.
Image stuttering is when these roles are reversed. Instead of the monitor not being able to keep up, the GPU renders images at a slower rate than your monitor is displaying them. This results in choppy images.
While experiencing these artifacts won’t directly affect your game performance, it can be SUPER aggravating. So to fix these issues you need synchronized communication between your monitor and GPU.
One way to achieve better communication is to upgrade to more high-end hardware…but we aren’t all made of money, and that is where these different sync technologies help us out.
V-Sync, short for vertical synchronization, is the most basic form of sync technology. Although it acts on the same basic principle as the likes of AMD’s FreeSync and NVIDIA’s G-Sync, it’s actually pretty different.
First of all, V-Sync is an in-game setting that’s found in most games. When it’s turned on, V-Sync limits the number of frames being rendered by the GPU to match your monitor’s maximum refresh rate. By doing this, even if your GPU is able to output more frames than your monitor can handle, it won’t. Therefore, V-sync will rid of any screen-tearing and stuttering issues.
BUT, and this is a huge “but,” your GPU will still render frames and backlog them so they don’t go to your monitor. As a result, these ‘extra’ frames being stored in the GPU can create some pretty serious input lag.
Reminder: input lag is when you give a device a command—type a message on your phone, open a folder on your computer, click ‘shoot’ in a video game—and the device doesn’t respond to the command immediately. As you can imagine, this can greatly affect game performance and is especially detrimental for competitive gamers who rely on quick reactions.
Because V-Sync can cause input lag, most gamers opt to turn it off and suffer through unpleasant tearing and stuttering instead.
Unless you are playing single-player or the stakes are low, I suggest you do the same.
Luckily, V-Sync isn’t our only option when it comes to synchronizing our GPU and monitor. AMD developed FreeSync and NVIDIA developed G-Sync.
How do these differ from V-Sync and what makes them better?
AMD's FreeSync and NVIDIA's G-Sync are both sync technologies developed to reduce screen-tearing, stuttering, without resulting in input lag, for smoother overall game performance. The way these two technologies are implemented is different from V-Sync, but they have the same goal in mind: to get your monitor to communicate with your GPU more efficiently.
So, how are FreeSync and G-Sync different from V-Sync?
It's pretty simple, really. While V-Sync slows down your GPU's output so your monitor can keep up, AMD and NVIDIA have instead developed their technologies to adapt the refresh rate of your monitor to the speed of your GPU’s frame rate. This helps reduce screen-tearing and stuttering while also avoiding input lag caused by using V-Sync.
How are FreeSync and G-Sync different from one another?
First and foremost, the major difference between the two is cost, with FreeSync being the budget option.
AMD’s FreeSync is cheaper because it allows manufacturers to incorporate FreeSync technology in their monitors without paying any licensing fees, making it much easier to find a FreeSync monitor and allowing more variability in the level of specs.
Nowadays, nearly every monitor on the market, gaming or not, will have FreeSync (as long as it isn’t a G-Sync monitor, of course). Additionally, AMD made it even more accessible by using VESA’s DisplayPort adaptive-sync standards.
NVIDIA’s G-Sync, on the other hand, strictly enforces the regulations for manufacturing a G-Sync monitor. Not only must the monitor pass a series of undisclosed testing, but they also require manufacturers to install a G-Sync module in the monitor (instead of using the Display Port) for it to be G-Sync compatible. Because NVIDIA requires more rigorous testing and hardware, the products typically cost a couple of hundred dollars more. But you will know for a fact that you’re getting a high-quality product.
To use these technologies, you must use their parent graphics card as well. If you want to use a G-Sync monitor you will need an NVIDIA graphics card and to use a FreeSync monitor you will need an AMD one.
However, in January 2019, at CES in Las Vegas, there were new developments with NVIDIA’s graphics cards. NVIDIA announced the GeForce GTX 10 series and RTX 20 series of GPUs will be able to be used with certain FreeSync monitors that have gone through a series of rigorous testing.
Hundreds of monitors were tested, and only a handful were selected as “G-Sync Compatible,” but NVIDIA also allows users to manually turn on G-Sync in FreeSync monitors, even if they didn’t pass NVIDIA’s testing.
Manually turning on G-Sync in certain monitors might mean that the result won’t be up-to-par with monitors that passed, but artifacts could be minor and there’s still the possibility of reaping benefits from it. Learn more about NVIDIA’s G-Sync Compatibility in this article.
Now, let’s take a look at a few monitors in ViewSonic’s line of premier gaming monitors.
Every monitor in ViewSonic’s XG lineup of gaming monitors—unless it’s a G-Sync monitor—supports AMD’s FreeSync technology, so you can’t go wrong. We’re going to highlight two of the newest monitors launched in 2019 below.
This 24” monitor is built especially for the competitive gamer who needs a monitor that reacts as fast as they do.
A 35” ultra-wide curved gaming monitor with QHD resolution is the perfect fit for those who want to game, and then some.
You already have the powerful graphics card from NVIDIA, now all you need is a monitor that can handle that kind of output. Here are a few that can get the job done:
If you want to learn more about ViewSonic ELITE gaming monitors, go to https://www.viewsonic.com/gaming/