G-Sync and FreeSync are synchronization technologies designed to eliminate screen tearing and stuttering during video playback and gaming. G-Sync is developed by Nvidia and works exclusively with Nvidia graphics cards. FreeSync is AMD’s alternative, made to work with AMD GPUs. Both technologies ensure smoother visuals, especially during fast-paced gameplay.

Why Synchronization Between GPU and Monitor Matters

When you play games or videos, your graphics card generates frames and sends them to the monitor. Monitors operate at a fixed refresh rate – for example, 60 Hz, 120 Hz, or 144 Hz. Without VRR (variable refresh rate), monitors refresh the screen at strictly fixed intervals. If you remember in the past, when you changed the frequency, a window would appear on the monitor asking you to apply the settings, and if you didn’t click OK after 10 seconds, the frequency you had set would be reset. Since 2013, HDMI and DisplayPort ports began to support feedback from the monitor to the video card and the computer began to receive information about what frequency the monitor supports.

However, in games, some frames may take longer to generate than others. If a new frame is not ready when the next screen refresh begins, the monitor displays only the portion of the image that has had time to render, along with a portion of the previous frame. As a result, there may be a new frame at the top of the screen and an old frame at the bottom. This leads to an effect known as tearing, which can be particularly noticeable and annoying in sudden movements, such as in first-person shooters (FPS).

G-Sync and FreeSync solve this problem with VRR technology, dynamically synchronizing the monitor’s refresh rate to the actual frame rate played by the GPU. In the first versions it wasn’t really VRR, the monitor would just skip frames if they weren’t fully formed, some frames were just duplicated. This allows only completed frames to be displayed, eliminating image tearing. After the implementation of full VRR support with HDMI and DisplayPort ports, new versions of G-Sync and FreeSync were released, which already allowed dynamic support for frame rate changes, the monitor would adjust to the frequency of the graphics card.

G-Sync and FreeSync: history and how they work

If you’ve ever played PC games with high-quality graphics, chances are you’ve encountered screen tearing or input lag. These visual issues usually happen when the refresh rate of your monitor doesn’t match the number of frames your graphics card is producing. To solve this problem, GPU manufacturers developed adaptive sync technologies. The two most popular solutions are NVIDIA’s G-Sync and AMD’s FreeSync.

G-Sync, introduced by NVIDIA in 2013, was a groundbreaking innovation. Unlike older technologies like V-Sync, where the graphics card adjusts to the monitor’s refresh rate, G-Sync does the opposite — it allows the monitor to dynamically adjust its refresh rate based on the GPU’s frame output. This results in incredibly smooth visuals, no screen tearing, and minimal input lag. G-Sync works across a wide range of refresh rates. Initially, it supported 30 to 144 Hz, and now it goes up to 360 Hz or even higher, depending on the monitor. When frame rates drop below 60 Hz, G-Sync can even duplicate frames to keep the experience smooth.

However, G-Sync isn’t without drawbacks. It’s a proprietary technology, meaning it only works with NVIDIA graphics cards. That might not seem like a big issue, but there’s more. G-Sync requires a special hardware module to be built directly into the monitor — a proprietary chip made by NVIDIA. These chips aren’t cheap, which means G-Sync monitors are often $50–100 more expensive than similar models without it. As a result, while G-Sync was revolutionary in 2013, it was also expensive and limited in availability.

In response, the Video Electronics Standards Association (VESA) introduced an open technology called Adaptive Sync in 2014. It aimed to achieve the same results as G-Sync — adaptive refresh rates and smooth visuals — but without the need for costly hardware. Adaptive Sync works between 9 and 240 Hz and is part of the DisplayPort standard, starting from version 1.2a. However, to make it functional, it required firmware support in monitors, compatible GPU drivers, OS-level support, and game-level integration. This integration work was championed by AMD.

In 2015, AMD released their own implementation of Adaptive Sync and called it FreeSync. It quickly gained traction because it was cheap and easy to integrate. So much so, that today it’s actually hard to find a gaming monitor without FreeSync support. AMD didn’t stop there. They expanded support to HDMI connections starting with version 1.4 and, in 2017, introduced FreeSync 2, which added HDR support and low framerate compensation — features similar to what G-Sync offered. Later, FreeSync 2 was renamed to FreeSync Premium Pro, and standard FreeSync with 120 Hz or higher refresh rates was rebranded as FreeSync Premium. While these naming strategies might seem a bit confusing, AMD deserves credit for helping to popularize adaptive sync as an industry standard.

NVIDIA also continued to evolve its technology, introducing HDR support under the label G-Sync Ultimate in 2017. So now both companies offer high-end, feature-rich solutions. But for a long time, users were stuck in a tough spot. If you had an NVIDIA GPU but only a FreeSync monitor, you were out of luck. You had to either buy a pricey G-Sync monitor or switch to an AMD GPU just to take advantage of adaptive sync.

Thankfully, that has changed. NVIDIA now supports what they call “G-Sync Compatible” monitors — essentially FreeSync monitors that have been tested and certified to work well with NVIDIA cards. This has opened up many more affordable options for gamers without sacrificing compatibility.

In the end, G-Sync and FreeSync are different approaches to solving the same problem. G-Sync delivers a premium experience with a higher price tag, while FreeSync focuses on affordability and wide adoption. Regardless of which one you choose, both technologies have greatly improved the gaming experience by eliminating tearing, reducing input lag, and making visuals feel much smoother.

VRR: Adaptive synchronization on TVs and consoles

If your TV has been equipped with HDMI port version 2.1. This is the version that introduced VRR, or Variable Refresh Rate, to the HDMI standard.

This technology is compatible with both NVIDIA and AMD Radeon graphics cards, because VRR is essentially the same VESA Adaptive Sync technology, just implemented within HDMI. It was VRR that made adaptive synchronization available on next-generation consoles. And even more so, it was implemented on the Xbox One S and One X before HDMI 2.1 even existed. Microsoft was ahead of its time by introducing VRR support before it became part of the official standard.

So today it’s safe to say that adaptive synchronization has become a de facto industry standard. It is supported not only by AMD and NVIDIA graphics cards, but also by modern TVs, game consoles, and even integrated graphics from Intel, starting from the 11th generation of processors.

Previous articleWhy Some Video Files Won’t Play on Your Samsung TV (and How to Fix It)
Next articleRecommended TV picture settings (all TVs and brands)

LEAVE A REPLY

Please enter your comment!
Please enter your name here