Is G-Sync worth it in 2019? An in-depth review.
All gaming labelled monitors being released will now support some type of refresh rate technology. The current two refresh rate technologies on the market are G-Sync from Nvidia and FreeSync from AMD.
What is refresh rate technology?
All monitors, including TVs, use what is called a refresh rate in order to display an image on screen. The vast majority of monitors will have a refresh rate of around 60Hz, which means that the image will refresh 60 times per second. Higher end monitors will be able to deliver a higher refresh rate, sometimes 120Hz and higher. A refresh rate which is lower than 60Hz will probably give you a headache after some time of usage.
Why would I want a higher refresh rate? In gaming, a higher refresh rate will provide a better image response and therefore image transactions in fast moving scenarios will be much clearer. Displays with a higher refresh rate will provide a crisper image when compared to lower refresh rate when showing a fast moving object. However, other factors might affect how crisp the image is, such as response time and panel type.
In order to be able to deliver the maximum refresh rate the monitor is able to deliver, the graphics card driving the monitor must be able to deliver enough frames per second in order to be able to meet or exceed the refresh rate of the monitor (120 FPS to 120Hz). The more demanding a game is in terms of graphics quality, or the higher the resolution of the monitor, the more difficult it is for a graphics card to deliver enough frames per second. When the graphics card is able to achieve higher frames per second than the refresh rate of the screen (190FPS to 120Hz), an effect known as screen tearing might occur.
In order to eliminate screen tearing, one can either enable V-Sync within the game settings, or one can also use G-Sync or FreeSync technology. Where as V-Sync can be used on every monitor, G-Sync and FreeSync can only be used on monitors that support one of the technologies. The advantage of using G-Sync or FreeSync as opposed to V-Sync is that the adaptive refresh rate technologies do not introduce input lag and will adapt the monitor’s refresh rate to match the FPS being achieved by the graphics card.
Brief explanation of Nvidia G-Sync and FreeSync
These adaptive refresh rate technologies are built into the monitor which provide an interface between the monitor and the graphics card. Through such an interface, the graphics card would be able to communicate with the monitor in order to change the refresh rate of the monitor to match the FPS being generated by the graphics card. Thus, through these technologies, the refresh rate of the monitor is changed in real-time during gameplay depending on the FPS.
Since G-sync monitors require certification and a proprietary chip by Nvidia, such monitors will be more expensive to buy when compared to their FreeSync counterparts. Furthermore, the G-Sync module might limit the capabilities of the monitor. A good example of this is LG’s 34GK950F and 34GK950G. Both monitors use the same panel, but the 34GK950F is able to deliver 20Hz more than the 34GK950G. The FreeSync version is also able to deliver HDR400 and 8bit+FRC colour depth, while the G-Sync version does not support HDR and only supports up to 8bit colour depth.
If you have a monitor with Nvidia G-Sync, you would have to have an Nvidia graphics card. If you have a monitor with FreeSync technology, you would have to have an AMD graphics card. An AMD graphics card does not support G-Sync monitors. Nvidia has enabled FreeSync support on Nvidia graphics card on certain monitors.
Is G-Sync worth it over FreeSync in 2019?
Users purchasing a new monitor or graphics card had to choose whether to go for an AMD card and a Freesync monitor or go with a NVIDIA card and a GSync monitor, since Freesync was not supported by NVIDIA cards and vice versa. As of NVIDIA driver version 417.71, released on January 15, 2019, this is no longer a problem. Nvidia has now enabled support for some FreeSync monitors without the need to have G-Sync technology within the monitor. You can see how to enable FreeSync from the Nvidia control panel here. Thus, if you have one of the Nvidia approved FreeSync monitors and an Nvidia card, you’d be able to make use of FreeSync. If the monitor is approved by Nvidia to work with an Nvidia card, the adaptive refresh rate technology will work as good as if you had a monitor with G-Sync.
However, if the monitor is not on Nvidia’s official FreeSync supported monitors list, then the monitor might not be fully supported and artefacts or screen blackouts might be experiencing when using FreeSync with an Nvidia graphics card. Note that we have tested non-approved monitors and we were still able to get FreeSync working perfectly when using an Nvidia graphics card. You can view a list of FreeSync monitors which work with Nvidia graphics cards here.
Using a FreeSync monitor with an Nvidia card might not necessarily mean that FreeSync will work well. Thus, our conclusion will not be clear cut in the sense that a FreeSync monitor is the way to go.
First off we recommend you take a look at our Nvidia supported FreeSync monitor list here. If you find that the monitor you are looking for works well with an Nvidia graphics card, then we suggest for you to go for it.
However, if the monitor you are looking for is not listed in our list and you have an Nvidia graphics card, then we suggest you consider a G-Sync monitor instead.
If you have an AMD graphics card, a G-Sync monitor will definitely not work since AMD graphics card do not support Nvidia G-Sync.