not supported with gsync lcd panel free sample

If you want smooth gameplay without screen tearing and you want to experience the high frame rates that your Nvidia graphics card is capable of, Nvidia’s G-Sync adaptive sync tech, which unleashes your card’s best performance, is a feature that you’ll want in your next monitor.

To get this feature, you can spend a lot on a monitor with G-Sync built in, like the high-end $1,999 Acer Predator X27, or you can spend less on a FreeSync monitor that has G-Sync compatibility by way of a software update. (As of this writing, there are 15 monitors that support the upgrade.)

However, there are still hundreds of FreeSync models that will likely never get the feature. According to Nvidia, “not all monitors go through a formal certification process, display panel quality varies, and there may be other issues that prevent gamers from receiving a noticeably improved experience.”

But even if you have an unsupported monitor, it may be possible to turn on G-Sync. You may even have a good experience — at first. I tested G-Sync with two unsupported models, and, unfortunately, the results just weren’t consistent enough to recommend over a supported monitor.

The 32-inch AOC CQ32G1 curved gaming monitor, for example, which is priced at $399, presented no issues when I played Apex Legends and Metro: Exodus— at first. Then some flickering started appearing during gameplay, though I hadn’t made any changes to the visual settings. I also tested it with Yakuza 0,which, surprisingly, served up the worst performance, even though it’s the least demanding title that I tested. Whether it was in full-screen or windowed mode, the frame rate was choppy.

Another unsupported monitor, the $550 Asus MG279Q, handled both Metro: Exodus and Forza Horizon 4 without any noticeable issues. (It’s easy to confuse the MG279Q for the Asus MG278Q, which is on Nvidia’s list of supported FreeSync models.) In Nvidia’s G-Sync benchmark, there was significant tearing early on, but, oddly, I couldn’t re-create it.

Before you begin, note that in order to achieve the highest frame rates with or without G-Sync turned on, you’ll need to use a DisplayPort cable. If you’re using a FreeSync monitor, chances are good that it came with one. But if not, they aren’t too expensive.

First, download and install the latest driver for your GPU, either from Nvidia’s website or through the GeForce Experience, Nvidia’s Windows 10 app that can tweak graphics settings on a per-game basis. All of Nvidia’s drivers since mid-January 2019 have included G-Sync support for select FreeSync monitors. Even if you don’t own a supported monitor, you’ll probably be able to toggle G-Sync on once you install the latest driver. Whether it will work well after you do turn the feature on is another question.

Once the driver is installed, open the Nvidia Control Panel. On the side column, you’ll see a new entry: Set up G-Sync. (If you don’t see this setting, switch on FreeSync using your monitor’s on-screen display. If you still don’t see it, you may be out of luck.)

Check the box that says “Enable G-Sync Compatible,” then click “Apply: to activate the settings. (The settings page will inform you that your monitor is not validated by Nvidia for G-Sync. Since you already know that is the case, don’t worry about it.)

Nvidia offers a downloadable G-Sync benchmark, which should quickly let you know if things are working as intended. If G-Sync is active, the animation shouldn’t exhibit any tearing or stuttering. But since you’re using an unsupported monitor, don’t be surprised if you see some iffy results. Next, try out some of your favorite games. If something is wrong, you’ll realize it pretty quickly.

There’s a good resource to check out on Reddit, where its PC community has created a huge list of unsupported FreeSync monitors, documenting each monitor’s pros and cons with G-Sync switched on. These real-world findings are insightful, but what you experience will vary depending on your PC configuration and the games that you play.

Vox Media has affiliate partnerships. These do not influence editorial content, though Vox Media may earn commissions for products purchased via affiliate links. For more information, seeour ethics policy.

not supported with gsync lcd panel free sample

Information on this error message is REALLY sketchy online. Some say that the G-Sync LCD panel is hardwired to the dGPU and that the iGPU is connected to nothing. Some say that dGPU is connected to the G-Sync LCD through the iGPU. Some say that they got the MUX switch working after an intention ordering of bios update, iGPU drivers then dGPU drivers on a clean install.

I"m suspecting that if I connect an external 60hz IPS monitor to one of the display ports on the laptop and make it the only display, the Fn+F7 key will actually switch the graphics because the display is not a G-Sync LCD panel. Am I right on this?

If I"m right on this, does that mean that if I purchase this laptop, order a 15inch Alienware 60hz IPS screen and swap it with the FHD 120+hz screen currently inside, I will also continue to have MUX switch support and no G-Sync? The price for these screens is not outrageous.

not supported with gsync lcd panel free sample

In 2019, Nvidia introduced(opens in new tab) its G-Sync Compatibility program. It was somewhat shocking because it allowed monitors to don a G-Sync badge without investing in Nvidia’s proprietary hardware. Ever since, monitors new and old, including those that use AMD FreeSync technology, are eligible for G-Sync Compatibility certification if they pass Nvidia’s tests. Additionally, Nvidia’s G-Sync Compatibility program is retroactive and internally funded, complicating the idea of G-Sync monitors carrying a tax.

Since G-Sync Compatibility’s arrival, we’ve tried running G-Sync on all gaming monitors that have arrived in our lab, whether they’re Nvidia-certified or not. The vast majority of FreeSync-only gaming monitors we tested successfully ran G-Sync, (and you can learn how in our articleHow to Run G-Sync on a FreeSync Monitor).

So what’s the deal? Is either a standard G-Sync or G-Sync Compatibility certification truly necessary in order to join Adaptive-Sync with an Nvidia graphics card? Should Nvidia gamers nix a FreeSync-only monitor from their shopping list or is it safe to assume they’ll be able to run G-Sync on it just fine? And is there any risk in running G-Sync on a non-certified display?

“Sometimes it might be okay, might be acceptable, satisfactory. Other times it might not. But we leave that choice to the end user,” he told Tom’s Hardware, echoing sentiments shared on Nvidia’s own website, which details how gamers can run G-Sync on monitors without certification while noting that it “may work, it may work partly, or it may not work at all.”

“And it"s easy enough to try,” Sharma added. “Take a monitor that you know is fixed frequency, an old one, 60 Hz, and then go into the video control panel, and turn on G-Sync Compatible and then go play a game. And check -- if the monitor doesn"t like the signal it’s getting it’ll just show a black screen [or something].”

Some monitor vendors also approve giving G-Sync a try. Jason Maryne, product marketing manager for LCD monitors at ViewSonic, which has G-Sync Compatible, G-Sync and FreeSync-only gaming monitors, told us that he’s not aware of any reason gamers shouldn’t try running G-Sync on ViewSonic’s FreeSync monitors.

“Every gamer with a FreeSync monitor should attempt to use G-Sync drivers (if they have an Nvidia graphics card), but their experience may not be as optimal as Nvidia wants it to be,” Maryne told Tom’s Hardware.

“In general, we recommend only using our monitors in the matter that they’ve been tested and certified to maximize performance and the end-user experience, which is why given the current great performance of FreeSync paired with the higher refresh specs of our monitors, there should not be any need to run our monitors in any un-supported modes,” Paul Collas, VP of product at Monoprice, which has FreeSync monitors but no G-Sync Compatible ones, told Tom’s Hardware.

We’ve also encountered some limitations in our testing. You can’t run the feature with HDRcontent, and we’ve been unable to run overdrive while running G-Sync on a FreeSync-only monitor.

If you want something that’ll fight screen tearing with your Nvidia graphics card right out of the box with satisfactory performance regardless of frame rates, official G-Sync Compatibility has its perks. To get why, it’s first important to understand what real Nvidia-certified G-Sync Compatibility is.

G-Sync Compatibility is aprotocol within VESA’s DisplayPort 1.2 spec. That means it won’t work with HDMI connections (barring G-Sync Compatible TVs) or Nvidia cards older than the GTX 10-series. As you can see onNvidia’s website(opens in new tab), G-Sync Compatibility is available across a range of PC monitor brands, and even TVs (LG OLEDs, specifically) carry G-Sync Compatible certification.

Each monitor that gets the G-Sync Compatible stamp not only goes through the vendor’s own testing, as well as testing required to get its original Adaptive-Sync (most likely FreeSync) certification, but also Nvidia testing on multiple samples. Most monitors that apply for G-Sync Compatibility fail -- the success rate is under 10%, according to Nvidia’s Sharma. Earning the certification is about more than just being able to turn G-Sync on in the GeForce Experience app.

In May, Nvidia announced(opens in new tab)that only 28 (5.6%) of the 503 Adaptive-Sync monitors it tested for its G-Sync Compatibility program passed its tests. Apparently, 273 failed because their variable refresh rate (VRR) range wasn’t at least 2.4:1 (lowest refresh rate to highest refresh rate). At the time, Nvidia said this meant gamers were unlikely to get any of the benefits of VRR. Another 202 failed over image quality problems, such as flickering or blanking. “This could range in severity, from the monitor cutting out during gameplay (sure to get you killed in PvP MP games), to requiring power cycling and Control Panel changes every single time,” Nvidia explained. The remaining 33 monitors failed simply because they were no longer available.

“Some were okay, some were good, but many were just outright bad. And when I say bad, it"s that the visual experience was very poor,” Sharma said. “And people would see things like extreme flicker in games, they’d see corruption on the stream, they"ve had difficulty enabling [VRR]. The supported VRR range would be really small.”

With G-Sync Compatible certification, your monitor’s guaranteed to run Nvidia VRR at the monitor’s maximum refresh rate, but that’s not the case with non-certified monitors, and the issue’s even worse if the monitor’s overclocked.

There should also be no flickering, which happens when the monitor changes brightness levels at different refresh rates, a common characteristic of LCD panels, Sharma said. In our own testing, we’ve found that running G-Sync on a non-certified monitor sometimes results in a lot of flickering in windowed screens.

Nvidia also looks for “corruption artifacts,” according to the exec, and VRR must be enabled by default without having to go into the on-screen display (OSD) or any PC settings.

G-Sync has long been associated with a price premium, so we often hear people question if perfectly good monitors are rejected from the G-Sync Compatibility program. Notably, Nvidia’s G-Sync Compatibility program isn’t pay-to-play -- at least not directly. The program is entirely internally funded. As Nvidia’s Sharma put it, “There’s no cost for anybody but Nvidia.”

“One of the key requirements was to have FreeSync enabled by default; (however, most monitors require that you enable FreeSync in the OSD). If not, the monitor was automatically disqualified, even if there weren"t any other issues,” Marny noted.

As you can see G-Sync Compatibility with a capital C entails more than just being able to turn G-Sync on. So, your experience running G-Sync on a monitor Nvidia hasn’t certified (because either it was never submitted or it failed the tests) will vary. This makes sense when you think about what exactly certifications are for: creating a standardized baseline of performance.

But it’s possible that a monitor lacking Nvidia’s seal of approval had some form of G-Sync testing performed. Pixio, for example, makes budget gaming monitors and doesn’t have any official G-Sync Compatible or standard G-Sync monitors. That’s not because it doesn’t want to or has never submitted a passing monitor; it’s because Pixio’s never been able to submit a monitor for certification at all.

“With FreeSync we’ve always been notified by AMD from the get-go that there’s an open certification process. They reached out to vendors specifically to ensure that the proper steps are taken. Whereas with Nvidia we haven’t seen or heard anything or gotten back from them about that kind of process,” Park told Tom’s Hardware.

Nvidia told us that any monitor vendor can submit a monitor for G-Sync Compatible validation. Vendors that don’t already have a relationship with Nvidia “should reach out to the G-Sync team at Nvidia, and we will work with the vendor to review the technical details and scheduling the validation testing,” a spokesperson said. However, the rep noted that the queue has been quite full, so it may take a while.

Pixio is a smaller vendor and is still looking to understand Nvidia’s G-Sync Compatible certification process in order to get some of its current and/or future monitors certified. In the meantime, Pixio does extensive testing with its gaming monitors so it can tell customers whether it thinks G-Sync will run on it satisfactorily. Since Pixio isn’t working with Nvidia, it holds G-Sync performance to the FreeSync standards it follows.

“If in the future we are able to get in touch with [Nvidia] we would definitely go by their requirements and make sure to do the proper steps ... But we let our consumers know we’re not officially certified by Nvidia but it does work with G-Sync compatibility to our current FreeSync certification process at the very least,” Park said.

Pixio’s G-Sync testing includes an accessible VRR range and testing for tearing, flicker and stutter. This occurs across GTX 10-series cards and on; however, RTX 30-series testing is lesser since the cards are newer. Pixio tests a certain percentage of production units, with tests particularly focused around the monitor’s mainboard. Testing can range from a month to a year, depending on the features and with new models taking longer.

If you have an Nvidia graphics card, it’s true that thebest gaming monitor for you will have some form of G-Sync. Sure, you can get G-Sync to run on most FreeSync monitors, but for a reliable VRR range and the promise of tear and flicker-free performance out of the box without digging into software, an official G-Sync Compatible certification is the minimum.

If you already have a FreeSync monitor, but are hoping to run G-Sync on it with an Nvidia GPU, there’s no harm in firing up GeForce Experience and enabling it. If you don’t notice any flickering or other artifacts, having to turn on G-Sync manually is a small price to pay.

Alternatively, if you’re buying a new monitor, plan on pairing it with an Nvidia card and are stuck between a FreeSync-only monitor and one with G-Sync Compatibility, weigh your options. Is a wide VRR range important to you? If you don’t have the best graphics card, are running intensive games and/or playing at high resolution, it likely is. Also consider how sensitive you are to flickering. Is this something you’d notice, and would it bother you? And do you plan on using overdrive?

not supported with gsync lcd panel free sample

Upgrading your computer with the latest technology only enhances your experience if you are able to get the maximum value out of your top-notch gear. It is clear that an AOC FreeSync monitor can stabilise and hone picture quality coming from the PC with AMD graphic cards, but thanks to the recent technological improvements, it is now possible to transform an AOC FreeSync monitor into a G-Sync compatible display, working smoothly with GPU’s from NVIDIA as well.

The GPU is usually not able to maintain a consistent frame rate, possibly altering between high spikes and sudden drops in performance. Its frame rate depends on the scenery the GPU has to display. For example, calm scenes in which there isn’t much going on demand less performance than epic, effect-laden boss fights.

When the frame rate of your GPU does not match the frame rate of your monitor, display issues occur: lag, tearing or stuttering when the monitor has to wait for new data or tries to display two different frames as one. To prevent these issues, the GPU and monitor need to be synchronised.

With the new generations of NVIDIA graphics cards, it is possible to get the G-Sync features working on specific FreeSync AOC monitors as well. NVIDIA announced a list of certified AOC monitors which are also G-Sync compatible. Even if the AOC product is not on the list, you can still enable G-Sync on any AOC monitor and test the performance.*

Now you should have successfully enabled G-Sync on your AOC FreeSync monitor. The picture quality stays perfect and you can enjoy your gaming session without disruptive image flaws.

* Please refer to NVIDIA website for the complete list of GPUs working with FreeSync monitors and to see which monitors have been officially categorized as G-sync Compatible.

not supported with gsync lcd panel free sample

It’s difficult to buy a computer monitor, graphics card, or laptop without seeing AMD FreeSync and Nvidia G-Sync branding. Both promise smoother, better gaming, and in some cases both appear on the same display. But what do G-Sync and FreeSync do, exactly – and which is better?

Most AMD FreeSync displays can sync with Nvidia graphics hardware, and most G-Sync Compatible displays can sync with AMD graphics hardware. This is unofficial, however.

The first problem is screen tearing. A display without adaptive sync will refresh at its set refresh rate (usually 60Hz, or 60 refreshes per second) no matter what. If the refresh happens to land between two frames, well, tough luck – you’ll see a bit of both. This is screen tearing.

Screen tearing is ugly and easy to notice, especially in 3D games. To fix it, games started to use a technique called V-Syncthat locks the framerate of a game to the refresh rate of a display. This fixes screen tearing but also caps the performance of a game. It can also cause uneven frame pacing in some situations.

Adaptive sync is a better solution. A display with adaptive sync can change its refresh rate in response to how fast your graphics card is pumping out frames. If your GPU sends over 43 frames per second, your monitor displays those 43 frames, rather than forcing 60 refreshes per second. Adaptive sync stops screen tearing by preventing the display from refreshing with partial information from multiple frames but, unlike with V-Sync, each frame is shown immediately.

VESA Adaptive Sync is an open standard that any company can use to enable adaptive sync between a device and display. It’s used not only by AMD FreeSync and Nvidia G-Sync Compatible monitors but also other displays, such as HDTVs, that support Adaptive Sync.

AMD FreeSync and Nvidia G-Sync Compatible are so similar, in fact, they’re often cross compatible. A large majority of displays I test with support for either AMD FreeSync or Nvidia G-Sync Compatible will work with graphics hardware from the opposite brand.

This is how all G-Sync displays worked when Nvidia brought the technology to market in 2013. Unlike Nvidia G-Sync Compatible monitors, which often (unofficially) works with AMD Radeon GPUs, G-Sync is unique and proprietary. It only supports adaptive sync with Nvidia graphics hardware.

It’s usually possible to switch sides if you own an AMD FreeSync or Nvidia G-Sync Compatible display. If you buy a G-Sync or G-Sync Ultimate display, however, you’ll have to stick with Nvidia GeForce GPUs. (Here’s our guide to the best graphics cards for PC gaming.)

G-Sync and G-Sync Ultimate support the entire refresh range of a panel – even as low as 1Hz. This is important if you play games that may hit lower frame rates, since Adaptive Sync matches the display refresh rate with the output frame rate.

For example, if you’re playing Cyberpunk 2077 at an average of 30 FPS on a 4K display, that implies a refresh rate of 30Hz – which falls outside the range VESA Adaptive Sync supports. AMD FreeSync and Nvidia G-Sync Compatible may struggle with that, but Nvidia G-Sync and G-Sync Ultimate won’t have a problem.

AMD FreeSync Premium and FreeSync Premium Pro have their own technique of dealing with this situation called Low Framerate Compensation. It repeats frames to double the output such that it falls within a display’s supported refresh rate.

Other differences boil down to certification and testing. AMD and Nvidia have their own certification programs that displays must pass to claim official compatibility. This is why not all VESA Adaptive Sync displays claim support for AMD FreeSync and Nvidia G-Sync Compatible.

This is a bunch of nonsense. Neither has anything to do with HDR, though it can be helpful to understand that some level of HDR support is included in those panels. The most common HDR standard, HDR10, is an open standard from the Consumer Technology Association. AMD and Nvidia have no control over it. You don’t need FreeSync or G-Sync to view HDR, either, even on each company’s graphics hardware.

Both standards are plug-and-play with officially compatible displays. Your desktop’s video card will detect that the display is certified and turn on AMD FreeSync or Nvidia G-Sync automatically. You may need to activate the respective adaptive sync technology in your monitor settings, however, though that step is a rarity in modern displays.

Displays that support VESA Adaptive Sync, but are not officially supported by your video card, require you dig into AMD or Nvidia’s driver software and turn on the feature manually. This is a painless process, however – just check the box and save your settings.

AMD FreeSync and Nvidia G-Sync are also available for use with laptop displays. Unsurprisingly, laptops that have a compatible display will be configured to use AMD FreeSync or Nvidia G-Sync from the factory.

A note of caution, however: not all laptops with AMD or Nvidia graphics hardware have a display with Adaptive Sync support. Even some gaming laptops lack this feature. Pay close attention to the specifications.

VESA’s Adaptive Sync is on its way to being the common adaptive sync standard used by the entire display industry. Though not perfect, it’s good enough for most situations, and display companies don’t have to fool around with AMD or Nvidia to support it.

That leaves AMD FreeSync and Nvidia G-Sync searching for a purpose. AMD FreeSync and Nvidia G-Sync Compatible are essentially certification programs that monitor companies can use to slap another badge on a product, though they also ensure out-of-the-box compatibility with supported graphics card. Nvidia’s G-Sync and G-Sync Ultimate are technically superior, but require proprietary Nvidia hardware that adds to a display’s price. This is why G-Sync and G-Sync Ultimate monitors are becoming less common.

not supported with gsync lcd panel free sample

When buying a gaming monitor, it’s important to compare G-Sync vs FreeSync. Both technologies improve monitor performance by matching the performance of the screen with the graphics card. And there are clear advantages and disadvantages of each: G-Sync offers premium performance at a higher price while FreeSync is prone to certain screen artifacts like ghosting.

So G-Sync versus FreeSync? Ultimately, it’s up to you to decide which is the best for you (with the help of our guide below). Or you can learn more about ViewSonic’s professional gaming monitors here.

In the past, monitor manufacturers relied on the V-Sync standard to ensure consumers and business professionals could use their displays without issues when connected to high-performance computers. As technology became faster, however, new standards were developed — the two main ones being G-Sync and Freesync.

V-Sync, short for vertical synchronization, is a display technology that was originally designed to help monitor manufacturers prevent screen tearing. This occurs when two different “screens” crash into each other because the monitor’s refresh rate can’t keep pace with the data being sent from the graphics card. The distortion is easy to spot as it causes a cut or misalignment to appear in the image.

This often comes in handy in gaming. For example, GamingScan reports that the average computer game operates at 60 FPS. Many high-end games operate at 120 FPS or greater, which requires the monitor to have a refresh rate of 120Hz to 165Hz. If the game is run on a monitor with a refresh rate that’s less than 120Hz, performance issues arise.

Although V-Sync technology is commonly used when users are playing modern video games, it also works well with legacy games. The reason for this is that V-Sync slows down the frame rate output from the graphics cards to match the legacy standards.

Despite its effectiveness at eliminating screen tearing, it often causes issues such as screen “stuttering” and input lag. The former is a scenario where the time between frames varies noticeably, leading to choppiness in image appearances.

Although the technology works well with low-end devices, V-Sync degrades the performance of high-end graphics cards. That’s the reason display manufacturers have begun releasing gaming monitors with refresh rates of 144Hz, 165Hz, and even 240Hz.

While V-Sync worked well with legacy monitors, it often prevents modern graphics cards from operating at peak performance. For example, gaming monitors often have a refresh rate of at least 100Hz. If the graphics card outputs content at low speeds (e.g. 60Hz), V-Sync would prevent the graphics card from operating at peak performance.

Since the creation of V-Sync, other technologies such as G-Sync and FreeSync have emerged to not only fix display performance issues, but also to enhance image elements such as screen resolution, image colors, or brightness levels.

Released to the public in 2013, G-Sync is a technology developed by NVIDIA that synchronizes a user’s display to a device’s graphics card output, leading to smoother performance, especially with gaming. G-Sync has gained popularity in the electronics space because monitor refresh rates are always better than the GPU’s ability to output data. This results in significant performance issues.

The most notable benefit of G-Sync technology is the elimination of screen tearing and other common display issues associated with V-Sync equipment. G-Sync equipment does this by manipulating the monitor’s vertical blanking interval (VBI).

To keep pace with changes in technology, NVIDIA developed a newer version of G-Sync, called G-Sync Ultimate. This new standard is a more advanced version of G-Sync. The core features that set it apart from G-Sync equipment are the built-in R3 module, high dynamic range (HDR) support, and the ability to display 4K quality images at 144Hz.

Although G-Sync delivers exceptional performance across the board, its primary disadvantage is the price. To take full advantage of native G-Sync technologies, users need to purchase a G-Sync-equipped monitor and graphics card. This two-part equipment requirement limited the number of G-Sync devices consumers could choose from It’s also worth noting that these monitors require the graphics card to support DisplayPort connectivity.

Released in 2015, FreeSync is a standard developed by AMD that, similar to G-Sync, is an adaptive synchronization technology for liquid-crystal displays. It’s intended to reduce screen tearing and stuttering triggered by the monitor not being in sync with the content frame rate.

Since this technology uses the Adaptive Sync standard built into the DisplayPort 1.2a standard, any monitor equipped with this input can be compatible with FreeSync technology. With that in mind, FreeSync is not compatible with legacy connections such as VGA and DVI.

The “free” in FreeSync comes from the standard being open, meaning other manufacturers are able to incorporate it into their equipment without paying royalties to AMD. This means many FreeSync devices on the market cost less than similar G-Sync-equipped devices.

As FreeSync is a standard developed by AMD, most of their modern graphics processing units support the technology. A variety of other electronics manufacturers also support the technology, and with the right knowledge, you can even get FreeSync to work on NVIDIA equipment.

Although FreeSync is a significant improvement over the V-Sync standard, it isn’t a perfect technology. The most notable drawback of FreeSync is ghosting. This is when an object leaves behind a bit of its previous image position, causing a shadow-like image to appear.

A key difference between FreeSync and FreeSync 2 devices is that with the latter technology, if the frame rate falls below the supported range of the monitor, low framerate compensation (LFC) is automatically enabled to prevent stuttering and tearing.

If you want low input lag and don’t mind tearing, then the FreeSync standard is a good fit for you. On the other hand, if you’re looking for smooth motions without tearing, and are okay with minor input lag, then G-Sync equipped monitors are a better choice.

not supported with gsync lcd panel free sample

The Dell S2716DG is Dell’s first gaming monitor offering one of the best gaming display performance along with a list of features to match up with gamers’ speed of reaction while displaying clear and undistorted images for the best gaming experience. Some of the key features of this monitor are:

NVIDIA G-SYNC is a groundbreaking new display technology that delivers the smoothest gaming experience ever. G-SYNC’s revolutionary smoothness is achieved by synchronizing display refresh rates to the GPU in your GeForce GTX-powered desktop or notebook, eliminating screen tearing and minimizing display stutter and input lag.

Nvidia G-SYNC feature is automatically enabled on all supported computers. If it is not, you can manually enable to Nvidia G-SYNC feature using the Nvidia Control Panel by following these instructions:

Ensure that the latest version of the Nvidia GPU driver is installed. If not, visit the Nvidia website to download and install the latest drivers for your graphics card.

For maximum display performance with Microsoft Windows Operating Systems, set the display resolution to 2560 x 1440 pixels by performing the following steps:

Your monitor has a built-in diagnostic tool that helps to determine if the screen abnormality you are experiencing is a problem with the monitor or with the video card on your computer.

If you do not detect any screen abnormalities upon using the built-in diagnostic tool, the monitor is functioning properly. Check the video card (GPU) and the computer.

Disable the Monitor Deep Sleep mode if you notice any of the two scenarios mentioned below. Follow the instructions mentioned to ‘Disable the Monitor Deep Sleep Mode.

Pressing any button (except the Power button) on the front panel of your Dell S2716DG Monitor may also wake up the monitor after it goes into deep sleep.

The Dell S2716DG Monitor does not support Self-Test Feature Check (STFC). When the monitor does not detect any signal, it will display the message "Enter Power Save Mode" for 15 seconds and then will go into Deep Sleep Mode.

NVIDIA G-SYNC is a groundbreaking new display technology that delivers the smoothest gaming experience ever. G-SYNC’s revolutionary smoothness is achieved by synchronizing display refresh rates to the GPU in your GeForce GTX-powered desktop or notebook, eliminating screen tearing (Figure 8) and minimizing display stutter and input lag. The result: scenes appear instantly, objects look sharper, and the gameplay is super smooth, giving you a stunning visual experience and a serious competitive edge.

Nvidia G-SYNC technology currently supports only DisplayPort video input. Using DVI to DisplayPort or HDMI to DisplayPort converters/adapters is currently not supported. Back to Top

ULMB and Nvidia G-SYNC feature cannot be enabled at the same time. You can choose to eliminate screen tears or improve motion resolution but not both.

not supported with gsync lcd panel free sample

Take a look at our best gaming monitor recommendations and you"ll find pretty much all of them have support for either AMD FreeSync or Nvidia"s G-Sync variable refresh rate technology. But what exactly is G-Sync and FreeSync and why are they so important for gaming? Below, you"ll find all the answers to your burning G-Sync FreeSync questions, including what each one does, how they"re different, and which one is best for your monitor and graphics card. I"ll also be talking about where Nvidia"s G-Sync Compatible standard fits in with all this, as well as what Nvidia"s G-Sync Ultimate, AMD"s FreeSync Premium and FreeSync Premium Pro specifications bring to the table as well.

Unlike V-Sync, which is a similar technology you"ll often find in a game"s settings menu, G-Sync and FreeSync don"t add input lag, and they don"t force your GPU to stick to whatever your monitor"s refresh rate is - an approach that causes stuttering with V-Sync. Instead they adjust the refresh rate on the fly.

As such, it"s worth using one or the other, but Nvidia and AMD have tweaked their technologies over the years, adding new features and different tiers with varying capabilities. Which is best? It’s time to find out.

G-Sync is Nvidia"s variable refresh rate technology, and (unsurprisingly) requires an Nvidia graphics card in order to work. That"s generally not a problem, unless you were waiting for a lesser-spotted RTX 3080, but an even bigger problem might be how much the best Nvidia G-Sync monitors tend to cost.

This is why you often only find G-Sync on higher-end monitors, as it"s simply not cost-effective to include it on cheaper ones. That said, since G-Sync is a fixed standard, you also know exactly what you"re getting whenever you buy a G-Sync enabled display: the effectiveness won"t be slightly different between monitors, as is the case with FreeSync.

If you’ve got an Nvidia card but no money left for a genuine, certified G-Sync display, all hope is not lost. The G-Sync Compatible label is given to AMD FreeSync monitors where Nvidia GPUs can essentially ride the coattails of FreeSync, granting you adaptive syncing via the onboard DisplayPort protocols instead of an expensive processor. Is it incredibly cheeky of Nvidia to borrow rival tech like this? A bit. Is it a bad deal for consumers? Nope.

That’s important, because while in theory any FreeSync monitor could support syncing with Nvidia graphics cards, in practice many – over 200, according to Nvidia – show signs of blanking, pulsing, flickering and other nasty-looking artefacts when they were put to the test. In other words, if you’re an Nvidia card owner looking to save cash on their next monitor, look out for the G-Sync Compatible label – or just check our list of G-Sync Compatible monitors.

The result is that FreeSync monitors are cheaper than their otherwise identical (or near-identical) G-Sync counterparts, often by £100 or more. There are lots more to choose from, too: whereas Nvidia lists 83 monitors with full G-Sync or G-Sync Ultimate support, AMD’s list of FreeSync monitors totals 1309. One-thousand, three-hundred and nine.

However, while a FreeSync display still needs to be certified by AMD before it can get a FreeSync sticker on the box, the standard isn’t fixed like it is with G-Sync. This means that your FreeSync experience can vary from monitor to monitor, and not all FreeSync monitors come with exactly the same features.

For starters, FreeSync’s variable refresh rate tech only works within a certain frame rate range. Some monitors have support for frame rates as low as 30fps, but most will only kick in if your frame rate is over 40fps, or even 48fps. That means that if your graphics card’s output drops below 30fps, 40fps, 48fps, or whatever the monitor’s lower limit is, FreeSync stops being effective and you don’t get any benefit whatsoever.

Some FreeSync monitors try to smooth out low-fps performance using something called Low Framerate Compensation, or LFC, which duplicates the number of frames being shown when they drop below a certain threshold – 30fps being bumped up to 60fps, for example. However, the monitor in question will need to have this feature built-in, so you may not find it on cheaper FreeSync models.

Here’s the basics, though: FreeSync is the lowest, most widely available standard, and works exactly as described a few paragraphs back. FreeSync Premium is a newer standard, with certification requiring both the LFC feature as standard and a 120Hz refresh rate.

Lastly, there’s FreeSync Premium Pro, which essentially replaces the now-defunct FreeSync 2 HDR standard. If anything, it’s just FreeSync 2 HDR by another name, its headline feature being HDR Support “with meticulous colour and luminance certification”. We still don’t know exactly where that puts FreeSync Premium Pro in relation to established HDR standards like DisplayHDR 400 and DisplayHDR 600, but at least all the games that supported FreeSync 2 HDR – including Assassin’s Creed Odyssey, Resident Evil 2 and The Division 2 – will also work with HDR via this “new” standard.

FreeSync Premium Pro also bundles in the benefits of the tiers below it, so you can be sure a monitor with this on the box will feature LFC and at least a 120Hz refresh rate.

The problem with G-Sync, though, is that it’s so damn expensive. It"s a hard sell at the best of times, and prices only get more expensive the further you move up the sizing scale. The inconvenience of trying to ascertain different monitors’ different FreeSync ranges is, comparatively, a small price to pay.

Of course, if you’re already a determined AMD GPU owner, you don’t have much of a choice in the first place, but for those with Nvidia cards the best solution would be to take the G-Sync Compatible route. Particularly if you’re not fussed about Ultra Low Motion Blur, or don’t play enough twitchy shooters to justify native G-Sync’s lower latency, it’s a good way to secure adaptive sync while also saving yourself a bundle.

not supported with gsync lcd panel free sample

Variable refresh rate (VRR), also referred to as adaptive sync, allows the monitor to adjust its refresh rate to the output signal. This allows for games to eliminate screen tearing with less of the usual downsides of Vsync (such as stuttering). For a comprehensive look at VRR see PC Gaming Wiki.

FreeSync is AMD"s implementation of VESA"s VRR standard, and the phrases are often used interchangeably. FreeSync branded monitors should be compatible with all VESA compatible drivers.

For setup purposes, it is necessary to differentiate between "native" G-SYNC monitors that licenses Nvidia"s own chip, and G-SYNC Compatible monitors, FreeSync monitors which support a subset of G-SYNC"s functionality. [1] Within the category of G-SYNC Compatible monitors, the monitor may or may not be validated by NVIDIA. [2] [3] Even if a VRR monitor has not passed NVIDIA"s validation (and thus would not be called G-SYNC Compatible in marketing material), you may still be capable of using it with G-SYNC.

None, indicating that this monitor does not seem to support G-SYNC. Note that there are some FreeSync VRR monitors which are not G-SYNC compatible at all. [7]

Note: If a monitor did fail NVIDIA"s certification to be G-SYNC compatible, there may be issues with the experience such as poor image quality, flickering, or lack of VRR activation due to limited refresh rate range. [8]

Install AUR and AUR to use VRR in GNOME. VRR needs to be enabled for each supported monitor in the Displays settings. When running on a supported and enabled monitor, GNOME automatically enables VRR for all full screen applications.

With VRR off, if the application"s FPS is less than the monitor"s native refresh rate then the bars will stutter a lot since frames are being skipped. With VRR active, the bars will always move smoothly across the screen since the screen"s refresh rate will match the application"s refresh rate. Even with VRR functional you may experience tearing in which case you can also enable the TearFree option for AMDGPU; with both enabled there should be neither stuttering nor tearing (what is the nvidia equivalent?).

If you are using a Nvidia GPU, you can test G-SYNC compatibility with AUR. This program will allow you to test VRR and Vsync so you can observe resulting effects. See project"s Readme for more information.

According to this page: "gl-gsync-demo is made with G-SYNC but that does not matter, it will test AMD adaptive sync just fine". However, it may still not work as expected for FreeSync testing.

The monitor must be plugged in via DisplayPort. Some displays which implement (part of) the HDMI 2.1 specification also support VRR over HDMI. This is supported by the Nvidia driver and is supported by the AMD driver (pre HDMI 2.1) in Kernel 5.13 and later [16].

Compositors will most likely need to be disabled before the OpenGl/Vulkan program is started (disabling compositors is not relevant or necessary on Wayland [17]).

Although tearing is much less noticeable at higher refresh rates, FreeSync monitors often have a limited range for their VRR of 90Hz, which can be much lower than their max refresh rate. See Change VRR Range of a FreeSync Monitor.

not supported with gsync lcd panel free sample

The Dell S2721DGF and the LG 27GP850-B are very similar, each with strengths and weaknesses. The LG has an optional black frame insertion feature, which can help reduce the amount of persistence blur seen on-screen. The Dell has a more versatile stand, as it can swivel and switch to portrait orientation on either side, and it feels a bit better built than the LG.

The MSI Optix MAG274QRF-QD and the LG 27GP850-B are similar 1440p, 27-inch monitors, but there are a few differences. The MSI has a few extra features for office use, like an ergonomic stand and a USB-C input that supports DisplayPort Alt Mode. However, colors look oversaturated, and the color accuracy is much better on the LG. The LG is also slightly better for gaming because it supports DP 1.4 bandwidth, allowing you to reach a higher refresh rate, and the motion handling is a bit better with lower frame rate signals.

The LG 27GP850-B is better than the LG 27GL850-B, but the differences are minor and might not matter to everyone. The 27GP850-B has a slightly faster refresh rate, resulting in better motion handling and a touch less motion blur behind fast-moving objects. The 27GP850-B also has an optional black frame insertion feature, but most people won"t use this when gaming anyway.

The Samsung Odyssey G7 C32G75T and the LG 27GP850-B use different panel technologies, each with strengths and weaknesses. The LG has better viewing angles, but this comes at the expense of contrast. The Samsung has much better contrast, so it"s a better choice for a dark room. The Samsung"s black frame insertion (BFI) feature is far more versatile, as it"s available across the entire refresh rate range of the monitor, as low as 60Hz, while the BFI on the LG is only available in a narrow range.

The LG 27GP83B-B and the LG 27GP850-B perform nearly identically overall. The 27GP850-B is a bit more feature-packed, with a higher refresh rate, an optional black frame insertion feature, and a built-in USB hub.

The Gigabyte M27Q X and the LG 27GP850-B are pretty similar overall. The Gigabyte has a higher native refresh rate, but this doesn"t really translate to better motion handling, as the LG looks a bit better overall, especially when gaming on a console below the monitor"s max refresh rate. The Gigabyte has better connectivity and more features, with high bandwidth USB-C and a built-in keyboard, video, and mouse switch.

The LG 27GP850-B is a bit better than the LG 27GN850-B. The 27GP850 has a higher refresh rate, resulting in a faster response time and clearer motion. The 27GP850 also has an optional black frame insertion feature to reduce the appearance of persistence blur, but it"s a bit limited and only works over a narrow refresh rate range. Finally, the 27GP850 has slightly better connectivity, with a built-in USB hub.

The LG 27GP850-B and the Samsung Odyssey G5 S27AG50 are both excellent gaming monitors with similar features. They both have a 1440p resolution with native FreeSync support and a 165Hz refresh rate, but you can overclock the refresh rate to 180Hz on the LG. Motion handling is superb on each, and they both have low input lag for gaming, but there are a few differences in other areas. The LG displays a wide color gamut for HDR content, which the Samsung doesn"t, but it doesn"t add much because neither deliver a satisfying HDR experience. The LG also has two USB 3.0 inputs, while the Samsung has a USB input for service inputs, but the Samsung has much better ergonomics because you can swivel it.

The LG 27GP850-B is slightly better than the ASUS TUF Gaming VG27AQL1A for gaming, but the ASUS is better for office use. The LG has a much faster response time, resulting in clearer motion with less blur behind fast-moving objects. On the other hand, the ASUS has much better ergonomics, so it might be slightly easier to place it in an ideal viewing position.

The LG 32GP850-B and the LG 27GP850-B are nearly identical. The 32 inch model is more accurate out of the box, and the 27 inch model has better text clarity due to the higher pixel density. Other than that, the differences between these models can almost entirely be attributed to panel variance.

The Samsung Odyssey G7 S28AG70 and the LG 27GP850-B are both excellent for gaming, but they have different features. The Samsung has a 4k resolution with a 144Hz refresh rate, while the LG has a 1440p resolution and a higher 180Hz max refresh rate. The LG has a slightly better response time, especially at 60Hz, and it"s better for bright rooms because it gets brighter and has better reflection handling. However, the Samsung is a better choice for console gaming thanks to its HDMI 2.1 inputs, and it has a local dimming feature, which the LG doesn"t have, but it causes blooming around bright objects.

The ASUS ROG Swift PG279QM and the LG 27GP850-B deliver very similar performance, each with strengths and weaknesses. The ASUS has better ergonomics, so it"s easier to place it in an ideal viewing position. On the other hand, the LG has a faster response time at the max refresh rate, and it has an optional backlight strobing feature to improve the appearance of motion.

The LG 27GN950-B and the LG 27GP850-B are both great gaming monitors from the same lineup, with similar designs and gaming performances. The main difference is that the 27GN950-B is a 4k model with a 160Hz refresh rate, while the 27GP850-B is a 1440p model with a 180Hz refresh rate. In HDR, the 27GP850-B has a much wider color gamut, but the 27GN950-B gets a lot brighter to make highlights pop.

The Gigabyte M32Q is slightly better than the LG 27GP850-B. The Gigabyte has better vertical viewing angles, slightly better ergonomics, a more versatile black frame insertion feature that"s available over a wider range of refresh rates, and it has a larger screen. The Gigabyte also offers slightly better connectivity, with a built-in KVM and USB-C port.

The LG 27GP850-B is significantly better than the Samsung Odyssey G5/G55A S27AG55. The LG has much better ergonomics, so it"s easier to place it in an ideal viewing position. The LG also has much better gaming performance, with a significantly faster response time, so motion looks smoother overall, with less blur behind fast-moving objects. The LG also gets brighter and has a wider viewing angle, so the image remains accurate to the sides if you"re sitting close to the screen.

The LG 27GP850-B and the Razer Raptor 27 165Hz are both great monitors. They each have a 1440p resolution with a 165Hz native refresh rate, but you can overclock the LG to 180Hz. Motion looks better on the LG thanks to the quicker response times, and its stand can rotate into portrait mode. On the other hand, the Razer"s stand can tilt a full 90 degrees backwards, and it has a better selection of inputs because there"s a USB-C input.

The LG 27GP850-B and the Dell P3223DE are different types of 1440p monitors. The LG is a gaming monitor with a high 180Hz refresh rate and VRR support for a tear-free gaming experience. Because of that, it also has a quicker response time for smoother motion handling. On the other hand, the Dell is an office monitor with two more USB 3.0 ports compared to the LG, it has a USB-C input, and it has much better ergonomics that make it easier to place in an ideal position.

not supported with gsync lcd panel free sample

If you’re a gamer, it’s always been a challenge to balance the performance of your graphics card with your monitor. For many years people have had to live with issues like “tearing”, where the image on the screen distorts and “tears” in places creating a distracting and unwanted experience. Tearing is a problem caused when a frame rate is out of sync with the refresh rate of the display. The only real option historically has been to use a feature called Vsync to bring both in sync with one another, but not without introducing some issues of its own at the same time which we will explain in a moment. Back in 2014 – 15 we saw a step change in how refresh rates are handled between graphics card and monitor and the arrival of “variable refresh rate” technologies. NVIDIA and AMD, the two major graphics card manufacturers, each have their own approach to making this work which we will look at in this article. We are not going to go in to mega detail about the graphics card side of things here, there’s plenty of material online about that. We instead want to focus on the monitor side of things a bit more as is our interest at TFTCentral.

As an introduction, monitors typically operate at a fixed refresh rate, whether that is common refresh rates like 60Hz, 120Hz, 144Hz or above. When running graphically intense content like games, the frame rate will fluctuate somewhat and this poses a potential issue to the user. The frame rate you can achieve from your system very much depends on the power of your graphics card and PC in general, along with how demanding the content is itself. This can be impacted by the resolution of your display and the game detail and enhancement settings amongst other things. The higher you push your settings, the more demand there will be on your system and the harder it might be to achieve the desired frame rate. This is where an issue called tearing can start to be a problem as the frame rate output from your graphics card can’t keep up with the fixed refresh rate of your monitor. Tearing is a distracting image artefact where the image becomes disjointed or separated in places, causing issues for gamers and an unwanted visual experience.

There were traditionally two main options available for how frames are passed from the graphics card to the monitor using a feature called Vsync, with settings simply for on or off.

At the most basic level ‘VSync OFF’ allows the GPU to send frames to the monitor as soon as they have been processed, irrespective of whether the monitor has finished its refresh and is ready to move onto the next frame. This allows you to run at higher frame rates than the refresh rate of your monitor but can lead to a lot of problems. When the frame rate of the game and refresh rate of the monitor are different, things become unsynchronised. This lack of synchronisation coupled with the nature of monitor refreshes (typically from top to bottom) causes the monitor to display a different frame towards the top of the screen vs. the bottom. This results in distinctive ‘tearing’ on the monitor that really bothers some users. Even on a 120Hz or 144Hz monitor, where some users claim that there is no tearing, the tearing is still there. It is generally less noticeable but it is definitely still there. Tearing can become particularly noticeable during faster horizontal motion (e.g. turning, panning, strafing), especially at lower refresh rates.

During Vsync ON operation, there can also sometimes be a sudden slow down in frame rates when the GPU has to work harder. This creates situations where the frame rate suddenly halves, such as 60 frames per second slowing down to 30 frames per second. During Vsync ON, if your graphics card is not running flat-out, these frame rate transitions can be very jarring. These sudden changes to frame rates creates sudden changes in lag, and this can disrupt game play, especially in first-person shooters.

To overcome these limitations with Vsync, both NVIDIA and AMD have introduced new technologies based on a “variable refresh rate” (VRR) principle. These technologies can be integrated into monitors allowing them to dynamically alter the monitor refresh rate depending on the graphics card output and frame rate. The frame rate of the monitor is still limited in much the same way it is without a variable refresh rate technology, but it adjusts dynamically to a refresh rate to match the frame rate of the game. By doing this the monitor refresh rate is perfectly synchronised with the GPU. You don’t get the screen tearing or visual latency of having Vsync disabled, nor do you get the stuttering or input lag associates with using Vsync. You can get the benefit of higher frame rates from Vsync off but without the tearing, and without the lag and stuttering caused if you switch to Vsync On.

NVIDIA were first to launch capability for variable refresh rates with their G-sync technology. G-sync was launched mid 2014 with the first screen we tested being the Asus ROG Swift PG278Q. It’s been used in many gaming screens since with a lot of success.

Traditionally NVIDIA G-sync required a proprietary “G-sync module” hardware chip to be added to the monitor, in place of a traditional scaler chip. This allows the screen to communicate with the graphics card to control the variable refresh rate and gives NVIDIA a level of control over the quality of the screens produced under their G-sync banner. As NVIDIA say on their website: “Every G-sync desktop monitor and laptop display goes through rigorous testing for consistent quality and maximum performance that’s optimized for the GeForce GTX gaming platform”.

This does however add an additional cost to production and therefore the retail price of these hardware G-sync displays, often in the realms of £100 – 200 GBP compared with alternative non G-sync models. Even higher when you consider the new v2 module which is often £400 – 600 additional cost. This is often criticized by consumers who dislike having to pay the “G-sync Tax” to get a screen that can support variable refresh rates from their NVIDIA graphics card. There have been some recent changes to this in 2019 which we will discuss later, in relation to allowing support for G-sync from other non-module screens.

There have been 3 generations of the G-sync module produced by NVIDIA to date, although when discussed, the first two generations are normally merged in to a single “Version 1” label as they are so similar. With v2 representing the real step change in capability.

Using a hardware G-sync module in place of a traditional scaler has some positives and negatives. It is somewhat limited by the available video connections with only a single DisplayPort and single HDMI connection offered currently, no matter whether it’s a v1 or v2 module. In contrast, screens without the chip can support other common interfaces like DVI and VGA, while also allowing support for the latest USB type-C connection which is becoming increasingly popular.

Due to the lack of a traditional scaler, some OSD options might be unavailable compared with a normal screen, so things like Picture In Picture (PiP) and Picture By Picture (PbP) are not supported. The active cooling fan used so far for the v2 module has also been criticised from those who like a quieter PC, or where manufacturers have used a noisy fan.

NVIDIA G-sync screens with the hardware module generally have a nice wide variable refresh rate (VRR) range. You will often see this listed in the product spec as something like “40 – 144Hz”, or confirmed via third party testing. We have seen lots of FreeSync screens, particularly from the FreeSync 1 generation, with far more limited VRR ranges. NVIDIA also seem to be at the forefront of bringing the highest refresh rate gaming monitors to market first, so you will often see the latest and greatest models with G-sync support a fair while before alternative FreeSync options become available.

Overclocking of refresh rates on some displays has been made possible largely thanks to the G-sync module. The presence of this module, and absence of a traditional scaler has allowed previously much slower panels to be successfully overclocked to higher refresh rates. For instance the first wave of high refresh rate 34″ ultrawide screens like the Acer Predator X34 and Asus ROG Swift PG348Q had a 100Hz refresh rate, but were actually using older 60Hz native panels. The G-sync module allowed a very good boost in refresh rate, and some excellent performance improvements as a result. This pattern continues today, as you will often see screens featuring the G-sync module advertised with a normal “native” refresh rate, and then an overclocked refresh rate where the panel has been boosted. For instance there’s quite a lot of 144Hz native screens which can be boosted to 165Hz or above thanks to the G-sync module.

The above is allowing an overclock of the LCD panel, while operating the G-sync module within its specifications. We should mention briefly the capability to also overclock the G-sync module itself, pushing it a little beyond its recommended specs. This has only been done once in this way once as far as we know, with the LG 34GK950G. That screen featured a 3440 x 1440 resolution panel with a natively supported 144Hz refresh rate, but it was combined with the v1 G-sync module. This was presumably to help avoid increasing costs of using the v2 module, especially as providing HDR support was not a priority. With the 3440 x 1440 @144Hz panel being used, this was beyond the bandwidth capabilities of the v1 module and so natively the screen will support up to 100Hz. It was however possible to enable an overclock of the G-sync module via the OSD overclocking feature on the monitor, pushing the refresh rate up to 120Hz as a result. The panel didn’t need overclocking here, only the G-sync module. We mention this only in case other monitors emerge where manufacturer opt to use the v1 module for cost saving benefits, but need to push its capabilities a little beyond its native support. It does seem that the chip is capable of being overclocked somewhat if needed.

From our many tests of screens featuring the hardware G-sync module, the response times of the panels and the overdrive that is used seems to be generally very reliable and consistent, producing strong performance at both low and high refresh rates. This seems to be more consistent than what we have seen from FreeSync screens so far where often the overdrive impulse is impacted negatively by changes to the screens refresh rate. NVIDIA also talk about how their G-sync technology allows for “variable overdrive” where the overdrive is apparently tuned across the entire refresh rate range for optimal performance.

G-sync modules also often support a native blur reduction mode, dubbed ULMB (Ultra Low Motion Blur). This allows the user to opt for a strobe backlight system if they want, in order to reduce perceived motion blur in gaming. It cannot be used at the same time as G-sync since ULMB operates at a fixed refresh rate only, but it’s a useful extra option for many of these G-sync module gaming screens. Of course since G-sync/ULMB are an NVIDIA technology, it only works with specific G-sync compatible NVIDIA graphics cards. While you can still use a G-sync monitor from an AMD/Intel graphics card for other uses, you can’t use the actual G-sync or ULMB functions.

It should be noted that the real benefits of G-sync really come into play when viewing lower frame rate content, around 45 – 60fps typically delivers the best results compared with Vsync on/off. At consistently higher frame rates as you get nearer to 144 fps the benefits of G-sync are not as great, but still apparent. There will be a gradual transition period for each user where the benefits of using G-sync decrease, and it may instead be better to use the ULMB feature if it’s been included, which is not available when using G-sync. Higher end gaming machines might be able to push out higher frame rates more consistently and so you might find less benefit in using G-sync. The ULMB could then help in another very important area, helping to reduce the perceived motion blur caused by LCD displays. It’s nice to have both G-sync and ULMB available to choose from certainly on these G-sync enabled displays. Soon after launch NVIDIA added the option to choose how frequencies outside of the supported range are handled. Previously it would revert to Vsync on behaviour, but the user now has the choice for various settings including Fast Sync, V-sync, no synchronisation and allowing the application to decide.

With the release of the “v2 module” NVIDIA added support for High Dynamic Range (HDR) to their hardware which is not supported from the v1 module. This allows support for HDR10 content along with G-sync VRR from high end NVIDIA graphics cards and systems.

Their system requirements at the time of writing stipulate a GeForce GTX 1050 GPU or higher with a DisplayPort 1.4 interface, along with Windows 10 for your Operating system. It should be noted that the G-sync v2 module is quite significantly more expensive than the v1 module, estimated to add around £400 – 600 to the retail price of a display f