
NVIDIA G SYNC SERIES
The Nvidia G-Sync-compatible graphics card (any GeForce GTX desktop card from the 600 series through the current 900-level series) sends a signal to a G-Sync controller chip physically built into the monitor (yes, G-Sync requires a new, specially compatible monitor). G-Sync synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are needed. This is what causes tearing, other visual artifacts and screen stutter. So, most people leave V-Sync off, leading to a problem where the next rendered frame is sent to the monitor, even if the display of the previous frame is not yet done. It could stop the graphics card output from outpacing the refresh rate of the display, but at the potential cost of a serious performance hit and input lag. Previously, to minimize tearing, gamers had to go into the game settings, or the Nvidia control panel app, and turn on V-Sync (or vertical synchronization), a technology that dates back to the CRT monitor days. Screen tearing without G-Sync in an Nvidia demo.

NVIDIA G SYNC PC
We've tested the technology on several games, using a high-end desktop PC and a G-Sync monitor from Asus.

Nvidia, maker of the popular GeForce line of graphics chips, has developed a display technology called G-Sync that promises to eliminate tearing and screen stutter, and improve input lag (where input commands can be out of sync with the action on-screen). It's something many PC gamers have just learned to live with. Tearing is horizontal distortion across the screen when playing a PC game, where it looks like one frame of animation is being half-written over another. But even on a $3,000-or-more desktop gaming PC with the latest processors and graphics cards, games can still display annoying visual artifacts, such as screen tearing and stutter.

Playing video games on a PC versus a living room game console has numerous advantages, from better textures to higher resolutions to tighter mouse-and-keyboard controls.
