default header

Hardware

Nvidia G-SYNC

Moderator: JC Denton

Nvidia G-SYNC

Unread postby Tain » 18 Oct 2013 16:13

Nvidia just announced a new technology used by monitors that will allow the GPU to control the monitor's timing to eliminate the issues of vsync (lag, pulldown) while still avoiding screen tearing.

INLINE_NV_GSync_MonitorKV_Rev5.jpg
INLINE_NV_GSync_MonitorKV_Rev5.jpg (64.95 KiB) Viewed 30947 times


http://blogs.nvidia.com/blog/2013/10/18/g-sync/

Montreal, to me, has always meant hockey masks, maple syrup and Mounties. I never thought it was the doorway to gaming heaven.

But when we launched NVIDIA G-SYNC here today, I could see my life as a gamer getting better. Actually, I could see the rest of my life getting better, too, because we’ve been working flat out to get this out the door for several years.

The idea behind G-SYNC is simple, even if the technology isn’t. It’s to deliver visually stunning games, without the artifacts that jolt you out of the zone – like a guy who keeps standing up in front of you during a great movie.

An Obstacle to Great Gaming

Now, I’ve been gaming on PCs for just about 20 years. I started out with “Doom” and “Descent.” Ever since, I’ve been reaching for the hard stuff. A perfect evening for me is coming home after a long day at NVIDIA, cracking open a Guinness and sitting down to a session of “Starcraft II.” Or even better, jumping into a new indie release like “Antichamber.” That thing still blows my mind.

But much as I love gaming, I’ve always hated the choices you have to make synchronizing to your monitor. With V-SYNC off you can have fast input response time, but images are seriously corrupted by tearing. Or, you can use V-SYNC on, but then games get laggy, and any time the GPU’s FPS falls below the refresh rate of the monitor, animation stutters badly.

Imagine if your fully armed buddies can see you. But your system won’t let you see them. The input lag can get you killed. Given the options, it’s not surprising that competitive gamers pick the lesser of two evils and run with V-SYNC off. But it’s still short of perfection.

What I want from gaming is to get immersed in the experience. I want to feel the cracked concrete under my feet. Or the buzzing jungle closing in around me. Or the flash of the grenade nearby. Stuttering and tearing are distortions that bring me back to the beige carpet in my game lounge. They make me wonder what caused them and I get jolted out of the zone pretty fast.

Getting to the Root of The Problem

This same observation got Jen-Hsun, our CEO, to commission some of the brightest minds at NVIDIA to solve this problem. We brought together about two dozen GPU architects and other senior guys to take apart the problem and look at why some games are smooth and others aren’t.

It turns out that our entire industry has been syncing the GPU’s frame-rendering rate to the monitor’s refresh rate – usually 60Hz – and it’s this syncing that’s causing a lot of the problems.

Sadly, monitors, for historic reasons, have fixed refresh rates at 60Hz. That’s because PC monitors initially used a lot of technology from TVs, and in the U.S. we standardized on a 60Hz refresh way back in the 1940s, around the time Ed Sullivan was still a fresh face. That occurred because the U.S. power grid is based on 60Hz AC power, and setting TV refresh rates to match that made early electronics easier to build for TVs. The PC industry just sort of inherited this behavior because TVs were the low-cost way to get a display.

So back at NVIDIA, we began to question whether we shouldn’t do it the other way. Instead of trying to get a GPU’s output to synchronize with a monitor refresh, what if we synchronized the monitor refresh to the GPU render rate?

No More Tearing, No More Stutters

Hundreds of engineer-years later, we’ve developed the G-SYNC module. It’s built to fit inside a display and work with the hardware and software in most of our GeForce GTX GPUs.

With G-SYNC, the monitor begins a refresh cycle right after each frame is completely rendered on the GPU. Since the GPU renders with variable time, the refresh of the monitor now has no fixed rate.

This brings big benefits for gamers. First, since the GPU drives the timing of the refresh, the monitor is always in sync with the GPU. So, no more tearing. Second, the monitor update is in perfect harmony with the GPU at any FPS. So, no more stutters, because even as scene complexity is changing, the GPU and monitor remain in sync. Also, you get the same great response time that competitive gamers get by turning off V-SYNC.

G-SYNC moves us a little closer to gaming nirvana – a world of great image quality with no tearing, no monitor stutter, and really fast input response. That lets me get back in the zone when I game. Already, I can see my beige shag carpet and Costco art prints in their crooked frames slowly replaced by the hard rock roads and dragon fire of “Skyrim.”

Life is good again.
User avatar
Tain
 
Joined: 15 Jul 2007 05:28

Unread postby El Chaos » 20 Oct 2013 22:30

Nvidia G-Sync: the end of screen-tear in PC gaming, Digital Foundry goes eyes-on with the revelatory new technology: http://www.eurogamer.net/articles/digit ... -pc-gaming
User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby El Chaos » 10 Nov 2013 15:37

Carmack, Sweeney and Andersson unplugged, the big three rendering architects effectively did an AMA at the G-Sync launch. Here's the transcript: http://www.eurogamer.net/articles/digit ... -interview

Richard Leadbetter wrote:We're not especially keen on multi-page articles on Eurogamer these days, but the sheer scale and scope of the discussion here - over 13,000 words in its edited form - demands a way to make the content navigable, so we've split the transcript into six separate sections. You can jump straight to the topics of interest by clicking on the links below.

User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby El Chaos » 10 Nov 2013 17:42

User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby Tain » 12 Dec 2013 16:07

Anandtech posted a review: http://www.anandtech.com/show/7582/nvidia-gsync-review

Assassin’s Creed IV

I started out playing Assassin’s Creed IV multiplayer with v-sync off. I used GeForce Experience to predetermine the game quality settings, which ended up being maxed out even on my GeForce GTX 760 test hardware. With v-sync off and the display set to 60Hz, there was just tons of tearing everywhere. In AC4 the tearing was arguably even worse as it seemed to take place in the upper 40% of the display, dangerously close to where my eyes were focused most of the time. Playing with v-sync off was clearly not an option for me.

Next was to enable v-sync with the refresh rate left at 60Hz. Lots of AC4 renders at 60 fps, although in some scenes both outdoors and indoors I saw frame rates drop down into the 40 - 51 fps range. Here with v-sync enabled I started noticing stuttering, especially as I moved the camera around and the difficulty of what was being rendered varied. In some scenes the stuttering was pretty noticeable. I played through a bunch of rounds with v-sync enabled before enabling G-Sync.

I enabled G-Sync, once again leaving the refresh rate at 60Hz and dove back into the game. I was shocked; virtually all stuttering vanished. I had to keep FRAPS running to remind me of areas where I should be seeing stuttering. The combination of fast enough hardware to keep the frame rate in the G-Sync sweet spot of 40 - 60 fps and the G-Sync display itself produced a level of smoothness that I hadn’t seen before. I actually realized that I was playing Assassin’s Creed IV with an Xbox 360 controller literally two feet away from my PS4 and having a substantially better experience.

Batman: Arkham Origins

Next up on my list was Batman: Arkham Origins. I hadn’t played the past couple of Batman games but they always seemed interesting to me so I was glad to spend some time with this one. Having skipped the previous ones, I obviously didn’t have the repetitive/unoriginal criticisms of the game that some other seemed to have had. Instead I enjoyed its pace and thought it was a decent way to kill some time (or in this case, test a G-Sync display).

Once again I started off with v-sync off with the display set to 60Hz. For a while I didn’t see any tearing, that was until I ended up inside a tower during the second mission of the game. I was panning across a small room and immediately encountered a ridiculous amount of tearing. This was even worse than Assassin’s Creed. What’s interesting about the tearing in Batman was that it really felt more limited in frequency than in AC4’s multiplayer, but when it happened it was substantially worse.

Next up was v-sync on, once again at 60Hz. Here I noticed sharp variations in frame rate resulting in tons of stutter. The stutter was pretty consistent both outdoors (panning across the city) and indoors (while fighting large groups of enemies). I remember seeing the stutter and noting that it was just something I’m used to expecting. Traditionally I’d fight this on a 60Hz panel by lowering quality settings to at least drive for more time at 60 fps. With G-Sync enabled, it turns out I wouldn’t have to.

The improvement to Batman was insane. I kept expecting it to somehow not work, but G-Sync really did smooth out the vast majority of stuttering I encountered in the game - all without touching a single quality setting. You can still see some hiccups, but they are the result of other things (CPU limitations, streaming textures, etc…). That brings up another point about G-Sync: once you remove GPU/display synchronization as a source of stutter, all other visual artifacts become even more obvious. Things like aliasing and texture crawl/shimmer become even more distracting. The good news is you can address those things, often with a faster GPU, which all of the sudden makes the G-Sync play an even smarter one on NVIDIA’s part. Playing with G-Sync enabled raises my expectations for literally all other parts of the visual experience.

Sleeping Dogs

I’ve been wanting to play Sleeping Dogs ever since it came out, and the G-Sync review gave me the opportunity to do just that. I like the premise and the change of scenery compared to the sandbox games I’m used to (read: GTA), and at least thus far I can put up with the not-quite-perfect camera and fairly uninspired driving feel. The bigger story here is that running Sleeping Dogs at max quality settings gave my GTX 760 enough of a workout to really showcase the limits of G-Sync.

With v-sync (60Hz) on I typically saw frame rates around 30 - 45 fps, but there were many situations where the frame rate would drop down to 28 fps. I was really curious to see what the impact of G-Sync was here since below 30 fps G-Sync would repeat frames to maintain a 30Hz refresh on the display itself.

The first thing I noticed after enabling G-Sync is my instantaneous frame rate (according to FRAPS) dropped from 27-28 fps down to 25-26 fps. This is that G-Sync polling overhead I mentioned earlier. Now not only did the frame rate drop, but the display had to start repeating frames, which resulted in a substantially worse experience. The only solution here was to decrease quality settings to get frame rates back up again. I was glad I ran into this situation as it shows that while G-Sync may be a great solution to improve playability, you still need a fast enough GPU to drive the whole thing.

Dota 2 & Starcraft II

The impact of G-Sync can also be reduced at the other end of the spectrum. I tried both Dota 2 and Starcraft II with my GTX 760/G-Sync test system and in both cases I didn’t have a substantially better experience than with v-sync alone. Both games ran well enough on my 1080p testbed to almost always be at 60 fps, which made v-sync and G-Sync interchangeable in terms of experience.

Bioshock Infinite @ 144Hz

Up to this point all of my testing kept the refresh rate stuck at 60Hz. I was curious to see what the impact would be of running everything at 144Hz, so I did just that. This time I turned to Bioshock Infinite, whose integrated benchmark mode is a great test as there’s tons of visible tearing or stuttering depending on whether or not you have v-sync enabled.

Increasing the refresh rate to 144Hz definitely reduced the amount of tearing visible with v-sync disabled. I’d call it a substantial improvement, although not quite perfect. Enabling v-sync at 144Hz got rid of the tearing but still kept a substantial amount of stuttering, particularly at the very beginning of the benchmark loop. Finally, enabling G-Sync fixed almost everything. The G-Sync on scenario was just super smooth with only a few hiccups.

What’s interesting to me about this last situation is if 120/144Hz reduces tearing enough to the point where you’re ok with it, G-Sync may be a solution to a problem you no longer care about. If you’re hyper sensitive to tearing however, there’s still value in G-Sync even at these high refresh rates.
User avatar
Tain
 
Joined: 15 Jul 2007 05:28

Unread postby El Chaos » 26 Jan 2014 18:01

Nvidia G-Sync review, the future of display technology:
http://www.eurogamer.net/articles/digit ... ync-review

Richard Leadbetter wrote:G-Sync is the best possible hardware solution to the age-old problems of screen-tear and v-sync judder. By putting the graphics card fully in charge of the screen refresh, effectively we have the visual integrity of v-sync along with the ability to run at unlocked frame-rates - something previously only possible by enduring ugly screen-tear. Out of the box, G-Sync provides a clearly superior experience, but it is not quite the magic bullet that solves all the issues of fluidity in PC gaming - something needs to change software-side too.

When we first looked at G-Sync at Nvidia's Montreal launch event, we marvelled at the sheer consistency of the experience in the pendulum and Tomb Raider demos. A drop down to 45fps incurred a little ghosting (frames were displayed on-screen for longer than the 60Hz standard 16.67ms, after all) but the fluidity of the experience looked very, very similar to the same demos running at 60fps - a remarkable achievement. However, the reason they looked so good was because of the regularity in the frame-rate - and that's not something typically associated with PC gaming. By running games completely unlocked, actual consistency while you play remains highly variable. G-Sync can mitigate the effects of this - but only to a certain degree.

There's a frame-rate threshold where the G-Sync effect begins to falter. It'll change from person to person, and from game to game, but across our testing, we found the sweet spot to be between 50-60fps in fast action games. Continual fluctuations beneath that were noticeable and while the overall presentation is preferable to v-sync, it still looked and felt not quite right. Owing to the nigh-on infinite combination of different PC components in any given unit, the onus will be on the user to gauge his quality settings effectively to hit the window, and equally importantly, the developer should aim for a consistent performance level across the game. It's no use tweaking your settings for optimal gameplay, only to find that the next level of the title incurs a much heavier GPU load. And if our G-Sync testing has taught us anything, it's that - within reason - consistent frame-rates are more important than the fastest possible rendering in any given situation.

That being the case, with careful application, G-Sync opens up a lot more possibilities. In the era of the 60Hz monitor, the most consistent, judder-free experience we can get is either with a locked 60fps, or else the console standard 30fps. As we've discussed, theoretically, with G-Sync, target frame-rate could be set anywhere (40fps for example) and locked there without the stutter you'd have on a current display. To make that possible, ideally what we really need to see is the introduction of frame-rate limiters in PC graphics settings. This has interesting implications for benchmarking GPUs because suddenly, lowest frame-rates become that much more important than the review-standard averages.

Overall, G-Sync is a hardware triumph, but the quest for a consistent, enjoyable gameplay experience is far from over. By eliminating the video artefacts, G-Sync lays bare the underlying problems of wildly variable gameplay frame-rates in PC gaming and highlights the problems of inconsistent input latency. If the hardware issue is now fixed, what's required now are software solutions to make the most of this exceptional technology.
User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby icycalm » 05 Feb 2014 04:39

http://shmups.system11.org/viewtopic.php?f=3&t=49132

bulbousbeard wrote:Bought a G-Sync monitor; impressions here

* G-Sync is incredibly rough around the edges right now. It's buggy, things don't work consistently, and you're constantly losing your ULMB (Ultra Low Motion Blur) settings when you switch from application to application. The latest Nvidia beta drivers don't even WORK with G-Sync. This is NOT consumer ready yet. This is prototype hardware as far as I'm concerned. I wouldn't buy this expecting to use it as your main monitor.

* The Asus VG248QE monitor that Nvidia built the G-Sync module for is a piece of shit. It has a laughably awful backlight, color reproduction gives off the distinct whiff of ass, and 1080p isn't even close to enough resolution to do HLSL properly. You're really looking at 1600p minimum before HLSL even starts to look decent. This monitor isn't good enough for a high quality arcade cabinet. You're going to want to wait for at least the 27" Asus 1440p monitor coming out in Q2 this year.

* ULMB (Ultra Low Motion Blur) is the new G-Sync strobed backlight method, and it's pretty fantastic on the desktop. I can use auto-scroll in my web browser, rapidly scroll through a page, and still make out most of the text because it's not blurred to fuck. I don't have to stop to read the text anymore.

* Even though Nvidia's pretending that most applications will "just work" with G-Sync, emulation is an entirely different story. If G-Sync monitors were widely available in 1997, MAME would have been designed a lot differently. The basic throttling mechanisms that emulators use simply don't make sense anymore when you have a G-Sync monitor. MAME's core requires significant work to really take advantage of this technology.

* MAME and G-Sync don't really work that well together as it stands. For example, if I have HLSL on, G-Sync works fine, and everything is smooth, but if I disable HLSL, use D3D, but only use prescale X and don't turn on HLSL, I can see noticeable tearing. It's almost as if G-Sync doesn't actually turn on unless the GPU is doing a certain amount of work. It's got to be a bug. The company that sold me this monitor actually said that a second revision of G-Sync DIY kits are going to be released; I think that the first run is glitchy and has a lot of problems.

* GroovyMAME's current black frame insertion solution doesn't work well. It periodically flashes white or black. It's far from ready to go for a cabinet.

* With G-Sync, you can run GroovyMAME on an LCD with frame_delay without getting that weird tearing effect. I think VERY low input lag solutions are possible with this setup. You could do double the native refresh rate AND framedelay. It seems like you could get input lag down to virtually nothing.

* BSNES is a piece of shit program. It might be the most accurate SNES emulator in the world, but it's a shitty Windows program. It stutters, it hitches, it doesn't even have a full screen exclusive mode. It sucks. It doesn't work with G-Sync at all. Useless emulator. Who cares if you're accurate when the program itself doesn't run well? I wouldn't ever actually want to PLAY A GAME with BSNES in its current state, so what good is it?

* Kega Fusion does work with G-Sync, but it has no HLSL options, so it looks like crap.

* Guacamelee works with G-Sync and even supports 120hz and 144hz refresh rates natively.

* Playing the Mega Play version of Sonic the Hedgehog 2 in Groovy UME with black frame insertion on is pretty staggering. There's basically no motion blur--even when running around at maximum speed. You don't even believe what you're seeing initially. It really does give you a glimmer of a possible future in which we don't need CRTs anymore for a decent picture in fast-moving games. That's the NICE part of black frame insertion. Here's the bad part...

* Black frame insertion isn't a silver bullet. It DOES look worse than an LCD back light running normally. It's as if a faint fog is covering the monitor. Color quality is diminished. It might partially be this piece of shit VG248QE panel, but the colors aren't vibrant at all. A great CRT is VIVACIOUS. The colors are so bright and vibrant. With black frame insertion on, it's like you're walking around on a cloudy day in the rain. I'm not entirely sold on it.

* Interestingly, G-Sync seems to make a more compelling difference in newer PC games than it does in MAME. Diablo 3, which is notorious for being a stuttery, hitchy pile of crap, runs butter-smooth with G-Sync on. It makes a huge difference. If you've played Diablo 3, you've probably experienced the game's constant hiccups. They're almost entirely gone with G-Sync on. It's like playing DOOM with the frame cap removed for the first time. It's a crisp slap in the face.

* Contrary to the bullshit and misinformation that's been spread, G-Sync works with OpenGL. It works perfectly with Quake 3.

* This G-Sync monitor is Display Port only. I can see why people dislike Display Port; my BIOS doesn't even show with this piece of crap. I don't get an image until Windows has booted.

I more or less can see how it's going to play out in the future. A 4K resolution, 30" OLED monitor with G-Sync or a similar technology is basically going to be the tech that can practically replace a CRT (really, be even better than a CRT).
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby SriK » 08 Jul 2015 08:58

Seven new G-Sync monitors are on the way, including a 27-inch 1440p 144Hz IPS display: http://www.pcgamer.com/seven-new-g-sync ... s-display/

Wes Fenlon wrote:Nvidia didn't just show up to Computex 2015 with the new 980 Ti in tow: the graphics company is also showing off seven new G-Sync monitors from Acer and Asus. And if you weren't already excited about G-Sync, these monitors might do it: the spread includes three 4K monitors, three ultrawide options, and a new, very exciting IPS model: the 27-inch 2560x1440, 144 Hz IPS Asus PG279Q.

Let me explain why that last one is a big deal. Our favorite gaming monitor is a G-Sync monitor, the great ASUS RoG Swift. But there's been one major obstacle standing between Nvidia's variable refresh technology and true greatness: color quality. To achieve high refresh rates like 144 Hz, most G-Sync displays have used TN panels, which are fast and cheap. But TN panels suffer when it comes to color quality, offering a more muted picture and poor viewing angles that look even more washed out if you're not looking at them straight on. IPS displays offer far more vivid colors and great viewing angles, but they're more expensive, and generally slower to refresh (hence visible ghosting on some IPS monitors). Finally, it looks like that's changing.

Acer has already released a 144 Hz 2560x1440 G-Sync monitor, which is also a 27-inch model. It costs a steep $800, but Asus introducing its own model may shake things up a bit. And with both Acer and Asus now offering 144 Hz IPS displays, we'll hopefully be seeing more high refresh IPS monitors in the near future, stretching from a more affordable 1080p to a premium 4K.

Here's the full list of new G-Sync monitors coming down the pipe.

  • Asus PG279Q, 2560x1440, 27", IPS, 144Hz
  • Acer Z35, 2560x1080, 35", VA, 144Hz
  • Acer X34, 3400x1440, 34", IPS, 75Hz
  • Asus PG34Q, 3400x1440, 34", IPS, 60Hz
  • Acer XB271HK, 4K, 27", IPS, 60Hz
  • Acer XB281HK, 4K, 28", TN, 60Hz
  • Asus PG27AQ, 4K, 27", IPS, 60Hz

Nvidia also announced today that an update for G-Sync will add support for windowed mode. We don't have release dates or prices on any of these monitors yet, but most or all of them should be at Computex, where we'll get some hands-on time to test them out. Eventually they'll join our list of the currently available G-Sync and FreeSync monitors.
User avatar
SriK
 
Joined: 05 Nov 2011 15:12

Unread postby El Chaos » 08 Jan 2017 22:25

G-SYNC HDR Monitors At CES 2017, Gaming’s Ultimate Displays Just Got Better: http://www.geforce.com/whats-new/articl ... t-ces-2017
User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby icycalm » 15 Jul 2021 10:30

I must be getting old because I don’t understand what he’s saying. It sounds important though?

https://shmups.system11.org/viewtopic.php?t=68786

cave hermit wrote:Mame low latency flag on a G-sync monitor: fire and forget?

I've been messing around with groovymame and groovyarcade on an optiplex I bought off of ebay awhile back, and I went so far as to flash a 15khz bios mod for an ATI graphics card in preparation to hook up to my PVM (I've been using a 31khz PC CRT prior to that).

After tating my PVM twice to play Dodonpachi on my MiSTer, I decided I would stop doing so given how rare and finicky these old PVMs are, and as for playing Groovymame on my PC CRT, honestly if I'm just doing it for the CRT experience, there's not much point if I'm going to just compromise by using a 31khz monitor with artificial scanlines.

I'm planning to get a spare pivoting lcd monitor for switch/ps4 shmups, and I noticed the monitor had freesync, so I was considering dropping another $125 for a RX 550 on eBay so I could use groovymame and freesync with the optiplex, when I realized something painfully obvious:

I have a PC with a RTX 3080 (a Nvidia card) connected to a high end native G-sync monitor. Also said monitor pivots (although I can't actually pivot it at the moment due to my desk having this stupid mini shelf in the way, but hey future consideration). Thus I could just use plain old mame with the recently added low latency flag.

So I downloaded the latest version of MAME, turned on low latency, disabled bilinear filtering, enabled integer scaling, and booted up Dodonpachi. And I don't know if low latency was working correctly, but I didn't notice any input lag in Dodonpachi compared to my MiSTer hooked up to a tate PVM, and I made it to stage 5 on my first credit.

Most of this post was just kind of rambling, but the reason I made this thread was to ask,

Is there anything else I actually need to do besides simply enabling the low latency flag in order to get the full benefit? Obviously I have G-sync enabled, but I also have driver level V-sync enabled as well, and I'm not sure if that has any effect.


It sounds like he’s saying G-Sync is better than anything besides genuine low-res CRTs.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands


Return to Hardware