At CES this week, AMD made an
unusual announcement about Nvidia’s new G-Sync technology. According to the company’s senior engineers, they can
replicate much of the advantages of Nvidia’s G-Sync tech through the use of
what are called dynamic refresh rates. Multiple generations of AMD video cards
have the ability to alter refresh rates on the fly, with the goal of saving
power on mobile displays. Some panel makers offer support for this option,
though the implementation isn’t standardized. AMD engineers demoed their own
implementation, dubbed “FreeSync,” on a laptop at the show.
AMD’s windmill application FreeSync demo. Unfortunately, it’s impossible to find video demos on YouTube that don’t ruin the G-Sync or FreeSync effect.
Dynamic refresh rates would
theoretically work like G-Sync by specifying how long the display remained
blank on a frame-by-frame basis, providing for smoother total movement. AMD has
stated that the reason the feature didn’t catch on was a lack of demand — but
if gamers want to see G-Sync-like technology, AMD believes it can offer an
equivalent. AMD also told Tech Report that it believes triple buffering can offer a solution to
many of the same problems G-Sync addresses. AMD’s theory as to why Nvidia built
an expensive hardware solution for this problem is that Nvidia wasn’t capable
of supporting G-Sync in any other fashion.
Nvidia
rebuts
Nvidia, unsurprisingly, has a
different view of the situation. Tech Report spoke to Tom Peterson, who stated the difference between a laptop and a desktop
running a software equivalent to G-Sync is that laptop displays are typically
connected using embedded DisplayPort or the older LVDS standard. Standalone
monitors, in contrast, have their own internal scaling solutions and these
chips typically don’t support a variable refresh rate.
I think Nvidia is probably being
honest on that score. The G-Sync FPGA is fairly hefty, with 768MB of onboard
memory and a limited number of compatible monitors. Nvidia has a long interest
in keeping its technology proprietary, but it also has reasons to extend
G-Sync as widely as possible for as little up-front cost as possible. A G-Sync
upgrade kit for $50 that fits any modern monitor would sell more units than a
$100 or $150 kit that only fits a limited number of displays or that requires a
new LCD purchase.
Nvidia’s G-Sync includes a 768MB
buffer combined with a custom FPGA.
It’s entirely possible that both
companies are telling the truth on this one. AMD may be able to implement a
G-Sync-like technology on supported panels, and it could work with the
manufacturers of scalar ASICs if G-Sync starts catching on for Nvidia. Nvidia,
meanwhile, is probably telling the truth when it says it had to build its own
hardware solution because existing chips for desktop displays weren’t doing the
job.
Whether this works out to a
significant halo for Nvidia in the long run or not will come down to price and
time-to-market. In the past, Nvidia took the lead on computing initiatives like
PhysX and CUDA, getting out in front on technical capability, while
industry-wide standards followed along at a slower pace. The impact on the
consumer market has been mixed — PhysX definitely delivered some special effects
that AMD didn’t match, but CUDA’s impact on the consumer space has been small
(its HPC success
is another story altogether).
The difference between these
technologies and G-Sync is that monitors are fairly long-lived. Buy a G-Sync
monitor today, and you have the benefits for five years or more. Some games
benefit from G-Sync more than others, but once Nvidia smoothes out the development
pipeline, we should see a consistent stream of titles that run better in that
mode. It’s not like hardware PhysX,
which was never supported by more than a handful of major games in any given
year. In the long run, if panel makers start building variable refresh rates
into their own displays, than the need for Nvidia-specific G-Sync technology
may fade out — but that doesn’t mean the company can’t make a pretty penny off
the concept while it lasts. And since it’ll take time for panel manufacturers
to adopt the capability if they choose to do so, it means Nvidia has a definite
window of opportunity on the technology.
No comments:
Post a Comment