There are plenty of users looking for FreeSync displays, and for those of us in the US we will have to wait a bit longer. However, AMD sends word today that FreeSync displays are now available in select regions in EMEA (Europe, Middle East, and Africa). We're still awaiting final information on pricing and we've asked AMD for details on which monitors are shipping.

While we can't come to any real conclusions without true hands on experience with testing FreeSync displays across a variety of games, this should hopefully be a pretty straightforward piece of hardware. At this time I'd argue that the panel technology is just as important as the adaptive VSYNC, as having to go with a TN panel to get higher refresh rates tends to be a case of one step forward, one step back. Thankfully, there should be IPS FreeSync displays (alongside TN models) available.

Of course having a FreeSync display won't do you much good without an appropriate FreeSync enabled driver, and AMD announced that they will have a publicly available FreeSync driver posted on their website on March 19. There's a corollary that's just as important, however: a driver with support for CrossFire configurations won't be available until the following month. If you're running an AMD GPU and have been looking forward to adaptive refresh rates, the wait is nearly over.

Update: In the UK at least, OverclockersUK has several FreeSync models available. The BenQ XL2730Z at £498, the LG Flatron 34UM67 for £500 and the Acer Predator XG277HU for £430 are listed with the BenQ listed in stock.

Comments Locked

55 Comments

View All Comments

  • Alexvrb - Thursday, March 5, 2015 - link

    The Cult of NVIDIA is so different? I think not.

    Anyway most people saw Mantle as a stop-gap and a means to force MS and Khronos to get off their butts. It worked, too. But Adaptive Sync is an implementation of a VESA standard... it's already open and others will implement it if it does well (Intel for example).
  • D. Lister - Friday, March 6, 2015 - link

    Most people saw Mantle as a technological edge for GCN over the competition, and a significant reason for buying (and recommending) a new AMD GPU or APU over the alternatives. Many of those same people are only NOW saying "Mantle was not really a game-changing feature to keep the aging GCN architecture competitive, but just a stop-gap and a means to force MS and Khronos to release DX12, teehehe."

    As for the "cult of Nvidia", we are doing quite well thank you. We recently renovated our temple and bought flowing green robes for our priests. Come visit us sometime. Let us show you the glory of power savings and lower TDP, and the splendor of a unified, 1st-party software suite that is the GeForce Experience. Let us lead you on a path to true graphic contentment that can only be achieved with stable drivers. All hail our lord, that Korean dude whose name I always forget.
  • Murloc - Friday, March 6, 2015 - link

    I had to uninstall it because it wants me to update drivers to a version that isn't supported by my main GPU, but is supported by the newer and very low-end GPU I'm using only for sending audio over hdmi to my AVR.
    When it did, it caused a mess.
  • silverblue - Friday, March 6, 2015 - link

    I'll try not to throw a Fermi-shaped stone into your glass house of power savings. Legitimate question - does NVIDIA still sell Fermi derivatives within the current product stack?
  • D. Lister - Friday, March 6, 2015 - link

    The 4xx series you mean? Bah, and once people use to ride jalopies and steam locomotives, lol.
  • silverblue - Friday, March 6, 2015 - link

    The 5xx series as well, plus various lower end options since (there were three flavours of the GT 730, one being based on GF108, all of which came out last June).
  • D. Lister - Friday, March 6, 2015 - link

    5xx series was actually a fair improvement over the TDP/Wattage disaster that was the 4xx series. Especially in terms of performance/watt, the disparity with the AMD equivalent was comparatively much smaller.

    Still, that was 4 gens ago (or 3, not counting the 8xxm parts). At this point though, I wouldn't recommend buying a Fermi GPU.

    As for the x20/x30 parts - that's the bargain basement of performance, where power and heat aren't really a significant issue anyway.
  • silverblue - Friday, March 6, 2015 - link

    I'd be annoyed if I had a hungry budget card. :)
  • tuxfool - Friday, March 6, 2015 - link

    I'm not likely to go for the likes of you to see what "most" people think. Almost as much BSing and FUD as that Jian fellow.
  • D. Lister - Friday, March 6, 2015 - link

    @tuxfool

    That's good - form your own opinions. Don't let anyone tell you how to think. Not the likes of me, not a website journalist, or a corporate marketing rep, a politician, or a clergyman. Understand this fact of modern life, that 99.99% stuff that we see, hear or read, is complete, and often commercially fabricated, bullshit. AAMOF, every time we read or hear anything from anyone, our first thought should be, "this is bullshit, unless unequivocally proven otherwise." Skepticism is a survival skill.

Log in

Don't have an account? Sign up now