Briefly announced and discussed during AMD’s 2015 GPU product presentation yesterday morning was AMD’s forthcoming dual Fiji video card. The near-obligatory counterpart to the just-announced Radeon R9 Fury X, the unnamed dual-GPU card will be taking things one step further with a pair of Fiji GPUs on a single card.

Meanwhile as part of yesterday evening’s AMD-sponsored PC Gaming Show, CEO Dr. Lisa Su took the stage for a few minutes to show off AMD’s recently announced Fury products. And at the end this included the first public showcase of the still in development dual-GPU card.

There’s not too much to say right now since we don’t know its specifications, but of course for the moment AMD is focusing on size. With 4GB of VRAM for each GPU on-package via HBM technology, AMD has been able to design a dual-GPU card that’s shorter and simpler than their previous dual-GPU cards like the R9 295X2 and HD 7990, saving space that would have otherwise been occupied by GDDR5 memory modules and the associated VRMs.

Meanwhile on the card we can see that it uses a PLX 8747 to provide PCIe switching between the two GPUs and the shared PCIe bus. And on the power delivery side the card uses a pair of 8-pin PCIe power sockets. At this time no further details are being released, so we’ll have to see what AMD is up to later on once they’re ready to reveal more about the video card.

Comments Locked

133 Comments

View All Comments

  • loguerto - Thursday, June 18, 2015 - link

    (•_•) .. <-----------------Nvidia
    ∫\ \___( •_•) <---------------------CPUGPUGURU
    _∫∫ _∫∫ɯ \ \

    ( ͡° ͜ʖ ͡°)
  • loguerto - Thursday, June 18, 2015 - link

    (•_•) .. <-----------------Nvidia
    ∫\ \___( •_•) <---------------------CPUGPUGURU
    _∫∫ _∫∫ɯ \ \

    ( ͡° ͜ʖ ͡°)
  • FlushedBubblyJock - Tuesday, June 23, 2015 - link

    6GB is already thin, so yes at least 8 is needed, but the 390X just doesn't have the core power to justify 8GB.
  • Nagorak - Thursday, June 18, 2015 - link

    Future proofing is kind of pointless on a high end GPU. Yeah, it sounds weird, but I'd argue it's true. The people who buy these halo products are the same one that upgrade to the latest and greatest whenever something new is released. They won't be using the card in a year and a half.

    If you only upgrade every couple years and need to be concerned about future proofing, then you shouldn't be buying the super high end stuff anyway.
  • sabrewings - Saturday, June 20, 2015 - link

    That's a wide generalization. I don't care about future proofing, and I don't buy the latest and greatest every year. I bought a 980 Ti because its aligned with when I was building my PC. My last PC was put together in early 2008 and I was still running my Q6600 and GTX 275 (playing Elite, no less) up until this month. I buy the best to make it last, but I don't expect to play with settings maxed out forever either. I have a 55" 1080p LED TV (for which my liquid cooled 980 Ti barely hits 23% utilization and is usually around 26C in Crysis), but the real reason I bought the 980 Ti is for VR. StarVR, SteamVR, Oculus, and even a Hololens will all eventually grace my machine and I wanted it to keep up without needing more upgrades.

    All that being said, I'm not opposed to offloading my 980 Ti late next year or keep it as a spare and rocking Pascal since it probably will be a huge leap forward with such a large node jump (nearly 50% feature size reduction).
  • tcube - Thursday, June 18, 2015 - link

    HBM was used on 28nm not because of the bw... but because of the saved space on the GPU die itself as the interfaces are a factor smaller then GDDR5, also they did it to reduce power requirements of the entire package. The HBM stacks use less power so do the interfaces, less heat as well. I'd say it is a good move as it enabled them to put the power saved against the cores and increase the core count by more then 50% over Hawaii but use LESS power. And no you don't NEED a radiator you could use an aircooling solution but they overengineered the board to be a premium/enthusiast part... you don't get that fine, buy the aircooled version! The cooling solution is a 500W monster... the GPU under it apparently uses about 200W in games and has a tdp of 275W... that means you have a huge headroom, also you can deliver 375W to the board but also not needed... as I said it is engineered to be a premium device! You don't like it... get the aircooled version.
  • sabrewings - Saturday, June 20, 2015 - link

    I'm not so sure I believe that their puny AIO cooler is going to acceptably displace 500w (at least to my standards). An XSPC RX360 barely pushes 500w at 10* delta-T (which is the upper limit IMO for coolant temps). With it my 980 Ti load temp is around 36*C. I could probably toss a second one in and see around 40C load temps, but that's as high as I'd prefer to go. If you're really expecting that cooler to move 500w, you're going to have a huge delta-T which shortens pump life as well as significantly raising your silicon temps.
  • FlushedBubblyJock - Tuesday, June 23, 2015 - link

    "probably uses about 200W in games"
    ROFLMAO
    another one, they're everywhere
  • loguerto - Thursday, June 18, 2015 - link

    (•_•) .. <-----------------Nvidia
    ∫\ \___( •_•) <---------------------CPUGPUGURU
    _∫∫ _∫∫ɯ \ \

    ( ͡° ͜ʖ ͡°)
  • hingsun - Wednesday, June 17, 2015 - link

    WITH STACK HBM RAM, I AM REALLY WORRY ABOUT HEAT ISSUE, SINCE NORMAL CHIPS RECEIVED MORE SURFACE AREA TO DISSIPATE HEAT, WHILE THE 2.5D STACKS HAS MUCH LESS SURFACE AREA TO VOLUME RATIO TO DISSIPATE HEAT. WE ARE TRADING IN CARD SIZE VS CHIP LIFE. AS FAR AS I KNOW, MORE HEAT TO CHIP, THE LESSER ITS LIFE.

Log in

Don't have an account? Sign up now