AMD’s Full RX Vega Lineup leaked: Fastest ‘VEGA Nova’ cheaper than GTX 1080 Ti at $599

Radeon RX Vega release date

The full RX Vega lineup has apparently leaked out. The information reportedly comes from very reliable sources within AMD’s Headquarters.

Leak reveals AMD RX Vega Lineup – Three Enthusiast Cards to be available at Launch

AMD is expected to unveil its upcoming Vega based graphics cards at this year’s Computex on 31st May. According to the leaked info, the reference model cards will be released on 5th June, which is in line with AMD’s confirmation of RX Vega launch in Q2 of 2017. There will be three RX Vega cards on launch, with the naming scheme based on a space theme.

The first card will be called the RX Vega Core which will start at $399. It will deliver performance on par with or better than Nvidia’s GeForce GTX 1070.

Next up is the RX Vega Eclipse which will be priced at $499 and will compete head to head with the GTX 1080.

The RX Vega Nova will be the Big Vega that will retail at $599 and rival the GTX 1080 Ti.

READ MORE: Only 16,000 AMD Radeon RX Vega GPUs to be available at Launch

Radeon RX Vega lineup - Vega on display

With these cards based the next-gen Vega architecture, AMD will finally be able to compete with Nvidia in the enthusiast market segment. The company hasn’t released a proper high-end graphics card since the launch of Radeon R9 Fury X in 2015.

Currently, the entire high-end sector is dominated by Nvidia thanks to its Pascal based GeForce lineup comprising of the GTX 1070, GTX 1080 (Ti), Titan X and Xp. These cards have no direct competitors from AMD; the fastest card it has to offer is the Polaris based Radeon RX 580 which only competes against the $249 GTX 1060.

RX VEGA Nova to offer 4K at only $599

What’s interesting about Vega launch is that the new Core and Eclipse graphics cards will be priced more than the GTX 1070 and GTX 1080. Right now, you can easily purchase a GTX 1070 for $350 on Amazon, so it’s a bit confusing as to why the competing AMD card would cost more while it offers similar performance.

Is AMD planning to release its own versions of Founder’s Edition cards? We don’t know yet, but it seems to be the only reasonable explanation for the higher price.

DON’T MISS: AMD’s 16-Core Ryzen Whitehaven CPUs spotted – 3.6GHz Clock, Quad Channel DDR4

Plus, previous leaked images of the reference Vega GPU show an aesthetic silver design which is again a hint that AMD could be heading that way.

RX Vega Lineup leak - Vega Reference card (Soul)

For now, it seems that the biggest of the three is also the baddest GPU in the RX Vega lineup. If the source’s $599 pricing for Nova is correct, we could be looking at a potential 4K solution at a much cheaper price than Nvidia’s GTX 1080 Ti, with similar performance.

Furthermore, if the RX Vega Nova manages to outshine the competition, it could definitely be AMD’s best GPU since the Radeon R9 290X which was released back in 2013.

  • ComradeHX

    inb4novidiafanboys

    • d0x360

      Still 599 at Best buy. I saw 3 of them yesterday while I was there buying a surface dial.

      • ComradeHX

        *WorstBuy

  • A Harrop

    The higher price of the Core over the 1070 would be based on performance, not some silly founders edition nonsense. You really think THAT was the only reasonable explanation? Makes you seem new to the AMD vs Nvidia world to be honest…

    • d0x360

      Yea, this is an architectural evolution not a tweaked and overclocked rebadge.

      It’s possible that the author knows something we don’t like perhaps instead of performance amd is now going to base pricing off of what color the backplate is?

    • Michaelius

      Fury X says hello – wouldn’t be first time when AMD releases overpriced product that can’t compete against direct rival.

      • A Harrop

        Are you implying the fury x was more expensive than the Nvidia card that most closely matched it’s performance? AMD has almost always had the better price/performance ratio. It’s how they’ve survived…

      • Andrew Lane

        Not sure what world you are living in but the Fury X was (supposed to be) cheaper then the 980 ti and despite poor drivers, still managed to compete with stock 980 ti in most games at higher resolutions.
        The problem that the Fury X had was that it was extremely scarce so retailers had no problems pricing it higher then the 980 ti due to supply and demand. I waited nearly 6 months to buy a Fury X but I ended up buying a 980 ti when they went on sale at my usual webshop. I did end up paying $AUD 1000 for the 980 ti instead of ~$AUD 900 for the Fury X.

    • Steven De Bondt

      You know what they say,
      in the minds of the green team members (which is the biggest one) Nvidia are competing with Nvidia products.

  • Chris

    If AMDs drivers weren’t so awful, I’d consider an AMD graphics card in the future. Apparently the latest Crimson Relive drivers still don’t work properly with my ASUS monitor. The RX 480 and RX 460 can’t detect the supported resolutions and refresh rates of my ASUS PG348Q monitor. I had to use 3rd party software for both cards to detect the proper settings over HDMI and Display Port connections. So back to Amazon they went.

    My replacement Nvidia GTX 1070 detected the monitor’s resolutions and refresh rates properly as soon as the drivers were installed. Its 2017 AMD how about you release properly working drivers already?

    • The road ahead

      I have used lot of amd cards since 5xxx series. I’ve had 0 driver issues. Amd recent drivers are much more issue-free vs nvidia drivers

      • Joseph Romus

        Must be nice. I’ve never owned a video card that didn’t have some sort of driver issue at some point.

    • Cooe

      Having used a variety of modern both Nvidia and AMD cards extensively over the past year, in my experience I’ve found AMD’s drivers to be more stable/hassle free. The old perception of AMD’s drivers being buggy and unstable is ridiculously outdated and completely false.

    • Brian Thompson

      I had the same impression several years ago. About 3 years ago, I decided to try AMD anyway and was pleasantly surprised. I got the Radeon R9 270 and it works perfect with my 3 screen setup. The new look is sleek and the drivers are stable. This is not entirely true for my old GTX 460 in the computer I gave my brother.

    • Neilius Maximus

      Nvidia drivers are worse as of late. Who the hell still thinks AMD drivers are worse lol? RTG under Raja has been amazing for drivers.

      • Geers Tyresoil

        And you don’t get nagged for personal info when you install them!

    • Ant McLeod

      I’m having issues Nvidia drivers, RIGHT NOW.

    • NoCompanyIsBetterThenTheOther

      I’ve used both and had no issues with the last AMD CPU & GPU (FX-6300 and R9390)
      I have had many of issues with my 980, but I won’t complain cause when it’s working great its 10x better then the 390, but it might also be the CPU difference

    • d0x360

      AMD used to have awful drivers but even months before the crimson drivers were released that stopped being true.

      I had a 290x and replaced it with a GTX 1080 and have found nVidias drivers to be way worse. The Nvidia CP freezes when you apply settings, it sometimes forgets settings for scaling or color as well. They also have issues with the latest hdcp revision, there is an error when it tries to make the handshake after an overclock on both my LG OLED 4k tv and my Denon receiver. That and their dx12 drivers still aren’t very good compared to amds.

      On the flip side amd has fixed dx11 driver issues for gpus in the 2xx series forward, their cp is fast, their dx12 drivers are flawless, I never had any issues with hdcp or things forgetting what they were set to.

      If you’re thinking about bringing up the performance difference in dx11 on cards that are comparable don’t bother because it’s not a driver issue it’s an architectural difference. AMD has been designing their gpus to take advantage of the very properties that make vulkan and dx12 potential performance monsters since the 2xx series of GCN. The downside to that architectural difference is slightly​ worse dx11 performance BUT even that difference in performance essentially vanishes after a few weeks because the driver tv will take literally replace shader code in games to take better advantage of their hardware which is a major reason why week 1 benchmarks on new games isn’t a very fair comparison.

      There is a reason people who have owned an amd gpu talk about the fine wine effect and it’s not to hide buyers remorse. My 290x is still running games at 1440p 60fps+ maxed out in a second gaming PC and it’s that very reason why I’m replacing a barely out of its diapers 1080 with Vega and would even do so if they had out of the box identical performance

      Nvidia generally releases game ready drivers and unless there is a bug that’s all they do. AMD on the other hand releases their game ready drivers then continues to fine tune them for weeks sometimes going from being a few fps behind to a few fps ahead.

    • Geers Tyresoil

      Nvidia schills/fanboys out in force today it appears

    • Zenstrive

      Crimson driver is so much better than Catalyst.

    • PrimitivƎ

      Your cred just sank along with your biased nonsense.

      • Chris

        No bias. All facts. Your AMD fanboyism is showing.

    • Joshua Tryon

      This whole myth that AMD’s are awful is completely false. Their drivers take some time to mature for a new architecture, it’s true, but their cards age much better​ over time as their drivers improve.

    • nashathedog

      AMD’s driver support has matched or exceeded Nvidia’s for around 2 years now, It’s been good enough that I comitted to AMD via getting a Freesync monitor, Your PG348Q monitor is a G-sync monitor and from what I’ve read and heard G-sync monitors do not play nicely with AMD cards, Someone on my local forum got a 290x to try out with his 1440p Swift and it was problematic at best so you shouldn’t blame AMD for a monitor with a G-sync module not playing ball, Nvidia’s the problem, you bought into an eco-system where they’ll do anything to make sure you only use Nvidia graphics cards.
      That is actually an example of the harm done to PC gaming by Nvidia’s cutthroat attitude.

    • Fatboy

      The story has changed since past 2 years, AMD drivers now fair much better then Nvidia’s offering. This is coming from someone who is currently using GTX 1070 in main system and HD 7970 in secondary system.

    • No

      If the Nvidia control panel wasn’t so awful, I still wouldn’t consider a nvidia card any time soon after I got rid of my 3.5GB cripple in exchange for a Vega (Eclipse).

    • ComradeHX

      >buys G-sync monitor
      >complains about AMD

      AMD drivers are fine actually, it’s Nvidia drivers that’s broken a lot of PC(can’t boot properly…etc.).

      • Chris

        The integrated Intel HD graphics on the i3, i5, and i7 processors detect my monitor’s resolutions and refresh rates correctly. Yet the AMD graphics cards don’t. Seriously, if Intel can get it right, why can’t AMD even after several driver updates?

        • ComradeHX

          Because Intel iGPUs are expected to be connected to gsync displays while AMD gpus aren’t. It could also easily be intentional from Nvidia’s side, by reporting resolution and refresh rate options only intel and nvidia gpus would decipher correctly, keep in mind that there is Nvidia chip in-between the monitor and gpu.

          Also, have you tried CRU?

          AMD driver does lack certain support(for example Radeon Pro is no more) but not supporting G-sync monitor would be last of my concerns.

        • Phartindust

          You are aware that Intel licenses Nvidia graphics tech right? So of course Intel’s HD graphics work.

    • So, let me get this straight. You bought a USD $1200 monitor that is G-Sync equipped, then bought an RX 480 and an RX 460, ran into the same issue with a G-Sync monitor refusing to cooperate with a non-Nvidia graphics card, and blame AMD’s drivers?

      Of course the Nvidia driver detected the monitor’s resolutions and refresh rates – the monitor is built on a scaler that Nvidia themselves built.

      Were you unaware that G-Sync equipped monitors frequently refuse to play nice with non-Nvidia graphics? Were you unaware that the RX 480 and RX 460 would not support G-Sync and probably wouldn’t have the oomph to push 3440x1440x100Hz gaming? Why would you buy an RX 480 – let alone an RX 460 – to drive a G-Sync 3440x1440x100Hz panel?

      This complaint simply doesn’t add up. Something’s missing here.

      • Chris

        The graphics cards were for an gaming article I was writing. There are some free to play games that run well on low settings with higher resolutions without using a $500 and up graphics card.. Also the monitor supports 1920×1080, 2560×1080, 2560×1440 in addition to the 3440×1440 resolution which makes it easy to test the game and graphics card performance at differnet settings.

        I also later ran into an issue with an older non G-Sync Samsung monitor that supports 1920×1200 and 1920×1080 resolution. The RX 460 and RX 480 wouldn’t allow the monitor to run at 1920×1080.

        The Intel integrated HD graphics on the 3XXX through 6XXX series processors all detect all the supported resolutions and refresh rates on both of the monitors. Yet the AMD graphics cards don’t. That seems like a major oversight or issue when a graphics card doesn’t detect settings on plug & play monitors.

        I’m also not the only one with this issue, I found other people who have the same issue with AMD RX 4XX series graphics cards even when they aren’t using G-Sync monitors. The only solution they found was 3rd party software to allow those supported resolutions to run on their AMD RX 4XX series GPUs.

        • Santiago Reina

          Why not just add a custom resolution? like everyone?

  • Calipha S. Callender

    Inb4 “haha i told u novideo fanboys!!!”

    Remember, AMD’s products are ALWAYS slower than what they’re advertised as before release.

    • Neilius Maximus

      This guy^ Is an Nvidia sh i l l who frequents wccftech. Don’t listen to his BS. I think hes even banned there now.

      • Ant McLeod

        I remember him and he is running a GTX 560 too. Str8 BS’er

      • d0x360

        Oh wow I have him blocked and I don’t block anyone. I’ll put up with the most blatant of trolls spewing the most blatant of lies without blocking but this guy I blocked…must be a real winner.

        • Neilius Maximus

          exactly lol

      • Calipha S. Callender

        If I was banned then how am I still commenting you lunkhead?

        • Neilius Maximus

          Because this isn’t wccftech? Lmfao.. dumb as a brick.

          • Calipha S. Callender

            Too bad I’ve never commented on WCCFTECH. :c

          • Yeah you have.

          • Neilius Maximus

            Do you always lie or just this one time??

          • Calipha S. Callender

            Pic of me commenting using WCCFTECH instead of Disqus.

            I’ll wait.

          • Neilius Maximus

            Wccftech uses disqus…

          • Calipha S. Callender

            But according to you I’m banned from WCCFTECH.

          • Neilius Maximus

            ya which is why you cant comment on disqus there lmfao!

          • According to my looking around for a bit, you are not banned from WCCFtech. In fact you were commenting there not even a week ago. And you have a LONG comment history there.

            I have links. You are lying.

          • Calipha S. Callender
          • “I think X is true.” Later finding out X is not true does not make the statement a lie. It makes it a mistake.

            Wanna see what it ACTUALLY looks like when someone gets caught in a lie?

            https://uploads.disquscdn.com/images/e814bf2cc4b773d808ed9bebc658ba068627a76b083b53ca345089fb8b95fe2c.jpg

          • Calipha S. Callender

            I’m pretty sure everybody by now has noticed that I’ve been here for ages, you aren’t stating anything new.

          • This isn’t WCCFtech.

            You straight up said that you’ve never commented on WCCFtech. You have. A lot.

          • Calipha S. Callender

            Again… you aren’t stating any thing new.

          • Calipha S. Callender

            If you look at the context, moron, you can clearly tell I was giving him a sarcastic answer due to him saying I was banned off WCCFTech.

      • larry fleming

        Blocked!!!

      • He’s not banned from wccftech. But he’s blatantly lying by claiming he’s “never commented on WCCFTECH”. I have links. He has a LONG comment history there.

        • Neilius Maximus

          Oh I know he does lol. But these Nvidia fanboy nut jobs lie through their teeth day and night. I don’t understand why…

          • Mostly because they don’t think they’ll get caught.

            Which is why I like to catch them.

    • Zenstrive

      Not Ryzen. It’s killing intel’s processors

      • Chris

        the articles linked right below your comment say, “Looking at the Ryzen 7 gaming benchmarks, we can see that AMD’s solutions are, of course, a competent performer, but overall, they are slower than their Intel counterparts in pure FPS.” and, “Ryzen 5 1600X Lags behind Core i5-7600K in Gaming”

        • Wirxaw

          Aha. And FX 8320 performs better than 2500k in multi-threaded applications. Which are a norm nowadays.
          So, go ahead, buy 4 threads in 2017. After your higher FPS gets rekt by stuttering and framerate dips – remember your choice.

          Come on, tell me I’m wrong.

        • Zenstrive

          yes, lagging in FPS, with CPU CAPACITY TO SPARE.
          The i5 are suffocating at 100% per thread., R5 are at most 50% per thread.

    • ComradeHX

      You mean like overhyped novidio gpus?

      You might not even get the same speed of vram as advertised (lol 3.5).

  • d0x360

    Oh my my my. I guess I’ll have a GTX 1080 for sale on June 5th if I’m lucky…very very lucky.

    Ok everyone, the Nova version is going to be completely garbage so don’t buy one, tell everyone you know not to buy one in fact let’s crowd source some billboards and podcast ads telling people not to buy anything called Nova in June.

    Thanks.

    • eaman

      If I was you I’d wait for Vega to be actually available at a decent price before selling the card you have now, I’m thinking about the 500 rebrand launch…

  • SkyAttacksX

    I really wanted to wait for Vega. Unfortunately, I sold my build (which had a Fury Nitro) a bit ago and used the money from my build with other cash I had to get a new build with a 1070. I couldn’t keep watching RAM prices soar (coming from a 2500k with 8gb), and if I got a 580 it would be a downgrade from my Fury in terms of raw performance. Didnt want another Fury as my room got really hot with it, so now I’m left with a 1070.

    First Nvidia card in around 8 years (last one was some Nvidia GeForce 7xxx thingy lol) but the drivers aren’t that bad and overall I can’t say I regret it for $340.

    I’m saying this because I think that Vega will be a very very good lineup, but I think it’s a bit bad timing… on the bright side I did get a Ryzen 7 1700, as I do want to support AMD, but they took a whole year to go head to head with Nvidia’s now year-old Pascal. With Volta on their doorstep, the timing couldn’t have been worse.

    I hope the best for AMD, and for those of you saying “I never had driver issues with Nvidia/Amd” both have had driver issues. The amount one has over the other is quite possibly within margin of error with each other (e.g. 5000 issues with Nvidia and 5100 with Amd or vice versa). Stop fanboying or you’ll pay the price of a 1080ti + Vega Nova with the performance of a 1060. That goes for both sides.

    Let’s be real, if AMD was in Nvidia’s position, they would be jacking prices up and raising margins as much as possible.

  • Japheth

    The 580 isnt that much slower than a 1070 and its $200 bucks cheaper..? Seems like a decent competitor to me.

    • Monitorm

      The RX580 is a LOT slower than a GTX 1070, the RX580 competes well against the 6GB GTX 1060

      • BaronMatrix

        Only at 1080p…

  • John Smith

    The Nvidia 10xx line will be useless once Volta drops. Nvidia always glimpse their cards while amd improves them.

  • AMDfan

    This lineup is better than NVIDIA offerings, the small one was the one shown to world a couple of times.
    Even on early drivers it managed to be faster than the 1070 and a little under 1080 performance.
    The middle one will be between n1080 and 1080Ti, full Vega 10 will be faster than 1080Ti

  • so the same watercooled nonsense on nova? gl with that

  • This is an amazing offering, but lets see if anyone actually buys it. Because overall few people bought Fury / Fury X and the aftermarket resale is pretty low.
    I’ve had a Fury a little over a year now and its an awesome card that runs everything I throw at it at 60fps in 4K on high settings.
    Honestly this entire BS of graphics cards today is just marketing and swag (colors and LED’s on cards as if that matters at all).
    Nvidia will do everything they can to downplay this release again the way that they did with the Fury.

    The 4XX series AMD’s basically should have made Nvidia an irrelevant brand because there is no game you can’t play with a 480 at 60fps atleast. Hell you can even get over 100fps in most games and thats even at 1440p.
    You just cant play all those same games in those resolutions at Ultra. But what most people don’t understand is that Ultra settings are a complete scam made with the graphics card industry to sell you graphics cards. You get about 3% increased visual fidelity at the cost of upto 1000% more performance. And this is done specifically on purpose to artifically drive the need for graphics cards.

    My buddy plays games all day today using my old Radeon 6850 and hes getting 45-60fps in most games and plays with dual monitors. Now he mostly plays HoTS, WoW, LoL and emulators but still we are talking about ~7 generations ago driving dual 1080p monitors and putting out respectable framerates.

  • BaronMatrix

    I have done LOTS of research into this card… I’ve analyzed every review comparing RX 580 to 1070 and Fury X…

    Fury X has 4096 SPs, that’s 56% more than RX580, but the average margin of victory for Fury X is around 15-20%…

    As compared to 1070, Fury does just as well at 4K and very close at 1080p…

    If we assume 40% more throughput than RX 580 (AMD claims much more but it’s unclear if they mean Polaris or Fury) then that puts Vega Core at 1080 levels, not 1070…

    Few new generations are less than 60-70% faster so we can assume at least that much more perf at the same clock and Vega is looking to be 500MHz faster than Fury…

    I think it will end up twice as fast as Fury X in the fastest SKU… Maybe even more…