✨ We've just launched our NEW website design!

Learn More Here
Graphics Cards

Sapphire Nitro+ RX 480 OC 8GB Graphics Card Review

Noise, Power Consumption and Temperatures


Noise

During idle states, the graphics card’s 0dB fan mode comes into operation and allows for silent running providing other components emit very little noise. This an excellent addition and ensures the system noise remains as low as possible when viewing online content, and other basic tasks. Annoyingly, a driver bug prevented me from accurately assessing the load noise output which is the reason why there’s no data being displayed below. AMD’s latest driver automatically overrides Sapphire’s tuned fan profile which results in ridiculously fast fan speeds above 2300RPM. Apparently, it’s because AMD targeted a 65C delta instead of 75C which the Sapphire model’s cooling apparatus is based upon.

Power Consumption

When stressed, the total system wattage is just over 300 watts although it drops below this amount on a regular basis. While this isn’t the most efficient unit we’ve tested, it’s still a low figure given the X99 test system’s large power draw.

tdp

Temperatures

As previously mentioned, the Sapphire Nitro+ RX 480 OC’s revised cooling solution is a huge upgrade over AMD’s reference design and maintains a very respectable 75C average temperature.

temmps

Previous page 1 2 3 4 5 6 7 8 9 10 11 12Next page

Related Articles

151 Comments

    1. I’m afraid the card is going back to the manufacturer very soon, and time is quite limited when doing testing.

      Thanks.

  1. Complaining about RGB lighting is perhaps the stupidest thing I’ve ever heard of, when you consider that turning off the lights or setting it to whatever color you prefer is a viable option. I believe you were just nit-picking for nit-picking sake tho.

    1. I was criticising the idea of just putting on RGB lighting and relying on it instead of trying to create a new design in other areas. I quite like subtle lighting and this is a subjective preference. If you enjoy lighting that’s great and you’re right that it can easily be turned off. In the review, I praise Sapphire’s implementation of RGB lighting and discuss how I personally enjoy it more than others due to the simplistic design.

    1. No problem, the maximum reading I monitored was 79C. Please note, the fan profile bug made the fans run rather fast so your experience may differ once this has been fixed.

      1. It reached 79C despite the high fan speed bug under loads? That’s not very promising for the cooling in place. Just 3C lower than the average of the reference cooler. Was expecting it to be more firmly under 70C like how well the R9 380 did. Could it be a bad factory thermal paste job under the heatsink assembly?

        1. To be fair Gamer’s Nexus and their scratch built AIO RX480 had a temp and fan speed reading glitch also, so maybe it’s just GPU Z/Afterburner fudging a sensor reading?
          But yeah, I too was expecting something around the 69-72°C mark.

          1. Yeah the thermal readings are based off of a calibration done to the formula that uses change in voltage readings from thermal diodes (force current – measure voltage). If Tmon is calibrated wrong readings will be wrong. Sometimes the readings are fine at a certain low temp range but the moment the temps go up there needs to be correction favored into the final value to give more accurate readings.

            Has anyone done FLIR analysis of the back of the card?. Nitro RX 480 reaching 79 peakC and average of 75C is just hard to digest for a card that has lower power draw than even the R9 380. That cooling looks well done on paper, it should be doing far better than that.

          2. The plot thickens it seems…As for FLIR analysis, I haven’t seen it as the reviewers who received the 480 don’t have them. Ii would assume Tom’s will do it as they usually take off the backplates (they did for the R9 Nitro series anyway)

      1. problem is that has only 4 gigs. When all cards on the market (except 480 4gigs) will have 6+ gigs, then soon those 4gigs may be not enough even for 1080p.

        1. That’s a very fair point! The R9 390X is a great choice and I really like the Sapphire Tri-X OC model.

  2. I was wondering, since the 480 vs 1060 hype is still here, don’t you think it will be justified if you include 1060 cards (and probably reference 480) ??

    1. The only 1060 I had was included, NVIDIA didn’t provide me with a Founders Edition and while I reviewed the RX 480 reference model it’s now in circulation with other reviewers. Sadly, didn’t have it at hand during the review, but I’ve asked AMD to return it pretty fast.

    1. I’d imagine the overclocking headroom to be better once Sapphire’s overclocking utility is updated. During the review, I used AMD Wattman and the voltage changes didn’t seem to apply even when selecting them. Saying that, I doubt there’s going to be huge headroom from other vendors, and the OC headroom is small because the factory overclock is already near the bleeding edge.

      1. Hello. Nice review. So it seems that AIB partner cards don’t play nice with wattman?? In all honesty i do not believe that these cards going to pass 1400 mhz. Atleast until someone proves me wrong. Also, card scales nice with higher clocks.

        1. Hi, thank you so much! Well, it’s probably a little too early to say for sure, but it seems like the overclocking headroom is quite limited. According to Sapphire, there’s not much manual headroom because the factory overclock is quite aggressive. Judging from this, I’d say you’re right and it will be challenging to go beyond 1400MHz. It could be possible with different software like Sapphire’s upcoming overclocking utility. I’ve usually achieved better results using that than AMD’s own overclocking tool. Wattman has a lot of potential, but whenever I adjusted the voltage, nothing changed and now analysing the GPU-Z screenshots, it appears that the voltage increase wasn’t being applied. It’s not going to be easy to suddenly apply huge voltage increases due to thermal limitations.

          1. Anytime. Well, with this type of cooling some adjustments to voltage could be possible but this year FinFets obviously don’t play along with overclocking. AMD side is more obvious. AMD already pushed the GPU high with 1266 boost and that was rarely reached cause of power limits. But this is great chip nonetheless and with great price is nice option for gaming. GTX 1060 is also around so there’s something for everyone. I have some experience with Afterburner on AMD cards (used only radeon cards for years now, that’s way :D) and it is a mess while Trixx proved itself to be great so maybe there is few MHz more with added voltage, but as you said temp gonna skyrocket fast.
            Basically Afterburner does not allow changing voltage on my 270X while it does on every Nvidia card as soon is updated with support for it which is quite annoying thing.

          2. Ah, that’s promising then! Hopefully there’s some good headroom afterall.

      2. Custom bios with unlocked voltage will come soon @1.20-1.25volt i guess max will Be 1450mhz 1500mhz best case

  3. about article – is great really – thank you 🙂 but…
    1. can we hve more Vulkan tests ? with more cards
    2. can we expect more OC results ?
    3. can we expect OC after vBIOS change with better voltage?

    1. Hi, thanks! The Vulkan situation wasn’t ideal since the code was provided to me so close to the NDA. However, I’m going to do an article comparing a whole host of AMD and NVIDIA cards running both Vulkan and OpenGL in Doom. Unfortunately, the sample is being returned very soon, but I should have some more 480s arriving in the next week or so. Honestly, I’d like to spend more time with some other cards before saying if there will be improved overclocking. I’m probably going to use something else instead of Wattman for the future though.

      1. nice to hear. Just would be good to see how older GCN run under Vulkan (CPU headroom, async) and on the other hand with more tesselation in other tests. To check how new GCN is better that the older one when talking about tesselators and ACEs.

    2. No mainstream reviewer other than HardOCP has Doom Vulkan test results, don’t know if this has anything to do with it but Nvidia sent out instructions to reviewers. the RX 480 absolutely kills the 1060 in Vulkan Doom.
      Debates on the independence of these reviewers has already started, and so it should as it does look very suspect, they all have their excuses.

      1. HardOCP is not reliable for me, they doing exactly the same for AMD what Tom’sHardware is doing for NV – the truth is in the middle… more or less.
        The most similar result to the overall average had ComputerBase when reviewing 480 vs 1060.

        1. nah, if you look at [H]s history, they ragged pretty hard on AMD. this might be the first positive review from them regarding amd in quite a while.

          Their cardinal sin, imo, was their VERY limited size of games.

      2. I dont hink it has to do with bias. John himself said that he got the Doom code pretty close to the NDA lift.
        Its realistic to assume that he was not the only one with this problem and maybe the reviewers in question just couldnt manage to conduct enough tests to have some conclusive and compelling charts to show.

        John himself had to resort to a really tiny sample size because of that.

    1. objective would be 60 games benchmarked like PCgamer and others sites do, or at least a pick of reasonable titles.

          1. The thing is that 5-10-15-20, it doesn’t really matter if you don’t choose the games carefully.

            For example. Techpowerup is a site that everyone checks among others. They are also including 15 games. Well guess what. Most games are old in their collection. They do NOT have Doom, which is just a first indication about Vulkan performance, they do NOT have DX12 games, other than Tomb Raider. They even run Hitman in DX11. Techpowerup’s benchmarks are good, if you living in 2014-2015.

            On the other hand, eteknix’s are better suited for having an indication what a card can do in 2016-2017. Probably the only thing that Eteknix needs, is to add GTA V in their benchmarks, to balance their suite by adding a DX11 game that everyone knows it is more optimized for Nvidia’s hardware. Because while Vulkan and DX12 are the APIs of the future, which means that this review will still be relevant in a year from now, new APIs do favor AMD cards at the moment.

      1. As mentioned in the review, I received the Doom code the day before the NDA lifted and the GTX 1060 had already been sent to another reviewer. Therefore, it seemed the fairest way to include the GTX 980 and RX 480 and ensures I had some Vulkan data to share

        1. please buddy, don’t make reviews if you don’t have the resources, and don’t make excuses. No problem with you or the review, but that excuse doesn’t make any sense.

          1. Erm how is that an excuse? Do you know how long it takes to test each card mulitple times at different resolutions? Then do photographs, a write-up. Also, even the most popular reviewers going on an almost celebrity status do not get unlimited access to every sample and get to retain it. Part of being a reviewer involves a lot of behind the scenes issues, trying to acquire samples, knowing if they need returning. I’m sorry but if I needed to use excuses just to avoid doing work, I would have stopped reviewing because that’s not in my character. Please try to understand that as reviewers everything doesn’t go smooth, we aren’t showered in samples, and often work at a very tight time-frame.

          2. I understand everything, but this thing is serious, you have to make it in the best way possible or not do it at all. Also if you do something do it right, do it fully or don’t do it. No body forced you to be a reviewer, it’s your job. Sorry is what i think and not just about you or reviews but in every job.

        2. Kudos @disqus_0ziHiUsmBs for actually using Vulkan the proper way, a lot of other reviewers are still sticking to OpengGL while some wont even acknowledge they haven’t enabled TSSAA8X for proper Aysnc Compute. Those folks clearly have an agenda, perhaps they are religiously following certain green reviewer guides.

    1. Looks like “new” Maxwell 2.0(Pascal) has trouble in Vulkan,so nvidia has one Order: “Don’t test Maxwell 2.0 in Doom Vulkan.Its forbidden.If you test that,then no more free graphic cards.”

      Nvidia do software Async (pre-emption) in Maxwell 2.0,but they “forget” to do it in Maxwell 1.As we see the new card has that “magical” driver for Async,old one not.

      Shame on them.

    2. I mean Nvidia users might still be using dx11 and opengl if that slaps the rx480…no issue there lel

        1. ye….there is no bottleneck unless you are on a socket 775 dualcore/quadcore or something…
          Running my 2600k at 4.5ghz and there is still no bottleneck and will not be any bottlenecks with my gears lel

          1. Well for anyone wondering, DOOM (demo) now has Vulkan as well. If you are using Nvidia, don’t bother. Nvidia fails at modern APIs. If using AMD, you should get a nice boost in performance, so crank up dem details!

  4. When stressed, the graphics card consumes just over 300 watt
    Na bs its a 100 watt gpu,
    the 256 bit memory buss uses 50% of the power.
    300 watts no chance

    1. it was worded incorrectly. it was TOTAL system power of 300 watts. the Nitro uses approx 150-165 watts.

      1. It doesn’t add up. It looks like Heaven bench is especially hard on AMD cards, or taxes the CPU heavily if an AMD card is used.
        Looking at the chart, the card at idle consumes an extra 10 watts (83 W vs. 73 W). I suppose GTX 1060 consumes 8 watts, and RX 480, 18 watts. So far, so good.

        But during the stress test, the 1060 consumes an additional 130 W, while Radeon an additional 230 W? Assuming ~60 W power draw of the rest of the system, it would imply ca. 240 W power draw by the Radeon and 130 W by the GeForce. Doesn’t look realistic, especially since RX 480 would exceed at least one power draw limit (total of 225 W allowed from PCI and PGE).

        GeForce 980 is previous generation enthusiast level card and should consume at least as much power, if not more, than RX 480, and yet it’s over 50 watts lower. 390X would consume 320 W. It just doesn’t add up.

        The X99 system is mentioned to have high power draw — how high. What happens during heaven benchmark to the rest of the system? If CPU power draw goes up as well, we could be looking at GTX 1060 consuming less than 100 W during stress testing, which is simply not realistic.

    1. A LOT of these annoying reviewers only test in 1 or 2 games, most of them pointless old water down games. It’s almost as if it was deliberate *cough green marketing guides/dollars? cough*.

      A more respectable site that does a better job is Techspot. While their settings are not always comparable to other reviews they do runs with OCs on atleast 3-4 games. One issue I had with their reference RX 480 review was they didn’t OC the RAM, we already know the RAM OC yields even more performance and scales rather well because Polaris is starved for Memory bandwidth for the first time in a long while as far as GCN cards are concerned.

      1. If readers prefer to see overclocked results showing gaming performance instead of 3DMark that will be looked into. I cannot really see how I can be paid off by NVIDIA since the review is extremely positive, I state to buy this instead of the GTX 1060 and include numerous DirectX 12 benchmarks.

        1. No offense guy, but why would we not? Why even bother with 3Dmark?! It doesn’t use true Async, and it almost never reflects typical results. It. Is. Useless.

          Unbiased games like Metro: Last Light or Crysis 3 would be ideal.

          1. In an ideal world, I’d like to include more games but I also review motherboards, processors, memory, systems and monitors. Therefore, I have to try and optimise my time and select the number of benchmarks accordingly. If you think the site should have a dedicated reviewer for graphics cards I’d recommending contacting the owner with your comments. All feedback is welcome of course.

          2. your response is more than enough to trust you. Hope you can improve the tests, surely you will with the time include more games and everything.

          3. Look buddy, you can look at what I said as an attack – or constructive criticism.

            I never once claimed you were lazy, only that you can do better (Everyone can always improve). Benchmarking programs are really only useful for stability tests considering no one plays them. I overclock A LOT, and I have access to a lot of cards. When it comes to checking overclocking gains games are what matters!

            P.S. Seriously though 3DMark is a joke. Look into the fake Async they have….

        2. Never implied you were paid off but I am more than convinced some others are. I even thanked you for doing the DOOM benches right with Vulkan and TSSAA8X switched on 🙂

          As for the OC performance yeah we really do need to see more games numbers after all that is the point of OC, we must not encourage these pointless benchers who only OC to see pointless numbers on 3D mark. Real world numbers are what most of us are looking for when we OC as this will factor in throttling if any, temps and how that affects real world sustained clocks etc.

          Since you are already going through those games for the stock benchmarks is it really that much of a time consumer to have the same runs repeated with the OCs in place? Atleast for 3-4 games would be good you know.

          Thanks for this review once again!

        3. techpowerup ios paid or mad at amd , some say they didn’t receive a review card so they are mad at them for that.

          people commented this in their forums.

      2. I love Techspot. They have pleasing graphs and a solid pool of benchmarks that isn’t biased towards either side. They also tend to come to good conclusions with solidly written conclusions.

        I also like Techpowerup for their large testing scopes and diverse sample sizes.

        Used to love Tomshardware, but their reviews just completely suck now. Their power consumption graphs are unreadable and they always seem to throw an Nvidia-biased slant into everything.

        1. Techpowerup is also guilty of showing selective settings very suspiciously in those titles were a different a setting/API would have shown the AMD card’s true strength. Prime example would be Hitman 2016, techpowerup claims they used DX11 because DX12 was “buggy”, pure bollocks, if you smell a shill in their claims you’d right!. They don’t want to run Mantle in BF4. Their selection of games are overall questionable as those are more nvidia friendly ones. They do have better methods proper scopes for power measurements, only thing they are worth visiting for.

      3. techpowerup comments section for the review was full of people pointing weird stuff in the review and the admin accused one of trolling and banned him just for “not making constructive critics” so, by techpowerup for me.

        1. Yeah techpowerup is now a place to avoid they have made their biases very clear, they haven’t even tried to come up with a passable lie whilst using settings that are overly nvidia friendly, if they are going to do that they need add in all those games with AMD friendly settings on balance to portray no bias either way. Nope, they are not into that, we know who they aim to please now which is a true shame.

        2. yea, what a shame. i loved their PSU reviews, now i have trouble taking them seriously. i understand that they need to make money to survive, and probably taking money form nvidia, but they are taking this too far.

      4. I dunno why no reviewers aren’t even aware of undervolting yet. It helps bring power draw significantly and even boosts your fps.

        1. They don’t want to spend time to tweak and explore, they really should. It can make for a nice review article on just that aspect.

          1. Also would be helpful for most average users to become aware they can actually do this and get benefit.

    1. 1.25V will destroy the card over a short period of time. i speak from experience having killed two HD6970s. 14nm transistors were not designed to operate at that voltage. and i can’t even imagine how bad the power draw at the PCI E lane is at that voltage.

      1. AMD said it will give the user the best possible tension control. I do not know how it will work, but do not think AMD go release something that damages the card over time. I am obviously considering a factory solution, anything outside that can bring risks.

  5. buddy… erm… You shouldn’t put it on automatic when you overclock stuff. Do you even know how to overclock?
    Always auto will increase power and voltage more then it really is required. I tried setting “auto” on voltage for CPU once and man voltage went through the roof even on slight clock increase. So it’s a no go if you want to really overclock something. You need to incrementally increase voltage while power limit is on max then test and trial n error kind of way see how much you’ll actually reach. I bet it can be 1400 without modding.

    1. Maybe I didn’t explain this well. I selected different voltages and even the maximum possible setting in AMD’s Wattman software had zero effect compared to automated voltage. It’s just the screenshots were taken when automatic was enabled. Of course, I always manually increase the voltage as seen on other reviews, but Wattman only offers set numbers instead of a gradual increase. To be clear, I also review processors/motherboards, and would never use automatic voltage or advise people to simply copy my settings. Honestly, the Wattman software caused me some problems and despite selecting through the higher wattage presets, it did absolutely nothing to improve the OC headroom. In the future, I’ll be using different software because it’s really caused some confusion.

  6. I preordered this card but I may cancel the preorder and just spend another £50 on the Fury. It seems the 480 custom isn’t the holy grail, this time.

    1. where on the planet are you finding a fury for just 50pound more? a fury is easily 150-200 pound more

      1. Only if he’s in Portugal/Spain.
        We have a retailer here selling the last couple Fury X for 299 euros. Quite a steal.

          1. Estava a tentar encontrar o link outra vez do site mas já o perdi.
            Alguém postou num dos podcast dos youtubers Kyle & Paul’s Hardware. Era as últimas em stock e já foi a uma semanita pelo menos.

      1. Funny thing is the so called Hyper and Ultra settings that call for more than 4GB of RAM are giving next to nothing in terms of visual eye candy. ME Catalyst is one good example, you’d struggle to notice anything significant between Ultra and Hyper. It’s like trying to run 4X MSAA on 4K resolutions, pretty useless. At 1440p Fury cards are a great buy now given the price reductions.

        But if we really think about it, none of the Furys were 4K cards anyway, apart from AMD’s marketing slides that is. Even the Titan X Maxwell wasn’t able to do 4K @ 60 fps for demanding titles, same goes for GTX 1080/70. The new Titan X Pascal may well be the first true 4K card to achieve that.

        There is a really lame trend now in games, if the ‘high’ settings are already giving the level of detail the developers originally targeted for the performance market, the so called Ultra settings are nothing more than contrived loads just to make a case for more hardware on GPU with virtually no improvement in game visuals. It’s a bit like those cheap games where they cant come up with an original idea to spice of up difficulty levels so they’ll make the difficulty higher by putting unreasonably high number of bosses from previous levels to come at you at an unfair and nigh on impossible rate that no human can deal with.

        If there were really big eye candy differences to be had that makes you go wow this is just awesome and I can’t go back to the high settings anymore, then yeah 4GB plus is needed. I guess the next true jump in visual fidelity will only happen for mainstream once the new higher end consoles will be out, even then it takes a while for developers to get the max out of the new gen consoles to push new novel ideas in rendering.

        1. You need 4k to really justify the ultra textures included in games for sure. If you don’t use filthy “post processing” and smudge fest AA methods like FXAA, then it makes the most difference too.

          It’s more just allowance to load a lot into the ram on the gpu. Eventually we are going to have the program 100% loaded into system memory (vram / dram replacement in future, probably hbm2.) It makes sense from a performance perspective, we already see databases loaded into huge banks of ECC RAM on the business end.

          It’s just hardware economics as to why we don’t run everything this way already. It’s not a big deal though tbh. All the new cards will be 8gb and that is the defacto standard of games for ultra for this generation as dictated by the hardware. Devs optimize for GCN now because the conosles use it and so do an increasing number of things going forward.

          “Volta” is essentiually Nvidia GCN compute based. Like old kepler. A sync and shader intrinsic functions ready to rock. The game is changed and AMD is in the driving seat now thanks to the API wars. Maybe NVIDIA can make some headground on Vulkan with Volta going forward, however all signs point towards Vega dominating till they can respond.

          Pascal titan will be the last true Milkening though. AMD is going to right this wrong (Nvidia’s behaviour in the market towards consumers and particularly lower end and mainstream – gtx 1060 launched cheaper because of rx480, ONLY reason)

          1. On the subject of milking, I was checking out some videos on the RX 480 in Witcher 3, AMD have really improved their scaling issue it seems. at Ultra w/o BadHairdoWorks switched on it’s matching and even at times going higher than what any of the Fury cards are able to at 1080p. It’s only when the resolution goes upto 1440p the RX 480 starts falling back by a larger margin versus Fury cards. Fury cards showed really poor performance at lower resolutions, in contrast shining at 4K and to some degree at 1440p. They had bottle necks elsewhere in the GCN pipeline, looks like those have been addressed with 4th gen GCN. Vega should be a monster. GTX 1060 was a rushed job with it being very close to it’s max fmax for the 16FF+ process just like RX 480 is in terms of fmax headroom on 14nm GloFo. Nvidia is clearly rattled as they have been forced to milk the GP106 silicon to the limits. GTX 1060 makes for a poor choice no matter how much of a Nvidia fan one might be, if one must use Nvidia then GTX 1070 is the only worthwhile card to have. RX 480 is the better buy at sub $300. Of course I am under no illusions here, we have seen before common sense doesn’t win in the markets against aggressive cunning marketing.

          2. 1070 will drop down big time on vega launch and actually be priced to make sense as a small step up over an rx480.

  7. So judging by the comments it seems it might hit 1400Mhz with some tweaking. Too bad their OC software is not ready yet, if I didn’t misread. The high quality materials are a must, though I will probably run it in quiet mode, physical switch is good, if I need more performance in the future will go for higher performance mode.

    With freesync tvs rumored to be on the horizon, this will be ace for a future htpc set up.

      1. FreeSync TVs? I’m unaware of any confirmed TVs adopting FreeSync over either DisplayPort or HDMI

        1. Technically you can pretty much any monitor as a tv these days. 😛
          However it may be tricky making freesync work with a tv signal at first, but I am sure that monitor/tv manufacturers like samsung may incorporate the tech in near future. It’s basically a variable refresh rate, which is part of the vesa standard now. The hdmi version is a custom thing done by amd, but they said they will make the code open source and available for anyone wanting to adopt it or make a version of his own.

  8. Hey, thanks for this great review!

    Just one question from my side:

    If I understand correctly, the 0dB mode allows the GPU to keep the fans completely stopped when under 50ºC. However, from your temperature measurements it seems like the card was actually reaching 52ºC in idle, which seems pretty high to me and in theory should force the fans to spin too.

    Can you clarify how is it that the card reached 52ºC in idle, and at which point did the fans actually start to spin?

    I have an old MSI R6950 Twin Frozr III OC, which is on its way out already, inside a Mini-ITX case… yet it easily idles between 37 and 42ºC, so I’m a bit shocked by the 52ºC shown here.

    Thanks!

        1. let’s look at the title of the review. it’s titled, “Sapphire Nitro+ RX 480 OC 8GB Graphics Card Review”

          hmm. i wonder if there is a clue in there somewhere that gives us a hint to what “this” card may be.

          1. well the “Sapphire Nitro” Costs 269.99 and the gigabyte 1060 costs 249.99 so if u say it costs 280 ofcourse im gonna ask which card because…till now i still dont know which one but…i assume you are refering to the sapphire lol.

  9. This graphics card RX 480 Nitro is very good because it has the best price and more VRAM than the graphics card GTX 1060.

    Sorry fans of the brand Nvidia.

    1. The 1060 is basically priced the same as the rx 480 8gb version and outperforms it in most games. Neither card can use all of its vram effectively. The only case where 8gb would be used would be in crossfire. So if you’re getting a single card, the 1060 is superior. If you want cheap price, then get the rx 480 4gb. Nobody should get the 8gb 480 unless they plan to crossfire.

      1. Directx 11 games:

        Tom Clancy’s The Division:
        http://www.guru3d.com/articles_pages/radeon_rx_480_vs_gtx_1060_fcat_frametime_analysis_review,10.html

        Middle Earth Mordor
        http://www.guru3d.com/articles_pages/radeon_rx_480_vs_gtx_1060_fcat_frametime_analysis_review,12.html

        Far Cry Primal
        http://www.guru3d.com/articles_pages/radeon_rx_480_vs_gtx_1060_fcat_frametime_analysis_review,15.html

        DX 12 games with async on:

        Hitman 2016:
        http://www.guru3d.com/articles_pages/radeon_rx_480_vs_gtx_1060_fcat_frametime_analysis_review,7.html

        Ashes of Singularity:
        http://www.guru3d.com/articles_pages/radeon_rx_480_vs_gtx_1060_fcat_frametime_analysis_review,8.html%5B

        Total Warhammer:
        http://www.guru3d.com/articles_pages/radeon_rx_480_vs_gtx_1060_fcat_frametime_analysis_review,9.html

        Conclusion:

        Conclusion

        “Frametime results tell us a lot about single and multi-GPU setups in the way they interact in framerate, latency and anomalies. You will, however, see less issues on a single GPU setup (obviously). But hey, if there is a problem, rest assured it would be exposed. Comparing apples to oranges, both cards each show minor oddities here and there, it however is a too close to call difference.

        Overall we can say that Pascal with the GeForce GTX 1060 and the Polaris 10 with the Radeon RX 480 performed really well in the eleven FCAT tests we ran it through. Four out of eleven tests were DirectX 12 enabled. All games passed our examination easily without any noticeable stutters or anomalies, and that is pretty good if you ask me. Any game and any title can show some game engine related stuff, but that’s not what we are looking at. The nice 6 GB frame-buffer on the GTX 1060 and the 8 GB on the RAdeon RX 480 obviously helps out quite a bit in the more AA heavy environments.

        For now we end this article, obviously in the future we’ll be monitoring framepacing with other and newer titles. “

        1. Overall, the 1060 performs a bit better than the RX 480. Times when the +4GB VRAM comes into play would be when you’re already experiencing a huge hit in performance due to lack of GPU power – which is why Crossfiring it would be the optimal scenario for the RX480 8GB version. Otherwise, get the 4GB version. I would’ve gotten the RX 480 4gb if there were any aftermarket versions in stock.

          https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/26.html
          I’m most concerned about the more GPU intensive games such as Crysis 3, where the fps difference is more important since you’re below 60 at higher fps.

          https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/14.html

          https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/16.html

          Also the Sapphire Nitro reviews:
          http://www.kitguru.net/components/graphic-cards/zardon/sapphire-rx-480-nitro-oc-4gb-8gb-review/

  10. These 480s will only get much much better as time goes on just like all AMD cards do in recent years, with 95% of upcoming games running on Dx12…going with AMD is the better investment in this price bracket. I noticed lots of folks at other places bringing up the Dx11 points but honestly..people in this price bracket are expecting their cards to last them a few years unlike enthusiast users that upgrade every 12 months and buy top tier cards…so the obvious choice here is AMD cards, no doubt. You would be a fool to choose a 1060 for a long term investment for the simple fact that 95% of upcoming games designed on console hardware utilizing Dx12 or Vulcan APIs which 95% of all PC games are ports of current console games, do the math. These AMD 480’s are designed for Dx12/Vulcan…both of these api’s were born from Mantle…AMD is the safer go in this price bracket.

        1. No.. what’s dumb is Moses claiming many AIB cards would hit 1600MHz and be close to the 1070 performance wise. So I’m wondering where these unicorn RX 480s are. There’s nothing dumb about it.

          PS: is that a dutch wife?

      1. Well the 1060 doesn’t have an sli port and the rx 480 is either scratching the balls of the 1060 or stomping it in games.
        The 1060 only wins in firestrike and similar benchmarks because it is a more powerful card, but on the other hand, the RX480 in cfx is better than the 1080.
        Regardless, benchmarks mean nothing.

        The RX is the better deal in every conceivable way and unlike Nvidia, AMD is heavily focused on stabilising dual GPU setups and future proofing for DirectX 12 games.

  11. Prior to launch I estimated a average OC for the 480 to be about 1.45GHz, seeing the GPU only managing about 1.38 – 1.4GHz at best regardless of manufacturer begs the question; are these GPUs being volted too high? 1.15v is really quite high for 14nm. Someone should try undervolting the GPU to about 1.05v and see what the results are. I also wonder if 14nm is truly ready for prime time or if GCN is too much “brain” and not enough “brawn”. Lean too heavily one way or the other with your architectural design and its going to hamper its performance and overclock ability.

      1. I don’t see much info beyond the odd post claiming better results. People want details, screenshots, proof. Not just claims.

  12. Can anyone confirm that the driver has been patched/fixed for this card to reduce the noise? The whole purpose for me getting this card is to replace my very noise 290X.

  13. I have a question regarding the Zero DB Cooling feature. Silence at idle is a feature I would highly value, so it is great to see Sapphire enabling a complete fan turn off below a set temp – I think its around 52 deg C. However, I have read that the RX 480 NITRO’s heatsink can’t passively keep the GPU temp below the fan’s turn on temp when the card is idling and they have to start spinning to get the card back under the turn-off temp. Did you see this happening yourself during the review, or has anyone had this happen to their card? The Zero DB Cooling feature would lose some of its lustre for me if the fans had to cycle on and off relatively frequently – it defeats the purpose somewhat – and exposes the heatsink/heatpipe assembly to be undersized for the task.

    1. OK, I can answer my own question. I purchased the 11260-01 SKU (NITRO+ OC 8GB) to try it out. I am lovin’ it actually, and very happy to make the shift to team RED. I swapped it for my EVGA GTX 1070 ACX 3.0 (non SC) to see what it was like. I can confirm that the idle temp does creep above 52 deg C, and the fans do come on periodically to cool things down to 47 deg C before it creeps back up again. The max fan speed is reached is 677 rpm, and is not audible. I have attached an image of GPU-Z showing the periodic fan activation. I suspect the heat sink is a little smaller than it could have been, but it still does a good job.

        1. That is quite possible, but I am not a strong advocate for having your customers being your R&D department – the British car industry tried this in the 70s and it was not a great success :-). I am only referring to the ‘silent/fan off’ performance of the card out of the box as set by the Sapphire engineers and product development team – the guys who should know these issues. I do have mine set on the ‘Performance’ BIOS position. I have not yet tried the ‘Silent’ BIOS position yet. As you can see from my GPU-Z trace, the card’s core and memory clocks idle at 300MHz. This is low enough to maintain 47-48 deg C as long as you don’t touch anything. However I noticed that as soon as you move the mouse or move or resize an app open on the desktop (say you have Word, Excel and a browser open and you are swapping between them or resizing them) the core and memory clocks immediately jump up. As they jump up they add a little bit of temp, which does not seem to be passively dissipated by the heat sink – at least quickly. The temps climb up to the 52-54deg C mark until the fans come back on to lower the temp to 47 and the cycle starts again. You can see this in the top two GPU-Z traces for core and memory clock speeds – they are jumping up all the time and this is not during gaming – just desktop productivity work. The fans are virtually inaudible when the do come back on – I needed GPU-Z to indicate that they were actually coming on. I will try the ‘Silent’ BIOS option, but it will frustrate me a little if I need to keep opening the side panel of my case to physically swap between the ‘Silent’ and ‘Performance’ BIOS switch positions depending on whether I am gaming or doing productivity work. Don’t get me wrong, I am really enjoying playing with this card. It is the card that has enticed me back to team RED. I also love the look of this Sapphire card. The only problem is that once its in my rig, I don’t see it’s great looks anymore, I just see its performance and Sapphire’s promise of silent/fan-off operation at low loads could have been slightly better executed. Just for information, I swapped this card for my EVGA GeForce GTX 1070 ACX 3.0 which sat rock solid in the low 30s Deg C with fans off for all desktop productivity work.

          1. The perfect card doesn’t exist.
            But you can get close by first carefully selecting and then buying the one that requires the least bit of tinkering for your purposes. Which you should have listed and weighted.
            It’s a bit like finding out about the edges of the cards. And then determine with which you can live : rough edges (are easy to forgive and live with) and sharp edges (are hard to live with and cause quick or immediate elimination). It’s similar to how most women (subconsciously) select a man.
            Looks like low noise and low temps should have had the highest priority and weighting on your requirements list.

          2. Trying an AMD card after many years of using nVidia cards was actually the highest priority on my list. I chose the Sapphire RX 480 NITRO+ OC card as I really liked the look of it and have never bought a Sapphire card before. Just to reiterate, I am only referring to my personal observation of my card’s ability to remain completely ‘fans off’ under light desktop/productivity loads. All of my case fans (2 x 140 front and 1 x 140 rear) are set to SILENT, and this may have an impact. Apart from this one observation, I am really happy to have bought this card and don’t want to replace it with anything until Vega cards arrive – AMD has certainly given me a thirst for more product.

      1. hi Big Nish’s PC Insights, can I ask you a lil favour?
        I’d want to flash my sapphire nitro+ OC RX 480 4gb with the bios of your graphic card. In the internet I found only invalid bios with 256kb size rather than the right size of 512kb. Can you please email me a copy of your bios? You must use Atiflash 2.74 to extract bios file from your card. Thank thanks thanks 🙂 🙂

        1. Sure – no problem. Do you want the ‘Quiet’ BIOS (1266 MHz) or the ‘Performance” BIOS (1342 MHz), or both?

          1. You have the bios? OMG i have been looking for this all over the internet!!! 😀 you would make this poor soul happy if you pass them to my Email
            Deathmetalero_666@hotmail.com
            Please xXxNoScopeMLGxXx i have the Rx480 nitro+OC 8gb 11260-01 SKU and i need the bios to flash this gpu that has a corrupt performance bios, do a backup with atiflash man please!! 😀

  14. I wish that Sapphire hadn’t kitted out the I/O area with 2 HDMI ports.
    While I can see plenty of reasons why. For example, simultaneously connecting a VR-headset and a surround sound system, or simultaneously connecting a (HDR) TV and a surround sound system.
    I do believe that 3 or 4 DisplayPort ports would have been more fitting. A good quality DP ==> HDMI adapter from Club 3D is not very expensive. With everything still in tact : picture, sound, and HDCP.
    There will be far fewer people connecting that second port to a surround sound system, then there will be leaving that second HDMI port unused.

  15. wtf why is noone talking about sapphire rx 480 8gb and the new release sapphire rx 480 nitro+ which one is better? should I buy rx 480?

  16. Personally, I think a OCd GTX980 and OCd R9 390X compared to a OCd RX480 would be the most useful comparison. On paper the RX480 should be pretty much equal to a GTX980 but theres no proper comparisons to confirm if that is the case or not.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker!   eTeknix prides itself on supplying the most accurate and informative PC and tech related news and reviews and this is made possible by advertisements but be rest assured that we will never serve pop ups, self playing audio ads or any form of ad that tracks your information as your data security is as important to us as it is to you.   If you want to help support us further you can over on our Patreon!   Thank you for visiting eTeknix