With the launch of the Nvidia 30XX graphics cards earlier this week certainly giving us a lot to digest in terms of features and performance potential, I don’t think it’s unfair to say that there were a number of significant points (when it comes to a more technical application) that were either not discussed or not particularly well elaborated upon.
Well, following an official Q&A on Reddit, if you did have any particular questions in regards to these new graphics cards while it’s too late for you to ask now, the good news is that Nvidia might have some answers ready for you!
While a lot of questions were asked, we’ve whittled them down into some of the more interesting or significant points. If you do, however, want to check out the full thread, click on the link here!
Q –Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?
A – We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price. In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples – if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory. Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.
Q – Does Ampere support HDMI 2.1 with the full 48Gbps bandwidth?
A – Yes. The NVIDIA Ampere Architecture supports the highest HDMI 2.1 link rate of 12Gbs/lane across all 4 lanes, and also supports Display Stream Compression (DSC) to be able to power up to 8K, 60Hz in HDR.
Q – Will there be a certain ssd speed requirement for RTX I/O?
A – There is no SSD speed requirement for RTX IO, but obviously, faster SSD’s such as the latest generation of Gen4 NVMe SSD’s will produce better results, meaning faster load times, and the ability for games to stream more data into the world dynamically. Some games may have minimum requirements for SSD performance in the future, but those would be determined by the game developers. RTX IO will accelerate SSD performance regardless of how fast it is, by reducing the CPU load required for I/O, and by enabling GPU-based decompression, allowing game assets to be stored in a compressed format and offloading potentially dozens of CPU cores from doing that work. Compression ratios are typically 2:1, so that would effectively amplify the read performance of any SSD by 2x.
Q – Does RTX IO allow use of SSD space as VRAM? Or am I completely misunderstanding?
A -RTX IO allows reading data from SSD’s at much higher speed than traditional methods, and allows the data to be stored and read in a compressed format by the GPU, for decompression and use by the GPU. It does not allow the SSD to replace frame buffer memory, but it allows the data from the SSD to get to the GPU, and GPU memory much faster, with much less CPU overhead.
Q – Will PCIe 3.0 bottleneck the RTX 3090? Concerned because my Intel system does not support 4.0.?
A – System performance is impacted by many factors and the impact varies between applications. The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance.We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases.
Q – Please clarify if the slide saying RTX 3070 is equal or faster than 2080 Ti is referring to traditional rasterization or DLSS/RT workloads? –Very important if you could clear it up, since no traditional rasterization benchmarks were shown, only RT/DLSS supporting games
A – We’re talking about both. Games that only support traditional rasterization and games that support RTX (RT+DLSS).
Q – Will the new GPUs and RTX IO work on Windows 7/8.1?
A – RTX 30-series GPUs are supported on Windows 7 and Windows 10, RTX IO is supported on Windows 10.
Q – How bad would it be to run the 3080 off of a split connector instead of two separate cable. would it be potentially dangerous to the system if I’m not overclocking?
A – The recommendation is to run two individual cables
Q – Any idea if the dual airflow design is going to be messed up for inverted cases? More than previous designs? Seems like it would blow it down on the cpu. But the CPU cooler would still blow it out the case. Maybe it’s not so bad.
A – The new flow through cooling design will work great as long as chassis fans are configured to bring fresh air to the GPU, and then move the air that flows through the GPU out of the chassis. It does not matter if the chassis is inverted. The Founders Edition RTX 3090 is quieter than both the Titan RTX and the Founders Edition RTX 2080 Super. We haven’t tested it against specific partner designs, but I think you’ll be impressed with what you hear… or rather, don’t hear. 🙂
So, with a lot of surprisingly detailed answers from Nvidia there, hopefully, this will have answered a lot of technical questions you may have had surrounding their new 30XX GPU series. Rest assured though, any remaining questions you may have will, more than likely, be addressed in the next couple of weeks! For us though, we simply can’t wait to get some of these strapped down onto our test bench!
What do you think? Are you interested in getting a Nvidia 30XX graphics card? If so, which model is most tempting to you? – Let us know in the comments!
Despite Helldivers II's popularity, fans have long felt the game lacked collaborations. Nearly a year…
The anti-cheat system in Call of Duty: Black Ops 6 and Warzone has not met…
The NVIDIA app, which recently replaced GeForce Experience, has gained popularity for its revamped interface…
AMD is gearing up to expand its CPU lineup in early 2025, with recent leaks…
Following the leak of AMD's flagship laptop CPU, another processor from the AMD Kraken Point…
DeepCool has just announced the ASSASSIN IV VC VISION CPU cooler, the latest in its…