NVIDIA’s Ada Lovelace architecture has been quite the generational leap. While retail pricing has been met with less than warm responses from customers, there’s no denying the power driving the latest NVIDIA GPUs. How much faster is the new generation of GPUs compared to their predecessors?
Let’s take a look between the cream of the crop for Ampere, the RTX 3090 Ti, against the RTX 4080. While you might think there isn’t much to discuss between the two GPUs, there are some surprising differences to note between them.
RTX 4080 vs. 3090 Ti: Side-By-Side Comparison

Feature | RTX 4080 | RTX 3090 Ti |
---|---|---|
Architecture | Ada Lovelace | Ampere |
CUDA Cores | 9728 | 10752 |
Manufacturing Process | 4 nm | 8 nm |
Core Clock Speed | 2205 MHz | 1560 MHz |
Boost Clock Speed | 2505 MHz | 1860 MHz |
TDP | 320 watts | 450 watts |
Interface | PCI-E 4.0 x16 | PCI-E 4.0 x16 |
Memory Type | GDDR6X | GDDR6X |
VRAM Amount | 16 GB | 24 GB |
Memory Bus Width | 256-bit | 384-bit |
- 16 GB of GDDR6X VRAM
- 2.51 GHz GPU clock speed
- PCI Express 4.0 support
- Great for AI work
- 9,728 NVIDIA CUDA Cores
RTX 4080 vs. 3090 Ti: What’s The Difference?
There have been steady improvements across the board with NVIDIA’s latest GPU architecture. Where the RTX 4080 might be thought to be deficient, it certainly exceeds even the example set by the RTX 3090 Ti. This is surprising in all honesty, the RTX 4080 isn’t NVIDIA’s most powerful offering for the current generation of GPUs.
Performance
Despite the specs suggesting a gap in performance, the RTX 4080 acquits itself quite ably against NVIDIA‘s former flagship GPU. Despite the differences in VRAM amounts and the memory bus, the RTX 4080 outclasses the 3090 Ti. Actual benchmarks using commercially available games are the most revealing, where the 4080 on average performed 15% faster than the RTX 3090 Ti.
On the same test bench using both GPUs saw a wide difference in performance at 4K with ultra settings. Games like Cyberpunk 2077 sail to a stable 60 frames per second at 4K Ultra, a sizable increase from the 3090 Ti’s 50 to 53 frames per second. This is seen across other games as well, with demanding titles like Red Dead Redemption 2 also maintaining a stable 60 frames per second on the 4080.
Despite the gulf in raw specs, the RTX 4080 is faster overall. The memory and core clocks outstrip the performance on offer by the 3090 Ti. It is honestly kind of incredible the leap NVIDIA has been able to take their GPUs in this generation.
Power and Pricing
The RTX 4080 requires less power to actually drive it. While it still needs the external power connectors from the PSU, it draws 320 watts of TDP. Contrast this to the 3090 Ti’s higher TDP at 450 watts. You can get away with a less powerful power supply unit with the RTX 4080, and actually have a more powerful GPU to boot.
Other specifications would indicate the RTX 4080 should perform less ably with games and other tasks than the RTX 3090 Ti. The only real stickler between the two GPUs currently is pricing. There is simply no getting around the pricing increase imposed on the Ada Lovelace GPUs by NVIDIA. With this in mind, you’re likely going to be spending $1,000 less purchasing the RTX 4080 over picking up the 3090 Ti.
While the 3090 Ti was the most powerful GPU on the market upon its introduction, the 4080 has replaced it for all intents and purposes while not even touching upon the RTX 4090. It has more power behind it in terms of raw performance, it uses less power, and it costs less. While previous generations of GPUs make for a good choice for the budget conscious, it is hard to actually recommend the 3090 Ti when the 4080 has so much to offer.
AI Performance
NVIDIA has been one of the leading forces behind AI and machine learning development. This is thanks in part to even their lowest GPUs having CUDA cores, which allows for parallel processes on hardware to be performed at far faster compute rates than a CPU could handle.
The gulf between the two GPUs isn’t quite as pronounced when using the Topaz AI suite to process images. The RTX 4080 edges out the 3090 Ti, but it is a fairly tight race. Image processing can be a stringent and demanding process. Both of these GPUs are game for the task, but all said you’ll likely want to lean toward the RTX 4080. It is a better performer overall, and it’s a good bit cheaper for those piecing together their own AI rigs for smaller research and development processes.
- 24GB of G6X memory
- 2.4 GHz GPU clock speed
- Powered by Ampere—NVIDIA's 2nd gen RTX architecture
- Enhanced Ray Tracing Cores, Tensor Cores, and new streaming multiprocessors
RTX 4080 vs. 3090 Ti: 6 Must-Know Facts
- The RTX 4080 only measures 310mm in a case
- RTX 4080 Supports HDMI 2.1
- The 3090 Ti needs 1000 watt power supply unit to function with other components
- The RTX 4080 has the same number of ROPs as the RTX 3090 Ti
- RTX 3090 Ti is the most powerful Ampere card on the market
- The RTX 3090 is ideal for workstations and gaming
RTX 4080 vs. 3090 Ti: Which One is Better? Which One Should You Choose?
The RTX 4080 represents a massive generational leap for NVIDIA’s GPUs, and it outclasses the RTX 3090 Ti in all considerations. It gives better gaming and AI performance, uses less power, and most importantly costs far less than the 3090 Ti. NVIDIA’s RTX 3090 Ti may have ruled the roost when it came to being the most powerful GPU upon its introduction, but the RTX 4080 is the clear winner.
If you’re in the market for a high-powered GPU, there is no reason to choose the RTX 3090 Ti. You’ll be spending far more for a GPU that performs at a lower standard than the RTX 4080.
- $1,508.99Buy Now on Amazon
- 16 GB of GDDR6X VRAM
- 2.51 GHz GPU clock speed
- PCI Express 4.0 support
- Great for AI work
- 9,728 NVIDIA CUDA Cores
We earn a commission if you make a purchase, at no additional cost to you.
11/27/2023 02:11 am GMT - $1,490.00Buy on Amazon
- 24GB of G6X memory
- 2.4 GHz GPU clock speed
- Powered by Ampere—NVIDIA's 2nd gen RTX architecture
- Enhanced Ray Tracing Cores, Tensor Cores, and new streaming multiprocessors
We earn a commission if you make a purchase, at no additional cost to you.
11/28/2023 06:43 am GMT
The image featured at the top of this post is ©Kiklas/Shutterstock.com.