4 reasons why the dual-chip graphic card trend died (2024)

Key Takeaways

  • Dual-GPUs fell out of favor due to compatibility issues with deferred rendering, causing communication gaps and inefficient use of VRAM.
  • Performance issues, including micro-stuttering, and lackluster support from game developers contributed to the decline of multi-GPU setups.
  • Single-chip GPUs improved rapidly, making dual-GPUs an impractical and costly investment compared to regularly upgrading GPUs.

Everybody knows about multi-GPU gaming setups, which were the spectacle of the previous decade. But did you know that mainstream GPUs that had more than one chip on their PCB used to exist? These rendering beasts have been around since 1997, with the Dynamic Pictures Oxygen 402 being one of the first with not two, but four GPU chips on a single PCB.

Companies like 3dfx brought dual-GPUs to fame, and afterward, the mantle was taken up by ATI (later brought by AMD) and Nvidia, which continued to make dual-GPUs until the better part of the last decade. Sadly, you won't find any consumer-grade dual-GPUs anymore. What exactly made the once-revolutionary dual-GPU formula become a mere memory of an era bygone?

4 Deferred rendering

Dual-GPUs just didn't go well with deferred rendering

4 reasons why the dual-chip graphic card trend died (1)

With how quickly the graphics of video games improved after the year 2000, developers started to opt for deferred rendering. Unlike the outdated forward rendering technique, deferred rendering was more efficient at handling a scene with multiple light sources and less taxing on the graphic card, without too much effect on the visual quality. It did this by breaking up the rendering pipeline into multiple stages and using the information from previous stages to avoid any unnecessary rendering.

The beef between deferred rendering and multi or dual-GPUs was that, since the final frame depended on the previous stages of its rendering, the data regarding those stages had to be passed from one GPU to the other. Initially, the PCIe interface was used to pass that data, and eventually, the SLI and Crossfire connectors were added, but even they weren't enough for the sheer bandwidth the communication required. Another issue was presented in the form of inefficient use of VRAM. Since both GPUs had access only to their own VRAMs, each VRAM had to store the same set of textures to allow both GPUs to render the same scene. This meant having 2x 4GB GPUs would give you only 4GB of effective VRAM.

Both these issues existed in dual-GPUs as well. Despite both chips being on the same PCB, a communication gap existed between the chips and their VRAMs. This gap could've been reduced with enough effort, at least on the graphic cards with both chips on the same PCB. However, due to the lack of incentive, Nvidia and AMD didn't pay much heed to it.

3 Performance Issues

The ever-present and infamous micro-stuttering

4 reasons why the dual-chip graphic card trend died (2)

Among the issues that multi and dual-GPUs had, arguably the most notorious was micro-stuttering. This was due to how both GPUs divided the workload of rendering frames, and unless you added a third or fourth GPU to your setup without proper support from developers, there was no workaround to it.

With the release of DirectX12, support for multi-GPU was actually improved. Yet, this API is often credited with being the final nail in the coffin for multi-GPU setups. That's because while DirectX12 could significantly increase the performance of multi-GPU setups in games, all the work needed to be done by game developers. With only a minute percentage of gamers rocking setups that could leverage this, devs had no incentive to put in extra hours and money adding a feature that only a few people could take advantage of. The few DX12 games that added support for multi-GPUs, like Ashes of the Singularity and Shadow of the Tomb Raider, ran well on such builds.

Even if we ignore the expensive price tag and the power-hungry nature of a multi-GPU setup, the failure of the gaming community to readily adopt this technology led to lackluster support from game developers, which led to performance and compatibility issues, which led to people avoiding multi-GPU setups entirely. It was a vicious cycle that would only end with the death of multi- and, consequently, dual-GPU graphic cards.

2 Rapid improvement and innovation in single-chip GPUs

Single-chip GPUs were the smarter purchase

In 2014, Nvidia released its last ever dual-GPU, the Titan Z, for $3000. This was a dream card for many, boasting 5 TFLOPS of FP32 calculation rate and 336 GB/s of memory bandwidth. Merely four years later, in 2018, the RTX 2080 Ti was released for less than half the price of the Titan Z, with 13.45 TFLOPS of FP32 and 616 GB/s of memory bandwidth, and had revolutionary features like DLSS and raytracing.

With single-chip GPUs improving so rapidly, investing such an absurd amount of money in a dual-GPU that might not stay relevant for more than a few years makes no sense, so periodically upgrading GPUs is the better choice. Hence, even if GPU manufacturers made dual-GPUs, only enthusiast PC gamers would purchase them, which wouldn't lead to many sales, nor would it be enough to incentivize game developers to optimize their games for such graphic cards.

Related

Best GPUs in 2024: Our top graphics card picks

Picking the right graphics card can be difficult given the sheer number of options on the market. Here are the best graphics cards to consider.

1 Dual-GPUs stopped being relevant

They lost the advantage that their legacy once had

4 reasons why the dual-chip graphic card trend died (4)

While dual-chip GPUs were once priced competitively and provided a performance increase, this soon became a race to see which company could create the most power-hungry and expensive GPU. During their peak, dual-GPUs provided much better performance for money. For example, the GTX 295 – a dual-GPU that MSRPed for $499 and combined features of GTX 260 and GTX 280 – gave better performance than 2x GTX 260 in SLI, a venture which would set you back $900.

Nvidia's Titan Z in 2014 was nothing more than an exercise in hubris. With a price tag of $3000 and performance worse than two GTX 780 in SLI, each of which cost only $650, it was an utter failure. AMD's R9 380x2 was a similar story; requiring four 8-pin connectors and a TDP of 580W, it was destined to fail. While these cards did grab attention, hardly anyone bought them. Bad multi-GPU support meant that only one chip in these GPUs would do the heavy lifting in most games, making these cards extremely bad value for money, and since single-chip GPUs of those generations were adequately powerful, ran quieter while using less power, and were less expensive, everybody went in that direction. The only real advantage dual-GPU had was the space they occupied, as instead of using two separate GPUs, one in each PCIe slot, you got the same functionality through a single PCIe slot.

Will dual GPUs make a comeback?

You'll be surprised that dual GPUs are still being made today. AMD's Radeon Pro Vega II Duo used in Apple's Mac Pro is an example. Multi-GPU setups also exist, but not how we used to know and love them. Only professionals and industries that require such GPUs and setups use them, and thanks to AMD's mGPU support, the very few gamers who still want to utilize two GPUs can do so.

But for gaming, it's safe to say that dual-chip GPUs won't be making a comeback anytime soon. That is, not in the way that we would think. The CPU and GPU industry is heading toward chiplets, with AMD leading the way. In its current state, this is in no way equal to having two GPUs on the same die since the main chip is broken down into functional blocks to reduce production costs. But maybe in the future, chiplets might evolve to a point where we could see something that actually counts as a dual-GPU.

  • GPU
  • Gaming

Your changes have been saved

Email is sent

Email has already been sent

Please verify your email address.

You’ve reached your account maximum for followed topics.

Manage Your List

Follow

Followed

Follow with Notifications

Follow

Unfollow

Readers like you help support XDA. When you make a purchase using links on our site, we may earn an affiliate commission. Read More.

4 reasons why the dual-chip graphic card trend died (2024)

FAQs

4 reasons why the dual-chip graphic card trend died? ›

Dual-GPUs fell out of favor due to compatibility issues with deferred rendering, causing communication gaps and inefficient use of VRAM. Performance issues, including micro-stuttering, and lackluster support from game developers contributed to the decline of multi-GPU setups.

Why did dual GPUs fail? ›

Dual-GPUs fell out of favor due to compatibility issues with deferred rendering, causing communication gaps and inefficient use of VRAM. Performance issues, including micro-stuttering, and lackluster support from game developers contributed to the decline of multi-GPU setups.

What is the disadvantage of dual GPU? ›

These include: Running two cards close to each other consumes more power and produces more heat and additional noise. Dual-card compatibility with all games is not guaranteed. SLI and CrossFire can sometimes cause onscreen micro stutters, making the video look choppy.

Why is my graphics card suddenly dying? ›

Here are a few reasons a GPU can completely die: GPU components failing prematurely due to faulty manufacturing. Incompatible installation of the graphics card. Static overload while installing the graphics card.

What caused the graphics card shortage? ›

Surges in cryptocurrency prices led to a boom in mining activities, increasing the demand for GPUs. Cryptocurrency miners often buy GPUs in bulk, exacerbating the shortage. Moreover, the profitability of mining has made GPUs a hot commodity, leading to inflated prices.

Why is graphics card failing? ›

Overheating: Excessive heat can damage the GPU and other card components. Driver issues: Outdated or corrupt drivers can cause performance problems. Physical damage: Mishandling or impacts can damage the card. Power supply issues: Inadequate power can affect graphics card performance.

Why is multi-GPU dead? ›

The death of multi-GPU

Ultimately, the direction of gaming hardware and software trends rang the death knell for multi-GPU setups. As games grew exponentially more complex, developers had little incentive to optimize for the shrinking multi-GPU market. The extra work for a minimal payoff wasn't worth it.

Is dual GPU better than single? ›

Increased Performance: In certain scenarios, dual GPUs can indeed provide a substantial performance boost, especially in applications that are optimized for multi-GPU configurations, like 3D rendering, video editing, and specific games.

What are the disadvantages of a GPU? ›

Disadvantages of GPUs compared to CPUs include: Multitasking—GPUs can perform one task at massive scale, but cannot perform general purpose computing tasks. Cost—Individual GPUs are currently much more expensive than CPUs. Specialized large-scale GPU systems can reach costs of hundreds of thousands of dollars.

Is 2 GPUs overkill? ›

The ideal graphics card count depends on your use case: 1-2 GPUs - Ideal for most high-end gaming rigs.

When GPU is dying? ›

Signs Your GPU is Dying

Your screen freezes or goes black. You experience lags or frame rate drops. Your screen glitches. You see strange artifacts, like dots, lines, and patterns.

Why do GPUs go bad? ›

Common Causes of Video Card Failures

Often, overheating from dust or lint in your computer is to blame. Other factors can include faulty installation to the motherboard, frequent overclocking, or a power surge from an electrical outage. Just like everything else in your computer, your GPU is subject to wear and tear.

What causes graphics card to crash? ›

Causes of Graphic Card Crashes

Overheating: Excessive heat buildup can lead to crashes and performance degradation. Outdated Graphics Drivers: Using obsolete or incompatible drivers can trigger instability. Hardware Issues: Faulty GPU connections, power supply problems, or defective components.

Will AI cause a GPU shortage? ›

Impact of AI on GPU supply

The rising demand for powerful processors due to the widespread adoption of AI technologies in sectors like healthcare has caused significant supply chain challenges. Specific cases highlighting the GPU shortages linked to AI developments have been reported recently.

Who is buying up all the GPUs? ›

Zuckerberg's Meta Is Spending Billions to Buy 350,000 Nvidia H100 GPUs. In total, Meta will have the compute power equivalent to 600,000 Nvidia H100 GPUs to help it develop next-generation AI, says CEO Mark Zuckerberg.

Why did GPU prices drop? ›

Three years later, Nvidia's RTX 30-series GPUs remain capable cards, and prices have come down now that the supply chain shortages and cryptocurrency mining are over.

Why did SLI and CrossFire fail? ›

By the mid-2010s, there was barely any space inside your average chassis to include multiple graphics cards without taking a hit in the thermals. As newer APIs started making the rounds, it became even more difficult for game developers to optimize their games for SLI/CrossFire setups.

Why was SLI discontinued? ›

Nvidia SLI is dead for the reason they did not want people to buy cheaper cards to have the same performance of a high expensive one. Also because the real hard core gamers got soft. Like in wanting the game to support it in settings. That was the biggest sorta thing I have seen over the last 20 years.

Will dual GPU come back? ›

So, while the days of having multiple GPUs for graphics rendering (we still do that for other types of jobs) or having two discrete chip packages on a single card are unlikely to ever return, the future looks more and more to be built from arrays of GPUs on a single die, acting as one.

Why was NVLink discontinued? ›

Jensen stated that the reason behind removing the NVLink connector was because they needed the I/O for "something else," and decided against spending the resources to wire out an NVLink interface.

Top Articles
20 Real Data Entry Jobs From Home Without Investment
SiteKey Help & FAQ from Bank of America Small Business
Jail Inquiry | Polk County Sheriff's Office
Cintas Pay Bill
Booknet.com Contract Marriage 2
50 Meowbahh Fun Facts: Net Worth, Age, Birthday, Face Reveal, YouTube Earnings, Girlfriend, Doxxed, Discord, Fanart, TikTok, Instagram, Etc
Mustangps.instructure
Achivr Visb Verizon
Optum Medicare Support
Devourer Of Gods Resprite
Catsweb Tx State
Missing 2023 Showtimes Near Lucas Cinemas Albertville
Los Angeles Craigs List
Craigslist Pets Athens Ohio
Erskine Plus Portal
Mile Split Fl
Rachel Griffin Bikini
De beste uitvaartdiensten die goede rituele diensten aanbieden voor de laatste rituelen
bode - Bode frequency response of dynamic system
Hyvee Workday
Fsga Golf
Busted Mcpherson Newspaper
Anotherdeadfairy
Inbanithi Age
Обзор Joxi: Что это такое? Отзывы, аналоги, сайт и инструкции | APS
Cal State Fullerton Titan Online
Times Narcos Lied To You About What Really Happened - Grunge
WOODSTOCK CELEBRATES 50 YEARS WITH COMPREHENSIVE 38-CD DELUXE BOXED SET | Rhino
Boneyard Barbers
Autozone Locations Near Me
RALEY MEDICAL | Oklahoma Department of Rehabilitation Services
Cal Poly 2027 College Confidential
How much does Painttool SAI costs?
Gateway Bible Passage Lookup
Devon Lannigan Obituary
Walmart Car Service Near Me
Mudfin Village Wow
How I Passed the AZ-900 Microsoft Azure Fundamentals Exam
Bmp 202 Blue Round Pill
Tropical Smoothie Address
25 Hotels TRULY CLOSEST to Woollett Aquatics Center, Irvine, CA
Scott Surratt Salary
Quest Diagnostics Mt Morris Appointment
Craigslist Sarasota Free Stuff
Solving Quadratics All Methods Worksheet Answers
Diamond Spikes Worth Aj
sin city jili
Southwind Village, Southend Village, Southwood Village, Supervision Of Alcohol Sales In Church And Village Halls
2121 Gateway Point
Att Corporate Store Location
Latest Posts
Article information

Author: Tyson Zemlak

Last Updated:

Views: 5513

Rating: 4.2 / 5 (63 voted)

Reviews: 86% of readers found this page helpful

Author information

Name: Tyson Zemlak

Birthday: 1992-03-17

Address: Apt. 662 96191 Quigley Dam, Kubview, MA 42013

Phone: +441678032891

Job: Community-Services Orchestrator

Hobby: Coffee roasting, Calligraphy, Metalworking, Fashion, Vehicle restoration, Shopping, Photography

Introduction: My name is Tyson Zemlak, I am a excited, light, sparkling, super, open, fair, magnificent person who loves writing and wants to share my knowledge and understanding with you.