Recent silicon shortages have been exacerbated for graphics chips by AI and cryptocurrency.
As many of you are likely already aware, semiconductor demand is significantly outstripping supply at the moment, particularly for ICs fabricated at foundries (versus captive capacity owned by Intel, Samsung, etc.) and double particularly for ICs fabricated on leading-edge lithographies. The problem’s not new, actually, although the downstream supply chain impacts are increasingly evident (in part because of reality, in part because the situation’s now regularly covered by various media and government entities). TVs and other consumer electronic devices, and new cars are just some of the IC-inclusive systems in short supply, and the situation’s expected to get worse before it gets better.
Big picture, multiple factors are behind this shortage, including the following:
The availability situation is even worse with graphics cards and the various components they’re based on. GPUs from AMD and Nvidia—specifically the highest-end and most profitable product families (and members of those families)—tend to leverage the latest-and-greatest fabrication processes, as do the blazing fast and ultra-dense memories that act as the on-board frame buffers mated to those GPUs. Demand for those graphics cards (as well as the PCs that contain them) has exceeded the earlier-mentioned work-from-home consumption spike, because they’re also used for entertainment.
But two additional, newer applications for high-end graphics are putting a substantial incremental strain on supply. They’re ironically both showcase examples of the general-purpose graphics (GPGPU) concept that I first wrote about in-depth more than 15 years ago (see “Instigating a platform tug of war: Graphics vendors hunger for CPU suppliers’ turf”); Nvidia in particular had made significant long-term investments in GPGPU from both silicon and software standpoints with little return for a long time, making the recent success all the more sweet, no matter how frustrating the inability to meet the entirety of demand might be at the moment.
The first new application is deep learning, which I’ve written about quite a bit of late (for example, see “2020: A consumer electronics forecast for the year(s) ahead” and “Google’s AIY kits offer do-it-yourself artificial intelligence”). It turns out that the same sort of massively parallel arithmetic processing that GPUs use to transform polygons and other high-level primitives into pixels is equally beneficial in accelerating the initial model training phase of deep learning project development. GPUs can also be used for subsequent inference processing, as Nvidia’s burgeoning success in the semi- and fully-autonomous vehicle market exemplifies, although the comparatively more focused function requirements with inference, coupled with comparatively more stringent low-power consumption requirements—something that GPUs aren’t historically known for—opens the door to a broader set of competitive acceleration alternatives.
The other and even more significant source of incremental GPU demand is in the bitcoin (and Ethereum and other) “mining” market. Here, GPUs’ massively parallel processing is once again advantageous (as is, ironically, the AVX-512 hardware and instruction set support in Intel’s latest CPUs). The GPUs’ video outputs find no use here, thereby explaining Nvidia’s development of cryptocurrency processing-specific architectures and products. Nvidia has also striven to discourage GPUs’ use as cryptocurrency “mining” accelerators by hampering processing performance when the GPUs detect they’re running a hashing algorithm, although initial attempts were quickly circumvented, in no small part aided by the company’s own misstep. And perhaps obviously, bitcoin-related demand waxes and wanes with the rise and fall of the cryptocurrency market value.
How many graphics cards are “miners” buying right now? Longtime, well-respected market analyst firm Jon Peddie Research says cryptocurrency miners bought 700,000 GPUs just in the first quarter of this year. And although JPR forecasts (and supplier comments concur) that crypto-related demand is beginning to abate, partially because newer cryptocurrencies leverage SSDs instead, we’ve been here before; and the boom-and-bust cycle will likely recur in the future.
How has this all affected me? Well, for the Intel-based system whose build planning I’ve been blogging about of late (see “Building a PC: Choose your chipset and motherboard”), I plan to use an AMD Radeon RX 570-based graphics board, the MSI Radeon RX 570 Armor 8G OC, that I thankfully had bought back in the summer of 2019. I’d originally intended to use it in a Thunderbolt 3-based external graphics (eGPU) enclosure—a project that to date I haven’t gotten around to actualizing. At the time, the board cost me $139.99 new at Newegg; as I write this there’s a new one exactly like mine listed for $600 on eBay, along with a used one for $399 (here’s a hint: don’t buy a used graphics card right now, because you don’t know whether or not it’s previously been used for crypto “mining,” with its lifetime-shortening consequences). Keep in mind as you read about this price inflation that the RX 570 was introduced in the spring of 2017; it’s more than four-year-old tech at this point.
For the AMD system, I’m still kicking myself that I’d ended up returning for a refund an Nvidia GeForce GTX 1650-based graphics card (the GIGABYTE GeForce GTX 1650 D6 OC Low Profile 4G) that I’d bought last August for $149.99. My original plan was to use it in one of my “Hackintosh” systems, until I realized post-purchase that MacOS never supported the GTX 1650, only its GTX 1050 predecessor. I’d still been tempted to hold onto it for Windows-only use, until I noticed that although it claimed to only require the PCI Express bus as its power source (versus also needing a dedicated power supply feed, which the proprietary PSUs in the “Hackintosh” systems didn’t natively support), it drew 85W of peak power—well beyond PCI Express’s 75W-max spec. I’m kicking myself because that same graphics card, which I could have instead used on one of these built-by-me Windows-only systems, is now selling new on Newegg for $344.60 (plus $20.00 for shipping). The GTX 1650 was introduced in April 2019.
Instead, in January I found and purchased an open-box MSI GeForce GTX 1650 D6 AERO ITX card on Ebay for $194.99 (and felt very fortunate to be able to do so even at that inflated price; that same card brand new is now selling for $499.99 on Amazon as I type these words).
You lose some, but you also sometimes win some. Back in September of last year, I’d bought an Nvidia GeForce GTX 1050-based card, MSI’s GeForce GTX 1050 Ti 4GT OC, for $127.38 at Newegg (minus a further $10 for a manufacturer rebate):
My original intent was to hold onto it as a Hackintosh system “spare,” but after watching the price inflation of recent months, I eventually decided to offload my “investment” instead. On May 2 of this year (my birthday, ironically) I successfully sold it on eBay for $210.27. Even after subtracting out eBay’s fees, I still turned a tidy profit. Again, keep in mind that the GeForce GTX 1050 Ti was introduced in October 2016, and ironically was more recently reintroduced.
Based on this experience, you might be thinking that I’m also pondering selling the AMD Radeon RX 570-based graphics board I mentioned earlier, and then re-purchasing it (or a successor) once price sanity returns to the market. And you’d be right, although I haven’t (yet) pulled the trigger; the Intel CPU has integrated graphics, after all, so I could just rely on them in the near term. What do you think I should do? Sound off with your thoughts on this or anything else I’ve discussed in the comments!
This article was originally published on EDN.
Brian Dipert is Editor-in-Chief of the Embedded Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.