There’s a risk-and-benefit tradeoff in choosing new parts versus older ones, and this is especially the case in the world of analog versus digital...
The Wall Street Journal recently ran two articles within a few days of each other on the same theme: the shortage of ICs due to fabs running at full capacity. The articles “Chips Are in Hot Demand—and That’s a Problem” and “Ford, Other Auto Makers Cut Output, Idle Workers on Chip Shortage” pointed that demand has surged for ICs for various reasons. These include increased sales of PCs due to work- and school-from-home, more traffic and storage at data centers and network pipes than anticipated, and heavy demands from automakers as a result of all the electronics designed in today’s cars.
The basic metric of lead time has stretched to 26 weeks and even longer to 40 weeks for some components, according to some sources cited. In fact, some production lines have had to slow down significantly or even shut down for a week and more due to a lack of electronic subassemblies which, in turn, lacked the needed parts.
Of occurs, such news always brings out the pundits who have very good hindsight and say “they” [IC vendors and their fabs] should have seen this coming. At least, they do acknowledge that you cannot add fab capacity easily or cheaply, so planning and spending years in advance is critical.
It’s worth noting that neither article uses the phrase “integrated circuit” anywhere. Instead, it’s always “chips” and this bothers me a great deal. I’ve been fighting a losing battle on this point for years. So please, to any of you journalists out there: at least once in your story, call them integrated circuits. Saying “chips” makes it sound like wood chips or potato chips that anyone can make at home.
Using only the term “chips” diminishes and even demeans the multidisciplinary expertise it takes to conceive, design, simulate, fabricate, test, and ship these components. If you are reading this column at Planet Analog, I don’t have to explain further, of course, but to the average reader, it makes ICs sound like “no big deal.” Once again, we have made the magic of creating semiconductor devices seem all too easy.
Older, cheaper processes
While I don’t expect these reporters to understand the technical differences between analog and digital, I would hope they and the commenting pundits would at least have some sense of them. One of the articles states, “Where the broader chip-industry shortages are being felt most acutely, in many cases, isn’t at the cutting edge of technology, but rather those products using older and cheaper processes that aren’t as lucrative for manufacturers, analysts and industry executives say.”
This is where the lack of understanding of the realities of the lifecycle and fabrication of analog versus digital ICs is frustrating. Sorry, but those “older and cheaper processes” can be quite lucrative. Yes, the latest and greatest processors and other digital-intensive ICs may have higher average selling prices (ASPs), but the margins on older analog ICs can be much greater. The reason is that all the design and start-up costs and uncertainties associated with these ICs are history, test and trim are under control, and yields are high.
Even if the statement had some truth—and certainly, there is some—the article doesn’t address why vendors would even make these less-lucrative products. The reason is simple: the various customer “interest groups” such as design, component and product-test engineers really, truly want them. These products have all their kinks worked out, their idiosyncrasies are known, and there’s a level of confidence and comfort when using an older product with a track record.
Design is about taking risks along many different axes, and one of these risks is to use a hot new product that offers performance, feature, and price advantages. But balancing that risk also means to stick with some known products in the design, and that’s where the older, often analog components shine, as they mitigate risk.
That’s not to say that there’s no place for new, improved analog products; but there are places where older products are “good enough” and that maturity is a virtue in the system-design sense. Good designers are cautious and analog-component users are perhaps more cautious than others, for many good reasons, including experience and judgement.
I’ve tried to explain this to people who think everything is digital, and there’s little room for analog in today’s high-tech designs, and frankly, it’s been a tough slog. The idea that you should immediately put all those new ICs on your bill of materials (BOM) seems attractive at first. After all, they offer the prospect of making things easier and better, even if reality is often different. That’s why IC vendors and their fabs still see a large place for older processes, especially for products on the analog IC side.
Have you ever chosen to use an older, mostly-analog IC when a newer, apparently better one was just announced? Have you ever been challenged to defend this decision at a design review by those who don’t understand the realities of using a just-released component and the need to balance the risk?