10 consumer technology breakthroughs from 2019

Article By : Brian Dipert

As 2019 draws to a close, it's time for a review of the big consumer technology breakthroughs of the past year.

With another year drawing to a close, it’s time once again for a review of the big consumer technology breakthroughs of the past 12 months. Click through the list below, and sound off in the comments with your thoughts on my thoughts (which are in no particular order save for how they streamed out of my noggin), as well as anything you think I may have overlooked.

Device autonomy

Tesla Model 3

Much has been written and accomplished over the past year concerning self-driving cars, exemplified by Tesla’s recently enabled “Smart Summon” feature and pending “early access” full self-driving capabilities (implemented in, for example, the Model 3 shown above). But, as I mentioned in a blog post earlier this year, partial-autonomy driver-assistance features are not only currently more pervasive but, at least in the near term, more impactful from a driver-, passenger-, and pedestrian-safety standpoint. Broaden your perspective beyond cars to include other types of vehicles, and autonomous freight-transport trucks (for example) come into view, which claim to both relieve current human driver shortages and alleviate human driver errors (albeit not any mechanical issues). Broaden it further and other devices already in widespread deployment come into view, such as computer vision-enhanced drones and vacuum cleaners.

Wi-Fi advancements

Google Nest Wi-Fi

TP-Link Archer AX1500 router

Two wireless networking innovations, with varying degrees of compatibility with (therefore near-term impact on) LAN clients, began to hit their mainstream strides this year. First off was mesh networking, a niche oddity a few years ago when initial consumer-tailored stabs at it from companies like Eero, Google (the first-generation Google WiFi system, along with the OnHub mesh retrofit), and Luma were launched. Nowadays, pretty much every networking company I can think of has at least one mesh setup in its product portfolio; they’re based on mature protocols for high client compatibility and enable you and your devices to roam around a home without dropping the router connection or needing to switch SSIDs.

Then there’s “Wi-Fi 6” i.e., 802.11ax. Under development for several years, it was officially unveiled by the Wi-Fi Alliance last October, and a flurry of chipset, software, and end device (based on those chipsets and software) announcements immediately followed. Quoting Wikipedia, “To improve spectrum efficient utilization, the new version introduces better power control methods to avoid interference with neighboring networks, orthogonal frequency-division multiple access (OFDMA), higher order 1024-QAM, and up-link direction added with the down-link of MIMO and MU-MIMO to further increase throughput, as well as dependability improvements of power consumption and security protocols such as Target Wake Time and WPA3.” It’s backward-compatible with 802.11ac and prior-generation protocols, of course, albeit without the enhancements in those usage cases. And at least one router manufacturer (TP-Link with its AX1500, shown above) is already launching products based on it at prices below $70. At least for now, however, Google is unwilling to follow TP-Link there … for the new second-generation Nest Wi-Fi system (also shown above), company officials commented that they declined to include Wi-Fi 6 support due to cost.

5G cellular

Motorola 5G mod

After years of hype, 5G cellular wireless is finally here (strictly speaking, for mobile devices … Verizon began limited sales for fixed broadband deployments last year, although it didn’t hit its stride until 2019, either). Shown above is the 5G mod for the Moto Z2 Force Edition, Moto Z3, and Moto Z4, phones introduced beginning several years ago (and a clever way to both extend the phones’ marketable life and upsell 5G into an installed base of handsets). Motorola also has native 5G phones on the way, and competitors such as Huawei and Samsung are already selling multiple 5G-capable models. Unsurprisingly, they’re bulky and heavy compared to LTE counterparts, per numerous reviews I’ve read, as well as prone to overheating (and dropping back to 4G transfer rates in response). And significant question marks remain regarding the rollout pace (and ultimate extent) of 5G’s coverage around the world, not to mention its consistent reliability within a given location (dependent in part on which band(s) are in use). But at least we’re now out of the starting blocks.

AMD’s CPU resurrection

AMD Ryzen Threadripper 3970X

AMD has long been a leading supplier of GPUs, by virtue of its 2006 acquisition of ATI Technologies. But in recent years (and until recently), the company has fallen notably behind longstanding nemesis Intel in CPUs from both performance and power consumption standpoints, therefore being forced to undercut its competitor in price in order to maintain a modicum of market share. This all changed in 2019 … big time. AMD actually pre-announced its Zen microarchitecture three years ago, and began shipping products based on it the following year. It’s taken time for computer OEMs to grow confident with Zen’s x86 compatibility and other characteristics, as well as to bring Zen-based system designs to market. Intel’s ongoing product shortages have certainly not hurt AMD’s aspirations. And AMD has used this time to its advantage, iterating both the microarchitecture and the manufacturing process it’s built on.

The just-introduced 32-core AMD Ryzen Threadripper 3970X, for example, has a MSRP of $1999 and is targeted at the high-end desktop (HEDT) market, where it currently has no direct Intel consumer competition; the workstation-targeted Intel Xeon W-3175X has four fewer cores and costs 50% more ($2999). AMD is at least currently focusing its fire on market segments that are particularly profitable, given that it’s reliant on limited-capacity foundry supply sources: HEDT, workstations, and servers. Intel’s multi-fab network still gives it a volume edge in mainstream desktop segments, as does its longstanding emphasis on power consumption for various mobile computing form factors. And it promises that upcoming architecture evolutions, along with belated process shrinks, will enable it to regain the lead. But ultimately, this showcase isn’t about AMD specifically, per se; it’s a celebration of the return of true competition to the CPU market, a resurrection that will ultimately benefit everyone, even Intel, by forcing it to strengthen its product offerings.

7 nm semiconductor processing

Apple A13 announcment

As you may have sensed from the preceding entry, architecture innovation isn’t the sole reason that AMD’s CPUs are now once again competitive with Intel counterparts. AMD’s access to advanced lithographies is equally critical; over the past few years, the company and its foundry partners have rapidly transitioned from 14 nm to 7 nm process nodes, leading to a higher cost-effective transistor budget, faster transistor switching, and lower transistor power consumption. Meanwhile, Intel has been pretty much stuck on its 14 nm node (which it claims is the dimensional equal of others’ 10 nm processes), forced to iterate its product line solely based on architecture tweaks.

AMD isn’t the only semiconductor supplier running leading-edge products on TSMC’s 7 nm foundry line, either. The photo shown comes from Apple’s iPhone 11 introduction earlier this year, which showcased the company’s homegrown (Arm-based) A13 SoC. Speaking of transistor budgets, a picture paints a thousand words, doesn’t it? The A13 wasn’t actually the first Apple 7 nm-fabricated application processor; that honor goes to last year’s A12. And the A12 wasn’t even the first 7 nm-based product introduced (though it was the first commercially available in an end product); that honor goes to Huawei subsidiary HiSilicon’s Kirin 980.

Voice interfaces

Amazon Echo

Amazon’s Echo smart assistants (shown above) and their counterparts from Apple (HomePod), Facebook (Portal), and Google (Nest Home) are at this point well covered in the media and widely adopted by consumers. Less-mentioned but no less impactful, however, is the broader voice interface support built into Apple’s MacOS (Siri) and Google’s Android, as well as Microsoft’s Windows 10 (Cortana), and in Amazon’s shopping and Google’s search apps for multiple mobile O/S (the remote control for the Comcast Xfinity system at my relatives’ house even embeds a microphone for voice-activated search and navigation purposes). Personally, I never even use my smartphone or tablets’ pop-up virtual keyboards any more for either search query or text message entry purposes; once accuracy got good enough, the occasional correction was an acceptable tradeoff for the convenience. And that accuracy continues to inch toward 100% as user feedback gets baked into the algorithms’ iterative AI training routines.

Personal health monitoring

Apple Watch health monitoring

FitBit Inspire HR

Thanks in no small part to the burgeoning popularity of wearables such as the EKG-measuring Apple Watch 5 shown above, along with the lower-cost (but still capable of measuring steps, heart rate, sleep profiles, and the like) Fitbit Inspire HR below it, consumers are increasingly shouldering an increased responsibility for analyzing their health instead of relying solely on a yearly physical at the doctor’s office. And, for the common good (in conjunction with privacy promises), they’re growing increasingly comfortable with sharing the data their widgets collect, too … see, for example, the recently published results of Apple’s Heart Study in partnership with Stanford University, or the three new studies Apple announced shortly thereafter.

Mainstream general-purpose GPU

Jetson nano board

Back in late 2005, I authored the feature article “Instigating a Platform Tug of War: Graphics Vendors Hunger for CPU Suppliers’ Turf,” which (along with a series of companion blog posts) covered the GPGPU (general-purpose graphics processing unit) phenomenon. NVIDIA in particular had long advocated even at that long-ago date the concept as a means of parallel-processing-accelerating algorithms that would otherwise execute serially and more slowly on the system CPU. But examples of widespread, successful implementation of the concepts were few and far between; Apple’s Core Image API, for example, along with a few still and video image processing functions in Adobe’s program suite.

Fast-forward 15 years, and the GPGPU landscape has certainly matured. NVIDIA launched its CUDA development platform and associated API two years after my article was published, for example. It’s been enthusiastically adopted by the supercomputer and other scientific computing communities, among others. And GPGPU shifted into high gear with the emergence of various deep learning techniques as an alternative to traditional algorithm development approaches. NVIDIA GPUs found use first, and are widespread to this day, as deep learning model training acceleration platforms. And now they’re being increasingly used to accelerate subsequent deep learning inference routines, as well. The $99-or-less Jetson Nano developer kit shown above, for example, is based on an SoC that integrates a 128-core Maxwell-generation GPU and a quad-core Arm Cortex-A57 CPU, the latter running at 1.43 GHz.

High-capacity, inexpensive SSDs

Back in the early 1990s, I was part of a development team at Intel that designed, patented, and introduced one of the world’s first flash memory-based hard drives, which we now know of as SSDs (solid-state drives). That trendsetter was based on dual-die 32 Mbit (yes, that’s right folks … 4 MByte) NOR flash memory components, and given how many (or if you prefer, few) chips one could squeeze into a 2.5″ HDD form factor, you can imagine how small and expensive the resultant drive was! At the time, I’m sure plenty of press and analyst folks, systems engineers, semiconductor competitors, and others chuckled at the seeming folly of our “tilting at windmills” against rotating magnetic hard drives.

Time, I’d argue, has borne out our vision, however. The 1 TByte 2.5″ SSD shown here, for example, is regularly advertised on sale for around $90; lower-end product variants from ADATA that dispense with the DRAM buffer ahead of the flash memory array are even less expensive. And SSDs are now pervasive, especially in mobile computing form factors, thanks to their inherent comparative ruggedness, low power consumption, and high random access speeds versus HDDs. In part, this dramatic cost-per-bit drop (along with the associated capacity growth) comes from the same earlier-mentioned lithography-shrink trends that have also benefitted logic devices, along with the ascendance of more silicon-efficient NAND flash memory. It’s been further accelerated by the ability to store multiple bits of information in each cell (I was also part of the first-generation Intel StrataFlash two-bit-per-cell product team), with latest-generation QLC (quad-level cell) chips capable of storing four bits per transistor. And then there’s “3D NAND,” which stacks multiple storage transistors vertically within the same planar area originally devoted to just one.

Subscription content

Roku streaming stick

It wasn’t so long ago that the music industry was decrying online streaming and other subscription services as its potential death knell. How times have changed … nowadays, per a recent report, streaming makes up 80 percent of that industry’s revenues (how much of that revenue actually makes its way to the artists and songwriters, however, is a different story).

Much the same goes for online video. I was back in my birth state (and city) for the Thanksgiving holidays, and was struck by how few movie theaters are left, and how scant the attendance is at the remaining venues. We saw a Thanksgiving evening showing of Joker and were three of only about 10 total attendees … the theater hadn’t even bothered staffing to check tickets and confirm purchases (via Fandango, in our case). We could have strolled right in without paying. This particular theater seemed unchanged from its years-ago original design; its owners have likely decided to not invest in upgrades but instead to milk whatever profits they can for as long as they can. Conversely, other theaters have consciously ripped out rows of seats and replaced them with fewer, plusher chair successors, serve food and beverages right at those seats, have updated the projection rooms’ sound systems, etc. And meanwhile, Disney and other content studios, which previously decried Netflix and other upstarts, are now joining them in the battle for online eyeballs.

As I said in the introduction, please sound off in the comments with your thoughts on my thoughts. And thanks as always for reading!

Brian Dipert is Editor-in-Chief of the Embedded Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.

Related articles:

Leave a comment