ATSC 1.0 has finally hit a sustainable stride, but will ATSC 3.0 ever amount to anything?
Long-time readers may recall that I've long covered the development and roll-out of ATSC (Advanced Television Systems Committee) digital television in the United States, both conceptually and hands-on (not to mention teardowns), the latter in both California and Colorado, and in both fixed and mobile reception forms. And courtesy of the cord-cutting phenomenon, a growing number of folks still desirous of local news and other regional programming are discovering "free TV" courtesy of an antenna on the roof or wall and the ATSC tuner built into their television. So it seems that ATSC has now hit its stride; not a full-blown sprint, mind you, more like a comfortable lope, but still sustainable.
Technology doesn't tend to stand still, however, which is why the standards committee is now pushing ATSC 3.0 at us (with specs finalized in time for announcement at the 2018 CES, subsequent to formal approval by the U.S. FCC (Federal Communications Committee) the prior November). The obvious upfront question that many of you are likely thinking right now is, what happened to ATSC 2.0? Within that answer is the core of my skepticism as to whether ATSC 3.0 will ever amount to much, either ... but I'm getting ahead of myself.
Common to both ATSC 2.0 and 3.0, to quote Wikipedia, are the following attributes:
The standard will allow interactive and hybrid television technologies by connecting the TV with the Internet services and allowing interactive elements into the broadcast stream. Other features include advanced video compression, audience measurement, targeted advertising, enhanced programming guides, video on demand services, and the ability to store information on new receivers, including Non-realtime (NRT) content.
Not mentioned in the above quote, but documented elsewhere, was the fact that ATSC 2.0 was also intended to have more robust mobile reception support, thereby folding into the spec key learnings from the industry's past ATSC-M/H experiment. However, Wikipedia also accurately noted that ATSC 2.0 was intended to be "backward compatible with ATSC 1.0." This meant that it still used MPEG-2 as its video codec, for example, just with more sophisticated encoding ("advanced video compression") to make more efficient use of the available bitrate, along with sticking with Dolby Digital as the audio codec. And it also meant that ATSC 2.0 retained 8VSB (8-level vestigial sideband) modulation for its physical layer implementation, for backwards-compatibility with existing first-generation ATSC broadcast and reception equipment.
Sounds pretty good, right? Only one problem: the television manufacturers (and Blu-ray player manufacturers, and ... ) were evolving their technologies at the same time. Work on ATSC 2.0 began around the turn of the decade, but by the time spec finalization drew near, "4K" televisions and UHD (ultra-high definition) blue-laser optical media (along with a few satellite and streaming services) were becoming increasingly commonplace. Unfortunately, the combination of MPEG-2 and 8VSB, along with the constrained bitrate of ATSC broadcast, didn't allow for UHD presentations with the necessary frame rates and image quality that discerning consumers would demand ... especially with all-important fast-action sports and other similar programming.
Enter ATSC 3.0, which dispenses with transmitter and receiver backwards compatibility in exchange for a switch to more bitrate-efficient OFDM (orthogonal frequency-division multiplexing) modulation in combination with variable-rate LDPC (low-density parity-check code) FEC (forward-error correction). ATSC 3.0 also embraces more modern codecs: H.265 aka HEVC for video, along with Dolby AC-4 and MPEG-H 3D as options for audio. The result? To quote Wikipedia, "Video channels of up to 2160p 4K resolution at 120 frames per second," along with support for "wide color gamut."
Perhaps obviously, given its lack of backwards compatibility, ATSC 3.0 content needs to be broadcast using a different UHF or VHF channel than that already in use for ATSC 1.0. And in fact, as part of its November 2017 approval process, the FCC explicitly stipulated that any transition from ATSC 1.0 to 3.0 would be voluntary, not mandatory as with the prior NTSC (analog)-to-ATSC transition. In fact, again quoting Wikipedia:
Stations that choose to deploy ATSC 3.0 services must continue to maintain an ATSC-compatible signal that is "substantially similar" in programming to their ATSC 3.0 signal (besides programming that leverages ATSC 3.0 features, and advertising), and covers the station's entire community of license (the FCC stated that it would expedite approval for transitions if the loss in over-the-air coverage post-transition is 5% or less). This clause will remain in effect for at least five years; permission from the FCC must be obtained before a full-power station can shut down its ATSC signal.
Other changes this time around include that "the FCC will not require ATSC 3.0 tuners to be included in new televisions, and there will not be a subsidy program for the distribution of ATSC 3.0-compatible equipment."
Program tweaks aside, there's notable implementation evidence available even at this early stage, and not just in the U.S., either. South Korea launched ATSC 3.0 service in May 2017, in advance of broadcasts during the 2018 Winter Olympics. And stateside, small-scale deployments (more like "experiments") are beginning; just the other day, for example, I saw an announcement that multiple Phoenix, AZ-area stations had banded together to time-share a single channel's spectrum (and associated equipment) for testing purposes, with the FCC's blessing.
So what's behind my earlier "skepticism" tease? Quite a few things, actually; many of them are captured in a few-word summary, "the chicken-and-egg." Particularly given that the FCC this time around isn't going to make the to-ATSC 3.0 transition mandatory, a lot of things are going to need to happen for it to hit critical mass:
- Content provider (i.e. network broadcaster) support with respect to creation and distribution of true UHD content, versus up-scaled HD live-capture and archive material
- Local broadcaster support with respect to purchases of new equipment and licenses (assuming that available- and terrain-acceptable channel spectrum still exists in a given market), along with a willingness to redundantly (and expensively) retain ATSC 1.0 broadcasts for a long time to come (in case anyone hasn't noticed, local broadcasters aren't exactly making tons of profit nowadays, even as is)
- TV manufacturers' willingness to balloon their equipment's bill-of-materials costs with even more receiver and decoder circuitry (and software licenses) than already exists
- And consumers' willingness to pull out their wallets and buy brand new, even more expensive ATSC 3.0-supportive televisions to replace the UHD displays they probably just bought recently ... or to tether yet another set-top box to their existing TVs
If any one of these four lags, critical mass flat-out won't happen. And focusing only on the consumer factor, I feel like I'm making the same argument I previously made against UHD-vs-HD Blu-ray, or even HD Blu-ray-vs-DVD (because I am). Unless your TV is really huge or you sit really close to it, you're not going to be able to tell any difference between a "true" UHD presentation and a HD version up-scaled upon reception at the TV. The vast majority of folks now with "4K" TVs, I'd in fact wager, aren't watching any native UHD material at all ... only up-scaled HD Blu-rays, satellite, and cable TV broadcasts, and over-the-air ATSC 1.0 channels. So why would you drop a thousand dollars or more on a new ATSC 3.0 TV ... or even clutter your entertainment center with yet another HDMI-tethered standalone receiver?
Don't think that mobile reception will be ATSC 3.0's salvation, either. Cellular service providers are loath to sanction anything that doesn't leverage their revenue bread-and-butter, the data plan. That's why there are plenty of smartphones out there today that have dormant FM receivers built into them, never to be activated; cellular carriers want you to get your music fix via lucrative-to-them data-streamed Pandora, Spotify, and the like instead. Of course, those smartphones don't have built-in FM reception antennas, either, but if you're a manufacturer, why would you bother cost-burdening your design for a feature that is never going to be allowed by your cellular service "partner" to be turned on?
Same goes for mobile television, as I've consistently prognosticated through my years' worth of coverage of ATSC-M/H and its competitors. Cellular service providers want you to get your video fix from Hulu, Netflix and the like ... all streamed via data services. Anything that bypasses that profitable path is anathema. And in cars? Cellular data is already there via smartphones, tablets, and even built-in hot spots. And anyway, the driver's not supposed to be watching TV while driving (like I've said before, folks, don't hold your breath on full autonomy's widespread arrival any time soon).
The NTSC-to-ATSC transition, like other past analog-to-digital conversions (LP to CD, VHS to DVD, legacy radio to Sirius XM, etc.), made lots of sense (and yes, the FCC mandate didn't hurt, either) ... you might not get a fringe "fuzzy" reception, but if you did receive a signal, it was crystal-clear. And the higher resolution and surround sound of ATSC were notable bonuses, too. This time around, on the other hand (and reminiscent of CD to SACD and DVD-Audio disc, and DVD to HD-DVD and Blu-ray), the incremental benefits are much less notable. And given the incurred costs (and timeframe of those costs) necessary to make the transition happen, I seriously doubt it'll come to pass with critical mass to ensure its longevity.
With that all being said, I'll also admit that I've been told a time-or-few in the past that I tend to be a conservative, skeptical curmudgeon when it comes to acceptance of the claimed promises of new technologies. So my mind is open to being changed ... well, a bit, at least. Agree or disagree, let me know your thoughts in the comments. Thanks-in-advance as always!
—Brian Dipert is Editor-in-Chief of the Embedded Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company's online newsletter.