Category: The Front Line

HPA Tech Retreat 2019: 8K Is Here, Ready Or Not…

As I write this, the second day of the annual HPA Tech Retreat is underway. So far, we’ve learned about deep fakes, film restoration at 12 million frames per second, how to make solid cinema screens work as sound transducers, and how lucrative the market is for media developed for subway systems. Artificial intelligence is a big topic here, used for everything from analyzing frames of film to perform color and gamma correction to flying drones and capturing “point cloud” imaging for virtual backgrounds.

Indeed, artificial intelligence is becoming a valuable tool for searching video footage and finding clips, a much faster process than any conventional search using your eyeballs. TV manufacturers are relying more on basic forms of AI to analyze incoming video streams and perform a variety of transformations to scale and size it to Ultra HD (and eventually 8K) video screens.

In addition to my annual review of the Consumer Electronics Show, I presented a talk on “8K: How’d We Get Here So Quickly?” I casually tossed out this concept last fall when suggesting a session topic, and it was accepted. My research came up with a lot more points than could be fit into 20 minutes, but here are the takeaways:

(1) The migration from 4K to 8K is largely being driven by supply chain decisions in Asia. More specifically, the collapse of profitability in 4K panel and TV manufacturing is leading large Chinese fabs (TCL, Hon Hai, BOW) to build Generation 10.5 and 11 LCD fabs with the intent of cranking out 65-inch and larger 8K TV panels, anticipating over 5 million TV shipments worldwide by 2022.

(2) There are more than a few 8K professional cameras, but all are using 4K lenses. Lenses to fit full-frame 8K sensors are way off in the future and will be challenging and expensive to manufacture, particularly zoom lens designs.

(3) Current display interfaces aren’t nearly fast enough for even basic 8K formats. Samsung’s 85-inch 8K offering is currently equipped with one HDMI 2.0 input (maximum 18 Gb/s), which is fast enough to support 8K (4320p) video @ 30Hz with 8-bit 4:2:0 color. That’s it. HDMI 2.1 won’t make an appearance on most TVs until 2020, and even LG’s 2019 models have to convert a v2.1 input into four v2.0 lanes to drive the displays. DisplayPort 1.4 is fast enough to handle 4320p/30 with 4:2:2 10-bit color, but that’s about it.

(4) Newer codecs will be needed to pack down 8K signals into more manageable sizes. JPEG XS has been shown for compressing 8K/60 10-bit 4:2:0 by a ratio of 5:1 to fit the signal through a 10-gigabit network switch. For high-latency codecs, HEVC H.265 and the new Versatile Video Codec (VVC) will be required to do the heavy lifting.

Most attendees don’t understand this mad rush to 8K, but in my talk I pointed out that 8K R&D has been going on for over 20 years and the first 8K camera sensors were shown at NAB in 2006 – thirteen years ago. Sharp exhibited an 85-inch 8K LCD display at CES in 2012 – 7 years ago. And we appear to be stuck on a 7-year cycle to the next-higher TV resolution, one that started way back in 1998 when the first 720p plasma TVs were coming to market.

Overshadowing everything is 8K content. Where will it come from? Probably not optical disc, but more likely from the cloud over fast networks. NHK launched an 8K Hi-Vision satellite channel last December for viewers in Japan, but that’s it. For that matter, does it even matter that we have 8K content? The scaling engines being shown on 2019 8K TVs make extensive use of artificial intelligence to re-size 4K, Full HD, and even standard definition video to be viewed on an 8K set.

My closing point was that we should just stop obsessing over pixel resolution. Most viewers sit so far away that they would never spot the pixel structure on an Ultra HDTV, let alone 8K. Panel manufacturers may choose to push ever higher with pixels (Innolux showed a 15K display in August of 2018), but we should turn our attention to more important display metrics – color accuracy, consistent tone mapping with HDR content, and improved motion rendering, particularly with high frame rate (HFR) video on the way.

I’ve used the expression “building the plane while flying it” to describe the evolution of 4K and Ultra HD. It’s even more appropriate to describe the world of 8K: Some pieces are in place, others are coming, and some have yet to be developed and are years off.

Yet, here we go, ready or not…

HDMI 2.1 Update – Pretty Much Status Quo

Last Thursday, a joint press conference was held in New York City by the HDMI Licensing Administrator to update attendees on the latest version of HDMI – version 2.1.

V2.1, which was officially announced at CES in 2017, represents a quantum leap over earlier versions. It’s the first HDMI architecture to use a packet-based signaling structure, unlike earlier versions that employed transition-minimized differential signaling (TMDS). By moving to a packet transport (an architecture which V 2.1 apparently borrowed a lot from DisplayPort, according to my sources), the maximum data rate could be expanded several-fold from the previous cap of 18 gigabits per second (Gb/s) to a stratospheric 48 Gb/s.

What’s more, the clock reference can now travel embedded in one of the four lanes. Previously, HDMI versions up to 2.0 were limited to three signal lanes and one clock lane. And of course, a digital packet-based signal stream lends itself well to compression, accomplished with VESA’s Display Stream Compression (DSC) system that is also the basis for Aptovision’s Blue River NT technology.

The HDMI Forum simply had to kick up the performance of the interface. Version 2.0, announced five years ago, was perceived by many (including me) to be too slow right out of the gate, especially when compared to DisplayPort 1.2 (18 Gb/s vs. 21.6 Gb/s). That perception was prescient: Just half a decade later, Ultra HDTVs are rapidly approaching the unit shipment numbers of Full HD models, and the bandwidth demands of high dynamic range (HDR) imaging with wide color gamuts (WCG) need much faster highways, especially with RGB (4:4:4) color encoding and 10-bit and 12-bit color rendering.

And if we needed any more proof that a faster interface was overdue, along comes 8K. Samsung is already shipping an 8K TV in the U.S. as of this writing, and Sharp has introduced a model in Japan. LG’s bringing out an 8K OLED TV in early 2019, and Dell has a 32-inch 8K LCD monitor for your desktop.

To drive this point home, IHS analyst Paul Gagnon showed numbers that call for 430,000 shipments of 8K TVs in 2019, growing to 1.9 million in 2020 and to 5.4 million in 2022. 70% of that capacity is expected to go to China, with North America making up 15% market share and western Europe 7%. Presumably, at least one of the signal inputs on these TVs will support HDMI 2.1, as even a basic 8K video signal (60p, 10-bit 4:2:0) will require a data rate of about 36 Gb/s, while a 4:2:2 version demands 48 Gb/s – right at the red line. (DSC would cut both of those rates in half).

Aside from stating that over 900 million HDMI-equipped devices are expected to ship in 2019 (including everything from medical cameras to karaoke machines,) HDMI Licensing CEO Rob Tobias didn’t offer much in the way of real news. But I had a few deeper questions, the first of which was “Is there now native support for optical interfaces in the HDMI 2.1 standard?” (Answer – no, not yet.)

My next question was about manufacturers of V2.1 transmitter/receiver chipsets. Had any been announced that could actually support 48 Gb/s? According to Tobias, HDMI Forum member Socionext, a chip manufacturer in Japan, has begun production on said chipsets. I followed that reply up with a question about manufacturer support for DSC in televisions and other CE devices, but couldn’t get a specific answer.

Much of the discussion among these panel members and David Meyer (director of technical content for CEDIA), Brad Bramy, VP of marketing for HDMI LA, and Scott Kleinle, director of product management for Legrand (a supplier to the CEDIA industry) was focusing on future-proofing residential installations that used HDMI interconnects.

But why not just go optical for all HDMI 2.1 connections and guarantee future-proofing? The responses I got to my last question were mostly along the line of “The installer just wants it to work the first time.” Yes, there are faster (Ultra High Speed) HDMI cables available now to work with V2.1 connections. But an HDMI cable that has to run 20, 30, or 40 feet at over a GHz clock rate is a pretty fat cable!

Multimode fiber cable is inexpensive compared to Cat 6 cable and the terminations are not difficult to install. Running strands of fiber through conduit, stone, and behind walls seems to be the most logical solution at the required speeds and is certainly what I’d recommend to installers in the commercial AV market. Properly terminated, optical fiber works the first time and very time and can run over a mile without significant signal degradation.

Once again, the HDMI Forum will have a booth at CES in the lower South Hall. With a new display wrinkle lurking in the shadows – high frame rate (HDR) video – there will be more upward pressure than ever on data rates for display connections. HDMI 2.1 may be up to the task (most likely aided by DSC), so I will be curious to see if there are any 8K/120 demos in Las Vegas. – PP

R.I.P For Home Theater Projectors?

Recent trends in large flat screen displays have me wondering if we are seeing the beginning of the end for home theater front projection. (We are already seeing pressure on front projection for commercial markets, but that’s a topic for another time.)

Earlier this month, both Samsung and LG announced they would release 80-inch-class 8K displays for the home. For Samsung, it’s an 85-inch 8K LCD with quantum dot backlights for supporting high dynamic range, while LG moves forward with an 88-inch 8K OLED, also HDR-compatible but not nearly as bright as the Samsung offering.

Wait – what? 8K TVs for the home!?!? you’re probably thinking. Yep, 8K is here, and wow, did it arrive in a hurry! That’s because the Chinese manufacturers have basically collapsed pricing in the Ultra HDTV market over just three short years. You’d be nuts NOT to buy a new Ultra HDTV with prices this low, as some models can be had with HDR support for just $9 per diagonal inch.

We already have an abundance of 80-inch-class Ultra HD flat screen displays and their prices are quite reasonable. A quick check of the Best Buy Web site shows Sony’s XBR85X850F for $3,999. It’s an 85-inch LCD with HDR and “smart” connectivity. The same page listed a Samsung QN82Q6FNAFXZA (82 inches, QLED) for $3,499 and Samsung’s UN82NU8000FXZA (82 inches, HDR, QLED) for $2,999.

Got a few more bucks in your pocket? For $19,999, you can have the new Samsung QN85Q900RAFXZA, a top-of-the-line Ultra HD QLED TV. For $14,999, you can pick up LG’s OLED77W8PUA 77-inch OLED (not quite 80-inches, but close enough). (And for you cheapskates, there were several Ultra HDTVs in the 75-inch class for less than $2,500.)

Sony’s 85-inch XBR85X850F has the same retail price as a Full HD LCD projector did ten years ago. And you can lose the screen.

If you currently have a home theater, chances are the projection screen is in the range of 80 to 90 inches. Just two years ago, replacing that setup with a flat screen LCD would have been quite an expensive proposition. But today, you can purchase one of those 80+ inch beauties for less than what a 50-inch Pioneer Elite plasma would have cost ten years ago. (And 50 inches seems pretty small now, doesn’t it?)

When I last upgraded my home theater (which was around 2006-2007), I replaced a Sony CRT projector with a Mitsubishi HC5000 (later an HC 6000). That was a Full HD 3LCD model with beautiful color management. I’ve thought about upgrading it over the years even though I hardly use the theater anymore. But looking at these prices, I’d probably be better off just removing the projector and screen and moving to a one-piece flat screen setup.

There are a bunch of reasons why that would be a good idea. For one thing, I have a few older home theater projectors left in my studio and all of them use short-arc lamps that contain metal halides of mercury. If I was to upgrade to a new projector, it would have to use an LED illumination system – and those are still more expensive with 4K resolution than flat screen TVs.

Second, I could get rid of my 92-inch projection screen and hang some more art on the wall. (It previously replaced an 82-inch screen, and frankly, that was large enough for the room.) I could also eliminate a ceiling power and AC connection and a bunch of wiring from my AV receiver. All of that stuff would be consolidated in a small space under the new TV. (Who knows? I might even go ‘commando’ and just use a soundbar/subwoofer combination!)

I’m sure I’m not the only person who (a) built a home theater in the late 1990s, (b) upgraded the main family room/living room TV to a large, cheap flat screen a decade later, and (c) now spends more time watching that family/living room TV than using the home theater. Mitsubishi exited the projector business almost eight years ago, so I’d never be able to get my 6000 fixed. (But I hardly use it anyway, so who cares?)

Even a 75-inch TV would work, and there are plenty of those available at bargain-basement prices. Hisense showed an HDR Ultra HD model (75EU8070) for just a hair over $1,000 and Vizio’s E75-E3 will set you back only $300 more. For those prices, you can hardly go wrong – if you don’t like it a year from now, just recycle it and buy a new one (for less money).

There’s a parallel trend in movie theaters, where the first fine-pitch LED displays are making tentative steps toward replacing high-powered projectors.  Pacific Theaters Winnetka in Chatsworth, California installed a 34×17 Samsung fine-pitch LED screen last year and claims it can hit higher levels of peak brightness (3,000 – 4,000 cd/m2 shouldn’t be difficult) for true high dynamic range. And of course, LEDs can achieve an enormous color gamut and very deep blacks when off, characteristics of emissive displays.

With ongoing developments in LED technology, we’re likely to see more theaters adopt the LED platform – no projection lamp to replace, because there’s no projector to operate. There are issues about aspect ratios and content formatting to resolve, but we figured them out for digital cinema when we turned our backs on motion picture film.

So why not have our home theater work the same way and get rid of the projector? For that matter, it’s possible and even likely within a decade that LCD and OLED TVs will both be replaced by fine-pitch or ‘micro’ LED displays, giving us the same experience as a state-of-the-art theater.

And home theater projectors will wind up curiosities of an earlier age, like Super 8mm and slide projectors…something Grandpa and Grandma used, along with optical disc players……

Spectrum Repacking and Channel Scans

In the wake of last year’s big spectrum auction, the FCC is chopping even more spectrum away from UHF TV stations and expecting (somehow) to jam all the remaining TV stations into low band VHF (2-6), high band VHF (7-13), and truncated UHF (14-36) channels.

In my neighborhood, stations are already packing up and moving. While conducting a recent test of a “smart” indoor UHF TV antenna, I grabbed some spectrum analyzer plots of all three television frequency bands. As expected, the RF spectrum from channel 56 to 87 (channels 2-6) was largely unusable due to high levels of impulse and main-made noise.

The high band VHF spectrum wasn’t much better, with some continuous RFI kicking up the noise floor by almost 20 dB. But it was the UHF spectrum I was interested in, and several former broadcasters were noticeable by their absence. Channels 29, 35, and 39 – previously in use for Univision, independent, and PBS stations – had all gone dark.

To get around the lack of available channels, TV stations are “channel sharing,” something the FCC frowned on as recently as a decade ago. What that means is that stations divvy up the available bits in an MPEG2 encoder and multicast several minor channels on one physical RF channel. This technique was almost impossible to pull off twenty years ago when digital TV broadcasts and HDTV were just getting started.

Now, thanks to very powerful processors and tricks like adaptive variable bitrate encoding and statistical multiplexing (a/k/a “stat muxing”), it’s not difficult at all, even though the jury is still out on the quality of HD and SD video using much lower bit rates that were not possible in 1998. NBC has done this in Philadelphia and New York, combining Telemundo channels with NBC programming and making room for one HD service from each.

Locally, an independent station in Allentown (WFMZ) will relinquish its 5-megawatt signal on UHF-46 and move to VHF-9, sharing bits with WBPH and the Lehigh Valley PBS station, WLVT (formerly on channel 39). This will have happened by the time you read this column and I’ll be curious to see just how much image quality has deteriorated for each minor channel after the new transmitter lights up.

Keep in mind that many stations auctioned off their channels in return for a nice pay day. Public stations in particular pocketed some serious change, money that went into facilities upgrades and balancing their budgets. If their multicast services hold up well with the latest in MPEG2 encoding, then they’ll come out of this smelling like a rose.

What this means to you as an OTA viewer is that you will need to re-run channel scans to catch all of these moves – otherwise, you’ll tune to a channel that has gone dark and will be standing there, scratching your heads in bewilderment. I’d perform a channel scan twice a month from now through the end of the year. (You might also pick up some newer, low-power translators and repeaters along the way, and you may find some channels are gone for good.)