Posts Tagged ‘Ultra HD’

True Facts

A recent post on LinkedIn led me to write this, and it has to do with 4K video and imaging. Or, at least how marketing types have redefined it as “True” 4K or “Faux” 4K.

The post in question had to do with a projector manufacturer’s 4K offerings and how other manufacturers may not be offering a “true” 4K product by comparison, calling those other products “faux” 4K (or “faux” K to be clever). That prompted more than a few comments about what “true” 4K is in the first place.

One comment pointed out that the projector brand behind the original post doesn’t even have a “true” 4K imaging device in its projector, as it uses Texas Instruments’ .66” DMD with 2716×1528 micromirrors and requires image shifting to create an image with full 4K resolution. (Some irony in that?)

Now, I know more than a few marketing folks in the AV industry, and they work very hard and diligently to promote their company’s products. However, sometimes they step out of bounds and create more confusion, particularly with new technologies. Which, by the way, was one reason I started teaching technology classes at InfoComm and other trade shows two decades ago – as a way to counter marketing hype with facts.

What, exactly, is “true” 4K? If you use spatial resolution as your benchmark, then your imager must have at least 4000 horizontal or vertical pixels. The fact is, very few displays today have that much resolution, save for a limited number of digital cinema projectors, a handful of home theater projectors, and a small selection of reference and color grading monitors. All of which will set you back quite a few $$$.

Most displays that are lumped into the 4K category are really Ultra HD displays, having a fixed resolution of 3840 horizontal and 2160 vertical pixels. This would include every so-called 4K consumer TV, many digital signage displays, and production monitors. Are they “true” 4K? Going by spatial resolution, no.

What make things even more confusing is projection specsmanship. Sony’s original SXRD projectors had Ultra HD resolution. Although Epson has shown a prototype HTPS LCD chip with UHD resolution, they’ve never brought it to market. And the only DMD that Texas Instruments makes with 4K resolution is the 1.38” dark chip they sell into the digital cinema marketplace.

What projector manufacturers do instead to get to 4K is to use lower-resolution chips and shift the image with very fast refresh rates to effectively create 4K images. I’ve seen demos of the .66” DMD creating 4K images vs. a native UHD imager and you can see the difference between native and shifted images, particularly with fine text and detail. But it represents a low-cost way to get something approaching UHD resolution.

Panasonic also did this with their PT-RQ32U 4K DLP projector, using devices with 2550×1536 resolution and mapping quadrants to get to 5120×3200 total pixels. Presumably, they’ve retained this trick on their newer 4K models shown at InfoComm 2019.

Is that “true 4K?” Not when it comes to spatial resolution. But what if you base your claims on each finished frame of video, after all sub-fields are created? In that case, you could have an argument that your device is actually creating 4K video. Since our eyes can’t keep up with refresh rates much past 60 Hz, we’re not likely to see any flicker from this technique (also known as “wobbulation” and used by such luminaries as JVC and Hewlett-Packard on their display products in the past).

In fact, Digital Projections’ Insight Laser 8K projector employs three 1.38” dark chip DMDs and some clever image shifting to get from native 4096 x 2160 resolution to get to 8K (presumably 8192 x 4320 pixels in the finished images). Native 8K DMDs don’t exist, and like 8K camera sensors, wouldn’t come cheap if they did. Scaling down, it would make no sense financially to try and ship single-chip 4K DLP projectors with the 1.38” 4K DMD, not to mention the optical engine would have to be a lot larger, resulting in a bigger and heavier projector.

At this point, we should stop using the nomenclature “4K” altogether and switch to the more accurate CTA designation for Ultra HD (3840 x 2160) when we talk about the next generation of displays past Full HD (1920 x 1080) and 2K (2048 x 1080). Also, SMPTE designates two sets of resolutions that go beyond Full HD – UHD-1, or anything up to and including 3840 x 2160, and UHD-2, anything beyond UHD-1 up to 8K (7680 x 4320) and beyond.

From my perspective; if your imaging device can show me a complete frame of video with at least 3840 x 2160 pixels, refreshed at 60 Hz, then I’m okay with calling it UHD (NOT 4K). But there’s a catch: High frame rate video is going to be a big thing with UHD-1 and UHD-2 and will require refresh rates of 90, 100, 120 Hz, and even 240 Hz. Can your current projector show me a complete video frame with at least 3840 x 2160 pixels of spatial resolution when refreshed at 240 Hz? 120 Hz?

Boy, I can hardly wait for 8K projector marketing campaigns to start…

(This article originally appeared on 9/19/2019 in Display Daily.)

R.I.P For Home Theater Projectors?

Recent trends in large flat screen displays have me wondering if we are seeing the beginning of the end for home theater front projection. (We are already seeing pressure on front projection for commercial markets, but that’s a topic for another time.)

Earlier this month, both Samsung and LG announced they would release 80-inch-class 8K displays for the home. For Samsung, it’s an 85-inch 8K LCD with quantum dot backlights for supporting high dynamic range, while LG moves forward with an 88-inch 8K OLED, also HDR-compatible but not nearly as bright as the Samsung offering.

Wait – what? 8K TVs for the home!?!? you’re probably thinking. Yep, 8K is here, and wow, did it arrive in a hurry! That’s because the Chinese manufacturers have basically collapsed pricing in the Ultra HDTV market over just three short years. You’d be nuts NOT to buy a new Ultra HDTV with prices this low, as some models can be had with HDR support for just $9 per diagonal inch.

We already have an abundance of 80-inch-class Ultra HD flat screen displays and their prices are quite reasonable. A quick check of the Best Buy Web site shows Sony’s XBR85X850F for $3,999. It’s an 85-inch LCD with HDR and “smart” connectivity. The same page listed a Samsung QN82Q6FNAFXZA (82 inches, QLED) for $3,499 and Samsung’s UN82NU8000FXZA (82 inches, HDR, QLED) for $2,999.

Got a few more bucks in your pocket? For $19,999, you can have the new Samsung QN85Q900RAFXZA, a top-of-the-line Ultra HD QLED TV. For $14,999, you can pick up LG’s OLED77W8PUA 77-inch OLED (not quite 80-inches, but close enough). (And for you cheapskates, there were several Ultra HDTVs in the 75-inch class for less than $2,500.)

Sony’s 85-inch XBR85X850F has the same retail price as a Full HD LCD projector did ten years ago. And you can lose the screen.

If you currently have a home theater, chances are the projection screen is in the range of 80 to 90 inches. Just two years ago, replacing that setup with a flat screen LCD would have been quite an expensive proposition. But today, you can purchase one of those 80+ inch beauties for less than what a 50-inch Pioneer Elite plasma would have cost ten years ago. (And 50 inches seems pretty small now, doesn’t it?)

When I last upgraded my home theater (which was around 2006-2007), I replaced a Sony CRT projector with a Mitsubishi HC5000 (later an HC 6000). That was a Full HD 3LCD model with beautiful color management. I’ve thought about upgrading it over the years even though I hardly use the theater anymore. But looking at these prices, I’d probably be better off just removing the projector and screen and moving to a one-piece flat screen setup.

There are a bunch of reasons why that would be a good idea. For one thing, I have a few older home theater projectors left in my studio and all of them use short-arc lamps that contain metal halides of mercury. If I was to upgrade to a new projector, it would have to use an LED illumination system – and those are still more expensive with 4K resolution than flat screen TVs.

Second, I could get rid of my 92-inch projection screen and hang some more art on the wall. (It previously replaced an 82-inch screen, and frankly, that was large enough for the room.) I could also eliminate a ceiling power and AC connection and a bunch of wiring from my AV receiver. All of that stuff would be consolidated in a small space under the new TV. (Who knows? I might even go ‘commando’ and just use a soundbar/subwoofer combination!)

I’m sure I’m not the only person who (a) built a home theater in the late 1990s, (b) upgraded the main family room/living room TV to a large, cheap flat screen a decade later, and (c) now spends more time watching that family/living room TV than using the home theater. Mitsubishi exited the projector business almost eight years ago, so I’d never be able to get my 6000 fixed. (But I hardly use it anyway, so who cares?)

Even a 75-inch TV would work, and there are plenty of those available at bargain-basement prices. Hisense showed an HDR Ultra HD model (75EU8070) for just a hair over $1,000 and Vizio’s E75-E3 will set you back only $300 more. For those prices, you can hardly go wrong – if you don’t like it a year from now, just recycle it and buy a new one (for less money).

There’s a parallel trend in movie theaters, where the first fine-pitch LED displays are making tentative steps toward replacing high-powered projectors.  Pacific Theaters Winnetka in Chatsworth, California installed a 34×17 Samsung fine-pitch LED screen last year and claims it can hit higher levels of peak brightness (3,000 – 4,000 cd/m2 shouldn’t be difficult) for true high dynamic range. And of course, LEDs can achieve an enormous color gamut and very deep blacks when off, characteristics of emissive displays.

With ongoing developments in LED technology, we’re likely to see more theaters adopt the LED platform – no projection lamp to replace, because there’s no projector to operate. There are issues about aspect ratios and content formatting to resolve, but we figured them out for digital cinema when we turned our backs on motion picture film.

So why not have our home theater work the same way and get rid of the projector? For that matter, it’s possible and even likely within a decade that LCD and OLED TVs will both be replaced by fine-pitch or ‘micro’ LED displays, giving us the same experience as a state-of-the-art theater.

And home theater projectors will wind up curiosities of an earlier age, like Super 8mm and slide projectors…something Grandpa and Grandma used, along with optical disc players……

Blu-Ray: On The Endangered Species List?

One of the problems with market research is that you often wind up with conflicting data from two or more sources. Or, the data presents a “conclusion” that’s all too easy to “spin” to advance an argument or make a point.

Ever since the two adversaries in the blue laser optical disc format squared off with pistols at twenty paces in 2008 (and one lost), the clear trend of media consumption has favored streaming and digital downloads. Entire business models have collapsed as a result, including Hollywood Video and Blockbuster Video sales and rental stores. The last two Blockbuster outlets in Alaska are closing, leaving just one solitary brick-and-mortar operation in Oregon.

With Netflix now serving over 100 million subscribers around the world and Amazon rumored to be working on a smart TV for delivering Prime video, the tide hasn’t stopped rising. Purchases of digital downloads and streaming media surpassed physical media in dollar value way back in 2015 and the gap continues to widen as more customers take advantage of fast broadband, smarter DVRs, and improved codecs for reliable delivery of Full HD AND 4K video over networks.

My industry colleague Greg Tarr recently posted a story on the HD GURU Web site quoting NPD Group analyst Stephen Baker as saying that, “…Ultra HD Blu-ray player sales increased by more than 150% over 2017 and the revenue is up 61%. The {Average Selling Price] ASP is $165 this year compared to $272 for the first 5 months of 2017.” Baker further pointed out that that sales of Ultra HD Blu-ray players in the United States increased 82% in May and revenue increased 13% with an ASP of $168. NPD estimates that 4K Ultra HD players represented about 15% of Blu-ray unit sales for the first five months of 2018.

Well, that certainly sounds like great news, doesn’t it? But some perspective is in order.

First off, all of these $168 players (which once cost north of $300 – $500 not long ago) also have built-in WiFi connections and can stream content from the likes of Netflix, Amazon, YouTube, and Hulu. And of course, they’re backward-compatible with standard Blu-ray, DVD, and CD audio formats.

Given the ridiculously low prices on Ultra HDTVs these days (such as 55-inch models with HDR 10 support for as low as $450), many consumers may simply be in a major TV and home entertainment upgrade cycle. I bought my first 1080p TV in 2008, a 42-inch Panasonic plasma, for about $1200. And I’m now ready to upgrade from a 2012-vintage, 47-inch 1080p LCD model, to a 55-inch or 60-inch smart 4K set, which with HDR support will cost me about as much as that 42-inch Panasonic from 2008.

Will I pick up an Ultra HD player too? Hey, for $150, why not? And will I watch a lot of UHD Blu-ray discs on it? Probably not, since I will be able to stream Netflix and Prime video at 4K resolution. Will that streamed 4K content look as good as a physical disc playing out at more than 100 Mb/s? Maybe not, but on the other hand, I won’t have to buy or rent any more discs. And based on my experience the other night watching “The Catcher Was A Spy” from Amazon Prime, I will be quite happy with the result.

Yes, you can buy a 4K TV at Shop Rite, available in the bread aisle. (Photo courtesy Norm Hurst)

As the saying goes, facts are stubborn things. The facts are; physical media sales have been in slow and steady decline for over a decade (and continue to decline) and Ultra HD BD disc sales constitute a small portion of overall media consumption. For that matter, so do sales of players: Research firm Futuresource predicts that global UHD Blu-ray player unit shipments should hit just 2.3 million, with more than 50% of those sales taking place in North America.

To put that in perspective, ABI Research forecasts that worldwide Ultra HD flat panel TV shipments will surpass 102 million in 2018, representing 44% of all WW flat panel TV shipments (about 232 million). So even with “record” sales growth, Ultra HD Blu-ray player sales will only constitute about 2.2% of Ultra HDTV sales, with the bulk of those player sales taking place in North America and Europe.

ABI also predicts that just shy of 200 million Ultra HDTVs will be sold in 2023 worldwide, with the majority taking place in China (which doesn’t use our Blu-ray format but instead relies on “China Blue,” the old HD-DVD standard). Coincidentally, Tarr’s article states that, “…market research predicts that blue laser optical disc player shipments will decrease from 72.1 million in 2017 to 68 million in 2023. Unit shipments for the global Blu-ray media market are expected to decrease from 595 million in 2017 to 516 million in 2023.”

That trend would seem to be at odds with TV purchases, according to an April press release from Futuresource. “We believe 4K UHD TV sets will ship over 100 million units this year, equivalent to two-thirds of the entire large screen market,” comments David Tett, Market Analyst at Futuresource Consulting. “Consumers increasingly want larger screens, and this is playing nicely into the 4K UHD proposition. HDR is expected to be present in 60% of 4K UHD sets this year.”

Digesting all of this data reveals that (a) 4K TV sales continue grow to worldwide (which is also being driven by a changeover from Full HD to 4K TV fab production, but that’s another story), (b) 4K TV sales will constitute an ever-larger percentage of overall TV sales by 2023 – if not close to 90%, (c) more and more consumers are streaming and downloading digital video than purchasing optical discs, (d) even with strong sales through the first six months of 2018, Ultra HD Blu-ray players are selling at a rate of just two for every 100 Ultra HDTVs purchased, and (e) overall sales of Blu-ray players of all kinds are in steady decline.

I fully expect to hear all of the arguments for UHD Blu-ray, picture quality being one of them. But if I can stream UHD content with HDR at acceptable quality levels, why do I need to buy discs? I’ll have access to an enormous cloud library and I’ll be more environmentally conscious, too. Besides, I rarely watch a movie more than once (look at the piles of old DVDs people try to get rid of at garage sales or foist on libraries). There’s plenty of good content available from video-on-demand.

Ultra HD video content with HDR @ 16 Mb/s that looks as good as UHD Blu-ray? Yep, Fraunhofer IHS showed it at NAB 2016.

And UHD BD supporters neglect to consider all of the continual advancements being made with codecs. A couple of years ago, Fraunhofer showed absolutely stunning Ultra HD video with dynamic HDR on a 65-inch UHDTV, encoded with HEVC H.265 at an average bit rate of 16 Mb/s – 15% of the peak streaming rate for Ultra HD Blu-ray – and they were encoding tricky stuff like confetti, wind-whipped waves, and moving objects with plenty of changing specular highlights. All heavy lifting.

Granted, it took two computers to do the software encoding and decoding. But those two computers can easily be reduced to a set of chips with firmware and a powerful CPU and installed inside my next TV.

So what would I need an optical disc player for?

Heads Up! Here Comes 8K TV (or, The Case Of The Amazing Vanishing Pixels)

Yes, you read that right: 8K displays are coming. For that matter, 8K broadcasting has already been underway in Japan since 2012, and several companies are developing 8K video cameras to be shown at next month’s NAB show in Las Vegas.

“Hold on a minute!” you’re probably thinking. “I don’t even own a 4K TV yet. And now they’re already on the endangered species list?”

Well, not exactly. But two recent press releases show just how crazy the world of display technology has become.

The first release came from Insight Media in February and stated that, “The 2020 Tokyo Olympics will be a major driver in the development of 8K infrastructure with Japanese broadcaster NHK leading efforts to produce and broadcast Olympic programming to homes…cameras from Hitachi, Astrodesign, Ikegami, Sharp and Sony address the many challenges in capturing 8K video…the display industry plans for massive expansion of Gen 10.5 capacity, which will enable efficient production of 65″ and 75″ display panels for both LCD and OLED TV…. sales of 8K Flat Panel TVs are expected to increase from 0.1 million in 2018 to 5.8 million in 2022, with China leading the way representing more than 60% of the total market during this period.”

Read it again. Almost 6 million 8K LCD and OLED TVs are expected to be sold four years from now, and over 3 million of those sales will be in China.

But there’s more. Analyst firm IHS Markit issued their own forecasts for 8K TV earlier this month, predicting that, While ultra-high definition (UHD) panels are estimated to account for more than 98 percent of the 60-inch and larger display market in 2017, most TV panel suppliers are planning to mass produce 8K displays in 2018. The 7680 x 4320-pixel resolution display is expected to make up about 1 percent of the 60-inch and larger display market this year and 9 percent in 2020.”

According to HIS Markit, companies with skin in the 8K game include Innolux, which will supply 65-inch LCD panels to Sharp for use in consumer televisions and in commercial AV displays. Meanwhile, Sharp – which had previously shown an 85-inch 8K TV prototype – will ramp up production of a new 70-inch 8K LCD display (LV-70X500E) in their Sakai Gen 10 LCD plant. This display was shown in Sharp’s booth at ISE, along with their new 8K video camera.

Sharp showed this 8K camera (BC-B60A) at ISE…

 

…feeding this 70-inch 8K LCD monitor (LV-70X500E), a new glass cut from the Sakai Gen 10 fab.

Sony and Samsung are also expected to launch 8K LCD TVs this year. Both companies showed prototypes at CES with Samsung’s offering measuring about 85 inches. Sony’s prototype also measured 85 inches but included micro light-emitting diodes (LEDs) in the backlight to achieve what Sony described as “full high dynamic range,” achieving peak (specular) brightness of 10,000 nits. (That’ll give you a pretty good sunburn!)

Oher players in 8K include LG Display, who already announced an 88-inch 8K OLED TV prior to CES, and Chinese fabricators BOE, AUO, and China Electronics Corporation (CEC). What’s even more interesting is that some of these 8K LCD and OLED panels will be equipped with indium gallium zinc oxide (IGZO) switching transistors.

No, IGZO isn’t a cure for aging. But what it does is provide much higher pixel density in a given screen size with lower power consumption. More importantly, it will allow these 8K TVs to refresh their pictures as fast as 120 Hz – double the normal refresh rate we use today. And that will be important as High Frame Rate (HFR) video production ramps up.

LG Display’s 88-inch 8K OLED display was a real eye-catcher at CES 2018.

Predictably, prices for TVs and monitors using panels with 4K resolution are collapsing. In the AV channel, 4K (Ultra HD) displays are only beginning to show up in product lines, but manufacturers are well aware of pricing trends with Ultra HD vs. Full HD (1920x1080p). With some consumer models now selling for as little as $8 per diagonal inch, the move from Full HD to 4K / Ultra HD will pick up lots of steam.

And with 8K displays now becoming a ‘premium’ product, 4K / Ultra HD will be the ‘everyday’ or mainstream display offering in screen sizes as small as 40 inches and as large as – well, you name it. We’ve already seen 84-inch, 88-inch, and 98-inch commercial displays, and prototypes as large as 120 inches – yes, 10’ of diagonal screen, wrap your head around that – have been exhibited at CES and other shows.

We saw quite a few demonstrations of 4K commercial displays at ISE and expect to see a whole lot more at InfoComm in June, along with the inevitable price wars. And there will be the usual “my encoder handles 4K better than yours with less latency” battles, shoot-outs, and arguments. But that could ultimately turn out to be the appetizer in this full-course meal.

For companies manufacturing signal distribution and switching equipment, 4K / Ultra HD already presents us with a full plate. 8K would be too much to bite off at present! Consider that an 8K/60 video signal using 12-bit RGB color requires a data rate approaching 100 gigabits per second (Gb/s), as compared to a 12-bit, 60 Hz Full HD signal’s rate of about 6 Gb/s, and you can see we will have some pretty steep hills to climb to manage 8K.

Distributing 8K over a network will be equally challenging and will require switching speeds somewhere north of 40 Gb/s even for a basic form of 8K video, which (we assume) will also incorporate high dynamic range and wide color gamuts. 40 Gb/s switches do exist but are pricey and would require 8K signals to be compressed by at least 25% to be manageable. And they’d certainly use optical fiber for all their connections.

To be sure, 4K / Ultra HD isn’t on the endangered species just yet. (For that matter, you can still buy Full HD monitors and TVs, if that’s any comfort.) But whether it makes sense or not – or whether we’re ready or not – it’s “full speed ahead” for 8K displays as we head into the third decade of the 21st century…

High Dynamic Range: It’s Here!

Ever since the launch of high definition television in the 1990s, it seems as if some new ‘bell and whistle’ enhancement comes along every few years. First it was the changeover to flat screen plasma displays in the late 1990s, followed by a shift to 1080p and Wide UXGA resolution in the early 2000s.

The industry transitioned to liquid-crystal display (LCD) panels for TVs and monitors a few years later. UHD (4K) imaging popped into our consciousness in 2012. And of course, 3D made its “once every few sunspot cycles” appearance in 2009, followed by virtual reality last year.

Some of these trends actually stuck, like 4K: Display manufacturers are slowing down production of Full HD (1920×1080) display panels in favor of UHD (3840×2160) as consumers increasingly choose the higher resolution. That, in turn, means that the displays we select for digital signage, classrooms, meeting rooms, and other applications will also be of the 4K variety.

The latest trend to rear its head is high dynamic range (HDR), which is accompanied by wide color gamut (WCG) imaging. In a nutshell, HDR means a greatly expanded range of tonal values that can be shown in still and moving images. Conventional cameras and imaging systems can capture anywhere from 9 to 11 f-stops of light. (Each f-stop increase represents a luminance value twice as bright as the previous one.)

HDR takes that to a higher level by capturing as many as 22 f-stops of light, and reproducing those tonal values becomes a real challenge to displays that employ conventional backlight or illumination systems. Hence, we are now seeing a new crop of LCD TVs with turbocharged backlights to reproduce the extreme dynamic ranges of HDR images. On the emissive display side, organic light-emitting diode (OLED) TVs can also reproduce HDR content, although with lower peak brightness levels.

At NAB 2017, NEC showed this 4K HDR encoder prototype, streaming 77 Mb/s with 99 ms latency.

For some perspective, the venerable CRT display had a peak brightness level somewhere around 29 foot-Lamberts (100 candelas per square meter), which represented close to 100% diffuse white. In an HDR display, that value largely holds, but more intense specular highlights (like the sun reflecting off a pane of glass or the water, or a bright streetlight at nighttime) can hit peaks much, much higher in the thousands of cd/m2 .

And HDR isn’t just about extreme brightness. The entire grayscale is expanded, so we should see more shadow details along with intense specular light sources. When done correctly, HDR images are quite the departure from ‘everyday’ HDTV, and more closely resemble the range of tonal values our eyes can register – with their visual contrast ratio approaching 1,000,000:1.

There are numerous ways to achieve higher levels of brightness. Dense arrays of light-emitting diodes can do it when used in a direct-illumination architecture. However, the favored approach is to employ a special optical film embedded with nano-sized red and green quantum dot particles, stimulated by an array of blue LEDs. 2017 TV models using this approach can achieve peak small-area brightness values of 2,000 cd/m2.

For perspective, consider that an LED (emissive) videowall for indoor use will routinely hit 3,000 cd/m2 brightness with full white images, and you can appreciate just how much of a leap HDR represents over current imaging technology. What’s more significant is how quickly the prices for HDR displays are coming down, particularly as Chinese TV manufacturers enter the marketplace.

Just prior to the Super Bowl – the best time to score a deal on a new TV, by the way – it was possible to purchase a 55-inch ‘smart’ Ultra HDTV for just $499 from a Tier 1 manufacturer. And a 65-inch model with basic HDR (static metadata) could be had from a Chinese brand for less than $700, while a Tier 1 HDR model of the same screen size was less than $900.

I mentioned wide color gamut earlier. It stands to reason that if a camera can capture a much wider range of luminance values, it can also record a much wider range of color shades. And that’s exactly what winds up happening. With the current 8-bit color system widely in use for everything from broadcast and cable television to Blu-ray discs and streaming media, a total of 16.7 million colors can be represented.

With HDR and WCG, the playing field is expanded considerably and now requires 10 bits per color pixel, resulting in 1,073,741,800 colors – over 1 billion color shades! That’s too much heavy lifting for LCD displays that use white LEDs with color filters, but it’s within reach of quantum dot LCDs and OLEDs.

The availability of HDR/WCG content has also forced a speed upgrade to display interfaces. HDMI 1.3/1.4 simply can’t handle a 4K HDR signal, so we must use HDMI 2.0 to do the job. And even version 2.0 is barely fast enough – if the 4K video signal uses lower color resolution (4:2:0, 4:2:2), then it can transport HDR signals as fast as 60 Hz. But switch to RGB (4:4:4) color mode – such as we’d see with 4K video from a computer video card – and HDMI 2.0 can’t pass a 60 Hz signal with anything more than 8-bit color.

On the DisplayPort side, things are somewhat more accommodating. Version 1.2 (the current one) can pass a 3840x2160p/60 signal 10-bit RGB (4:4:4) color, but nothing more. The newest DP version – 1.3 – raises its maximum speed to 32.4 Gb/s, which makes imaging 12-bit and even 16-bit 4K HDR content possible. However, version 1.4 is required to recognize the HDR ‘flags’ that travel with the content and must be passed on to the display. (HDMI uses extensions for HDR and WCG, with ‘a’ used for static HDR metadata and ‘b’ used for dynamic metadata.)

Marketing folks have a field day confusing people with new display tech and apparently they’re going to town with HDR. We’re now hearing about “HDR-compatible” products, particularly in signal interfacing. Nothing to see here, folks – if the signal distribution and switching equipment is fast enough to pass the required clock rate and hands over HDR metadata (CTA.861.3) to the display without alteration, then it is indeed “HDR compatible.” Simple as that.

I return to my original question: Will HDR have an impact on our industry? The answer is an emphatic “yes!” There are many customers that would realize a benefit from HDR imaging – medical, surveillance, military, research, virtual reality, and simulation verticals will embrace it pretty quickly, and others will follow.