Category: The Front Line

LG Is “All In” With OLEDs (Updated)

Last month, I was invited to visit LG Display’s manufacturing and R&D facilities in Paju, Korea; something I had been hoping to do for several years. The trip was motivated by the many display exhibits I’ve seen at LGD’s CES suite every year – a suite we often refer to as “the candy store” because so many cool display concepts are in the spotlight.

At Paju, our small group was able to take a quick look at the mostly-automated LCD assembly process in one of the large fabs (P7) and tour an extensive Innovation exhibit, showing off the company’s latest developments in both LCD and OLED tech. Nearby, an enormous P10 fab was under construction; one which will be used to manufacture hundreds of thousands of OLED display panels each month.

While there are several different companies engaged in OLED research and development and others that manufacture smaller quantities of smaller OLED screen sizes, LG Display has decided to put all of its chips on this technology for televisions. The underlying science originally came from Eastman Kodak’s work on white OLED emitters in the 1970s and 1980s, and LG Display acquired the rights to that intellectual property several years ago.

Given the plethora of LCD manufacturing fabs in Asia and the plummeting costs of LCD TVs (largely due to China’s increasing dominance in the market), the LCD manufacturing and TV business is fast becoming a zero-sum game. It’s why legacy brands including Toshiba, Hitachi, Mitsubishi, and Panasonic have exited the consumer television market – the profit just isn’t there.

So a big investment in OLEDs is quite a gamble, especially since televisions using this technology sell for higher prices than comparably-sized LCD sets. Right now, a 55-inch Ultra HDTV using OLED technology retails between $2,000 and $3,000, depending on whether there is a sale or special promotion.

LG’s massive P10 OLED fab, under construction in Paju and seen from the conference suite atop the P7 LCD fab.

65-inch sets are even more expensive: A quick check on the Best Buy / Magnolia Web site shows a variety of OLED TV models (including one from Sony that uses LG Display panels) ranging in price from $3,000 to $5,500. There’s also an LG 77-inch model for $15,000, if you want to go that big.

Some perspective is useful here. Way, WAY back in the mid-1990s, a 50-inch 720p/768p plasma TV would set you back between $25,000 and $30,000. Before plasma manufacturing stopped just 15 years later, you could pick up a 50-inch 1080p model with “smart” functions for as little as $500. So it’s reasonable to assume the downward price curve will come into play once again as demand ramps up.

OLED display technology is a very different beast than LCD and has more in common with CRTs and plasma, since it uses an array of emissive color pixels (plus white pixels for higher brightness). Emissive architectures don’t have the issues that hamstring LCDs, such as contrast flattening and color de-saturation when images are viewed off-axis. As we saw in an eye-opening demo using a star field on a deep black background, edge-illuminated HDR LCD TVs can create an unwanted “shaft of light” artifact while trying to modulate black levels across a small a local area and still maintain high brightness and contrast. OLEDs have no such difficulty with this type of content – the high-contrast star field images reproduced as you’d see them in real life, displayed as intense point sources of light on a near-black field.

This massive curved, tiled OLED videowall greets visitors to the Seoul Tower lobby.


Here is a “cylinder” display, made up from large UHD OLEDs.

The OLED “stack” is also much thinner than an LCD – in the case of the new LG “wallpaper” OLED displays, it’s just 4 mm. Since it’s also possible to form OLED pixels on plastic surfaces, we can create a different class of flexible displays. I’ve mentioned this more than once on previous occasions: The largest market in the world for displays is transportation. Think of anything that moves and transports people and goods – ships, trains, planes, cars, trucks, subways. G-forces and sustained vibrations are the two biggest threats to displays in these applications. By making the substrates thin and flexible, we can minimize the long-term effects of both threats.

The behavior of OLEDs as they slowly come out of black (quiescent state) to full white is very predictable – these are low-voltage devices, after all – and it’s possible to show images with very low luminance values that are not limited by the inherent characteristics of LCDs. Even the best plasma displays that used fractional, low-luminance pulse-width modulation techniques can’t equal the performance of an OLED.

There’s always a ‘catch,’ though. OLEDs (like any other emissive display) have a practical brightness limit. For a full white screen, that’s in the range of 100 to 200 cd/m2, while a 10% window can hit about 600 cd/m2. Contrast that with an LCD TV equipped with quantum dots, which can achieve a peak small-area brightness measurement in excess of 1000 cd/m2.

But everything is relative. While you may need more horsepower if you are viewing images in a brightly-lit room; in a darkened room or in the evening, 500 cd/m2 peak brightness is more than adequate. And the ability of OLED displays to render shadow detail down to near-black (close to .0005 cd/m2) means they can show images with high-dynamic range – over 20 stops of light can be displayed from just above black to 100% white.

This OLED tunnel is similar to the one shown at CES 2017. It’s also located in the Seoul Tower.


This 6×6 OLED wall uses older 55-inch 1080p panels and is positioned in the entrance lobby to the Paradise Casino near Incheon Airport.

The other ‘catch’ has intrigued us display analysts for years, and that is differential aging of different OLED colors. While green and red appear to have acceptable half-life spans, dark blue has always been a challenge. Sony’s original 25-inch and 17-inch TriMaster RGB OLED monitors acknowledged as much by raising the power consumption over time to maintain a 100 cd/m2 brightness target as the blue emitters decayed. Samsung’s first 55-inch curved OLED TV – shown about 4 years ago – used dual blue emitters that were twice the size of the red and green emitters, another way to tackle differential aging.

LG Display’s technology uses a white OLED emitter that combines two doped compounds. One by itself emits blue light, while the other emits yellow light. Together, they create white light, and RGB color filters round out the package. The combination works very well, although LG Display adds a white pixel for every red, green, and blue combination to boost brightness. (Presumably this trick sacrifices a bit of color saturation and certainly makes for plenty of animated discussions among home theater enthusiasts!)

While speaking with Dr. Jang Jin Yoo, who heads up LG Display’s Image Quality Development Department, I asked if there was any measureable decay in the blue doped compound over time. If so, this would result in images with a yellowish tint. His reply was that the white OLED emitters should last 20,000 hours before any such decay might be observed, at which point the entire OLED array might have reached 50% of initial brightness. (And it’s likely that you would have replaced the TV long before then anyway.)

Here is a flexible plastic OLED dashboard for cars and trucks. It has excellent brightness, contrast, and color saturation and will simply flex with bumps and vibration.

My other question had to do with something I observed at the 2016 CE Week TV Shoot-Out conducted by Bob Zohn of Value Electronics. I didn’t make this year’s event as I was in Korea, but at last year’s running I had observed a greenish tint on a 55-inch LG OLED TV when viewed off-axis at just 30 degrees. The folks at LG Display in Paju had not heard of this previously, so I loaded up a few photos on my phone to show them what I saw. (This phenomenon was confirmed by Joe Kane and a few other golden eyes.)

The answer? There wasn’t any forthcoming during our visit, but I’m hopeful I may get one eventually. It may have been a production issue with 2016 models, and I don’t know if the same thing was seen this year at the Shoot-Out. My theory was that something in the TV’s optical path was acting like a bandpass filter, attenuating red and blue light output off-axis while boosting green response.

The green color cast I observed on an LG 2016 55-inch Ultra HDTV last year. My viewing angle is about 30 degrees off the centerline.


Here’s another view of the green shift. To date, I still haven’t gotten an explanation from anyone at LG as to what might have caused this and if it’s been addressed.

If you are a display aficionado, you know that the best images consistently come from emissive displays – bright, saturated colors, wide viewing angles in both axes, low black levels, and excellent contrast. Over time, we’ve weaved back and forth from motion picture film (transmissive) to CRT televisions (emissive) to rear-projection television (transmissive) to plasma TVs (emissive) to LCD TVs (transmissive).

With wider adoption of OLED technology, we are swerving back into the emissive lane again. And if you need any convincing that emissive is here to stay, look at the wave of fine-pitch, super-bright inorganic LED displays now available for commercial displays: They’ve already put a dent in sales and rentals of high-brightness front projectors and are becoming the preferred backdrop display for TV news broadcasts.

We’re a long ways off from micro LED TVs in our living rooms. For now, the next best thing is OLED technology. In addition to LG’s lineup, Sony is now selling two models with LGD panels – will others jump on board? Panasonic had a nice 65-inch offering using the LGD panels, but took it off the market. Will it come back as a companion to Panasonic’s UHD Blu-ray players?

From what I could tell, LG Display believes the future of displays is OLED technology – that’s where the bulk of their capital investments are going. They expect to finish their massive P10 OLED fab line sometime late this year or early next year, at which point they will be able to roll larger sheets of motherglass and yield more cuts. That should bring 55-inch and 65-inch UHDTV set prices down to a level that consumers expect.

Those are my observations from Korea for now. Look for more thoughts on OLEDs in future columns…

My two able guides, Sue Kim of Insight Communications and Jean Lee from LG Display PR. We’re posing at 486 meters above the ground in the Lotte World Tower – the 5th-tallest building in the world. (It has a few LG OLED displays in its lobby!)


How tall is 486 meters? THIS tall…

EDITOR’S NOTE: This article has been updated to correct the thickness of a 65-inch OLED panel (4mm, not 2.5mm) and peak brightness in a 10% window (600 nits, not 500 nits).

*LG Display also contacted me after publication to state that the expected time to half-brightness for white OLEDs is officially 50,000 hours, not the 20,000 hours quoted in this article. As a veteran of many years in the display industry, I have seen half-brightness specifications like 10,000, 20,000, and 50,000 hours used far too liberally in the past to believe most of them. I would have to see more specific aging studies that could substantiate a half-brightness claim of 50,000 hours, and as I wrote in my article, it’s unlikely anyone would keep a television long enough to rack up even 20,000 hours on it in any case.

If you watched television for 6 hours a day, 365 days a year, that would amount to 2,190 hours, and at that rate, you’d hit the 20,000-hour mark in just over 9 years. It’s highly likely you would have replaced your television by then.

A New QLED Artifact

During the week of July 9 I, along with three other media/analyst types, was LG Display’s guest in Seoul and Paju for a tour of facilities and interviews with executives. During that trip, LGD engineers showed me a display artifact in Samsung’s Q Series of quantum-dot LCD-TVs previously unknown to me and my colleague Pete Putman, who was also on the tour.

LGD and Samsung are in a pitched battle to convince consumers (and the media) as to whether the Q Series or LGD’s OLED-TV offer the superior image, so the Q Series is what LGD focused on in the darkened lab in LGD’s huge Paju panel manufacturing complex. But, as you will see, it is likely that the artifact would be seen on any LCD-TV that uses edge-lighting and one-dimensional (1D) local area dimming. The Q Series uses such a system, with the LEDs distributed along the bottom edge.  (Although the artifact was new to Pete and me, I have since learned that Ray Soneira of DisplayMate described a related artifact discovered in tests and analysis he performed in 2015 and 2016.  Since he used a different test pattern the details of the artifact were not the same as what I observed, but both artifacts have the same origin.)

Photos were not permitted in the lab, but the concept of 1D dimming is shown in this photograph taken in the Corning booth at the most recent SID Display Week. Each local dimming zone is the vertical beam of light emanating from the LED on the bottom. The artifact was demonstrated by an image of a bright moon in a star-filled black sky. The LED for the zone containing the moon brightened in order to produce the desired high luminance for the moon, but in doing so it made all of the stars within the beam brighter, and created the appearance of the sky within this zone being less black. If the moon moves across the screen, so does this “searchlight beam,” which all of us were soon calling a “halo” or “halation.” The effect was obvious and not-at-all subtle. The effect can arise whenever there is an array of bright points on the screen combined with a larger or brighter object, or even if the distribution of points is significantly uneven.

These images were shown by Corning at Display Week to demonstrate how the better local dimming index of Iris glass produces a better defined LED beam (right) for more effective 1D local area dimming. The contours of such beams should not be visible in the TV image seen by the viewer, but they were strongly suggested by the “halation pattern” described in the article. (Photo: Ken Werner)

Although this effect is likely to be most disturbing to buyers of expensive premium LCD-TVs, there is no reason to think that the artifact is caused by quantum dots. It should be visible on any edge-lit LCD-TV. But, ironically, because less expensive sets are likely to have a lower “local area index” — see the left side of the photo — the local-area dimming will less effective and the artifact should be less obvious.

The artifact could be removed by using two-dimensional (2D) dimming implemented with a full-array backlight. With this type of backlight, the LEDs are arrayed evenly behind the LCD in a backlighting rather than edge-lighting configuration. Less expensive sets use a relatively small number of LEDs, resulting in low cost and less effective local dimming. Expensive sets use a large number of LEDs for excellent results at a higher cost. (Reasonably well-populated full-array backlights can have their own artifacts, but the ones I’ve seen are more subtle than the halation I saw at Paju.)

So why not used the full array? I have not discussed this with Samsung specifically but full-array backlights, although thinner than they used to be, are considerable thicker than edge-lights, particular those implemented with Corning’s Iris glass instead of acrylic. And, if you go for quality rather than economy, they require more LEDs and therefore cost more.

Isn’t a full-array backlight’s modest increase in thickness worth while if it permits images with fewer artifacts? For many readers of the answer is probably yes. But to the marketing imagination “thin” equals “quality,” and when your competition includes an OLED-TV so thin and light in weight it can be mounted directly on a wall, deciding to go thicker is likely not an easy call.


Ken Werner is Principal of Nutmeg Consultants, specializing in the display industry, manufacturing, technology, and applications, including mobile devices and television. He consults for attorneys, investment analysts, and companies re-positioning themselves within the display industry or using displays in their products. He is the 2017 recipient of the Society for Information Display’s Lewis and Beatrice Winner Award. You can reach him at

InfoComm 2017 In The Rear View Mirror

InfoComm 2017 has come and gone, and left us with lots to think about.

For me, this year’s show was hectic, to say the least. I presented my annual Future Trends talk on Tuesday to kick off the Emerging Trends session, then conducted a 3-hour workshop on RF and wireless that afternoon to the largest crowd I’ve ever had for the class. (It may be the largest crowd I ever get as I’m thinking of shelving this class.)

Bright and early on Wednesday morning, I taught a 2-hour class  on AV-over-IT (the correct term; you could also use “AV-with-IP”) to a full house. There were even some folks standing in the back of the room. I guessed at least 200 were in attendance.

Thursday morning found me back in the same space, talking about 4K and Ultra HDTV to a smaller crowd (maybe not as “hot” a topic?) and urging them to set their BS meters to “high” when they headed to the show floor to talk to manufacturers about 4K-compatible/ready/friendly products.

With other presentation commitments, it worked out to nearly 15 hours standing in front of crowds and talking. Tiring to say the least, but I did get a ton of great follow-up questions after each session. People were paying attention!

AV-over-IT was a BIG theme at InfoComm, and it was hard to miss.

Mitsubishi had a very nice fine-pitch LED display at the show – one of the few that are not built in China.

The migration to using TCP/IP networks to transport video and audio instead of buying and installing ever-larger and more complex HDMI switchers and DAs is definitely catching steam. My colleagues and I have only been talking about this for over a decade and it’s rewarding to see that both manufacturers and end-users are buying in.

And why not? Computer hardware couldn’t get much cheaper. For my AV/IT demo, I was streaming a local TV station, broadcasting in the 720p HD format, using an H.264 AVC encoder/decoder pair running through a 1GigE NetGear managed switch. The streaming rates were in the range of 15 – 18 Mb/s, so I had plenty of headroom.

It worked like a champ. I was able to show how adjusting the group of pictures (GOP) length affected latency, along with the effects of constant bitrate (CBR) vs. variable bitrate (VBR) encoding. If I could have dug the gear up in time, I would have demonstrated UHD content through a 10 Gb/s switch – same principles, just a faster network.

I saw more companies than ever this year showing some sort of AV-over-IT solution. (Almost as many as those showing LED walls!) Lots of encoders and decoders, using H.264, Motion JPEG, and JPEG2000 formats; connected through fast switches and driving everything from televisions to projectors.

If it’s REALLY happening this time, then this is BIG. Migration to AV-over-IT is a big shot across the bow of companies that sell large HDMI-based matrix switches, not to mention distribution amplifiers and signal extenders – both made obsolete by this new technology. With AV on a network, all you need is a fast switch and a bunch of category cable. For longer runs, just run optical fiber connections to SPF fiber connections on the switch.

LG showed off its unique curved OLED displays – and they’re dual-sided.

Meanwhile, Samsung unveiled the first digital signage monitors to use quantum dot backlight technology for high dynamic range and wide color gamuts.

Hand-in-hand with this migration to an IT-based delivery system is a steady decline in the price of hardware, which has impacted the consumer electronics industry even harder. Consider that you can now buy a 65-inch Ultra HDTV (4K) with “smart” capabilities and support for basic high dynamic range video for about $800.

That’s even more amazing when you consider that the first Ultra HD displays arrived on our shores in 2012 with steep price tags around $20,000. But the nexus of the display industry has moved to mainland China, creating an excess of manufacturing capacity and causing wholesale and retail prices to plummet.

There is no better example of China’s impact on the display market than LED display tiles and walls. These products have migrated from expensive, coarse-resolution models to super-bright thin tiles with dot pitches below 1 millimeter – about the same pitch as a 50-inch plasma monitor two decades ago.

Talk to projector manufacturers and they’ll tell you that LED displays have cut heavily into their business, especially high-brightness projectors for large venues. LED wall manufacturers were prominent at the show, and some are hiring industry veterans to run their sales and marketing operations; removing a potential barrier to sales in this country by presenting potential customers with familiar faces.

Panasonic showed there are still plenty of applications for projection, especially on curved surfaces.

Absen is an up-and-coming LED brand, and they’re hiring veterans of the U.S. AV market to push sales along.

At the other end, large and inexpensive LCD displays with Full HD resolution have killed off much of the “hang and bang” projector business, and large panels with Ultra HD resolution are now popping up in sizes as large as 98 inches. The way things are going in Asia, Full HD panel production may disappear completely by the end of the decade as everyone shifts to Ultra HD panel production.

Even the newest HDR imaging technology – quantum dots – made an appearance in Orlando in a line of commercial monitors with UHD resolution. Considering that QD-equipped televisions have only been around for a couple of years, that’s an amazingly accelerated timeline. But compressed timelines between introduction and implementation are the norm nowadays.

This was my 24th consecutive InfoComm and the 21st show (so far as I can remember) where I taught at least one class. When I went to my first show in Anaheim, CRT projectors were still in use, a ‘bright’ light valve projector could generate maybe 2000 lumens, LCD projectors cost ten grand and weighed 30 pounds, and composite video and VGA resolution ruled the day. RS232 was used to control everything and stereo was about as ‘multichannel’ as audio got.

All of that has passed into oblivion (except for RS232 and VGA connectors) as we continue to blow by resolution, size, speed, and storage benchmarks. The transition to networked AV will result in even more gear being hauled off to recycling yards, as will advances in wireless high-bandwidth technology, flexible displays, cloud media storage and delivery, and object-based control systems.

Can’t wait for #25…

InfoComm Tech Trends for 2017

Although I’ve been working in the AV industry since 1978 (the good old days of tape recorders, CRT projectors, and multi-image 35mm slide projection), I only started attending InfoComm in 1994.

At that time, the Projection Shoot-Out was picking up steam with the first solid-state light modulators (LCDs). Monitors still used CRTs, and some new-fangled and very expensive ‘plasma’ monitors were arriving on our shores. “HD resolution” meant 1024×768 pixels, and a ‘light valve’ projector could crank out at best about 2,000 lumens. The DB15 and composite video interfaces dominated connections, and a ‘large’ distribution amplifier had maybe four output ports on it.

I don’t need to tell you what’s transpired in the 23 years since then. This will be my 24th InfoComm, and it might be the most mind-boggling in terms of technology trends. We’ve come a long way from XGA, composite video, CRTs, 35mm slides, analog audio, and RS232. (Okay, so that last one is still hanging around like an overripe wine.)

I’ve mentioned many of the trends in previous columns, so I’ll list what I think are the most impactful and exactly why I feel that way. I should add that I’m writing this just after attending the NAB 2017 show, where many of my beliefs have been confirmed in spades.

Light-emitting Diodes (LEDs) are taking over (the world): This is an obvious one, but now they’re simultaneously threatening both the large venue projection and direct-view display markets. I saw at least a dozen LED brands at NAB – most of them from mainland China – offering so-called ‘fine pitch’ tiled displays. These range from 1.8mm all the way down to .9mm, which is about the same pitch as a 50-inch plasma TV had 17 years ago.

The challenge for anyone here is who to buy from and which products are reliable. You wouldn’t recognize most of these companies, as they are largely set up to market LED tiles to the outside world. And some of them supply companies you do know in the LED marketplace. With brightness levels hitting 400 – 800 nits for fine pitch (and over 2,000 nits for coarser pixel arrays), it’s no wonder that more applications are swinging away from front projection to tiles.

And there are even finer screens in the works with pixel pitches at .8mm and smaller. That’s most definitely direct-view LCD territory, at least at greater viewing distances. But the LCD guys have some tricks of their own…

Cheaper, bigger, 1080p and UHD flat screens: Right now, there are too many LCD ‘fabs’ running in Asia, making too much ‘glass.’ More and more of that ‘glass’ will have Ultra HD resolution. That, in turn, is forcing down prices of 1080p LCD panels, making it possible for consumers to buy super-cheap 60-inch, 65-inch, and 70-inch televisions.

Consequently, it will be easy to pick up 65-, 70-, and even 85-inch LCD screens for commercial installations for dirt-cheap prices. We’re talking about displays that can be amortized pretty quickly – if they last a couple of years, great. But even if they have to be replaced after a year, the replacement costs will be lower. And with the slow migration to UHD resolution in larger sizes (it’s a matter of manufacturing economies); you can put together tiled 8K and even 16K displays for a rational budget.

Don’t expect OLEDs to make too many inroads here. They don’t yet have the reliability or sheer brightness of LCDs, and you’re going to start seeing some high-end models equipped with quantum dot enhancements for high brightness and high dynamic range (HDR) support. Speaking of which…

High dynamic range and wide color gamut technologies were all over the place at NAB. There is so much interest in both (they go hand-in-hand anyway) that you will numerous demos of them in Orlando. Who will use HDR and WCG? Anyone who wants a more realistic way to show images with brightness, color saturation, and contrast levels that are comparable to the human eye.

Obviously, higher resolution is very much part of this equation, but you don’t always need 4K to make it work. Several companies at NAB, led by Hitachi, had compelling demos of 2K (1080p) HDR. On a big screen, the average viewer might not even know they’re looking at a 1080p image. And yes, both enhancements do make a difference – they’re not just bells and whistles.

AV distribution over networks: I’ve been teaching classes in networked AV for over a decade, but it has finally arrived. You won’t hear nearly as much about HDMI switching and distribution in Orlando as you will about JPEG2000, latency, network switch speeds, and quality of service issues.

That’s because our industry has finally woken up and smelled the coffee: Signal management and distribution over TCP/IP networks is the future. It’s not proprietary HDMI formats for category wire. It’s not big, bulky racks full of HDMI hardware switches. No, our future is codecs, Layer 2/3 switches, cloud servers and storage, faster channel-bonding WiFi, and distribution to mobile devices.

You couldn’t throw a rock at NAB without hitting a company booth that was showcasing a codec or related software-based switching (SBS) product. More and more of them are using the HEVC H.265 codec for efficiency or M-JPEG2000 for near-zero latency. Some companies demonstrated 25 Gb/s network hardware for transport and workflows, while others had scheduling and playout software programs.

Internet of Things control for AV: You can defend proprietary control systems all day long, but I’m sorry to tell you that you’re on the losing end of that argument. IoT is running wild in the consumer sector, which of course wields great influence over our market. App-based control has never been easier to pull off, which is why the long-time powers in control are scrambling to change gears and keep up with the crowd.

In short; if it has a network interface card or chip, it can be addressed over wireless and wireless networks with APIs and controlled from just about any piece of hardware. And control systems have gotten smart enough that you can simply connect a piece of AV hardware to a network and it will be identified and configured automatically. You won’t have to lift a finger to do it.

It is a sobering thought to realize I’m in my 40th year working in this industry. Yet, I have never seen the technology changes coming as hard and as fast as I have in the past decade (remember, the first iPhone appeared in 2007). It’s all migrating to networks, software control, and displays that have LEDs somewhere in the chain. Tempus fugit…

High Dynamic Range: It’s Here!

Ever since the launch of high definition television in the 1990s, it seems as if some new ‘bell and whistle’ enhancement comes along every few years. First it was the changeover to flat screen plasma displays in the late 1990s, followed by a shift to 1080p and Wide UXGA resolution in the early 2000s.

The industry transitioned to liquid-crystal display (LCD) panels for TVs and monitors a few years later. UHD (4K) imaging popped into our consciousness in 2012. And of course, 3D made its “once every few sunspot cycles” appearance in 2009, followed by virtual reality last year.

Some of these trends actually stuck, like 4K: Display manufacturers are slowing down production of Full HD (1920×1080) display panels in favor of UHD (3840×2160) as consumers increasingly choose the higher resolution. That, in turn, means that the displays we select for digital signage, classrooms, meeting rooms, and other applications will also be of the 4K variety.

The latest trend to rear its head is high dynamic range (HDR), which is accompanied by wide color gamut (WCG) imaging. In a nutshell, HDR means a greatly expanded range of tonal values that can be shown in still and moving images. Conventional cameras and imaging systems can capture anywhere from 9 to 11 f-stops of light. (Each f-stop increase represents a luminance value twice as bright as the previous one.)

HDR takes that to a higher level by capturing as many as 22 f-stops of light, and reproducing those tonal values becomes a real challenge to displays that employ conventional backlight or illumination systems. Hence, we are now seeing a new crop of LCD TVs with turbocharged backlights to reproduce the extreme dynamic ranges of HDR images. On the emissive display side, organic light-emitting diode (OLED) TVs can also reproduce HDR content, although with lower peak brightness levels.

At NAB 2017, NEC showed this 4K HDR encoder prototype, streaming 77 Mb/s with 99 ms latency.

For some perspective, the venerable CRT display had a peak brightness level somewhere around 29 foot-Lamberts (100 candelas per square meter), which represented close to 100% diffuse white. In an HDR display, that value largely holds, but more intense specular highlights (like the sun reflecting off a pane of glass or the water, or a bright streetlight at nighttime) can hit peaks much, much higher in the thousands of cd/m2 .

And HDR isn’t just about extreme brightness. The entire grayscale is expanded, so we should see more shadow details along with intense specular light sources. When done correctly, HDR images are quite the departure from ‘everyday’ HDTV, and more closely resemble the range of tonal values our eyes can register – with their visual contrast ratio approaching 1,000,000:1.

There are numerous ways to achieve higher levels of brightness. Dense arrays of light-emitting diodes can do it when used in a direct-illumination architecture. However, the favored approach is to employ a special optical film embedded with nano-sized red and green quantum dot particles, stimulated by an array of blue LEDs. 2017 TV models using this approach can achieve peak small-area brightness values of 2,000 cd/m2.

For perspective, consider that an LED (emissive) videowall for indoor use will routinely hit 3,000 cd/m2 brightness with full white images, and you can appreciate just how much of a leap HDR represents over current imaging technology. What’s more significant is how quickly the prices for HDR displays are coming down, particularly as Chinese TV manufacturers enter the marketplace.

Just prior to the Super Bowl – the best time to score a deal on a new TV, by the way – it was possible to purchase a 55-inch ‘smart’ Ultra HDTV for just $499 from a Tier 1 manufacturer. And a 65-inch model with basic HDR (static metadata) could be had from a Chinese brand for less than $700, while a Tier 1 HDR model of the same screen size was less than $900.

I mentioned wide color gamut earlier. It stands to reason that if a camera can capture a much wider range of luminance values, it can also record a much wider range of color shades. And that’s exactly what winds up happening. With the current 8-bit color system widely in use for everything from broadcast and cable television to Blu-ray discs and streaming media, a total of 16.7 million colors can be represented.

With HDR and WCG, the playing field is expanded considerably and now requires 10 bits per color pixel, resulting in 1,073,741,800 colors – over 1 billion color shades! That’s too much heavy lifting for LCD displays that use white LEDs with color filters, but it’s within reach of quantum dot LCDs and OLEDs.

The availability of HDR/WCG content has also forced a speed upgrade to display interfaces. HDMI 1.3/1.4 simply can’t handle a 4K HDR signal, so we must use HDMI 2.0 to do the job. And even version 2.0 is barely fast enough – if the 4K video signal uses lower color resolution (4:2:0, 4:2:2), then it can transport HDR signals as fast as 60 Hz. But switch to RGB (4:4:4) color mode – such as we’d see with 4K video from a computer video card – and HDMI 2.0 can’t pass a 60 Hz signal with anything more than 8-bit color.

On the DisplayPort side, things are somewhat more accommodating. Version 1.2 (the current one) can pass a 3840x2160p/60 signal 10-bit RGB (4:4:4) color, but nothing more. The newest DP version – 1.3 – raises its maximum speed to 32.4 Gb/s, which makes imaging 12-bit and even 16-bit 4K HDR content possible. However, version 1.4 is required to recognize the HDR ‘flags’ that travel with the content and must be passed on to the display. (HDMI uses extensions for HDR and WCG, with ‘a’ used for static HDR metadata and ‘b’ used for dynamic metadata.)

Marketing folks have a field day confusing people with new display tech and apparently they’re going to town with HDR. We’re now hearing about “HDR-compatible” products, particularly in signal interfacing. Nothing to see here, folks – if the signal distribution and switching equipment is fast enough to pass the required clock rate and hands over HDR metadata (CTA.861.3) to the display without alteration, then it is indeed “HDR compatible.” Simple as that.

I return to my original question: Will HDR have an impact on our industry? The answer is an emphatic “yes!” There are many customers that would realize a benefit from HDR imaging – medical, surveillance, military, research, virtual reality, and simulation verticals will embrace it pretty quickly, and others will follow.