Category: The Front Line

InfoComm 2017 In The Rear View Mirror

InfoComm 2017 has come and gone, and left us with lots to think about.

For me, this year’s show was hectic, to say the least. I presented my annual Future Trends talk on Tuesday to kick off the Emerging Trends session, then conducted a 3-hour workshop on RF and wireless that afternoon to the largest crowd I’ve ever had for the class. (It may be the largest crowd I ever get as I’m thinking of shelving this class.)

Bright and early on Wednesday morning, I taught a 2-hour class  on AV-over-IT (the correct term; you could also use “AV-with-IP”) to a full house. There were even some folks standing in the back of the room. I guessed at least 200 were in attendance.

Thursday morning found me back in the same space, talking about 4K and Ultra HDTV to a smaller crowd (maybe not as “hot” a topic?) and urging them to set their BS meters to “high” when they headed to the show floor to talk to manufacturers about 4K-compatible/ready/friendly products.

With other presentation commitments, it worked out to nearly 15 hours standing in front of crowds and talking. Tiring to say the least, but I did get a ton of great follow-up questions after each session. People were paying attention!

AV-over-IT was a BIG theme at InfoComm, and it was hard to miss.

Mitsubishi had a very nice fine-pitch LED display at the show – one of the few that are not built in China.

The migration to using TCP/IP networks to transport video and audio instead of buying and installing ever-larger and more complex HDMI switchers and DAs is definitely catching steam. My colleagues and I have only been talking about this for over a decade and it’s rewarding to see that both manufacturers and end-users are buying in.

And why not? Computer hardware couldn’t get much cheaper. For my AV/IT demo, I was streaming a local TV station, broadcasting in the 720p HD format, using an H.264 AVC encoder/decoder pair running through a 1GigE NetGear managed switch. The streaming rates were in the range of 15 – 18 Mb/s, so I had plenty of headroom.

It worked like a champ. I was able to show how adjusting the group of pictures (GOP) length affected latency, along with the effects of constant bitrate (CBR) vs. variable bitrate (VBR) encoding. If I could have dug the gear up in time, I would have demonstrated UHD content through a 10 Gb/s switch – same principles, just a faster network.

I saw more companies than ever this year showing some sort of AV-over-IT solution. (Almost as many as those showing LED walls!) Lots of encoders and decoders, using H.264, Motion JPEG, and JPEG2000 formats; connected through fast switches and driving everything from televisions to projectors.

If it’s REALLY happening this time, then this is BIG. Migration to AV-over-IT is a big shot across the bow of companies that sell large HDMI-based matrix switches, not to mention distribution amplifiers and signal extenders – both made obsolete by this new technology. With AV on a network, all you need is a fast switch and a bunch of category cable. For longer runs, just run optical fiber connections to SPF fiber connections on the switch.

LG showed off its unique curved OLED displays – and they’re dual-sided.

Meanwhile, Samsung unveiled the first digital signage monitors to use quantum dot backlight technology for high dynamic range and wide color gamuts.

Hand-in-hand with this migration to an IT-based delivery system is a steady decline in the price of hardware, which has impacted the consumer electronics industry even harder. Consider that you can now buy a 65-inch Ultra HDTV (4K) with “smart” capabilities and support for basic high dynamic range video for about $800.

That’s even more amazing when you consider that the first Ultra HD displays arrived on our shores in 2012 with steep price tags around $20,000. But the nexus of the display industry has moved to mainland China, creating an excess of manufacturing capacity and causing wholesale and retail prices to plummet.

There is no better example of China’s impact on the display market than LED display tiles and walls. These products have migrated from expensive, coarse-resolution models to super-bright thin tiles with dot pitches below 1 millimeter – about the same pitch as a 50-inch plasma monitor two decades ago.

Talk to projector manufacturers and they’ll tell you that LED displays have cut heavily into their business, especially high-brightness projectors for large venues. LED wall manufacturers were prominent at the show, and some are hiring industry veterans to run their sales and marketing operations; removing a potential barrier to sales in this country by presenting potential customers with familiar faces.

Panasonic showed there are still plenty of applications for projection, especially on curved surfaces.

Absen is an up-and-coming LED brand, and they’re hiring veterans of the U.S. AV market to push sales along.

At the other end, large and inexpensive LCD displays with Full HD resolution have killed off much of the “hang and bang” projector business, and large panels with Ultra HD resolution are now popping up in sizes as large as 98 inches. The way things are going in Asia, Full HD panel production may disappear completely by the end of the decade as everyone shifts to Ultra HD panel production.

Even the newest HDR imaging technology – quantum dots – made an appearance in Orlando in a line of commercial monitors with UHD resolution. Considering that QD-equipped televisions have only been around for a couple of years, that’s an amazingly accelerated timeline. But compressed timelines between introduction and implementation are the norm nowadays.

This was my 24th consecutive InfoComm and the 21st show (so far as I can remember) where I taught at least one class. When I went to my first show in Anaheim, CRT projectors were still in use, a ‘bright’ light valve projector could generate maybe 2000 lumens, LCD projectors cost ten grand and weighed 30 pounds, and composite video and VGA resolution ruled the day. RS232 was used to control everything and stereo was about as ‘multichannel’ as audio got.

All of that has passed into oblivion (except for RS232 and VGA connectors) as we continue to blow by resolution, size, speed, and storage benchmarks. The transition to networked AV will result in even more gear being hauled off to recycling yards, as will advances in wireless high-bandwidth technology, flexible displays, cloud media storage and delivery, and object-based control systems.

Can’t wait for #25…

InfoComm Tech Trends for 2017

Although I’ve been working in the AV industry since 1978 (the good old days of tape recorders, CRT projectors, and multi-image 35mm slide projection), I only started attending InfoComm in 1994.

At that time, the Projection Shoot-Out was picking up steam with the first solid-state light modulators (LCDs). Monitors still used CRTs, and some new-fangled and very expensive ‘plasma’ monitors were arriving on our shores. “HD resolution” meant 1024×768 pixels, and a ‘light valve’ projector could crank out at best about 2,000 lumens. The DB15 and composite video interfaces dominated connections, and a ‘large’ distribution amplifier had maybe four output ports on it.

I don’t need to tell you what’s transpired in the 23 years since then. This will be my 24th InfoComm, and it might be the most mind-boggling in terms of technology trends. We’ve come a long way from XGA, composite video, CRTs, 35mm slides, analog audio, and RS232. (Okay, so that last one is still hanging around like an overripe wine.)

I’ve mentioned many of the trends in previous columns, so I’ll list what I think are the most impactful and exactly why I feel that way. I should add that I’m writing this just after attending the NAB 2017 show, where many of my beliefs have been confirmed in spades.

Light-emitting Diodes (LEDs) are taking over (the world): This is an obvious one, but now they’re simultaneously threatening both the large venue projection and direct-view display markets. I saw at least a dozen LED brands at NAB – most of them from mainland China – offering so-called ‘fine pitch’ tiled displays. These range from 1.8mm all the way down to .9mm, which is about the same pitch as a 50-inch plasma TV had 17 years ago.

The challenge for anyone here is who to buy from and which products are reliable. You wouldn’t recognize most of these companies, as they are largely set up to market LED tiles to the outside world. And some of them supply companies you do know in the LED marketplace. With brightness levels hitting 400 – 800 nits for fine pitch (and over 2,000 nits for coarser pixel arrays), it’s no wonder that more applications are swinging away from front projection to tiles.

And there are even finer screens in the works with pixel pitches at .8mm and smaller. That’s most definitely direct-view LCD territory, at least at greater viewing distances. But the LCD guys have some tricks of their own…

Cheaper, bigger, 1080p and UHD flat screens: Right now, there are too many LCD ‘fabs’ running in Asia, making too much ‘glass.’ More and more of that ‘glass’ will have Ultra HD resolution. That, in turn, is forcing down prices of 1080p LCD panels, making it possible for consumers to buy super-cheap 60-inch, 65-inch, and 70-inch televisions.

Consequently, it will be easy to pick up 65-, 70-, and even 85-inch LCD screens for commercial installations for dirt-cheap prices. We’re talking about displays that can be amortized pretty quickly – if they last a couple of years, great. But even if they have to be replaced after a year, the replacement costs will be lower. And with the slow migration to UHD resolution in larger sizes (it’s a matter of manufacturing economies); you can put together tiled 8K and even 16K displays for a rational budget.

Don’t expect OLEDs to make too many inroads here. They don’t yet have the reliability or sheer brightness of LCDs, and you’re going to start seeing some high-end models equipped with quantum dot enhancements for high brightness and high dynamic range (HDR) support. Speaking of which…

High dynamic range and wide color gamut technologies were all over the place at NAB. There is so much interest in both (they go hand-in-hand anyway) that you will numerous demos of them in Orlando. Who will use HDR and WCG? Anyone who wants a more realistic way to show images with brightness, color saturation, and contrast levels that are comparable to the human eye.

Obviously, higher resolution is very much part of this equation, but you don’t always need 4K to make it work. Several companies at NAB, led by Hitachi, had compelling demos of 2K (1080p) HDR. On a big screen, the average viewer might not even know they’re looking at a 1080p image. And yes, both enhancements do make a difference – they’re not just bells and whistles.

AV distribution over networks: I’ve been teaching classes in networked AV for over a decade, but it has finally arrived. You won’t hear nearly as much about HDMI switching and distribution in Orlando as you will about JPEG2000, latency, network switch speeds, and quality of service issues.

That’s because our industry has finally woken up and smelled the coffee: Signal management and distribution over TCP/IP networks is the future. It’s not proprietary HDMI formats for category wire. It’s not big, bulky racks full of HDMI hardware switches. No, our future is codecs, Layer 2/3 switches, cloud servers and storage, faster channel-bonding WiFi, and distribution to mobile devices.

You couldn’t throw a rock at NAB without hitting a company booth that was showcasing a codec or related software-based switching (SBS) product. More and more of them are using the HEVC H.265 codec for efficiency or M-JPEG2000 for near-zero latency. Some companies demonstrated 25 Gb/s network hardware for transport and workflows, while others had scheduling and playout software programs.

Internet of Things control for AV: You can defend proprietary control systems all day long, but I’m sorry to tell you that you’re on the losing end of that argument. IoT is running wild in the consumer sector, which of course wields great influence over our market. App-based control has never been easier to pull off, which is why the long-time powers in control are scrambling to change gears and keep up with the crowd.

In short; if it has a network interface card or chip, it can be addressed over wireless and wireless networks with APIs and controlled from just about any piece of hardware. And control systems have gotten smart enough that you can simply connect a piece of AV hardware to a network and it will be identified and configured automatically. You won’t have to lift a finger to do it.

It is a sobering thought to realize I’m in my 40th year working in this industry. Yet, I have never seen the technology changes coming as hard and as fast as I have in the past decade (remember, the first iPhone appeared in 2007). It’s all migrating to networks, software control, and displays that have LEDs somewhere in the chain. Tempus fugit…

High Dynamic Range: It’s Here!

Ever since the launch of high definition television in the 1990s, it seems as if some new ‘bell and whistle’ enhancement comes along every few years. First it was the changeover to flat screen plasma displays in the late 1990s, followed by a shift to 1080p and Wide UXGA resolution in the early 2000s.

The industry transitioned to liquid-crystal display (LCD) panels for TVs and monitors a few years later. UHD (4K) imaging popped into our consciousness in 2012. And of course, 3D made its “once every few sunspot cycles” appearance in 2009, followed by virtual reality last year.

Some of these trends actually stuck, like 4K: Display manufacturers are slowing down production of Full HD (1920×1080) display panels in favor of UHD (3840×2160) as consumers increasingly choose the higher resolution. That, in turn, means that the displays we select for digital signage, classrooms, meeting rooms, and other applications will also be of the 4K variety.

The latest trend to rear its head is high dynamic range (HDR), which is accompanied by wide color gamut (WCG) imaging. In a nutshell, HDR means a greatly expanded range of tonal values that can be shown in still and moving images. Conventional cameras and imaging systems can capture anywhere from 9 to 11 f-stops of light. (Each f-stop increase represents a luminance value twice as bright as the previous one.)

HDR takes that to a higher level by capturing as many as 22 f-stops of light, and reproducing those tonal values becomes a real challenge to displays that employ conventional backlight or illumination systems. Hence, we are now seeing a new crop of LCD TVs with turbocharged backlights to reproduce the extreme dynamic ranges of HDR images. On the emissive display side, organic light-emitting diode (OLED) TVs can also reproduce HDR content, although with lower peak brightness levels.

At NAB 2017, NEC showed this 4K HDR encoder prototype, streaming 77 Mb/s with 99 ms latency.

For some perspective, the venerable CRT display had a peak brightness level somewhere around 29 foot-Lamberts (100 candelas per square meter), which represented close to 100% diffuse white. In an HDR display, that value largely holds, but more intense specular highlights (like the sun reflecting off a pane of glass or the water, or a bright streetlight at nighttime) can hit peaks much, much higher in the thousands of cd/m2 .

And HDR isn’t just about extreme brightness. The entire grayscale is expanded, so we should see more shadow details along with intense specular light sources. When done correctly, HDR images are quite the departure from ‘everyday’ HDTV, and more closely resemble the range of tonal values our eyes can register – with their visual contrast ratio approaching 1,000,000:1.

There are numerous ways to achieve higher levels of brightness. Dense arrays of light-emitting diodes can do it when used in a direct-illumination architecture. However, the favored approach is to employ a special optical film embedded with nano-sized red and green quantum dot particles, stimulated by an array of blue LEDs. 2017 TV models using this approach can achieve peak small-area brightness values of 2,000 cd/m2.

For perspective, consider that an LED (emissive) videowall for indoor use will routinely hit 3,000 cd/m2 brightness with full white images, and you can appreciate just how much of a leap HDR represents over current imaging technology. What’s more significant is how quickly the prices for HDR displays are coming down, particularly as Chinese TV manufacturers enter the marketplace.

Just prior to the Super Bowl – the best time to score a deal on a new TV, by the way – it was possible to purchase a 55-inch ‘smart’ Ultra HDTV for just $499 from a Tier 1 manufacturer. And a 65-inch model with basic HDR (static metadata) could be had from a Chinese brand for less than $700, while a Tier 1 HDR model of the same screen size was less than $900.

I mentioned wide color gamut earlier. It stands to reason that if a camera can capture a much wider range of luminance values, it can also record a much wider range of color shades. And that’s exactly what winds up happening. With the current 8-bit color system widely in use for everything from broadcast and cable television to Blu-ray discs and streaming media, a total of 16.7 million colors can be represented.

With HDR and WCG, the playing field is expanded considerably and now requires 10 bits per color pixel, resulting in 1,073,741,800 colors – over 1 billion color shades! That’s too much heavy lifting for LCD displays that use white LEDs with color filters, but it’s within reach of quantum dot LCDs and OLEDs.

The availability of HDR/WCG content has also forced a speed upgrade to display interfaces. HDMI 1.3/1.4 simply can’t handle a 4K HDR signal, so we must use HDMI 2.0 to do the job. And even version 2.0 is barely fast enough – if the 4K video signal uses lower color resolution (4:2:0, 4:2:2), then it can transport HDR signals as fast as 60 Hz. But switch to RGB (4:4:4) color mode – such as we’d see with 4K video from a computer video card – and HDMI 2.0 can’t pass a 60 Hz signal with anything more than 8-bit color.

On the DisplayPort side, things are somewhat more accommodating. Version 1.2 (the current one) can pass a 3840x2160p/60 signal 10-bit RGB (4:4:4) color, but nothing more. The newest DP version – 1.3 – raises its maximum speed to 32.4 Gb/s, which makes imaging 12-bit and even 16-bit 4K HDR content possible. However, version 1.4 is required to recognize the HDR ‘flags’ that travel with the content and must be passed on to the display. (HDMI uses extensions for HDR and WCG, with ‘a’ used for static HDR metadata and ‘b’ used for dynamic metadata.)

Marketing folks have a field day confusing people with new display tech and apparently they’re going to town with HDR. We’re now hearing about “HDR-compatible” products, particularly in signal interfacing. Nothing to see here, folks – if the signal distribution and switching equipment is fast enough to pass the required clock rate and hands over HDR metadata (CTA.861.3) to the display without alteration, then it is indeed “HDR compatible.” Simple as that.

I return to my original question: Will HDR have an impact on our industry? The answer is an emphatic “yes!” There are many customers that would realize a benefit from HDR imaging – medical, surveillance, military, research, virtual reality, and simulation verticals will embrace it pretty quickly, and others will follow.

Three Premium 2017 LCD-TVs Plot Different Paths to Enhanced Performance

The white LEDs used in conventional LCD-TV backlight units — which are actually blue LEDs with a broadband yellow phosphor inside the package — produce a white light that has insufficient output and purity in the green and red. All current approaches to improving LCD-TV color gamut and purity involve improving the intensity and purity of the red and green light that is produced by the BLU and makes its way to the matrix color filter.

Interestingly, three premium 2017 sets now on the market adopt completely different approaches to solving this problem.

Samsung’s “QLED” Q Series uses a new type of quantum dot (QD) for converting some of the light from the backlight’s blue LEDs to red and green. The novelty here is not the architecture of the QD-enhanced backlight, but the design of the Nanosys-developed QDs. Samsung says the QDs are fabricated with a metal alloy and end up with a light-transmissive metal alloy shell. Industry sources say the fabrication process produces “perfectly spherical” dots, which produces a purer chromatic emission.

The top-of-the-line Q9 model delivers a peak white luminance of more than 2000 nits and a color gamut of more than 100% DCI P3. The luminance of the Q7 and Q8 models is slightly less. In side-by-side comparisons with an OLED-TV and a non-QD LCD-TV, the Q9 was impressive. And it should be. With an MSRP of nearly $6000 for the 65-inch version, this is a super-premium TV that costs only a thousand dollars less that the 65-inch LG Signature OLED TV G.

These three premium TV sets use different technologies for increasing red and green intensity and purity. Data from various sources, some of which requested anonymity. (Table: Nutmeg Consultants)

LG does not use QDs in any of its LCD-TV sets, but it had do something if it was to compete in the new world of high-dynamic-range and enhanced-color-gamut premium LCDs. LG’s solution is “Nano Cell” technology, which panel supplier LG Display calls “IPS Nanocolor I.” The technology consists of a film having a uniform distribution of 1- to 2-nanometer particles. The film, which sits beneath the front polarizer, acts as an additional color filter to produce an RGB distribution with increased purity. The film also appears to act as a diffuser for improved viewing angle. Says LG: “accurate colors maintained at any angle…colors look truer and black levels look deeper.”

But the additional filter necessarily absorbs light, making for a less efficient display. LGD claims 90% of DCI P3, while an industry source says the peak white luminance is less than 450 nits. LGD says the color gamut at an 80° viewing angle is 91% of what it is when viewed straight on, a benefit of the Nano Cell film. The MSRP of a 65-inch Model SJ8500 is just shy of $2800.

Samsung reserves the Q and QLED brands for sets with QDs in them, but it needed a premium series to fit below the Q sets with a more affordable pricing. This role is filled by the MU series, which gets its improved chromatic performance from using red and green phosphors in the LED packages rather than broadband yellow. This is not a new idea in shipping TV sets. Sharp, calling it SPECTROS, used the technology in its 2015 premium UH30 series, and claimed a color gamut of 100% DCI.

My sources tell me that Samsung gets just a bit more than 94% DCI, together with a 1000-nit peak white luminance, in its MU-8500 models, which has an MSRP of just under $2200.

These three technologies don’t exhaust the possible approaches for further enhancing LCD-TVs. LGD is working on “IPS Nanocolor II,” which will place a film containing red and green phosphor nano-particles between the backlight and the LCD cells.

Dot-on-chip, which places the QDs right on the LED chip, is under active development. The issue is that QDs are sensitive to heat and high luminous flux, but TCL’s Ranjit Gopi says that TCL’s product plans include a dot-on-chip set “sooner rather than later.”

And using quantum dots to replace the color filter for an estimated doubling of LC module efficiency is also under active development. The issue here is that to be patterned for an MCF replacement, the dots must be stable in air. However, both Nanosys and Crystalplex have demonstrated air-stable technologies.

So, now and in the future, we have multiple technologies for enhancing LCD-TV performance substantially. In light of these developments, will OLED be able to retain its reputation as the display technology par excellence? We’ll save the answer to that question for a future post.

Ken Werner is Principal of Nutmeg Consultants, specializing in the display industry, manufacturing, technology, and applications, including mobile devices and television. He consults for attorneys, investment analysts, and companies re-positioning themselves within the display industry or using displays in their products. He is the 2017 recipient of the Society for Information Display’s Lewis and Beatrice Winner Award. You can reach him at kwerner@nutmegconsultants.com.

There’s More To The Story (There Always Is!)

A couple weeks ago, I posted a story about a particularly irritating problem I was having with my high-speed Comcast internet service dropping out. After lots of troubleshooting, I thought I had cornered the culprit – a Samsung-manufactured set-top box that Comcast was using for delivery basic Xfinity services (no DVR).

When I connected the cable TV input connection to a spectrum analyzer, I saw some pretty nasty burst of spectral noise that ranged from 11 MHz all the way up to 400 MHz. I figured this might have been the cause of the dropouts and promptly returned it to the local Comcast Store, only to find out that particular model wasn’t in use any more and that I’d have a much smaller, flatter version to take home – one that would link automatically to my main X1 DVR.

Coincidentally, the dropout problem stopped, so I did what any other reasonable person would do: I assumed that was the end of it.

Except it wasn’t. A few days after I replaced the box, the WAN connection started dropping out again. Some days it dropped only a couple of times, but on April 13, it dropped out almost fourteen times in four hours. Out came the test equipment (and plenty of expletives) as I started testing every line in the house, taking more and more things of-line.

At one point, the only thing connected to the Comcast drop was my wireless gateway and my spectrum analyzer, through a brand-new 2-way splitter good to 1.5 GHz. Sure enough, the WAN connection dropped again – but this time, I caught something on the analyzer I hadn’t seen before.

Figure 1. The entire system (two cable boxes and a wireless gateway) are working just fine with these signals.

Figure 1 shows the ‘normal’ levels of QAM carriers coming through the drop. There’s a little up-and-down there, but the entire system – in particular, the downstream QAM carriers above 650 MHz – all measured at least 32 dB above the noise floor (about -87 to -88 dBm). In this condition, the wireless gateway was chugging along just fine and broadband speeds were pretty fast.

Just ten minutes later – while I was watching the analyzer screen – the QAM carriers from 50 MHz through 400 MHz dropped precipitously, as seen in Figure 2. Right on schedule, the WAN connection stopped working! Yet, I hadn’t touched, changed, or re-wired anything. This was starting to look like a classic ghost in the machine, and likely an issue outside my house. (Yes, the Samsung box did need to be replaced in any case – it was quite dirty, RF-wise.)

Figure 2. Ten minutes later, KABOOM – the WAN connection dropped and my analyzer showed some nasty QAM waveforms below 400 MHz. What was causing this?

Well, after escalating this problem to Comcast’s Special Operations unit (yes Virginia, they do have Special Ops guys), I was visited by Jason Litton and Fredrick Finger of Comcast. I asked them to replace the DC block/ground block outside the house and also to sweep the underground cable coming back to the house. I had previously gone out to check the ground block and discovered (a) it was grounded – the wire had come loose) and (b) there was a tiny bit of play in the connections at either end, which I tightened up before they arrived.

Long story short; the block was eventually replaced, re-grounded, and new connectors were installed at either end of the underground drop. During testing, Jason spotted noise coming from a neighbor’s coaxial drop and proceeded to install several more new connectors. I also took the opportunity to have them put in two brand-new splitters in my basement (overkill, but what the heck) and run a new coaxial line to my workbench.

And that finally did the trick. Whatever phantom was haunting my system had finally been exorcised for good. Using Comcast’s brodband speed test to New Castle, Delaware and Secaucus, New Jersey, I saw wired LAN and 5 GHz 802.11ac download speeds hitting 100 MB/s. Using the popular TestMy.net server in Dallas, Texas; I measured download speeds around 30 – 48 Mb/s. Upload speeds to all servers were in the range of 10 – 12 Mb/s.

Figure 3. The final setup with all QAM levels where they should be!

 

Figure 4. Best of all, there’s no noise below 50 MHz on the upstream channels. FINALLY!

So what was the culprit? Most likely the cheapest thing in the system – the DC block. Noise from the Samsung STB didn’t help, and apparently neither did the noise coming from my neighbor’s cable drop. But the block probably had an intermittent connection and was creating some nasty standing waves, causing tilt on the lower QAM carriers and noise at the uplink frequencies around 30 MHz.

I’ll have more details on this unfortunate series of events during my RF/Wireless class at InfoComm in June. Until then, things are working well (knock on wood, or metal, or coax, or modem…)