Posts Tagged ‘4K’

High Dynamic Range: It’s Here!

Ever since the launch of high definition television in the 1990s, it seems as if some new ‘bell and whistle’ enhancement comes along every few years. First it was the changeover to flat screen plasma displays in the late 1990s, followed by a shift to 1080p and Wide UXGA resolution in the early 2000s.

The industry transitioned to liquid-crystal display (LCD) panels for TVs and monitors a few years later. UHD (4K) imaging popped into our consciousness in 2012. And of course, 3D made its “once every few sunspot cycles” appearance in 2009, followed by virtual reality last year.

Some of these trends actually stuck, like 4K: Display manufacturers are slowing down production of Full HD (1920×1080) display panels in favor of UHD (3840×2160) as consumers increasingly choose the higher resolution. That, in turn, means that the displays we select for digital signage, classrooms, meeting rooms, and other applications will also be of the 4K variety.

The latest trend to rear its head is high dynamic range (HDR), which is accompanied by wide color gamut (WCG) imaging. In a nutshell, HDR means a greatly expanded range of tonal values that can be shown in still and moving images. Conventional cameras and imaging systems can capture anywhere from 9 to 11 f-stops of light. (Each f-stop increase represents a luminance value twice as bright as the previous one.)

HDR takes that to a higher level by capturing as many as 22 f-stops of light, and reproducing those tonal values becomes a real challenge to displays that employ conventional backlight or illumination systems. Hence, we are now seeing a new crop of LCD TVs with turbocharged backlights to reproduce the extreme dynamic ranges of HDR images. On the emissive display side, organic light-emitting diode (OLED) TVs can also reproduce HDR content, although with lower peak brightness levels.

At NAB 2017, NEC showed this 4K HDR encoder prototype, streaming 77 Mb/s with 99 ms latency.

For some perspective, the venerable CRT display had a peak brightness level somewhere around 29 foot-Lamberts (100 candelas per square meter), which represented close to 100% diffuse white. In an HDR display, that value largely holds, but more intense specular highlights (like the sun reflecting off a pane of glass or the water, or a bright streetlight at nighttime) can hit peaks much, much higher in the thousands of cd/m2 .

And HDR isn’t just about extreme brightness. The entire grayscale is expanded, so we should see more shadow details along with intense specular light sources. When done correctly, HDR images are quite the departure from ‘everyday’ HDTV, and more closely resemble the range of tonal values our eyes can register – with their visual contrast ratio approaching 1,000,000:1.

There are numerous ways to achieve higher levels of brightness. Dense arrays of light-emitting diodes can do it when used in a direct-illumination architecture. However, the favored approach is to employ a special optical film embedded with nano-sized red and green quantum dot particles, stimulated by an array of blue LEDs. 2017 TV models using this approach can achieve peak small-area brightness values of 2,000 cd/m2.

For perspective, consider that an LED (emissive) videowall for indoor use will routinely hit 3,000 cd/m2 brightness with full white images, and you can appreciate just how much of a leap HDR represents over current imaging technology. What’s more significant is how quickly the prices for HDR displays are coming down, particularly as Chinese TV manufacturers enter the marketplace.

Just prior to the Super Bowl – the best time to score a deal on a new TV, by the way – it was possible to purchase a 55-inch ‘smart’ Ultra HDTV for just $499 from a Tier 1 manufacturer. And a 65-inch model with basic HDR (static metadata) could be had from a Chinese brand for less than $700, while a Tier 1 HDR model of the same screen size was less than $900.

I mentioned wide color gamut earlier. It stands to reason that if a camera can capture a much wider range of luminance values, it can also record a much wider range of color shades. And that’s exactly what winds up happening. With the current 8-bit color system widely in use for everything from broadcast and cable television to Blu-ray discs and streaming media, a total of 16.7 million colors can be represented.

With HDR and WCG, the playing field is expanded considerably and now requires 10 bits per color pixel, resulting in 1,073,741,800 colors – over 1 billion color shades! That’s too much heavy lifting for LCD displays that use white LEDs with color filters, but it’s within reach of quantum dot LCDs and OLEDs.

The availability of HDR/WCG content has also forced a speed upgrade to display interfaces. HDMI 1.3/1.4 simply can’t handle a 4K HDR signal, so we must use HDMI 2.0 to do the job. And even version 2.0 is barely fast enough – if the 4K video signal uses lower color resolution (4:2:0, 4:2:2), then it can transport HDR signals as fast as 60 Hz. But switch to RGB (4:4:4) color mode – such as we’d see with 4K video from a computer video card – and HDMI 2.0 can’t pass a 60 Hz signal with anything more than 8-bit color.

On the DisplayPort side, things are somewhat more accommodating. Version 1.2 (the current one) can pass a 3840x2160p/60 signal 10-bit RGB (4:4:4) color, but nothing more. The newest DP version – 1.3 – raises its maximum speed to 32.4 Gb/s, which makes imaging 12-bit and even 16-bit 4K HDR content possible. However, version 1.4 is required to recognize the HDR ‘flags’ that travel with the content and must be passed on to the display. (HDMI uses extensions for HDR and WCG, with ‘a’ used for static HDR metadata and ‘b’ used for dynamic metadata.)

Marketing folks have a field day confusing people with new display tech and apparently they’re going to town with HDR. We’re now hearing about “HDR-compatible” products, particularly in signal interfacing. Nothing to see here, folks – if the signal distribution and switching equipment is fast enough to pass the required clock rate and hands over HDR metadata (CTA.861.3) to the display without alteration, then it is indeed “HDR compatible.” Simple as that.

I return to my original question: Will HDR have an impact on our industry? The answer is an emphatic “yes!” There are many customers that would realize a benefit from HDR imaging – medical, surveillance, military, research, virtual reality, and simulation verticals will embrace it pretty quickly, and others will follow.

InfoComm 2016 In The Rearview Mirror

Another InfoComm show has come and gone. This is my 23rd InfoComm and it’s hard to imagine when I first set foot in Anaheim way back in 1994 – ostensibly to cover the now-defunct Projection Shoot-Out – that I’d still be making the treks to Orlando and Las Vegas, let alone teaching classes and joining the InfoComm faculty.

For this recap, I’ll focus on trends I saw at the show that will continue to impact our industry for some time to come. And there were plenty of them, everywhere you looked.

First off; I’ve been saying for several years now that software is becoming increasingly more important than hardware in our industry (and across all market segments  – look at how inexpensive Ultra HDTVs have become already), and that we’d start to see less of a focus on expensive hardware and more of an emphasis on software and managed services.

And that’s exactly what I spotted in Las Vegas. Astute observers noticed that the once humongous booths set up by the likes of Sony, Panasonic, Crestron, LG, Samsung, Hitachi, and other companies have gotten a bit smaller. (NEC, Da-Lite, and Christie were exceptions to the rule.)

AMX, when it was a stand-alone company, used to have an enormous booth at the show (not to mention a huge party every year). Now, AMX is limited to a few small stands within the Harman booth.  Walk the show floor these days and you’ll recognize other once-mighty brands that have been acquired by holding companies and now occupy much smaller footprints.

And this trend shouldn’t be any surprise. When hardware used to sell for four and five figures (and in some cases, six figures), you could justify those million-dollar booths that looked like mini-malls. (Remember the huge tented Sanyo projector booths?) But that’s not the case anymore.

Kramer's huge booth at InfoComm, touting a shift away from

Kramer’s huge booth at InfoComm, touting a shift away from “big boxes” to software and the cloud, was one of the exceptions to the trend to go smaller.

 

LG is doing some very cool things with curved displays, thanks to advancements in OLED and LCD manufacturing.

LG is doing some very cool things with curved displays, thanks to advancements in OLED and LCD manufacturing.

Practically speaking, how much real estate do you need to talk about software programs and managed services? The same thing is happening at NAB, where once humongous companies like Harris (now Imagine) are largely touting services and not hardware.

Even Digital Projection has scaled back its enormous multi-tier InfoComm booth. And projectiondesign has shed some square footage since being acquired by Barco, which has itself gone on a square footage diet. Ditto Sharp, which had one of the smallest booths ever at this show, perhaps related to the company’s ongoing financial challenges.

Surprisingly, Toshiba showed there is indeed a second act by showing up with a nice-size booth full of LCD monitors for tiled display walls. That’s not exactly an easy market to compete in, what with LG, Samsung, and NEC having a big footprint. But they’re giving it a shot.

Toshiba has re-entered the super-competitive world of display walls...a market they once dominated 20 year ago.

Toshiba has re-entered the super-competitive world of display walls…a market they once dominated 20 year ago.

 

The

The “surfer dude engineers” from Santa Barbara have a very nice 4K-over-IP encoder/decoder line-up!

Another trend that’s really picking up speed is the move away from projection lamps to solid-state illumination systems, most often lasers with color phosphor wheels. The availability of large, inexpensive LCD displays has cut deeply into sales of projectors – particularly in small classrooms and meeting rooms, where we used to put in “hang and bang” projection systems.

If you talk to people who’ve made the switch away from projection to direct-view, the reason they most frequently cite is that they don’t have to change out lamps anymore, and the LCD displays can be used under normal room lighting and turn on instantly.

Well, projector manufacturers have gotten the message and are moving en masse to solid state light sources. Early adopters like Casio have reaped the benefits, but now everyone from Sony and Panasonic to Vivitek and Optoma is on board.

Even so, the corner wasn’t really turned until this year when Epson – one of the big manufacturers of projection lamps – showed a 25,000-lumen 3LCD projector powered by a laser light engine. And I saw more than one UHD-resolution projector using the laser-phosphor combination, even in ultra-short throw configurations.

Epson finally got religion and showed its first laser/phosphor 3LCD projector this year - a 25,000 lumens model.

Epson finally got religion and showed its first laser/phosphor 3LCD projector this year – a 25,000 lumens model.

 

And Panasonic harnessed laser/phosphor technology to a new high-brightness 4K projector.

And Panasonic harnessed laser/phosphor technology to a new high-brightness 4K projector.

How much longer will we be changing out lamps? I don’t think it will be more than a few years before the majority of projectors offered for sale will use laser or LED light engines (or both). There will be exceptions for certain models, but for all intents and purposes, short-arc lamps are toast.

Here’s another trend – LED walls. I tried to count all of the exhibitors at InfoComm and lost track after wandering through the North Hall. And just about every single exhibitor was based in China, with names you would not recognize. Were they looking for U.S. dealer/distributor partners? It’s not likely many would pick up customers here, and that may be why Leyard (another Chinese manufacturer) bought Planar last year – everyone knows who Planar is.

I also saw LED walls with pitches as small as .9mm. That’s smaller than the pixel pitch of a 50-inch 1366×768 plasma monitor from 1995! And if anyone continues to go big with their booths, it’s the LED wall manufacturers. (Not like they have any choice!) Leyard’s 100’+ 8K LED wall was a perfect example of why bigger is still better when it comes to a booth.

And Sony’s Cledis 8Kx2K LED wall shows just how much farther we’ve come with this technology, creating what appeared to be a perfectly seamless, pixel-free panoramic LED wall that dazzled with bright, super-saturated color images.

Sony's CLEDIS 8K x 2K LED wall did an excellent job of hiding its seams - and pixels.

Sony’s CLEDIS 8K x 2K LED wall did an excellent job of hiding its seams – and pixels.

 

Planar (Leyard) is building some amazingly big and bright display walls. And they've got 8K resolution, too, thanks to using 16 2K panels.

Planar (Leyard) is building some amazingly big and bright display walls. And they’ve got 8K resolution, too, thanks to using 16 2K panels.

The Chinese dominance in LED displays shouldn’t be surprising. They’re moving to a similar level in the manufacturing of LCD panels, monitors, and televisions, undermining the Korean manufacturers (who undermined the Japanese, who took our U.S.-based television business away in the 1980s).

In fact, so much of our hardware is fabricated, soldered, and assembled in China and Southeast Asia these days that it should be no surprise prices have dropped as much as they have. Back in the day, a quality line doubler (remember those?) would set you back as much as $5,000 to $8,000. Today, you can buy a compact scaler that works to 1080p and Wide UXGA for a few hundred bucks.

My last trend has to do with the slow migration of video and audio signal distribution and switching away from hardware-intensive platforms based on display interface standards to software-based platforms that use IT switches, encoders, and decoders. Wow, did I spot a lot of those products at the show, even from some previously-vigorous defenders of HDMI-based architectures.

The interest in learning how to move to an “open” IP-type AV distribution architecture must be considerable: I taught a class on AV-over-IP this year at InfoComm and was astounded to see that 185 people had signed up to attend. And there were very few no-shows, as I found out when I had attendees sitting on the floor and standing along the back wall for almost the entire 90-minute class.

You know there's considerable interest in AV-over-IP when these guys show up.

You know there’s considerable interest in AV-over-IP when these guys show up.

 

RGB Spectrum's new Zio AV-over-IP system has one of the most user-friendly interfaces I've seen to date - touch and swipe to connect video streams.

RGB Spectrum’s new Zio AV-over-IP system has one of the most user-friendly interfaces I’ve seen to date – touch and swipe to connect video streams.

What’s more, a substantial portion of those attendees came from the higher education market segment, and an informal poll revealed that most of them were still upgrading from older analog systems to all-digital infrastructure. In essence, they were telling me that they preferred to skip by HDMI-based solutions and move directly to an IP-type solution.

Hand-in-hand with this discovery came more responses about transitioning to app-based AV control systems and away from proprietary, code-based control that requires specialized programming. Well, there were a few companies showing app-based AV control products in Vegas that had super-simple GUIs; software that just about anyone could learn to use in a few hours.

Throw in the accelerating transition to UHD resolution displays (they’ll largely replace Full HD within a year), and you have some very interesting times in store for the AV industry as this decade winds on…

AV-over-IP: It’s Here. Time To Get On Board!

At InfoComm next week in Las Vegas, I look forward to seeing many familiar faces – both individuals and manufacturers – that have frequented the show since I first attended over 20 years ago. And I also expect to find quite a few newcomers, based on the press releases and product announcements I’ve been receiving daily.

Many of those newcomers will be hawking the latest technology – AV-over-IP. More specifically, transporting video, audio, and metadata that are encoded into some sort of compressed or lightly-compressed format, wrapped with IP headers, and transported over IP networks.

This isn’t exactly a new trend: The broadcast, telecom, and cable/satellite worlds have already begun or completed the migration to IT infrastructures. The increasing use of optical fiber and lower-cost, fast network switches are making it all possible. Think 10 gigabit Ethernet with single-mode fiber interconnections, and you can see where the state-of-the-art is today.

You’ve already experienced this AV-over-IP phenomenon if you watch streaming HD and 4K video. Home Internet connection speeds have accelerated by several orders of magnitude ever since the first “slow as a snail” dial-up connections got us into AOL two decades ago. Now, it’s not unusual to have sustained 10, 15, 25, and even 50 megabit per second (Mb/s) to the home – fast enough to stream Ultra HD content with multichannel sound.

And so it goes with commercial video and audio transport. Broadcast television stations had to migrate to HD-SDI starting nearly 20 years ago when the first HDTV broadcasts commenced. (Wow, has it really been that long?) Now, they’re moving to IP and copper/fiber backbones to achieve greater bandwidth and to take advantage of things like cloud storage and archiving.

So why hasn’t the AV industry gotten with the program? Because we still have a tendency to cling to old, familiar, and often outdated or cumbersome technology, rationalizing that “it’s still good enough, and it works.” (You know who you are…still using VGA and composite video switching and distribution products…)

I’ve observed that there is often considerable and continual aversion in our industry to anything having to do with IT networks and optical fiber. And it just doesn’t make any sense. Maybe it originates from a fear of losing control to IT specialists and administrators. Or, it could just be a reluctance to learn something new.

The result is that we’ve created a monster when it comes to digital signal management. Things were complicated enough when the AV industry was dragged away from analog to digital and hung its hats on the HDMI consumer video interface for switching and distribution. Now, that industry has created behemoth switch matrices to handle the current and next flavors of HDMI (a format that never was suitable for commercial AV applications).

We’ve even figured out a way to digitize the HDMI TMDS signal and extend it using category wire, up to a whopping 300 feet. And somehow, we think that’s impressive? Single-mode fiber can carry an HD video signal over 10 miles. Now, THAT’S impressive – and it’s not exactly new science.

So, now we’re installing ever-larger racks of complex HDMI switching and distribution gear that is expensive and also bandwidth-capped – not nearly fast enough for the next generation of UHD+ displays with full RGB (4:4:4) color, high dynamic range, and high frame rates. How does that make any sense?

What’s worse, the marketing folks have gotten out in front, muddying the waters with all kinds of nonsensical claims about “4K compatibility,” “4K readiness,” and even “4K certified.” What does that even mean? Just because your switch or DA product can support a very basic level of Ultra HD video with slow frame rates and reduced color resolution, it’s considered “ready” or “certified?” Give me a break.

Digitizing HDMI and extending it 300 feet isn’t future-proof. Neither is limiting Ultra HD bandwidth to 30 Hz 8-bit RGB color, or 60 Hz 8-bit 4:2:0 color. Not even close. Not when you can already buy a 27-inch 5K (yes, 5K!) monitor with 5120×2880 resolution and the ability to show 60 Hz 10-bit color. And when 8K monitors are coming to market.

So why we keep playing tricks with specifications, and working with Band-Aid solutions? We shouldn’t. We don’t need to. And the answer is already at hand.

It’s time to move away from the concept of big, bulky, expensive, and basically obsolete switching and distribution hardware that’s based on a proprietary consumer display interface standard. It’s time to move to a software-based switching and distribution concept that uses an IT structure, standard codecs like JPEG2000, M-JPEG, H.264, and H.265, and everyday off-the-shelf switches to move signals around.

Now, we can design a fast, reliable AV network that allows us to manage available bandwidth and add connections as needed. Our video can be lightly compressed with low latency, or more highly compressed for efficiency. The only display interfaces we’ll need will be at the end points where the display is connected.

Even better, our network also provides access to monitoring and controlling every piece of equipment we’ve connected. We can design and configure device controls and interfaces using cloud-based driver databases. We can access content from remote servers (the cloud, again) and send it anywhere we want. And we can log in from anywhere in the world to keep tabs on how it’s all functioning.

And if we’re smart and not afraid to learn something new, we’ll wire all of it up with optical fiber, instead of bulky cables or transmitters and receivers to convert the signals to a packet format and back. (Guess what? AV-over-IP is already digital! You can toss out those distance-limited HDMI extenders, folks!)

For those who apparently haven’t gotten the memo, 40 Gb/s network switches have been available for a few years, with 100 Gb/s models now coming to market. So much for speed limit issues…

To the naysayers who claim AV-over-IP won’t work as well as display interface switching: That’s a bunch of hooey. How are Comcast, Time Warner, NBC, Disney, Universal, Netflix, Amazon, CBS, and other content originators and distributors moving their content around? You guessed it.

AV-over-IP is what you should be looking for as you walk the aisles of the Las Vegas Convention Center, not new, bigger, and bulkier HDMI/DVI matrices. AV-over-IP is the future of our industry, whether we embrace it or are dragged into it, kicking and screaming.

Are you on board, or what?

CES 2016: Some Second Thoughts

We’re almost a month removed from the 2016 International CES, which was quite the crowded bazaar of electronic gadgets. I’ve already reported on what I saw at the show; now, I want to take a few minutes to do some “Monday morning quarterbacking.”

Quarterly reports in this week from two of the CE world’s titans – Apple and Samsung – aren’t very rosy. In fact, both companies are predicting a slowdown in sales of smartphones, which was arguably the hottest CE category over the past six years (even more so than televisions). Although shipments of smartphones are predicted to rise this year, consumer demand for them is in decline.

That shouldn’t be surprising. I bought a Samsung Galaxy V in December of 2014 and it’s still serving me well. In fact, it can do more things than I need, so I’m not likely to replace it when my service contract expires this coming December. (Yep, I’m one of a dying breed of two-year service contract holders!) And I suspect that many other smartphone owners feel the same way.

Tablets were also supposed to be hot prospects for 2015, with some analysts predicting 18% year-to-year growth. Yet, tablet shipments actually went into decline, while sales of laptop computers actually exceeded predictions. Once again, if you have a tablet that’s a couple of years old, there’s no real reason to replace it unless the battery goes dead.

The only drawback with some of these products is inadequate memory capacity. Most phones and tablets start with 16 GB of memory, expandable with micro SD cards. Yet, given how quickly apps and downloads can gobble up that space, it’s wiser to start with 32 GB and maybe even 64 GB these days. After all, memory is cheap (unless you buy it from Apple).

So – mobile devices aren’t providing the stellar sales and returns we all hoped for. How about televisions?

There’s no question that shipments and sales of 1080p TVs are in a slow decline, and have been for a few years. Practically speaking; if you bought a big (46” and larger) “smart” Full HD LCD TV in the past five years, you already have fast Wi-Fi connectivity, Netflix and possibly Amazon streaming, and three or four HDMI inputs – most of which you’re probably not using, if you stream video.

So why would you shell out money for a new Full HDTV? You wouldn’t, except that you can now buy a much larger screen for the money. But that’s not what’s happening – people are opting to move up to Ultra HD resolution, as the prices for these sets have just about reached parity with same-size Full HDTVs. And not surprisingly, Ultra HDTV sales have been strong and are growing by double digits each year. Still a small portion of overall TV shipments, but essential to the bottom line of Samsung (37% UHDTV market share through June 2015), LG (17% share), and Sony (10% share).

What’s new this year is a stronger presence from China Inc. brands, notably TCL and Hisense. The former acquired the Sanyo brand and factory from Panasonic, while the latter now owns Sharp’s US TV business and a former assembly plant in Mexico.

Excepting Ultra HDTV, it’s very difficult to make any money in the TV biz these days. What we’re seeing is more manufacturing and display panel sourcing from China, as the quality of LCD panels for TVs made at BOE, CSOT, Hisense, and TCL is very good. (And they’re cranking out Ultra HD panels, too.)

2016 will be the year that OLED TV technology finally goes mainstream. LG has placed some big bets on their white OLED / RGBW process and is also selling OLED panels to five of the largest Chinese TV manufacturers. Prices continue to fall stateside; LG just announced a Super Bowl promotion through February 13 that will snag you a 55-inch Full HD curved set for $1,999 and a flat or curved 55-inch Ultra HD model for $2,999.

OLEDs are already in wide use in smartphones and tablets (both my Samsung tablet and smartphone use them) and we’re seeing them in smart watches, too. LG Display’s demonstrations of super-curved, warped, and roll-up OLED displays at CES shows the promise of this technology for mobile displays, particularly in transportation applications.

For displays, we can expect more of the same in 2016 – ever-larger TV s at lower prices as retailers try to stir up sales of hat has become a disposable commodity. You can buy a 50-inch Hisense Full HD set now for $399, amazingly, and 42-inch TVs are getting ever close to the $200 price barrier.

So what’s going to change? It will take a while, but the 60 GHz wireless technology demos I saw in Las Vegas are very promising. Imagine streaming Ultra HD content with high dynamic range from your Ultra HD Blu-ray player to your 65-inch 4K OLED without cables. Or showing video clips from your phone or tablet the same way.

Better yet, how about downloading an HD movie before you travel in just 5 to 10 seconds? It’s possible with the new 60 GHz 802.11ad protocol, as demonstrated by Qualcomm with a bumper crop of tri-band (2.4/5/60 GHz) modems at CES, and a suitably-equipped phone or tablet. This one’s a game-changer, but I don’t think you’ll see many products with this feature until a year from now. Peraso’s aftermarket 60 GHz USB wireless links might help, as they can retrofit to any laptop or desktop computer.

The other category you’ll want to keep your eye on is the Internet of Things. It seems like every gadget has an IP address and can be controlled by an app. Through in Wi-Fi, and you have home security systems you can install yourself for about $250 bucks. Or wireless doorbell cameras, or LED bulbs that double as cameras and motion detectors. (And even alarms that monitor your alarms.)

This continual downward pricing pressure (again, led by Chinese manufacturing) will shift profitability away from hardware to software. Verizon Wireless, the last company to abandon annual service contracts, doesn’t really care what you send on your phone. They just want that recurring monthly revenue stream that you generate. (Notice how nobody charges for voice calling and texting anymore, just blocks of data? The increasing use of Wi-Fi for smartphone connectivity has a lot to do with it.)

I’ve said it before, and I’ll say it again: “Hardware is cheap, and anyone can make it.” Software and services are where the growth lies as we enter the second half of this decade, and you’ll see just how low prices will fall a year from now when you can buy a fully-featured smartphone for $300, you’ll be able to score a 65-inch Ultra HD “smart” TV with HDR and WCG support for $800, and a 4K “action” camera will cost less than $150.

May you live in interesting times!

Display Interfacing: Welcome to Babylon

For many years, ‘interfacing’ a video signal meant plugging in a yellow RCA or silver BNC connector that carried composite video. As picture resolution went up, computers became commonplace at work and home, and the term ‘progressive scan’ entered the lexicon, we saw the birth of S-video and then component video (YPbPr and RGB).

So we adapted, building switching and distribution gear that could handle one-, two-, and three-wire formats. All was well and good…until ‘digital’ made its grand entrance about 15 years ago.

Now, we have digital versions of component and RGB video, starting with the Digital Video Interface (DVI) and moving to High Definition Multimedia Interface (HDMI), DisplayPort, and the new superMHL interface that (according to the MHL Alliance) will start appearing on televisions as soon as December.

If I’m a consumer, I mostly don’t care about any of this. As long as I can plug in my set-top box, Blu-ray player, and other gadgets with the right cables I can find at Best Buy, this is just a bunch of alphabet soup.

However, if I’m an integrator (consumer or commercial), then I care VERY much about where all of this is heading. And if I’m paying any attention at all to the growing market for 4K and UHD, then I’m rightfully concerned about the impending problems with interfacing these signals.

superMHL is certainly fast enough to handle UHD. But you can't find it in use yet. is there a better way?

superMHL is certainly fast enough to handle UHD. But you can’t find it in pro AV gear yet. Is there a better way?

Consider that:

*Even though HDMI 2.0 was announced in September of 2013 – TWO FULL YEARS AGO – virtually no manufacturer in the pro AV space supports this interface on their switchers and distribution amplifiers. Instead, the vast majority are still providing version 1.4 while claiming these products are “4K compatible” or “4K ready” because version 1.4 is just fast enough to pass an Ultra HD (3840×260) signal at 30 Hz with 8-bit RGB color. That’s setting the bar kinda low, isn’t it?

*Some computer manufacturers don’t even support HDMI, like Apple (DisplayPort) and Lenovo (also DisplayPort). So, now you have to carry dongles everywhere you go?

*HDMI 2.0 arrives hand-in-hand with a new version of copy protection (HDCP 2.2) which is much more rigorous than versions 1.3 and 1.4. If a valid HDCP key exchange isn’t made within 20 milliseconds, the connection will shut down. Period.

*HDMI 2.0 isn’t fast enough for what UHD is turning out to be – a real departure from 1080p and Wide UXGA, with a move to 10-bit color to support high dynamic range (HDR), wide color gamuts (WCG), and high frame rates (HFR). DisplayPort 1.2 can barely support these requirements; DP version 1.3 and super MHL are better positioned to handle the job.

*The intellectual property behind HDMI and superMHL is owned by the same company – Lattice Semiconductor – and whereas once there were clear dividing lines between the two interfaces (MHL was designed originally for smartphones and tablets), they are now competing against each other. I’ve even sat in on presentations where it was explained that both could exist on consumer TVs. (And why would that make sense, again, when neither interface has been widely deployed to date, and one is clearly an improvement over the other?)

You can imagine what this trend is doing to product designers and manufacturers. Sure, HDMI is a “safe bet” for now, but what if our UHD needs quickly outstrip its maximum clock speed? DP is certainly faster and there appears to be more support for it from computer manufacturers. But super MHL is faster still. Shouldn’t your interfaces at least have a head start on display manufacturers?

This reliance on HDMI has led several manufacturers into a potential trap, investing heavily on signal distribution architectures that may quickly run into a “future-proofing” problem. In contrast; outside the commercial AV industry, everyone from cable TV system operators to broadcasters and telecom operators are busy migrating to an IP-based architecture.

Not only does IP-based architecture have the advantage of being a relatively open system, it also solves many of the speed issues as 1-gigabit and 10-gigabit networks are becoming more commonplace. (Heck, Comcast just upgraded my home Internet speeds to 75 Mb/s on downloads, which is more than fast enough for me to stream 4K content from Netflix and Amazon!)

So, why don’t we do the same in the commercial AV industry? It’s not for a lack of products – there are several companies offering AV-over-IP transmitters and receivers, along with encoders and decoders. I’ve also seen impressive demos of “middleware” used to locate, switch, and play out media assets over IP networks. All of these guys were at InfoComm 2015.

The big players in HDMI-based switching and distribution argue against AV-over-IP for in-room and short-run signal distribution, citing latency and compression issues. Well, we now have a new codec (HEVC H.265) to handle that end of things, and it’s possible to stream video and high resolutions with low latency. (How does 1920x1080p/60 at 1 to 2 Mb/s sound to you? Thought so.)

High latency is often the result of over-compression and heavy forward error correction (FEC). But if video and audio assets are streaming on bandwidth-managed, private IP networks, there isn’t a lot of forward error correction required. Group of Pictures (GOP) sizes can also increase to reduce latency. So latency is sort of a “straw man” argument. (And HDMI 2.0 will have plenty of issues with HDCP 2.2, trust me. Talk about latency…)

As for copy protection; video and audio assets streaming over IP connections have their own security protocols. Practically speaking, what could be more secure than video content streaming directly into a UHDTV, through an Ethernet connection? And you don’t even have to plug in a cable to make it work, unless you use a wired Ethernet hookup. Bandwidth issues? Well, how about 5 GHz 802.11ac channel-bonding routers? I’m getting 70+ Mb/s download speeds from mine with wired connections, and 25 – 30 Mb/s some distance from my 5 GHz wireless link.

Again, looking outside our industry, the two most common signal distribution and switching architectures are based on HD-SDI or IP (or both). Not HDMI, and certainly not HDMI-derived, structured-wire systems like HDBaseT. If the rest of the world wants to multiplex video, audio, metadata, and other low bitrate control signals, they do it over optical fiber. (Did you know that multimode fiber is cheaper than Category 6 wire?)

I’ll wrap up thing by saying that the smart move is for commercial AV integrators to move to an AV-over-IP signal distribution system at the core like everyone else, leaving the HDMI, DisplayPort, superMHL, and “whatever comes next” connections for the far ends, near the displays (if those far-end conversions are even needed at all).

Leave the core as a high-speed, copper bus or optical bus, software-based switcher. If there’s enough bandwidth (and there should be), that system can also carry local TCP/IP traffic, SMTP alerts from connected devices, and control signals to all devices. Not only does this approach free everyone from the “closed world” paradigm of HDMI, it also makes the system infinitely more appealing to end-users and facility administrators, an increasing number of whom come from the IT world.

Consider this ad that was posted recently on a listserv for higher education:

“We are looking for an experienced AV-IT Engineer for the role of Technical Guru. The position will provide planning and support for AV-IT systems used in teaching and learning spaces big and small. The person in this position will focus on design, installation, and troubleshooting of AV-IT systems in a variety of venues, including traditional classrooms, active learning classrooms, large auditoria, computer labs, and even Makerspaces…We are looking for a seasoned professional with a solid background in AV-IT systems. This is a great opportunity for a doer who is excited about not just maintaining but also shaping the future of AV-IT technology as a key element of the teaching mission of one of the world’s top universities.”

I rest my case. It’s time for the commercial AV industry to get in step with the rest of the world and move to AV-over-IP signal distribution.

Wake up. Have you smelled the coffee yet?